AMC Network Entertainment LLC

This browser is supported only in Windows 10 and above.

Iron Man Knows It’s the Age of Cold Fusion


In last year’s adaptation of Marvel Comics’ Iron Man, billionaire inventor Tony Stark — a goateed playboy, all aswagger — creates a palm-sized generator that can produce massive amounts of nuclear power in order to fuel his cyborg-like metallic exoskeleton. Stark calls it an “arc reactor” — an extremely efficient power supply that generates huge amounts of energy without consuming typical fuel or producing much waste. For those who follow such things, Tony Stark’s arc reactor is as close to science’s elusive griffin — the cold fusion reactor — as you can get.

Both scientists and science fiction writers have long looked to cold fusion as the solution to most of mankind’s ills. Fusion occurs when two atomic nuclei come together to form a heavier nuclei. But as far as science has been able to prove, it only occurs in extremely violent, high energy situations: A nuclear reactor, for example, or at the core of the sun; hot places, in other words.

Cold fusion, on the other hand (and as its name suggests), is cold: Its nuclear fusion occurs at room temperature, and the potential benefits for mankind are therefore huge. If cold fusion is possible and can be mastered, every home and automobile in America could produce its own electricity with a small generator. It’s free energy in a test tube.

Iron Man isn’t the only flick to latch on to cold fusion as its scifi MacGuffin. The 1997 adaptation of Leslie Charteris’ pulp character The Saint had the eponymous master thief purloining the formula in order to create cheap heat for the freezing Russian people. And 1996’s Keanu Reeves vehicle Chain Reaction similarly deals with a plot to steal the secret to bubble fusion, which is a form of nuclear fusion hypothesized to occur within a collapsing bubble at room temperature (although it’s not quite cold fusion, since the core of an imploding bubble is, counter-intuitively, extremely hot).

But despite Hollywood’s fascination with the technology, is cold fusion possible in the real world? The Navy’s Space and Naval Warfare Systems (SPAWAR) seem to think so: They’ve been cramming cold fusion experiments into their program for quite some time. Likewise, DARPA funded an experiment just last year to muck about with cold fusion. But both agencies have traditionally been mum about their experiments. Cold fusion, while desired, is met with a collective scoff by most atomic scientists — mostly fallout from an announcement in 1989 that the technology had been mastered, which was later disproven.

But on March 22nd, 2009, during a four-day symposium on “New Energey Technology”, SPAWAR broke their silence claiming that they had detected energetic neutrons in a palladium-deuterium co-deposition cell using CR-39, a plastic polymer mostly used for making eyeglass lenses. If that’s too technical, here’s the eye-popping gist: SPAWAR was claiming to have detected the first cold fusion reaction since the embarrassment of 1989’s false announcement.

Has SPAWAR finally cracked mankind’s energy woes? There’s plenty of reason to be skeptical: “[The research paper] fails to provide a theoretical rationale to explain how fusion could occur at room temperatures,” says Rice University physicist Paul Padley. “The whole point of fusion is, you’re bringing things of like charge together. As we all know, like things repel, and you have to overcome that repulsion somehow.”

Only scientists duplicating the experiment in a lab will be able to tell if SPAWAR’s cold fusion is for real, or if the neutrons were produced through some other method. But we can only hope: Cold fusion represents an infinite clean energy source (buck up Russia, heat’s on its way!), and as scifi fans, we’re all twitching to join Tony Stark’s elusive club of flying, propulsor-blasting mech suit adventurers. Cold fusion is the first step towards red and gold robot suits for all.

Read More