They say that good things come in threes, and the U.S. is definitely banking on the Lawrence Livermore National Laboratory (LLNL) to deliver just that when it comes to cold fusion. Having achieved their second successful fusion ignition with an energy surplus (meaning that more energy was produced than was required to achieve the fusion reaction itself) within a national lab on July 30th, the U.S. now aims to spur research and facilitate a successful third ignition — and beyond. To do that, the country is ready to invest a further $112M into a dozen supercomputing projects.
Fusion (short for nuclear fusion) refers to the ability to fuse two light atoms into a single, heavier one: a process that when successful, leads to the release of massive amounts of energy in the form of electrons. Unlike fission (which works by breaking down heavy elements such as uranium or plutonium), nuclear fusion is expected to be a safe, nearly-unlimited source of energy. When done right, fusing two light atoms (such as deuterium and tritium, each a hydrogen isotope that carries additional electrons compared to “plain” hydrogen) brings about an energy surplus that is more than four times the amount that fission processes can generate. That also makes it a process worth about four million times the amount of energy released from coal burning (at a per-kilogram basis) — its merits are obvious.
It’s on the back of that promise that the newly-instated Scientific Discovery through Advanced Computing (SciDAC) program combines the two pre-existing programs from the Department of Defense with the aim of streamlining programs invested into solving complex fusion energy problems using supercomputing resources, including exascale systems.
“The modeling and simulation work of these partnerships will offer insight into the multitude of physical processes that plasmas experience under extreme conditions and will also guide the design of fusion pilot plants,” said DoE Associate Director of Science for FES, Jean Paul Allain.
There’s still a lot of work to achieve a sustainable, surplus-energy fuel ignition that actually rockets humanity into a clean, energy-conscious and abundant future, however. The July 30th fusion ignition did provide a higher energy output than was delivered into the light-atom fuel capsule (although it’s unknown how much better it was than the 2.05 megajoules-in, 3.15 megajoules-out achieved in December of last year), but that only takes into account the energy transmitted unto the pellet itself. Unfortunately, the way that energy is delivered into the pellet (via 192 lasers) is still extremely inefficient — LLNL needed to push a staggering 322 megajoules to fire the lasers themselves, which still left the process on a global energy deficit.
But the way forward is to better understand the quantum processes surrounding fusion. Until quantum computers themselves can provide a viable computing platform that can crack that code (and there’s no telling how long that will take — but it’s likely in the decade mark), supercomputers based on standard computing are the best way we have to look into the ordered chaos of processes that occur when the laser strikes the pellet.
The $121M will certainly be a boon there — but it definitely won’t be enough. Yet we humans have this strange way of looking farther ahead — of chasing the carrot — than simply focusing on what is right in front of us. This grant injection is part of that, and a healthy injection into the High performance Computing (HPC) landscape — however small a slice of the total it ultimately turns out to be.