70 Years On, Supercomputing Helping Clean Up Manhattan Project WasteMay 16, 2016
More than seven decades after the end of WWII, radioactive waste from the Manhattan Project is still awaiting cleanup.
Progress at sites around the country—the largest is Hanford in southeastern Washington—has been slow, costly and plagued with problems.
Cleaning up radioactive waste is incredibly complicated. It’s a little separating certain grains of sand from the rest of a beach. And it’s not enough to separate radioactive elements from other waste. Some elements stay radioactive for thousands of years; some for hundreds. Scientists have to separate these from one another for safe storage.
The job is so complicated that some of the cleanup methods have yet to be invented.
Now a coalition of scientists is using GPU-accelerated supercomputing to better understand the radioactive materials inside storage tanks and find safe, inexpensive ways to remove and store them.
“When they built atomic weapons, nobody knew how dangerous this stuff was,” said David Dixon, a chemistry professor at The University of Alabama who is principal investigator on the project.
Supercomputing Speeds Experiments
The scientists are using one of the world’s most powerful supercomputers, Titan at Oak Ridge National Laboratory, to study the chemistry of radioactive elements called actinides — uranium, plutonium and other metals that release huge amounts of energy when their atoms are split. Equipped with NVIDIA Tesla GPUs, Titan gives scientists the speed they need to conduct many experiments in a short time.
Actinides are highly radioactive, which makes them difficult to handle in a lab. But they must be separated from other waste before it can be decontaminated.
With Titan, scientists can simulate the chemical reactions of actinides to different methods of removing them from the rest of the waste. This is designed to help researchers develop new methods of decontaminating waste.
Why Cleanup Is So Hard
The Hanford site manufactured plutonium for the bomb dropped over Nagasaki in 1945. Today, it has 56 million gallons of highly radioactive, chemically hazardous waste in aging storage tanks – about a third of which are leaking. There’s also waste in the ground, some dumped and some leaked. Other sites holding waste from the dawn of the atomic age include Los Alamos, New Mexico, and St. Louis.
The problem is scientists don’t know exactly what’s inside the storage tanks or in the ground, Dixon said. They lack knowledge about actinide chemistry. Some of the Hanford waste is unique to that site, so scientists have to create cleanup methods before work can be done.
As Oak Ridge National Lab moves to its next-generation supercomputer, Dixon said the team will tailor its software to take full advantage of GPUs.
“We want to get reliable predictions of complex phenomena, and that takes a lot of computation,” Dixon said. Putting more code on GPUs will allow scientists to simulate larger, more complex systems.
Besides the University of Alabama, the scientists’ coalition consisted of researchers from Lawrence Berkeley National Laboratory and five other universities: Washington State University, The State University of New York at Buffalo, the University of Minnesota and Rice University. The research component was supported by the U.S. Department of Energy’s Office of Basic Energy Sciences.