by Axel Koehler

Life’s full of unanswered questions, but at least one is being addressed with the help of a new GPU supercomputer at Bielefeld University, near Hannover Germany.

The university is delving deep into the physics of matter in the moments after the Big Bang. To help carry out the complex simulations required to research quantum chromodynamics (QCD) – the theory of strongly interacting matter in extreme states – Bielefeld has built an NVIDIA GPU-powered supercomputer, with the help of NVIDIA and sysGen.

Bielefeld University’s new GPU supercomputer.
[Image Courtesy of Bielefeld University]

Bielefeld’s supercomputer will use a total of 400 GPUs to crank out 500 Teraflops of processing power. To keep the university at the forefront of QCD research, the new GPU-cluster will deliver 125 times the performance of the university’s 4,000 Gigaflops supercomputer, which was built in 2005.

The system was announced at a conference at Bielefeld, where leading scientists discussed the benefits of GPU computing with a packed audience of more than 175 students and academics from all over Europe.

GPU evangelist Richard Brower, from Boston University, focused on the relevance of GPUs in QCD computing. Speakers repeatedly stressed  the importance of energy efficiency in parallel computation performance.

A QCD simulation showing quark and gluon
field fluctuations.

The universe, in the moments after its inception, was extremely hot and dense. Under these conditions, the elementary particles that comprise protons and neutrons, known as “quarks”, created an extreme form of matter. The world’s largest particle accelerators, including CERN, in Geneva, are being used to study this state of matter experimentally. The Bielefeld University GPU supercomputer will be used to research this type of matter through simulations.

“We are excited about the new possibilities the GPU-cluster will bring to research”, says Edwin Laermann, a professor of  theoretical physics at Bielefeld. “Using GPUs in this system also means lower power consumption for better energy efficiency.  We have been working with CUDA for some years and this had a strong bearing on our decision to use NVIDIA GPUs.”