Come for Supercomputers, Stay for Science: 3 Amazing Talks We’re Hosting at SC14

by Brian Caulfield

Feeds. Speeds. Code. That’s not what supercomputing is about. At least, it’s not all that it’s about.

It’s also about detecting breast cancer, modeling the galaxy, and saving money on fuel. In a word, science.

That’s why we’re bringing more than 40 speakers to our GPU Technology Theater at SC14, the annual supercomputing conference in New Orleans, starting Nov. 17.

Many of our speakers are using GPUs to accomplish things computational researchers have never done before.

Here’s a quick look at just three of the talks you’ll be able to hear at our booth 1727 starting Monday and throughout the week.

Better Breast Cancer Detection – Researcher Dan Ciresan from the Swiss artificial intelligence lab, IDSIA, will talk about how GPUs are training accelerated deep convolutional neural networks to recognize mitosis – or cell division – in breast cancer histology images. His approach won both the ICPR 2012 mitosis detection competition and the MICCAI 2013 Grand Challenge on Mitosis Detection. Ciresan’s approach outperformed other contestants to achieve a level accuracy comparable to that of professional histologists. (Tuesday, Nov. 18, 2:30pm CT)

Using GPUs to Save Fuel – Small savings add up fast in the world of jet engines. That’s why GE’s moved its in-house computational fluid dynamics software, known as Tacoma, to the GPU-powered Titan supercomputer at the Oak Ridge National Laboratory. Brian Mitchell, a senior engineer at GE Global Research, will talk about the process of moving Tacoma to Titan, and how taking advantage of its massive GPU acceleration capabilities will enable GE to advance its work to improve the efficiency of turbine engines. (Wednesday, Nov. 19, 2pm CT)

Modeling Galaxies with GPUs – Modeling the development of entire galaxies may sound straight out of science fiction, but it’s a reality, thanks in part to GPUs. Simon Portegies Zwart, a professor in computational astrophysics in the Netherlands, will talk about how scientists are doing just that. A pair of high-performance, GPU-accelerated supercomputers – Piz Daint at the Swiss National Supercomputing Centre and Titan at Oak Ridge National Laboratory – have simulated for the first time the formation of the Milky Way, consisting of some 242 billion bodies. This ground-breaking work landed Zwart and his colleagues as finalists for the coveted Gordon Bell Prize, considered the Nobel Prize of computational science. (Wednesday, Nov. 19, 4pm CT)

For more, you can browse all the talks we’ll be hosting in the GPU Technology Theater, here.