How AI, Machine Learning Are Advancing Academic Research

Academics are taking up GPUs, data science and AI to advance research.
by Cheryl Martin

Insulin. The polio vaccine. The periodic table of elements. Countless discoveries across every field of research have their origins in academia.

Universities and research institutes around the world are key drivers of discovery and innovation, with professors and researchers looking for answers to the biggest questions facing each academic discipline.

With powerful GPU computing resources, academics can use AI, machine learning and data science to more swiftly advance knowledge in their respective fields.

How AI Is Used in Astrophysics and Astronomy

Innumerable questions remain about the origins of the universe, and about the workings of cosmic bodies such as black holes. A team at the University of Toronto is harnessing deep learning to parse satellite images of lunar craters, helping scientists evaluate theories of solar system history.

Running on NVIDIA GPUs on SciNet HPC Consortium’s P8 supercomputer, the neural network was able to spot 6,000 new craters in just a few hours — nearly double the number that scientists have manually identified over decades of research.

At the National Center for Supercomputing Applications at the University of Illinois, Urbana-Champaign, researchers are using deep learning to detect and analyze gravitational waves, which are caused by massive stellar events like the collision of black holes.

And scientists at the University of California, Santa Cruz, and Princeton University have been using NVIDIA GPUs to gain a better understanding of galaxy formation.

How GPUs Are Used for Biology

Deep learning is also giving scientists powerful tools to understand organisms back on Earth. Researchers from the Smithsonian Institution in the U.S. and the Costa Rica Institute of Technology are using big data analytics and GPU-accelerated deep learning for plant identification, classifying organisms recorded in museum specimens with an image classification model.

University of Maryland researchers are using NVIDIA GPUs to power phylogenetic inference, the study of organisms’ evolutionary history. Using a software tool called BEAGLE, the team examines underlying connections between different viruses.

And at Australia’s Monash University, researchers are developing superdrugs for antibiotic-resistant superbugs using a process called cryo-electron microscopy, which allows researchers to analyze molecules at extremely high resolution. Using a supercomputer powered by more than 150 NVIDIA GPUs, the team is able to resolve its image models in days instead of months.

How AI Is Used in Earth and Climate Science

Geologists and climate scientists work with streams of data to analyze natural phenomena and predict how the environment will change over time.

Hundreds of natural disasters occur each year, striking different corners of the world. While some, like hurricanes, can be spotted days before hitting land, earthquakes, tornados and others take humans by surprise.

At Caltech, researchers are using deep learning to analyze seismograms from more than 250,000 earthquakes. This work could lead to the development of an earthquake early warning system that can warn government agencies, transportation officials and energy companies when an earthquake is on the way — giving them time to mitigate damage by shutting off trains and power lines.

In the aftermath of a natural disaster, deep learning can be used to analyze satellite imagery to gauge impact and help first responders direct their efforts to the areas that need it most. DFKI, Germany’s leading research center, is using the NVIDIA DGX-2 AI supercomputer to do just that.

Climate scientists, too, rely heavily on GPUs to crunch complex datasets and project global temperature decades into the future. A researcher at Columbia University is using deep learning to better represent clouds in climate models, enabling a finer-resolution model with improved predictions for precipitation extremes.

How AI Is Used in the Humanities

The usefulness of AI and GPU acceleration goes beyond the biological and physical sciences, extending into the fields of archaeology, history and literature as well.

In a legendary volcanic eruption more than two millennia ago, Mount Vesuvius buried Pompeii and nearby towns in volcanic ash. This eruption also hit a library filled with papyrus scrolls, welded together by the heat of the lava. A University of Kentucky computer science professor has developed a deep learning tool to automatically detect each layer of these scrolls and virtually unfurl them so the contents can be read by scholars, more than three centuries after their discovery.

For texts from a few centuries ago, humanities researchers often rely on scans or photographs of physical pages to read these works digitally. But these texts, printed in antiquated fonts, aren’t legible by computers. This means scholars can’t use a search engine to find a specific passage of text or analyze the usage of a particular word over time.

Instead of relying on the lengthy and expensive process of hiring individuals to convert manuscripts to typed text, researchers across Europe are using AI on early German printed texts and 12th century papal correspondence from the Vatican Secret Archives.

How AI Is Used in Medicine

AI and GPUs are used broadly throughout healthcare and medical research. At universities, too, these technologies are being used to develop new tools for medical imaging, drug discovery and beyond.

MIT researchers are using neural networks to assess breast density from mammograms, creating a tool to aid radiologists in their readings and improve the consistency of density assessments across mammographers.

In the field of drug discovery, deep learning and the computational power of GPUs can help scientists mine through billions of potential drug compounds to more quickly discover treatments for currently incurable diseases.

A professor at the University of Pittsburgh is using neural networks to improve the speed and accuracy of molecular docking, a technique to digitally model how well a drug molecule will bind with a target protein in the body.

How GPUs Are Used for Physics

Physics researchers simulate some of the trickiest, most complex molecular interactions to test theories of how the world works. These experiments require massive computational power — like the deep learning work done by Princeton University and Portugal’s Técnico Lisboa to study and predict the plasma behavior in a nuclear fusion reactor.

Being able to anticipate dangerous disruptive events during a fusion reaction even 30 milliseconds before they occur could help scientists control the reaction long enough to harness this potential source of carbon-free energy.

And at Switzerland’s University of Bern, a research team is analyzing the impact of gravity on antimatter, a rare kind of material that annihilates upon collision with ordinary particles, releasing energy. With GPUs, the scientists have been able to improve their ability to study the way particles interact during matter-antimatter collisions.

RAPIDS Powers Machine Learning, Data Analytics

Beyond deep learning, researchers rely heavily on machine learning and data analytics to drive their work. RAPIDS, powered by CUDA-X AI GPU acceleration, allows data scientists to take advantage of GPU acceleration with a robust platform of software libraries.

An open-source platform, RAPIDS integrates Python data science libraries with CUDA at its lowest level. It can shrink training times from days to hours, and hours to minutes — so data scientists can iterate their analytics workflow faster, ask more questions from their datasets and more quickly reach answers.

The ability to store data in GPU memory enables academics to try different algorithmic approaches with their datasets without the time-consuming process of moving data between GPU memory and host. RAPIDS also features interoperability between different software libraries comprising data analytics, machine learning, graph analytics and deep learning algorithms under a single data format.

Professors and researchers interested in teaching kits, the NVIDIA Deep Learning Institute and the University Ambassador Program can visit our academic programs website to learn more.

See the NVIDIA higher education and research page for additional AI resources for developers and educators.