country_code

How AI, Machine Learning Are Advancing Academic Research

Academics are taking up GPUs, data science and AI to advance research.
by Cheryl Martin

Insulin. The polio vaccine. The periodic table of elements. Countless discoveries across every field of research have their origins in academia.

Universities and research institutes around the world are key drivers of discovery and innovation, with professors and researchers looking for answers to the biggest questions facing each academic discipline.

With powerful GPU computing resources, academics can use AI, machine learning and data science to more swiftly advance knowledge in their respective fields.

How AI Is Used in Astrophysics and Astronomy

Innumerable questions remain about the origins of the universe, and about the workings of cosmic bodies such as black holes. A team at the University of Toronto is harnessing deep learning to parse satellite images of lunar craters, helping scientists evaluate theories of solar system history.

Running on NVIDIA GPUs on SciNet HPC Consortium’s P8 supercomputer, the neural network was able to spot 6,000 new craters in just a few hours — nearly double the number that scientists have manually identified over decades of research.

At the National Center for Supercomputing Applications at the University of Illinois, Urbana-Champaign, researchers are using deep learning to detect and analyze gravitational waves, which are caused by massive stellar events like the collision of black holes.

And scientists at the University of California, Santa Cruz, and Princeton University have been using NVIDIA GPUs to gain a better understanding of galaxy formation.

How GPUs Are Used for Biology

Deep learning is also giving scientists powerful tools to understand organisms back on Earth. Researchers from the Smithsonian Institution in the U.S. and the Costa Rica Institute of Technology are using big data analytics and GPU-accelerated deep learning for plant identification, classifying organisms recorded in museum specimens with an image classification model.

University of Maryland researchers are using NVIDIA GPUs to power phylogenetic inference, the study of organisms’ evolutionary history. Using a software tool called BEAGLE, the team examines underlying connections between different viruses.

And at Australia’s Monash University, researchers are developing superdrugs for antibiotic-resistant superbugs using a process called cryo-electron microscopy, which allows researchers to analyze molecules at extremely high resolution. Using a supercomputer powered by more than 150 NVIDIA GPUs, the team is able to resolve its image models in days instead of months.

How AI Is Used in Earth and Climate Science

Geologists and climate scientists work with streams of data to analyze natural phenomena and predict how the environment will change over time.

Hundreds of natural disasters occur each year, striking different corners of the world. While some, like hurricanes, can be spotted days before hitting land, earthquakes, tornados and others take humans by surprise.

At Caltech, researchers are using deep learning to analyze seismograms from more than 250,000 earthquakes. This work could lead to the development of an earthquake early warning system that can warn government agencies, transportation officials and energy companies when an earthquake is on the way — giving them time to mitigate damage by shutting off trains and power lines.

In the aftermath of a natural disaster, deep learning can be used to analyze satellite imagery to gauge impact and help first responders direct their efforts to the areas that need it most. DFKI, Germany’s leading research center, is using the NVIDIA DGX-2 AI supercomputer to do just that.

Climate scientists, too, rely heavily on GPUs to crunch complex datasets and project global temperature decades into the future. A researcher at Columbia University is using deep learning to better represent clouds in climate models, enabling a finer-resolution model with improved predictions for precipitation extremes.

How AI Is Used in the Humanities

The usefulness of AI and GPU acceleration goes beyond the biological and physical sciences, extending into the fields of archaeology, history and literature as well.

In a legendary volcanic eruption more than two millennia ago, Mount Vesuvius buried Pompeii and nearby towns in volcanic ash. This eruption also hit a library filled with papyrus scrolls, welded together by the heat of the lava. A University of Kentucky computer science professor has developed a deep learning tool to automatically detect each layer of these scrolls and virtually unfurl them so the contents can be read by scholars, more than three centuries after their discovery.

For texts from a few centuries ago, humanities researchers often rely on scans or photographs of physical pages to read these works digitally. But these texts, printed in antiquated fonts, aren’t legible by computers. This means scholars can’t use a search engine to find a specific passage of text or analyze the usage of a particular word over time.

Instead of relying on the lengthy and expensive process of hiring individuals to convert manuscripts to typed text, researchers across Europe are using AI on early German printed texts and 12th century papal correspondence from the Vatican Secret Archives.

How AI Is Used in Medicine

AI and GPUs are used broadly throughout healthcare and medical research. At universities, too, these technologies are being used to develop new tools for medical imaging, drug discovery and beyond.

MIT researchers are using neural networks to assess breast density from mammograms, creating a tool to aid radiologists in their readings and improve the consistency of density assessments across mammographers.

In the field of drug discovery, deep learning and the computational power of GPUs can help scientists mine through billions of potential drug compounds to more quickly discover treatments for currently incurable diseases.

A professor at the University of Pittsburgh is using neural networks to improve the speed and accuracy of molecular docking, a technique to digitally model how well a drug molecule will bind with a target protein in the body.

How GPUs Are Used for Physics

Physics researchers simulate some of the trickiest, most complex molecular interactions to test theories of how the world works. These experiments require massive computational power — like the deep learning work done by Princeton University and Portugal’s Técnico Lisboa to study and predict the plasma behavior in a nuclear fusion reactor.

Being able to anticipate dangerous disruptive events during a fusion reaction even 30 milliseconds before they occur could help scientists control the reaction long enough to harness this potential source of carbon-free energy.

And at Switzerland’s University of Bern, a research team is analyzing the impact of gravity on antimatter, a rare kind of material that annihilates upon collision with ordinary particles, releasing energy. With GPUs, the scientists have been able to improve their ability to study the way particles interact during matter-antimatter collisions.

RAPIDS Powers Machine Learning, Data Analytics

Beyond deep learning, researchers rely heavily on machine learning and data analytics to drive their work. RAPIDS, powered by CUDA-X AI GPU acceleration, allows data scientists to take advantage of GPU acceleration with a robust platform of software libraries.

An open-source platform, RAPIDS integrates Python data science libraries with CUDA at its lowest level. It can shrink training times from days to hours, and hours to minutes — so data scientists can iterate their analytics workflow faster, ask more questions from their datasets and more quickly reach answers.

The ability to store data in GPU memory enables academics to try different algorithmic approaches with their datasets without the time-consuming process of moving data between GPU memory and host. RAPIDS also features interoperability between different software libraries comprising data analytics, machine learning, graph analytics and deep learning algorithms under a single data format.

Professors and researchers interested in teaching kits, the NVIDIA Deep Learning Institute and the University Ambassador Program can visit our academic programs website to learn more.

See the NVIDIA higher education and research page for additional AI resources for developers and educators.

AI Is a 5-Layer Cake

by Jensen Huang

Now Live: The World’s Most Powerful AI Factory for Pharmaceutical Discovery and Development

by Rory Kelleher

NVIDIA Brings AI-Powered Cybersecurity to World’s Critical Infrastructure

Akamai, Forescout, Palo Alto Networks, Siemens and Xage Security integrate NVIDIA accelerated computing and AI to advance OT cybersecurity.
by Itay Ozery

As technologies and systems become more digitalized and connected across the world, operational technology (OT) environments and industrial control systems (ICS) — from energy and manufacturing to transportation and utilities — are increasingly depending on enterprise networks and the cloud. This expands OT and ICS capabilities — but also their exposure to cyber threats.

Unlike traditional IT environments that manage data and applications, OT systems control real-world processes where cyber incidents can have immediate consequences for safety, availability and operational continuity.

Many of these systems were originally designed for reliability and longevity, not for today’s threat techniques. This can widen the gap between modern attacks and existing defenses. Even as OT and ICS environments modernize with improved automation, connectivity and analytics, most were not built to withstand adaptive, software-driven cyberattacks that evolve in real time.

NVIDIA is collaborating with leading cybersecurity providers Akamai, Forescout, Palo Alto Networks and Xage Security, as well as industrial automation innovator Siemens, to bring accelerated computing and AI to OT cybersecurity, advancing real-time threat detection and response across critical infrastructure.

These efforts represent a fundamental shift in OT and ICS cybersecurity, where security is embedded into and distributed across infrastructure, enforced at the edge and coordinated through centralized, AI-driven intelligence, bringing modern cybersecurity to the systems that keep the physical world running.

Forescout and NVIDIA Bring Zero Trust to OT and ICS Environments

Zero trust is a security model that removes implicit trust from networks. Every user, device and workload must be continuously verified and authorized, regardless of where it originates.

While zero trust has been widely adopted to secure enterprise IT environments, applying its principles to OT environments has traditionally been difficult. Legacy devices, proprietary protocols and safety-critical operations limit the use of intrusive controls or AI-driven enforcement, even as increased connectivity to IT and cloud environments expands the attack surface.

Forescout is working with NVIDIA to make zero trust practical for OT. Forescout provides continuous, agentless discovery and classification of OT, internet of things and IT assets, delivering real-time risk assessment and policy-based enforcement. With deep visibility into network activity, Forescout applies network segmentation to contain lateral movement and enforce zero trust controls precisely where they matter most, without impacting operations.

At the industrial edge, NVIDIA BlueField DPUs run security services on dedicated hardware, keeping protection separate from operational systems so critical processes remain unaffected.

Siemens and Palo Alto Networks Embed Security Into Industrial Automation

Industrial automation environments demand consistent performance, low latency and high availability — requirements that traditional IT security tools often struggle to meet.

At the S4x26 security conference, Siemens will demonstrate its AI-ready Industrial Automation DataCenter, a unified, holistic solution that consolidates decades of cross-industry automation expertise into one robust IT/OT platform. The future-proof solution contains all the core elements of an edge data center such as computing based on virtualization, data archiving and reporting, resilient disaster recovery solutions, and a robust cybersecurity architecture in accordance with IEC 62443. Through the integration of NVIDIA BlueField, it is uniquely possible to deliver a truly AI-ready, zero-trust solution tailored for the demands on industrial automation.

Prisma AIRS AI Runtime Security delivers deep visibility into industrial traffic and continuous monitoring for abnormal behavior. By running these security services on NVIDIA BlueField, inspection and enforcement happen directly at the infrastructure level, closer to the workloads. This AI-powered approach strengthens security coverage and drives greater operational uptime where it matters most.

Akamai Extends Segmentation to OT and ICS With NVIDIA

Akamai Technologies has extended the Akamai Guardicore Platform to now run on NVIDIA BlueField, enabling agentless segmentation — the ability to isolate applications, devices or workloads into tightly controlled security zones — and the ability to enforce zero-trust policies directly at the edge. This removes the need for agents that may not be compatible with legacy OT systems or safety-certified devices.

Segmentation is enforced at full network speed directly within the infrastructure, without introducing latency or disrupting time-sensitive workloads in centralized data centers or remote edge locations. This helps contain threats quickly, limit their spread and keep mission-critical operations running smoothly.

Xage Security Protects the Energy Infrastructure That Powers AI With NVIDIA

As AI scales into a pillar of critical infrastructure, securing the energy systems that power AI factories is as essential as securing the compute itself.

Modern energy supply chains are complex, distributed and deeply interconnected with AI operations, and they operate largely within the operational technology domain. In this environment, cyber-physical systems, legacy assets and real-time controls demand security approaches purpose-built for critical infrastructure protection.

Xage Security is working with NVIDIA to help address this need by bringing zero-trust security to both energy infrastructure and the AI systems it supports. At S4x26, Xage will demonstrate a new integration running on NVIDIA BlueField, showing how zero trust enforcement can be embedded directly into energy and AI infrastructure environments.

Xage already protects about 60% of U.S. midstream pipeline infrastructure and works with utilities and energy operators worldwide. By combining Xage’s distributed, identity-based security platform with NVIDIA BlueField, operators can protect energy assets, manage third-party access and secure AI-driven operations at scale without compromising performance, reliability or resilience.

A New Class of OT Cybersecurity

Across these environments, a consistent OT cybersecurity architecture is taking shape. Security services run at the edge on NVIDIA BlueField DPUs, close to the operational systems they protect. By executing inspection and enforcement on dedicated, hardware-isolated infrastructure, BlueField enables continuous protection without disrupting time-sensitive operations.

OT data generated at the edge is sent to centralized AI factories, where it’s analyzed across many sites to identify patterns, anomalies and emerging threats. In addition, security actions are enforced locally at the edge, while insights are shared centrally — creating a coordinated defense that improves visibility, accelerates response and scales protection consistently across OT and IT environments.

This architecture helps detect and contain threats faster while strengthening resilience across distributed environments, maintaining consistent performance and protecting uptime.

The result is a new standard for securing critical infrastructure — where AI-driven protection and operational excellence move forward together.

NVIDIA-powered OT cybersecurity solutions are delivered through a global ecosystem of trusted partners. Read this OT cybersecurity use case and solution overview for more.

Join NVIDIA at S4x26, running Feb. 24–26 in Miami, to see how accelerated computing and AI are transforming cybersecurity for OT and critical infrastructure.