Michael Kirk and Raphael Attie, scientists at NASA’s Goddard Space Flight Center, regularly face terabytes of data in their quest to analyze images of the sun.
This computational challenge, which could take a year or more on a CPU, has been reduced to less than a week on Quadro RTX data science workstations. Kirk and Attie spoke to AI Podcast host Noah Kravitz about the workflow they follow to study these images, and what they hope to find.
The lessons they’ve learned are useful for those in both science and industry grappling with how to best put torrents of data to work.
The researchers study images captured by telescopes on satellites, such as the Solar Dynamics Observatory spacecraft, as well as those from ground-based observatories.
They study these images to identify particles in Earth’s orbit that could damage interplanetary spacecraft, and to track solar surface flows, which allow them to develop models predicting weather in space.
Currently, these images are taken in space and sent to Earth for computation. But Kirk and Attie aim to shoot for the stars in the future: the goal is the ultimate form of edge computing, putting high-performance computers in space.
Key Points From This Episode:
- The primary instrument that Kirk and Attie use to see images of the sun is the Solar Dynamics Observatory, a spacecraft that has four telescopes to take images of the extreme ultraviolet light of the sun, as well as an additional instrument to measure its magnetic fields.
- Researchers such as Kirk and Attie have developed machine learning algorithms for a variety of projects, such as creating synthetic images of the sun’s surface and its flow fields.
“We take an image about once every 1.3 seconds of the sun … that entire data archive — we’re sitting at about 18 petabytes right now.” — Michael Kirk [6:50]
“What AI is really offering us is a way to crunch through terabytes of data that are very difficult to move back to Earth.” — Raphael Attie [34:34]
You Might Also Like
How the Breakthrough Listen Harnessed AI in the Search for Aliens
UC Berkeley’s Gerry Zhang talks about his work using deep learning to analyze signals from space for signs of intelligent extraterrestrial civilizations. And while we haven’t found aliens yet, the doctoral student has already made some extraordinary discoveries.
Forget Storming Area 51, AI’s Helping Astronomers Scour the Skies for Habitable Planets
Astronomer Olivier Guyon and professor Damien Gratadour speak about the quest to discover nearby habitable planets using GPU-powered extreme adaptive optics in very large telescopes.
Astronomers Turn to AI as New Telescopes Come Online
To turn the vast quantities of data that will be pouring out of new telescopes into world-changing scientific discoveries, Brant Robertson, a visiting professor at the Institute for Advanced Study in Princeton and an associate professor of astronomy at UC Santa Cruz, is turning to AI.