The world depends on coral reefs, but they’re disappearing – ravaged by climate change, coastal development, overfishing and pollution.
With a quarter of Earth’s reefs already gone, scientists are racing to save them, and they’re getting a big boost from GPU-accelerated deep learning.
There’s a lot at stake. Although reefs cover less than one percent of the ocean floor, they provide food and shelter for more than a quarter of all marine species, support fish stocks that feed more than a billion people and provide jobs to millions of people in coastal areas.
Scientists study images of coral reefs to measure reef health and changes over time. That’s now done by human experts, but it’s costly and time-consuming.
Researchers unveiled today a deep learning process that automatically analyzes reef photos. It’s 900X faster than the traditional method but just as accurate, according to the XL Catlin Global Reef Record, an online resource for coral reef images and information.
The new technology “will allow the world’s scientists to more quickly assess the health of coral reefs at scales never dreamed of before,” said Ove Hoegh-Guldberg, chief scientist of the global reef record and a professor at the University of Queensland. With that information, they can more effectively take steps to protect and save them.
Game-Changer for Coral Reefs
The automated analysis grew from a partnership of the University of California, Berkeley’s Artificial Intelligence Research Center and the University of Queensland’s Global Change Institute. The data came from the XL Catlin Seaview Survey, a four-year-old project to collect information about the world’s reefs.
Oscar Beijbom, a postdoctoral scholar at Berkeley, developed the image-analysis and deep learning algorithms that made it possible for computers to identify as many as 40 different categories of corals, sponges, algae and other elements in about 225,000 reef images.
Beijbom used our CUDA 7.5 programming model and three GeForce GTX TITAN X GPUs to achieve the 900X speedup over human image analysis. The dramatic speed-up is a potential game-changer.
“The better we can measure and monitor the reefs, the better we can address their decline,” said Beijbom.
The More Deep Learning, the Better
Beijbom is putting the finishing touches on a second deep learning-powered resource for coral reef images, adding it to the CoralNet web portal that he helped develop while pursuing his doctorate.
Researchers use the portal to upload and annotate their coral reef images –nearly 240,000 so far. When Beijbom rolls out deep learning on CoralNet in the next few months, any user will be able to take advantage of automated analysis.
For scientists, coastal authorities and others scrambling to save coral reefs, more information about more reefs – what is unique about each, how it reacts to stresses such as pollution and how it’s changing over time – boosts their ability to develop reef protections, regulate coastal activity and design experiments, said Manuel González-Rivero, lead scientist on the XL Catlin Seaview Survey.
Some researchers are experimenting with growing coral in nurseries; they could use the reef data to determine where transplants have the best chance of surviving, he said.
“If we’re going to save the coral reefs, we have to take a different path than the one we’ve been on,” said González-Rivero.