AI in the Sky Aids Feet on the Ground Spotting Human Rights Violations

by Isha Salian

In a traditional human rights investigation, researchers travel to a region, conduct interviews, visit crime scenes, examine court records, and collect hospital or autopsy records.

While that painstaking approach still constitutes a major part of Human Rights Watch’s work, the U.S.-based nonprofit is also exploring new technological methods — including AI — for its investigations, said Fred Abrahams, an associate director.

“It would be irresponsible of us not to do that,” Abrahams said in a talk attended by more than 100 people at last month’s GPU Technology Conference. “We must explore every opportunity we can to get the goods to report on these human rights violations.”

These new tools include remote sensing via satellite and drone data, analytics from public datasets, and investigations using videos and photos posted to social media. Remote sensing is essential in situations where researchers can’t access a conflict zone or closed country — a major issue for the human rights and humanitarian community.

“We can’t document it if we can’t get there,” said Josh Lyons, director of geospatial analysis at Human Rights Watch. “If the people are in hiding or they’re dead, there’s no way to document that case.”

To push this work forward, the nonprofit is partnering with Element AI, a global AI software provider cofounded in 2016 by deep learning pioneer Yoshua Bengio. The company has a team in London focused on building AI for social good.

In addition to using NVIDIA GPUs in Element AI’s data center, Human Rights Watch is using two NVIDIA DGX Stations, provided in 2018 by NVIDIA, to further their efforts.

“The hardware will allow us to make it work,” Abrahams said.

Where There’s Smoke

There are hundreds of satellites orbiting and observing the Earth. Aerial imagery can show geographic features, human settlements and forces like flood and fire. Comparing how a region looks at one moment in time compared to another can be critical for human rights investigations — but the influx of data is too vast for any individual to go through.

At GTC, Lyons shared how Human Rights Watch was able to use thermal data from environmental satellites to begin monitoring the outbreak of ethnic violence in Myanmar in 2017, just hours after the first reports of conflict. Combined with aerial images, the organization was able to detect a pattern of burned Rohingya villages across the region.

This digital evidence helped on-the-ground researchers corroborate the testimony of the Muslim minority community targeted by the authorities. By pinpointing the exact date and time that a village began burning, investigators could better quantify the scale of violence and begin to determine who the perpetrators were.

But it takes an expert eye — or a neural network — to tell the difference between smoke plumes and puffy white clouds.

“Most of the time, it’s my eyes that are doing the analysis,” Lyons said. “The DGX immediately gives us the ability to scale.”

A deployed deep learning model that analyzes satellite or social media data could one day identify potential human rights abuses automatically from text and images and alert Human Rights Watch and humanitarian agencies.

However, though the proliferation of satellites and social media has led to a massive amount of new data for human rights investigators to parse, there’s still little labeled data to train neural networks. Looking at a satellite image of a smoke plume, “I know it’s a crime,” Lyons said. “But how do I tell the computer it’s a crime?”

That’s where Element AI’s expertise in deep learning can help. “By essentially cloning Josh’s visual cortex, we can have a huge impact,” said Julien Cornebise, director of research at Element AI. The company has also partnered with Amnesty International.

Cornebise and his team have also worked with Amnesty International on two projects: one to build neural networks to detect burned villages in Sudan, and another to parse Twitter data to study online abuse against women.

Putting AI to Good Use

Human Rights Watch has been using the DGX Stations for photogrammetry, or converting 2D footage into 3D models, based on data collected from the nonprofit’s drones. The team is also developing and testing deep learning models to parse aerial imagery and social media data.

“We’re data rich and drowning in potential applications,” Lyons said. “The simple challenge is to prioritize.”

Potential uses include AI tools for processing archival footage dating back nearly 50 years, or making handwritten notes from Human Rights Watch investigators easier to translate or search.

These archives, particularly researchers’ notebooks, are “more or less locked in hard copy, paper form,” Lyons said. “Having such a system in place would be quite useful. It would give immeasurable value to future investigations.”

Having powerful deep learning systems onsite is also critical for Human Rights Watch to build AI tools analyzing sensitive datasets. For certain data such as forensic photographs or personal information, the organization is often not authorized to share the information with third parties — or host it on a remote server that falls under a specific geographic area or legal jurisprudence.

Lyons said, “The DGX Station hits that perfect sweet spot of being able to do large, robust data analysis in-house with sensitive data in a way that meets all of our legal and ethical privacy concerns.”

The above satellite image may look like clouds over a coastal community. However, an expert eye, or AI, can tell that the image shows smoke — revealing building fires in five villages in Myanmar’s Maungdaw township on the morning of September 15, 2017. Image courtesy of Human Rights Watch and Planet Labs Inc.