How UCSF Researchers Are Using AI on Some of Healthcare’s Toughest Problems

A faster path to the diagnosis of Alzheimer’s is just the beginning.
by Tony Kontzer

Hospitals produce huge volumes of medical data that, when handled with concern for privacy and security, could be used to re-examine everything from hospital administration to patient care. But this means finding careful ways to assemble this data, so it’s no longer locked in organizational silos.

A research team at the University of California, San Francisco, is demonstrating just how powerful the combination of deep learning and data can be — and the potential this research holds for improving the healthcare system.

“Medicine is still in the beginning stages of using deep learning and other AI technologies,” said Dexter Hadley, assistant professor of pediatrics, pathology and laboratory medicine at UCSF’s Bakar Computational Health Sciences Institute. “But the technology industry is starting to show how useful it can be.”

Hadley cited the example of work by Google, UCSF and other academic medical centers to develop methods that can predict which patients are more likely to be readmitted to the hospital better than the hospitals’ current algorithms.

“With deep learning and a little bit of insight, you can do a lot,” he said.

Hadley’s initial work identified Alzheimer’s disease every time it was present more than six years before doctors were able to.

Powerful First Stepsucsf logo

The effort started when Benjamin Franc, a former UCSF professor and radiologist who has since left the hospital, approached Hadley with a desire to demonstrate how tapping reservoirs of imaging data could lead to advances in diagnoses.

Franc zeroed in on a particular pool of shared data: the Alzheimer’s Disease Neuroimaging Initiative (ADNI), a major multisite study funded by the National Institutes of Health and focused on clinical trials to improve prevention and treatment of the disease.

The early work was promising: After training a deep learning algorithm using more than 2,000 brain images from 1,000 Alzheimer’s patients, the team achieved 100 percent sensitivity in detecting the disease an average of 75 months earlier than a final diagnosis. (More details on the findings can be found in the paper the team published earlier this year.)

This, Franc and Hadley believe, represents the tip of an iceberg of what may be possible.

“If we are to scale up molecular imaging to benefit medicine worldwide, we have to find ways to use all the information we have,” Franc said. “This is where AI can help.”

For the Alzheimer’s algorithm, both training and inference occurred on a six-core server running four NVIDIA TITAN X GPUs. Training was done using 90 percent of the ADNI images, while 10 percent were held out for validation. The team also made use of the TensorFlow and Keras deep learning libraries, as well as Google’s Inception neural network architecture for image classification and detection.

The Data Is All There

In addition to being potentially game-changing, the Alzheimer’s findings have been a source of frustration for Hadley, whose mother has the disease. He says all the neuroimaging information needed to speed up diagnoses is there, but hard to access because of an industry-wide reluctance to share the information, largely out of concerns for patient privacy.

Healthcare providers are forgetting about the “portability” aspect of HIPAA regulations, he says, and that the regulations are designed to ensure that data is shared appropriately.

Hadley believes this is leading to unnecessary challenges for Alzheimer’s patients and their families.

“If we knew this six years ago, it would have been totally different,” said Hadley.

That’s what he believes makes the work he and Franc started so important. By showing what can be accomplished by pooling existing data, Hadley’s hoping deep learning can be the answer to early detection of numerous diseases.

He cited breast cancer as an example of a disease that could be diagnosed faster. He noted that if the data were shared, we could simulate retrospective multi-institutional clinical trials across millions of patients.

Fewer patients could be subjected to trials, and diagnoses might be sped up, if the data could be used more effectively and deep learning methods applied.

“Technology isn’t the limiting factor in medicine anymore — it’s politics and policy,” said Hadley. “If those bottlenecks are solved, the future is quite bright.”