AI Pathologist Helps Zero in on Correct Cancer Diagnosis
Editor’s note: This is one of five profiles of finalists for NVIDIA’s 2017 Global Impact Award, which provides $150,000 to researchers using NVIDIA technology for groundbreaking work that addresses social, humanitarian and environmental problems.
For more than a century, pathologists have diagnosed cancer by studying stained tissue slides under a microscope. Amit Sethi thinks it’s time for a change.
Sethi, a professor at the Indian Institute of Technology (IIT) Guwahati, is creating an AI pathologist to supplement human specialists. This could lead to more accurate diagnoses and more effective treatments for two of the most common types of cancer — breast cancer among women and prostate cancer among men.
“I want patients to get the treatment that’s right for them,” Sethi said.
Sethi’s research has placed him and a team of IIT Guwahati researchers among five finalists for NVIDIA’s 2017 Global Impact Award. Our annual grant program totaling $150,000 goes to researchers using NVIDIA technology for groundbreaking work that addresses social, humanitarian and environmental problems.
AI Matches the Experts
A pathologist’s diagnosis is a critical element of determining cancer therapy, but it’s also highly subjective and time consuming.
“Our goal is to do better than a single pathologist,” Sethi said. “When you have several pathologists, you usually come up with a good decision.”
Using GPU-accelerated deep learning, he created algorithms to analyze slides for patterns associated with the two most aggressive types of breast cancer. In tests on one of the types, his AI pathologist agreed with an expert panel 90 percent of the time.
Sethi then put his AI pathologist to work to help patients.
Beating Breast Cancer
Those with breast cancer often have more than one type of cancer within the same tumor, but pathologist reports typically identify only the type considered dominant, Sethi said. That’s because the idea that a tumor may incorporate more than one cancer variety is relatively recent, he added.
Even when doctors analyze the genomics of tumor cells, they’re looking at a few tissue samples rather than the whole tumor and may miss the signs of multiple cancer types.
“When cancer comes back after treatment, it’s usually the cells that weren’t treated,” Sethi said.
For example, a quarter of patients with the aggressive HER2-positive type of cancer don’t respond to Herceptin, the drug most commonly used to treat it. That’s often because the tumor includes a second aggressive cancer type known as triple-negative, meaning it lacks receptors for estrogen, progesterone and the HER2 protein. The two types require different treatments.
Sethi and the team used NVIDIA GPUs and deep learning to detect patterns in stained tissue slide images showing the presence of HER2-positive and triple-negative cancers. For individual patches of tissue, their accuracy rates were 84 percent for HER2-positive and 91 percent for triple-negative; for all of a patient’s tissue samples combined, accuracy reached 100 percent. Significantly, the algorithm also found both types of cancer within the same tumor.
“If we can identify these patients, we can offer better treatment,” Sethi said.
Will the Cancer Return?
Some 20-30 percent of men treated for prostate cancer see their cancer return, according to the Prostate Cancer Foundation. Even when men have surgery to remove the prostate gland, more than one out of six patients suffer a relapse, Sethi said. Yet tests to predict whether the cancer will return are not always accurate.
Sethi wants to overcome that problem. Using our Tesla GPU accelerators, he trained a neural network to distinguish cancers likely to come back from those less likely to return based on stained tissue images. That could help doctors more effectively treat prostate cancer.
“If doctors know the cancer is likely to recur, they can treat it aggressively with chemotherapy and radiation,” he said. “But if a patient is a low risk, we don’t want to put them through these treatments.”
Check out the work of last year’s Global Impact Award winner.