Chest Defense: Deep Learning Spots Disease Early Using Chest X-Rays

Want to analyze a chest X-ray? There’s a neural network for that.

Researchers from the National Institutes of Health in Bethesda, Maryland, have developed a deep learning-powered framework that detects diseases from chest X-rays. Their system then creates detailed captions for the X-rays, making it easier for doctors to screen patients and detect critical diseases early.

The team used our CUDA programming model and GPUs to train their neural network, which identifies disease and describe its contexts, such as location, severity, size or affected organs.

Bringing Deep Learning to Medical Images

Image caption generation has made huge progress with the advancement of deep learning approaches (see “Accelerating AI with GPUs: A New Computing Model”). But most applications have used publicly available tagged images to train neural networks that can caption “natural” images, such as those of pets, nature scenes or city landmarks — but not medical ones.

Comparably well-annotated datasets for medical images are hard to come by. And crowdsourcing chest X-rays isn’t an option. The average person can easily label images of trees, animals and buildings — but identifying lung diseases like cardiomegaly or calcified granulomas takes an expert.

The NIH researchers used a public dataset of chest X-ray images to train a convolutional neural network to recognize diseases. Then, in what may be a first for radiology images, they trained a recurrent neural network to describe the context of the disease.

Chest X-ray annotations
In what may be a first for radiology images, NIH researchers trained a recurrent neural network to describe the context of diseases using chest X-rays.

The paired networks, built using our cuDNN libraries and the Torch deep learning framework, produce richer, more accurate image annotation results. They used NVIDIA Tesla K40 GPUs to train their model, gaining significant speedups.

The team’s system needs further training and higher prediction rates before hospitals and clinics deploy it. But once auto-annotation systems like this are up and running, doctors will be able to search electronic records for all X-rays with a particular disease.

Their system could even help countries with limited clinical resources screen large numbers of patients for diseases.

Similar Stories

  • Kay231

    dfgdsfgdfg