Home > Computing > Catdiology? Cat pictures are helping AI get better…

Catdiology? Cat pictures are helping AI get better at recognizing X-rays

cat photos xrays  l
Sergey Nivens/123RF

It’s easy to joke that the internet was invented to give people around the world the opportunity to share pictures of cats. However, according to a new report, those kitty pictures may one day turn out to save your life.

That is based on work being done by Dr. Alvin Rajkomar, an assistant professor at the University of California, San Francisco Medical Center. Rajkomar trained a deep learning neural network to be able to automatically detect life-threatening abnormalities in chest X-rays.

“When I was a medical resident, I ordered a stat X-ray of a patient who I suspected had a life-threatening pneumothorax — air outside of his lung compressing his heart — and happened to be standing next to the digital X-ray machine as it was being taken,” he told Digital Trends. “Seeing the finding in real-time, I was able to immediately thrust a needle into his chest to evacuate the air, saving his life. I wondered if we could create an algorithm that could identify emergent findings so that radiographs don’t sit dormant in a database, waiting for a doctor to finally read the study and contact someone to take action.”

So what does this have to do with pictures of cats? Because Rajkomar said that deep learning systems need to train themselves by looking at vast numbers of images, but that the right kind of X-ray pictures aren’t in great supply for a variety of reasons.

More: Revolutionary 3D bone-scanning technique could take X-rays to the next level

“It wasn’t easy to collect that many radiology images, and even when we did, we discovered that incorrect metadata about the images made it difficult to harness in algorithms,” he continued.

Instead, he decided to have a go at plugging the holes with other images. Working with four Titan X GPUs and the CUDA parallel computing platform, he trained a deep learning neural network on more than 1 million color images taken from the ImageNet public database. After that, he then retrained the network by showing it a portion of the photos in grayscale, before doing it once more using actual chest X-rays.

Crazily enough, it worked. “We were able to show that by mixing thousands of radiology images with millions of images of everyday objects, like cats and fungi, we could get excellent performance,” he said.

You can find out more details about Rajkomar’s work by checking out his co-authored recent paper, published in the Journal of Digital Imaging, here.

“Our hope is that we can generate algorithms that will go much further than just creating metadata,” he noted, concerning his plans for what’s next. “I envision a future where we can immediately and automatically flag radiographs with critical findings so that patients can get the care they need without delay.”

And all this time you thought cat pictures were just cute, wasted bandwidth!