Image-scanning software developed at Stanford University can distinguish deadly skin cancers from benign ones as accurately as top dermatologists, according to a study published Wednesday.
The potentially life-saving technology could soon be incorporated in a smart phone, the researchers said, an advance reminiscent of the diagnostic device wielded by Dr McCoy in the 1960s Star Trek sci-fi series.
Adapting a Google algorithm designed to distinguish between categories of objects based on images — telling a cat from a dog, for example — the Stanford team compiled a database of nearly 130,000 photos of skin disease.
To be effective, the software would need to tell a benign lesion from a malignant carcinoma.
Computer scientists “trained” the algorithm to combine visual processing with a type of artificial intelligence called deep learning.
From the very outset, the results were startlingly good.
“That’s when our thinking changed,” said senior author Sebastian Thrun, a professor in the Stanford Artificial Intelligence Laboratory.
“We said, ‘Look, this is not just a class project for students; this is an opportunity to do something great for humanity’.”
Fine-tuned with the help of physicians, the app they created performed just as well as a panel of 21 board-certified dermatologists, the researchers reported in the science journal Nature.
In the United States alone, more than five million new cases of skin cancer are diagnosed every year.
For melanoma detected in its earliest stages, the five-year survival rate is about 97 percent. If the disease is uncovered only later, that drops to about 14 percent.
– The next level –
Dermatologists inspect skin for signs of cancer, relying on their training and experience.
If a lesion is spotted, the next step is typically a closer look with a hand-held microscope called a dermatoscope.
If doubt remains, the final phase of diagnosis is a biopsy — taking a skin sample to be tested in a lab.
The images used in the new app — representing over 2,000 different skin diseases — were gathered from the internet and vetted by dermatologists.
“There’s no huge dataset of skin cancer, so we had to make our own,” said Brett Kuprel, co-author and Stanford graduate student.
In a final contest, the app was pitched against 21 dermatologists to identify cancerous and non-cancerous lesions in over 370 images.
Human and machine performed equally well.
The next step is to create a smart phone version, the researchers said.
“Everyone has a supercomputer in their pockets with a number of sensors in it, including a camera,” noted co-lead author Andre Esteva, also a graduate student.
A smart phone app of this kind “might enable effective, easy and low-cost medical assessments of more individuals than is possible with existing medical-care systems,” Sancy Leachman of Oregon Health and Science University and Glenn Merlino of the US National Cancer Institute wrote in a comment also published in Nature.
“Star Trek presented a vision of the future in which (McCoy) used a portable diagnostic device known as a tricorder, to assess the medical condition of Captain James Kirk and other Enterprise crew members.
“Although fanciful then, machines capable of the non-invasive diagnosis of human disease are becoming a reality,” they added.