AI Can Now Do a Doctor’s Job
Look at how far we’ve come with AI and healthcare. Not long ago, I took a look at the telehealth scene and noted that there were no algorithms coming up with direct diagnoses of illnesses.
There’s Dr. Google, a symptom checker that taps data from the Mayo Clinic and Harvard Medical School and uses Google’s AI to make recommendations. And then there’s Amazon’s foray into healthcare. Alexa can now deliver “first aid” information through the Mayo Clinic’s First Aid Alexa Skill. Additionally, WebMD has an integration that basically does what WebMD already does, just through Alexa, while Dr. A.I. by HealthTap uses Machine Learning to make recommendations based what doctors have previously suggested to people with similar symptoms.
But none of these do what a doctor does, per se. A doctor uses her own observations and tests to make diagnoses, while Dr. A.I. and other symptom checkers rely only on information that you provide. These are murky waters. Symptom checkers may do more harm than good: Harvard Medical School tests revealed that Bayesian inference algorithms only give the right diagnosis on the first try 34 percent of the time. After three attempts, they get it right 51 percent of the time. Furthermore, when the symptom checkers told the researchers to seek medical attention or take care of the problem themselves, they only made the correct recommendation 57 percent of the time.
That’s not very encouraging unless all you’re dealing with is a sore throat and a stuffy nose (in which case, you certainly don’t need a symptom checker). If you’re dealing with diabetic retinopathy, however, the FDA now believes that AI can tell you what’s wrong.
The FDA has approved the first AI software that can identify disease, marking a major milestone for medical technology. The program is called IDx-DR. If you’re diabetic, the program scans a picture of the back of your retina and analyzes it for signs of the disease. The software recommends you see an eye specialist if it returns a positive diagnosis. Initially, any clinician can operate the program and there’s very little training required, all the clinician has to do is operate a special camera. Down the line it will be easy for an AI robot to perform the photographer duties, thereby eliminating the need for a human doctor to diagnose diabetic retinopathy.
It looks like IDx-DR does a better job at this than flesh-and-blood ophthalmologists. IDx-DR fails in its diagnoses 13 percent of the time, compared with ophthalmologists, who misdiagnose about 20 to 30 percent of the time. In order to make the FDA’s grade, the software had to be able to tell whether patients have the disease 87.5 percent of the time.
About 80 to 85 percent of diabetics develop diabetic retinopathy, and treatment can be successful if it’s caught before the retina is damaged. Because of how common it is and the high demand for diagnostic procedures, diabetic retinopathy is an excellent problem for AI to tackle. The program can free ophthalmologists up to do other work.
At the outset, AI isn’t going to leave any primary care physicians out of the picture because they need to operate the camera, and it won’t put any ophthalmologists out of a job because diagnosis is only one part of treatment. The software does reduce costs because, according to Laura Shoemaker, director of marketing and communications for the company that makes the software, “You’re making things more productive because then only the patients who really need to see the ophthalmologist go there.”
IDx-DR is encouraging to the medical world because it could provide a model for future diagnostic imaging programs. As the baby boomer population ages, demand for practitioners increases. Any help that AI can contribute will be welcome. It will free up humans to do what humans do best: communicate, innovate, and care for each other.
By Daniel Matthews