What happens when diagnosis is automated? In a future where computer-aided detection swells to include machine-learning algorithms with millions of reads of experience, what happens to physicians?
According to computer scientist Sebastian Thrun, machine learning will augment the physicians brain, not replace it.
“When you use a phone, you amplify the power of human speech. You cannot shout from New York to California,” he said in an article published in The New Yorker. “And yet this rectangular device in your hand allows the human voice to be transmitted across 3,000 miles. Did the phone replace the human voice? No, the phone is an augmentation device. The cognitive revolution will allow computers to amplify the capacity of the human mind in the same manner.”
When experienced radiologists read an imaging study and make a diagnosis, they don’t scan quadrants and weigh evidence to make their decisions. Instead, they are promptly recognizing an everyday object, similar to identifying an animal or a letter of the alphabet.
Thrun and his team of computer scientists designed a machine-learning algorithm to identify skin cancer from benign skin conditions such as acne or moles, hoping the system would learn to recognize cancer the same way a radiologist does—as a familiar picture.
Read the full article at The New Yorker by following the link below.