Researchers at Stanford University have made a potential breakthrough in the diagnosis of common skin cancers by developing software that is capable of identifying common symptoms and signs. They used a database of 129,450 images to train a deep-learning algorithm to identify 2,032 different skin diseases. This new programming technique mimicks brain function by creating artificial neural networks allowing researchers to ‘teach’ algorithms to identify patterns in training images, and then apply this knowledge to new photos. The resulting algorithm was able to perform at a level on par with 21 expert clinicians over a series of 130,000 tests.
These new algorithmic tools have the potential to transform the medical field due to their adaptable and scalable nature. Medical specialists may be able to draw upon these clinical diagnostic algorithms to help them in their diagnoses and decision-making processes in a wide range of field. The Israeli start-up Zebra Medical Vision is working to develop a suite of tools that can be used to detect breast cancer or predict cardiovascular trouble.
Although it is currently only running on computer, the team from Stanford has is keen to use its algorithm in a mobile phone so that users can submit images they themselves have taken for diagnosis. It is currently trained on high-quality images that have been taken in clinical situations. So, for this scaling to work, the software will have to be adapted and trained to process images taken from mobile phone cameras. If the team succeeds in doing so, these new algorithms could transform the way primary care and diagnosis work for the medical sector.