Skin cancer can now be diagnosed using artificial intelligence. A neural network was able to accurately diagnose sets of pictures with 91 percent accuracy - a similar feat of actual dermatologists.
Artificial Intelligence, or simply AI, has been a basic concept of computers today. Its use extends up to the daily use of mobile phones and other computer applications. The commonly known AIs today are Google Assistant, Amazon's Alexa, Microsoft's Cortana, and Apple's Siri.
AIs are envisioned to be used to aid human activities with the concept of natural communication between a computer and an actual human being. But more than just communication and basic help, AIs can be used for practical health care and diagnosis.
A team from Stanford University, co-led by Andre Esteva, used a branch of Artificial Intelligence called Machine Learning to let computers analyze a set of pictures that may or may not have signs of skin cancer. The objective is to let the computer diagnose a skin condition.
The team do not need to code every detail of a photo just to know which one needs further medical help or not - an algorithm was encoded and it automatically starts to "learn" by itself which photo displays skin cancer or not. A dataset was fed to the computer for prior analysis, and as the neural networks get accustomed to a particular pattern, the computer starts to "understand" which features in the picture to look for and give proper diagnosis.
Called as the deep convolutional neural net, the algorithm has the ability to make decisions. By the time the algorithm was tested, it was able to identify 1.28 million of images out of thousands of categories. When pitted against dermatologists, the neural network made diagnosis with similar accuracy - at some point slightly better.
This development will make it possible to create smartphone apps that can diagnose skin cancer or any other skin problem. However, the current algorithm can only be useful for white people, as people with darker skin may be identified differently. A dataset for darker complexions must first be "learned" before the app becomes a global medical tool.