As a recent graduate of a UK medical school and current Foundation Year One Doctor there is little exposure to dermatology, and even less to artificial intelligence (AI). However, more and more doctors are getting exposed to and involved in AI, with many suggesting it will take up a large part of our working life by the time current trainees are qualified. In this article I will assess some of the larger research studies involving AI in dermatology as well as look into what seems likely for the future, from the perspective of a junior doctor.
AI is the idea of technology that can mimic the way the human brain can solve issues, and remains as difficult to explain as the human brain. From basic algorithms that have set outputs for set inputs, AI can stretch as far as machine learning when computer programmes are, in essence, able to write their own programming after being fed new data. In dermatology, and other fields, programmers have developed convoluted neural networks (CNN), where multiple hidden layers allow for the technology to learn different ways to interpret the data. In simple AI there is one hidden layer that assesses the data, in CNNs it can be thousands. In the case of dermatology, the input is a group of pixels from an image, neural networks have been trained with images with a known output, the diagnosis. They can learn to weight neuron connections between the hidden layers through thousands of images with known diagnoses creating a working neural network.
A multitude of research studies have been undertaken, assessing multiple AI software in different ways. Most commonly they have revolved around the ability of an AI to detect melanoma using a single image as the data input. There are retrospective trials around possible melanoma lesions, comparing an AIs ability to diagnose from images with that of dermatologists. There are also studies just evaluating AIs ability to distinguish melanomas using biopsy as a definitive to compare to. The headline from most of these studies, along with subsequent systematic reviews, is that AI does outperform dermatologists in the research scenarios in both sensitivity and specificity in assessing for melanoma1-3. In those not comparing against dermatologists, sensitivity and specificity are impressive, generally both over 90% with the lowest being around 70% 4-6. However, these studies do not come without critical analysis. Most clearly, they are using historic data sets, none of them are randomised control trials. Pre-processing is also often used, this is the process of identifying and excluding certain images, a question of how this would happen in the clinical environment is raised.
A further common critique of these studies that compare an AI’s ability to studying one image against a dermatologist’s is the lack of collateral information a dermatologist has. A dermatologist will ask questions around a lesion, for example, finding out time of onset or the way a skin lesion has changed may affect their diagnosis. One thing that I have wondered, without a detailed knowledge of the AI, is perhaps why this quantifiable information has not been implemented into an AI. A picture, along with 2/3 binary questions, could allow for a more rounded information data set. However, this would require a significant step back in data sets that have been accrued by AI companies so far.
These data-sets themselves are a further issue, AI will only assess a piece of information in reference to previous information it has been given. Thus, ensuring that data sets are wideset is vital. If less than 10% of melanoma are amelanotic, data sets will have less information for amelanotic melanomas, possibly making it more difficult for AI to make obscure diagnoses7. The same goes for lesions on skin of different colours. The data set the AI is tested on is also questionable, one AI that showed an average sensitivity and specificity of 85.1% and 81.3% respectively, was tested on a different data set by an external group who found a 29% sensitivity for the AI8. This poses questions around the true applicability of certain AI with different populations. This issue would hope to be solved with more inclusive data sets, these are constantly growing, along with further research with RCTs to assess AI use in new, unidentified populations. Until this happens companies are constantly growing their data set, by the hundreds of thousands, to help prevent issues like this in the future.
The critiques above are being worked on by AI companies. Growing data sets and improvements to software are continuous. The main improvement it seems for evidence is the lack of randomised control trials (RCTs), as noted in a BJD review recently none are even registered to be in progress9. As the gold standard for medical research, undertaking RCTs will likely allow for a more complete acceptance of AI in the medical community. RCTs with strong evidence would be much harder for clinicians to ignore or criticize.
To delve into the plethora of publications looking into inflammatory skin disease is beyond to scope of this article but systemic reviews do show great potential in this area also10.
An argument against a lot of negative critical analysis is the fact that multiple different companies having CE medical device status granted from the EU already. With governing bodies allowing for AI technology to be implemented, for many doctors and patients this will be enough to trust the technology. For this group of people, the question now is implementation.
Applications for AI in the British Healthcare system
Direct to consumers
Most commonly in the trials that assess AI technology, the AI is set against a control of qualified dermatologists, or perhaps trainees. These have all had significantly more training in clinical dermatology than General Practitioners (GP) who see patient’s skin issues first. This is one of the key influencers in my opinion that AIs current most applicable use is in assisting or even bypassing GP services. As a direct to consumer product, AI technology is an alternative not to dermatologists but to GPs, in the UK the GP is the initial healthcare professional who is consulted for a skin issue.
Throughout 5 years of medical school I received just one week of mandatory clinical placement in dermatology, this is similar across the UK. If I choose to become a General Practitioner (GP) I will receive no further placements in dermatology. However, GPs in the UK will end up seeing around 13million dermatological cases this year alone11. This leads to around 870,000 referrals to dermatologists a year. Any relative reduction in this, would result in vast absolute savings for GPs. To show this crudely, 13million dermatological consultations with the mean GP consultation time of 7minutes results in over 1.5million hours of work. This is also time that GPs could be using skills that they have had adequate training in, benefitting not just dermatology but healthcare as a whole. Having such a large percentage of GPs time being spent on something they have so little training in is inefficient. Smartphone apps pose a viable option in this case, some criticism comes from the varying sensitivities and specificities of these apps12. However, one cannot use the varying quality as evidence against a single product. Each piece of technologies can and is assessed by governing bodies in its individual quality. A 2019 NHS review asked for medical schools to increase dermatological training to a minimum of 4 weeks due to the issue of GP caseload being so heavily dermatological. This clearly has not been implemented yet, and with pressures from many larger specialties such as GP and psychiatry for increased training, it seems unlikely it will. There must be a solution then for GPs having to see so many dermatological cases without adequate training and AI looks to be a perfect candidate.
An important note is that whilst the sensitivity and specificity of dermatologists versus AI is still questioned by some due to the quality of evidence, the quality of GP sensitivity and specificity is not near either of them. Dermatologists only find skin cancer in 11.8% of the referrals they receive from GPs. Obviously having a piece of AI technology as the gold standard for referring to dermatologists is more difficult to implement than having GPs using it as a tool, though it is likely to be more accurate in referrals than GPs alone. Of course, in this screening setting, sensitivity is most vital, assessing a comparison of GPs against AI with sensitivity the key could prove useful research. Current research suggests with sensitivity pushing high 90s specificity is sacrificed but not to the extent of 11.8% 2,13. Implementing AI into GP practices to both reduce GP pressure and reduce unnecessary referrals to dermatologists seems to be a clear improvement in the quality of healthcare providence in the UK.
There are further uses for AI as a diagnostic tool. Dermatologists are extremely good at diagnosing melanomas, with the use of biopsy as a gold standard. Currently around 7 skin lesions are biopsied for every melanoma found, so the “Number needed to biopsy” is 714. AI as a tool of assistance has been shown to increase sensitivity, preventing further missed diagnoses of melanoma4. With AI increasing specificity there is scope for the number needed to biopsy to be reduced. That would not only reduce stress on the system but further reduce stress for patients who must undergo a procedure and wait for potential life changing results. Of course, in this potential application of AI, the quality of the technology should come under greater scrutiny, and it does. It would seem for diagnostic images it is more likely that a more standardised photograph would be needed. Whether using dermoscopic images or standardised lighting/camera quality, we can see from research this higher quality images will lead to higher sensitivity and specificity. This clearly makes sense from a view of the AI technology, a higher quality image contains more information and thus with more information to input to the software, more accurate outcomes can be contained.
Let us not also forget the benefit in diagnoses of inflammatory dermatological conditions. GPs see inflammatory skin disease even less than they see skin lesions and thus distinguishing between eczema and psoriasis or acne and rosacea can produce difficulties. With later diagnoses and interventions, outcomes are of course worsened. AI technology can remind GPs of obscure inflammatory disease they have not studied for 15 years as differentials, it can also reduce mistreatment as a result.
How AI is perceived
In the case of these ‘direct to consumer’ AI products such as smartphone applications, public opinion is clearly vital. The option to see a GP will likely not be removed and thus it will be a patient’s choice on using a GP or AI. We can see from recent studies that the main issues people have with AI is privacy, along with mistrust15. Perhaps this will lead to a push for companies to reduce the personal data collected. An open culture between the medical world, technology companies and the public must ensue for the public to engage with AI.
The opinion of healthcare professionals is also important. A recent review of dermatologists’ opinions on AI revealed their largest concern was with the liability in the cases of machine error16. There does not seem to be any case law yet in this area and the European Commission is yet to produce a detailed report on how they expect liability to be dealt with. However, they are aware of this and are looking to come to a consensus in the near future17.
The NHS want AI use to base from the idea of identifying a problem and producing a solution18. As one example AI can be a solution to a stretched healthcare system with melanoma survival rates below other developed nations. But there are further applications still, from reducing GP consultations to allowing for a lower number needed to biopsy, the benefits are not just in taking pressure away from doctors, but also in making their decisions more accurate. To assist in the evidence in the future it RCTs should be performed. There is solid evidence already but RCTs would provide unequivocal evidence for to wider community to accept AIs use. As a general newcomer to the field, I may be missing a blinding reason why these have not occurred but I would hope with the suggestions being made by multiple research reviews it would be possible soon.
The way AI will be implemented is not clear yet. Private companies are working directly with dermatologists, the public and GPs, others are gaining contracts with the NHS. Personally, I believe that the implementation with the largest scope for change and benefit is in assisting GP’s onward referrals. For all aware of the potential changes AI could have, it will be very interesting to watch how developments occur in the future.
About the author
My name is Robert Hill, I am a Foundation Year One doctor in the UK. I heard about First Derm shortly after discovering AI in dermatology when reading the NHS 2019 review of AI. I have an interest in dermatology and hope to pursue this through internal medicine training in the UK after completing my Foundation year two training.
As a junior doctor with little exposure to dermatology and even less to AI I am by no means an expert in this field so please feel free to comment any corrections, opinions or questions below.
- Han SS, Lim W, Kim MS, Park I, Park GH, Chang SE. Interpretation of the outputs of deep learning model trained with skin cancer dataset. J Invest Dermatol. 2018 PubMed
- Esteva A, Kuprel B, Novoa RA, et al. Dermatologist-level classification of skin cancer with deep neural networks. Nature. 2017
- Marchetti MA, Codella NCF, Dusza SW, et al. Results of the 2016 international skin imaging collaboration international symposium on biomedical imaging challenge: comparison of the accuracy of computer algorithms to dermatologists for the diagnosis of melanoma from dermoscopic images. J Am Acad Dermatol. 2018 PudMed
- Thomsen K, Iversen L, Titlestad TL, Winther O. Systematic review of machine learning for diagnosis and prognosis in dermatology. J Dermatol Treat. (2019)
- Brinker TJ, Hekler A, Enk AH, Klode J, Hauschild A, Berking C, et al. . Deep learning outperformed 136 of 157 dermatologists in a head-to-head dermoscopic melanoma image classification task. Eur J Cancer. (2019)
- Souza S, Abe JM. Nevus and melanoma paraconsistent classification. Studies Health Technol Inform. 2014
- Pizzichetta MA, Talamini R, Stanganelli I, et al. Amelanotic/hypomelanotic melanoma: clinical and dermoscopic features. Br J Dermatol. 150: 1117-24. DOI:10.1111/j.1365-2133.2004.05928.x. PubMed
- Navarrete-Dechent C, Dusza SW, Liopyris K, Marghoob AA, Halpern AC, Marchetti MA. Automated dermatological diagnosis: hype or reality? J Invest Dermatol. 2018
- Charalambides M, Flohr C, Bahadoran P, Matin R. New international reporting guidelines for clinical trials evaluating effectiveness of artificial intelligence interventions in dermatology: strengthening the SPIRIT of robust trial reporting. British Journal of Dermatology. 2021;184(3):381-383.
- 1. Gomolin A, Netchiporouk E, Gniadecki R, Litvinov I. Artificial Intelligence Applications in Dermatology: Where Do We Stand?. Frontiers in Medicine. 2020;7.
- [Internet]. Bad.org.uk. 2021 [cited 4 July 2021]. Available from: https://www.bad.org.uk/library-media/documents/consultant%20physicians%20working%20with%20patients%202013.pdf
- Chuchu N, Takwoingi Y, Dinnes J, Matin RN, Bassett O, Moreau JF, et al. Smartphone applications for triaging adults with skin lesions that are suspicious for melanoma. Cochrane Database Syst Rev. 2018
- Maron R, Utikal J, Hekler A, Hauschild A, Sattler E, Sondermann W et al. Artificial Intelligence and Its Effect on Dermatologists’ Accuracy in Dermoscopic Melanoma Image Classification: Web-Based Survey Study. Journal of Medical Internet Research. 2020;22(9):e18091.
- 5. Klebanov N, Shaughnessy M, Gunasekera N, Tan S, Tsao H. 15881 Number-needed-to-treat analysis of skin cancers among referrals for suspicious lesions. Journal of the American Academy of Dermatology. 2020;83(6):AB49.
- Sangers T, Wakkee M, Kramer‐Noels E, Nijsten T, Lugtenberg M. Views on Mobile Health Apps for Skin Cancer Screening in the General Population: An In‐Depth Qualitative Exploration of Perceived Barriers and Facilitators. British Journal of Dermatology. 2021;.
- Scheetz J, Rothschild P, McGuinness M, Hadoux X, Soyer H, Janda M et al. A survey of clinicians on the use of artificial intelligence in ophthalmology, dermatology, radiology and radiation oncology. Scientific Reports. 2021;11(1).
- Gerke S, Minssen T, Cohen I. Ethical and Legal Challenges of Artificial Intelligence-Driven Health Care. SSRN Electronic Journal. 2020;.
- [Internet]. Nhsx.nhs.uk. 2021 [cited 25 June 2021]. Available from: https://www.nhsx.nhs.uk/media/documents/NHSX_AI_report.pdf