Before delving into the content of this essay, it is crucial to understand why this argument is valid. Cancer is a fatal disease and the chances of survival, especially for skin cancer, vary greatly depending on how early and accurately it is diagnosed (The Skin Cancer Foundation, 2019). Recognising cancerous cells from benign ones requires not only medical knowledge but precision and intelligence. Would a machine be able to diagnose a patient with skin cancer?
When arguing whether Artificial Intelligence (AI) should, or should not, be used in the diagnosis of skin cancer; it is vital to compare machine diagnosis against a pathologist with a microscope in order to see which is, essentially better. The categories of comparison are as follows: accuracy of diagnosis; empathy for the patient; feasibility and accessibility (Gordon A, 2018).
Save your time!
We can take care of your essay
- Proper editing and formatting
- Free revision, title page, and bibliography
- Flexible prices and money-back guarantee
Place an order
Artificial Intelligence is dominating the world hence making this project even more captivating. This essay is informative about how vital early diagnosis in skin cancer is to the survival of the patient. However, the main purpose of this research is to provide an argument for and against the use of apps, machinery and algorithms in skin cancer diagnostics. Questions regarding ethics, patient care, accuracy and more will be addressed in the findings of this research.
What is Artificial Intelligence?
In order to comprehend what Artificial Intelligence means, it is vital to to first define Intelligence. Not only is the meaning of intelligence subjective but it is constantly changing. In the 15th century, they had already perceived it to be a form of ‘superior understanding’ and in 1580 it was frequently used in military circles as ‘secret information from spies’ (Pg 23-42, De Spiegeleire S, Maas M, Sweijs T. 2017). Marcus Hutter and Shane Legg, in 2007, conducted a survey of over 70 definitions of intelligence, one of them being ‘ to collect information and form an impression thus leading one to finally understand’ (Smith C, 2006). In conclusion to this study, it was discovered that it is difficult to narrow down to one definition but the general idea conveyed in all of them is the same. The functional definition devised by Stanford University’s Formal Reasoning Group (2017) covers both natural and artificial forms of intelligence,‘ Intelligence is the computational part of the ability to achieve goals in the world’. The purpose of this ability known as intelligence is to, as stated above, achieve goals. Intelligence can be used to solve a puzzle or buy a train ticket all the way to analysing samples to diagnose skin cancer.
Now that the definition of Intelligence has been understood and clarified, we will now be able to understand the concept of Artificial Intelligence. AI is a truly broad subject and has many aspects of it, it can range from complex machinery that can think and comprehend like humans all the way to simple algorithms used to play board games. The term Artificial Intelligence was first coined by John McCarthy in 1956 when he held the first academic conference on this subject. However, the journey to understand if machines can truly think began much before that. In 1950, Alan Turing wrote a paper on the notion of machines being able to simulate human beings and the ability to do intelligent things, such as play Chess (D. L. Dowe, 1998). Technological advancement began decades ago and this shows that it required a lot of research and development to get to where it is today.
In a basic understanding, Artificial Intelligence is the ‘automation of intelligent behavior’ (S. Bringsjord, 2003) but similarly to the term intelligence, it has many deeper layers to its meaning and countless definitions. A commonly quoted definition formulated by the US Defence Science Board upheld in the recent Summer Study on Autonomy is ‘the capability of computer systems to perform tasks that normally require human intelligence’ (2016). However, it is vital to contextualise and use a definition which is most suited to the to healthcare. Artificial intelligence in healthcare is ‘the use of algorithms and software to approximate human cognition in the analysis of complex medical data’, specifically,it is the ability for computer algorithms to approximate conclusions without direct human input (M. S. Ali, 2012). This definition is most appropriate as skin cancer diagnostics requires exactly that, the analysis of complex medical data.
Types of Skin Cancer
In the UK, 37 people are diagnosed with melanoma every day (Macmillan editorial team, 2017). When treated early it can usually be cured, but the disease still claims tens of thousands of lives every year. According to the World Health Organisation (2018), skin cancer accounts for one in every three cancers diagnosed worldwide, proving it is one of the most common cancers. Skin cancer is split into three main categories: basal cell carcinoma; squamous cell carcinoma and melanoma. Basal cell carcinoma (BCC) is a cancer of the basal cells at the bottom of the epidermis. It is occasionally called a rodent ulcer and about 75% of all skin cancers in the UK are BCCs (NHS, 2017). Most BCCs are very slow-growing and almost never spread to other parts of the body. If a patient has a mole which they suspect to be BCC, then it is recommended to get a check up and possibly diagnosis within 18 weeks(), this shows that this cancer is not the most fatal as diagnosis is not of high urgency. Almost all patients with BCC who receive treatment are completely cured.
Squamous cell carcinoma (SCC) is a cancer of the cells in the outer layer of the skin. It is the second most common type of skin cancer in the UK (NHS, 2017). Similarly to BCC, most people treated for SCC are completely cured. Usually, SCCs are slow-growing and they only spread to other parts of the body if they are left untreated for a long time. Occasionally, though, they can behave more aggressively and spread at an earlier stage. Both BCC and SCC are non-melanoma cancers and are, by far, what the majority of skin cancer patients are affected by. However, the most dangerous and urgent of skin cancers falls in the category of malignant melanoma.
Melanoma develops from melanocytes that start to grow and divide more quickly than usual. When they grow out of control, they usually look like a dark spot or an unusual, odd shaped mole on your skin and despite this cancer being more common in lighter complexions, it is not exclusive to them. It is also twice as common in females than males, however more men die from it. In the UK alone, more than 2100 Britons die each year from malignant melanoma (). It is important to find and treat melanoma as early as possible. If a melanoma is not removed, the cells can grow down deeper into the layers of the skin and if the melanoma cells get into the blood or lymphatic vessels, they can travel to other parts of the body. As displayed from the information above, quality and speed of diagnosis plays a large role in a successful recovery hence proving the validity of this essay question. Melanoma’s are the most dangerous type of skin cancer as they root furthest down into the skin. If given time, the melanoma will reach bones and vital tissues which is life-threatening
The Role of AI in the Healthcare industry
As young as AI may be, it is growing rapidly and becoming more and more commonly used. Three centuries ago, the UK faced an industrial revolution (Jacob. B. Madsen, 2010) and today the world has entered a digital revolution. An example of an industry which recently began using technology is transport with the invention of self driving cars such as Tesla and introducing self driving trains like the DLR Service UK. This reduces their costs and improves efficiency as a human is no longer required to carry out this job. Also, the financial industry follows technological advancement with keen interest. Big banks such as JP Morgan have been early adopters of disruptive technologies like Blockchain (C Hudson, 2018). The use of AI in industries is growing rapidly and it has many applications in the industry of Healthcare as well.
https://towardsdatascience.com/ten-applications-of-ai-to-fintech-22d626c2fdac
Madsen, Jakob B., et al. “Four Centuries of British Economic Growth: the Roles of Technology and Population.” Journal of Economic Growth, vol. 15, no. 4, 2010, pp. 263–290. JSTOR, www.jstor.org/stable/40984852.
If there is one industry that reaches everybody in the world, it is the healthcare industry. With the power to save lives, this industry must be on a continual path towards excellence. Living in a world which is becoming more and more digitised with time, it is only natural that many industries will also use technology in order to improve the efficiency of accuracy of their work. The two main branches associated with Artificial Intelligence in medicine are virtual and physical. The virtual component includes machine learning (ML) and algorithms, whereas physical AI includes medical devices and robots for delivering care. AI is used successfully in tumour segmentation, histopathological diagnosis, tracking tumour development, and prognosis prediction (Y Vaishali, 2019). Major disease areas that use AI tools include cancer, neurology and cardiology (Jiang F, 2013). AI, within the last 10 years, has become predominant in the healthcare industry and there must be reasons for it.
Skin cancer diagnosing technology
For many years, skin cancer has been detected visually as this is still the first step towards a diagnosis in skin cancer. Moles are checked for abnormalities and, typically, a biopsy of the skin is taken and examined by specialists under a microscope. The cells are analysed and compared with several other samples before concluding. Cancer diagnosing technology can range from data analysing software in computers, apps, machines and more. This technology being developed for medical use is typically complicated pattern matching: An algorithm is shown many many medical scans of organs with tumors, as well as tumor-free images, and tasked with learning the patterns that differentiate the two categories. The algorithms were shown nearly 200,000 images of malignant, benign, and tumor-free CT scans, both in 3D and 2D. The way that the nodule detection algorithm is measured for accuracy is as it learns to find these tumors is it the same as they would be implemented in a specialist’s office, with a metric called “recall.” Recall tells us the percentage of nodules the algorithm catches, given a set number of false alarms ( D Gershgorn, 2018). The engineering behind this technology requires specialist understand and a lot of research and development especially when being used in diagnosis’ as it is a matter of life and death.
The Figure above displays how Diagnostic Imaging is the most considered medical data. This means, for skin cancer, skin images and biopsies are digitalised, and this is becoming common. The benefit of transferring the image onto a computer is that any specialist across the world, can view that patients sample and analyse it. App 1- Mole Detect Pro
In recent years, as technologies develop, dermatologists and software creators have come together to create apps that the average person can use and check themselves for skin cancer. Despite there being some controversy about skin cancer diagnosing apps, performing self exams is recommended by The Skin Cancer Foundation. Skin cancers that are detected early are almost always prevented from spreading and cured which is why it is important to regularly examine yourself and stay alert to any changes in your skin (Glynn S, 2018). The aim and purpose behind skin cancer detecting apps is to make it easier for the average person to identify whether their mole could be cancerous or not and then make an appointment with a doctor. An example of a skin cancer-detecting app is Mole Detect Pro and it claims to ‘provide its users with a remote professional diagnosis within 24 hours’ using an advanced algorithm to assess the probability of a potential melanoma (Glynn S, 2018). Dr Ashworth said ‘The technology behind this app is pretty impressive’. AI should be used in the diagnosis of skin cancer as it makes the process much more quick and efficient. Another app which has been debated over by many analysts and dermatologists is SkinVision which claims the following: ‘SkinVision helps you check your skin for signs of skin cancer with instant results on your phone. Our clinically-proven technology, combined with the knowledge of dermatologists specialized in skin cancer, helps you keep your skin healthy’ (SkinVison, 2018). The only major difference in the statements made by both Mole Detect Pro and SkinVision is that Mole Detect Pro claims explicitly to provide a diagnosis whereas Skinvision claims that a proper diagnosis can only be done combined with the knowledge of a professional. AI should be used in the diagnosis of skin cancer as the net result of using this app is that a lot more people with a potential problem will end up going to seek a professional diagnosis. This saves costs and time as less people will be visiting the dermatologists with non-cancerous moles.
The reason behind the controversy for allowing apps to diagnose is due to researchers believing there is a lack of rigorous published trials to display that these apps are reliable and therefore safe to use, a lack of input of mole images which are used to create the advanced algorithm that the app is based upon and general flaws in the technology. Dr Mary Martini, an associate professor of dermatology at Northwestern University’s Feinberg School of Medicine commented ‘When an app tells you something is benign when it isn't, that's a major problem’. Dr Martini’s concern is logical and shared with a wide community of oncologists. Instead of saving more lives, this has the potential to do the opposite, however, accuracy of AI diagnosis as compared to a doctor is higher therefore this should not be an issue.
Success rate in diagnosis - Machine VS Man
AI machines are programmed to see things far better than a human will ever see. For example it is possible to analyse the retina of a person to detect whether they are male or female, to analyse a colonoscopy and AI will pick up polyps that were missed by doctors (Topol E, 2019). This shows that AI is, in simple terms, smarter than a human. This can also be proved seen in simple calculators as they are able to solve problems faster and with 100% accuracy as compared to a human.
Now, specifically looking into the accuracy of AI in the diagnosis of skin cancer. A study was conducted by the University of Birmingham (2018) where a team from the United States, France and Germany instructed an AI system to distinguish fatal skin lesions from benign ones and in this study, more than 100,00 images were used . In order for this test to be fair, 58 dermatologists where over half of them were at ‘expert’ level with more than 5 years of experience, were also given these images and asked to distinguish between malignant and benign images. On average, dermatologists accurately detected 86.6% of skin cancers from the images presented to them and an AI machine successfully identified 95% of the melanomas (British Association of Dermatologists’, 2018). This shows that the machine was more sensitive and precise while analysing the images then the professionals were. Many dermatologists have been forced to question whether it is true that a machine is able to conduct their job, that they spent years training for, better than they are.
Lack of empathy using AI and its implications - Ai is harming the patient/doctor bond?
Despite the results of this study proving that AI is more accurate in the diagnosis then dermatologists, there are other important aspects of diagnosing skin cancer that need considering, particularly in malignant melanoma skin cancers. This is the quality of care. A key part of a doctor’s role is to be able to console their patients when conveying distressing news to the patient or their families.
Whether a machine is able to provide the empathy that is required while diagnosing skin cancer is not one of much controversy, it cannot unless it is programmed to and even then there are ethical issues arising with allowing machines to turn into comforting robots. The issue arising with allowing robots to, essentially, display emotions is that those emotions are not real. Patient care is regarded of high importance not only within the foundations of the NHS but also the World Health Organisation (WHO). If the use of AI becomes predominant in the field of diagnostics then in order to be in compliance with the 7 Principles of Care (23) the machinery would need to be taught to empathise and communicate in a comforting fashion.
The founder of Google’s empathy Lab, Danielle Krettek said that her work has contributed to some of the Google Assistant’s ability to attune to one’s mood. Danielle Krettek further explained this idea at the design conference Semi Permanent in Sydney, Australia, she said “When you say “Im feeling depressed”, instead of giving you a description of what depression is, it might say “you know what, a lot of people feel that way, you are not alone”. The ability to empathise is a social skill which can be taught to artificial intelligence which then can be practiced among skin cancer patients during diagnosis. A component of affective empathy is when we are able to share the emotions of others. Dr Pascal Molenberghs, a social neuroscientist at the University of Melbourne said “We stimulate in ourselves the emotions we observe in others” and also implied in his speech that if a robot is designed to alter its tone and speech in order to empathise, it may come across as mimicry instead of empathy. Another issue with diagnosing cancer patients using artificial intelligence is that the patient may feel restricted in transmitting their internal emotions and therefore, the specialist who is treating that individual may lose empathy which will potentially reduce the quality of care further (M Robson, 2018). It is believed by these professionals that AI cannot display empathy they way a doctor can and essentially can sabotage the bond between the patient and doctor.
In contrast to the notion that AI sabotages the patient-doctor bond, Dr Eric Topol’s book ‘Deep Medicine: How Artificial intelligence can make Healthcare human again’ presents an optimistic viewpoint of the future of algorithms and medicine. The relationship between patient and doctor has deteriorated over the years. Medicine has become so depersonalised and with this new technology individual patients can be deeply understood and given the care they need. The main point of this argument in his book was based on the idea that doctors are ‘burned out’ ‘tired’ and ‘depressed’ and now more than ever. In the UK, this could be due to increasing work and study hours with little pay. Doctors simply cannot sustain a friendly and comforting relationship with each and every one of their patients, especially if their role requires empathy like diagnosing skin cancer. Dr Eric Topol believes AI can help enhance the human aspect of diagnosis’ as AI can focus on the accuracy and efficiency of diagnosis therefore the doctor is able to focus on his/her patient as an individual.
After considering the reasons for and against the use of AI in diagnostics and its impact on the patient-doctor bond, it is debatable whether this bond is lost or resuscitated by AI.