Can AI Enhance Physician Empathy? Exploring the Pros and Cons

Written by Mireia Cuevas Crespo (Reporter)

The fact that AI is shaping our current understanding of healthcare is undeniable.

Thanks to AI, many areas of modern medical care including vaccine production, depression diagnosis and treatment and cancer are witnessing promising developments that are shaping current preventive, diagnostic and therapeutic medical procedures.

However, there are increasing concerns surrounding physician empathy in modern medicine that are receiving significant attention. Fortunately, AI might be able to help solve this issue.

An Issue of Unempathetic Healthcare

Lately, there has been a widespread proliferation of training courses aimed at enhancing communication skills among healthcare professionals. However, due to pressures to quickly see large numbers of patients whilst keeping track of time-consuming administrative duties, physicians often struggle to transmit empathy to patients and their families during critical circumstances.

The current standard of patient care is lowered by physician burnout and the negative effects it has on medical empathy. However, according to research, enhanced physician empathic skills could potentially benefit patients suffering from a range of clinical disorders.

The Potential of AI in Boosting Physician Empathy

Several recent studies are investigating whether the implementation of AI in medical care could potentially enhance physician empathy.  These could be some of the benefits:

  • AI might free doctors from healthcare administrative responsibilities, allowing them to invest more time in communicating with patients and engaging with them more empathically. This extra time could be critical for better assisting patients and families during times of crisis.
  • Artificial intelligence systems are already being developed to improve medical responses to patient’s needs. This could be a chance to help doctors provide improved emotional support to their patients.
  • AI can quickly analyse massive volumes of patient data, both big and small . This could enable doctors to quickly evaluate specific individual needs and design personalised treatment plans for different patients suffering from the same disease.
  • AI might be used to provide training to healthcare personnel to enhance their empathic communication skills, promoting the implementation of more tailored care plans for different illnesses.

“With AI, doctors would have more time to look patients in the eyes during consultations.” – Ricard Mesia, Head of Medical Oncology Department at Catalan Institute of Oncology- Badalona

Could AI Help Doctors be More Emphatic?

Despite the potential advantages of using AI to improve physician empathy, AI has inherent limitations that cannot be ignored. These are some of the most concerns:

  • There is remarkable controversy surrounding AI’s capacity to successfully emulate human empathy, which is characterised by a genuine understanding of human emotions and a real doctor-patient flow of emotions. Several researchers affirm that emphatical AI is
  • Over-relying on technology to solve emphatic concerns may lead to the dehumanisation of patient-doctor relationships, in which human-to-human contact is believed to be essential to maintaining effective healthcare.
  • There is concern about the reliability of some AI devices currently being employed in several domains of healthcare. Some of these tools appear to not meet clinical validation requirements and therefore the safety of their use for medical care is unclear.
  • To function correctly, AI- powered devices need access to vast big and small datasets containing private patient information. Many times, patients are unclear about what use is being given to their confidential data, and this raises significant concerns regarding patient privacy and trust in AI.

“It will take a long time until we can fully trust the implementation of AI in any area of medicine. Technology can always fail, and AI-driven diagnoses are less likely to be trusted by patients” – Laura M. Carbonell, Health Psychologist and lecturer at London School of Social Science and Technology

Bright Yet Doubtful Future Prospects

The application of AI to promote medical empathy could represent an innovative way to improve current patient care. Relying on AI-powered technologies may minimise the amount of time physicians spend on time-consuming administrative duties, potentially allowing them to provide more effective emotional support to their patients.

However, these promising prospects are accompanied by significant challenges. Numerous regulatory, privacy and device reliability concerns must be addressed, and further research is needed to ensure that the use of AI tools in healthcare safeguards the needs of patients.

To overcome these challenges, AI tools must be appropriately adapted and regulated in accordance with the needs of modern human-led healthcare. Therefore, to fully exploit the potential medical benefits of implementing AI in medicine, physicians must learn how to take advantage of AI instruments to create a more efficient system whilst preserving the human aspect of healthcare.

Source

  1. Botrugno C. (2021) “Information technologies in healthcare: Enhancing or dehumanising doctor–patient interaction?” Health. 25(4):475-493. https://doi:10.1177/1363459319891213
  2. Crudden G, Margiotta F, Doherty AM (2023) “Physician burnout and symptom of anxiety and depression: Burnout in Consultant Doctors in Ireland Study (BICDIS)”. PLoS ONE 18(3): e0276027. https://doi.org/10.1371/journal.pone.0276027
  3. Howick J, Moscrop A, Mebius A, et al. (2018) “Effects of empathic and positive communication in healthcare consultations: a systematic review and meta-analysis". Journal of the Royal Society of Medicine.111(7):240-252. doi:10.1177/0141076818769477
  4. John W. Ayers, P. (2023). Comparing physician and chatbot responses to patient questions. JAMA Internal Medicine. https://jamanetwork.com/journals/jamainternalmedicine/fullarticle/2804309
  5. Montemayor, C., Halpern, J. & Fairweather, A. “In principle obstacles for empathic AI: why we can’t replace human empathy in healthcare”. AI & Soc 37, 1353–1359 (2022). https://doi.org/10.1007/s00146-021-01230-z
  6. Reddy S, Fox J, Purohit MP. “Artificial intelligence-enabled healthcare delivery”. Journal of the Royal Society of Medicine. 2019;112(1):22-28. doi:10.1177/0141076818815510
  7. Williamson, Steven M., and Victor Prybutok. “Balancing Privacy and Progress: A Review of Privacy Challenges, Systemic Oversight, and Patient Perceptions in AI-Driven Healthcare.” Applied Sciences, vol. 14, no. 2, 1 Jan. 2024, p. 675, www.mdpi.com/2076-3417/14/2/675, https://doi.org/10.3390/app14020675.