How can AI detect our feelings of pain?
Researchers have developed an automated AI-driven pain recognition system that could provide an unbiased method of recognizing patient pain prior to, during and following surgery.
An ache, that random sharp tinge, a dull throb… there are countless ways our body lets us know that we are in pain. These feelings are usually subjective, unique experiences that can be difficult to describe to others who are not confronted with such ordeals. But now, an automated AI-driven pain recognition system may help us to convey and express feelings of pain.
Presented at the ANESTHESIOLOGY® 2023 annual meeting (13–17 October, CA, USA), new research demonstrates the potential of AI to provide an unbiased means of detecting pain.
Patient pain is currently evaluated using subjective examinations and questionnaires, such as the Critical-Care Pain Observation Tool (CPOT) and Visual Analog Scale (VAS). The CPOT is for healthcare professionals to rate patients’ pain by observing their facial expressions, body movement, muscle tension and ventilator compliance. The VAS involves patients rating their own pain.
These traditional pain assessment methods can be impacted by cultural and racial biases, which can in turn lead to poorer pain management and health outcomes. Previous studies have illustrated that swift detection and treatment of pain have reduced hospitalization length, as well as avoided long-term health conditions including anxiety, depression and chronic pain.
Two types of AI are used in the automated pain recognition system: computer vision and deep learning, for visualizing and analyzing patient pain respectively.
To test the system, the team used 143,293 facial images from 115 pain episodes and 159 non-pain episodes in 69 patients who had undergone various surgical procedures, including simple knee replacements and complex heart surgeries. The system was trained by revealing whether each facial image represented pain or no pain, until it started to recognize patterns on its own.
The team discovered that the AI tools concentrated on facial expressions and muscles in specific areas, including the lips, eyebrows and nose. When analyzing the pain recognition performance of the system after training, astonishingly, the system was consistent with VAS results 66% of the time and CPOT results 88% of the time.
Timothy Heintz, a medical student at the University of California San Diego (USA), stated, “The VAS is less accurate compared to CPOT because VAS is a subjective measurement that can be more heavily influenced by emotions and behaviors than CPOT might be,” said Heintz. “However, our models were able to predict VAS to some extent, indicating there are very subtle cues that the AI system can identify that humans cannot.”
“Concerns about privacy would need to be addressed to ensure patient images are kept private, but the system could eventually include other monitoring features, such as brain and muscle activity to assess unconscious patients.”
“Further, there is a gap in perioperative care due to the absence of continuous observable methods for pain detection. Our proof-of-concept AI model could help improve patient care through real-time, unbiased pain detection.
Upon validation of these results, the system could vastly improve patient care and health outcomes through monitoring their surgical recovery using cameras. This would alleviate the workload of healthcare professionals and enable them to spend more time on other areas of care.