Understanding emotion with the help of AI?

Written by Lauren Coyle (Future Science Group)

UWS academics developed an AI emotion recognition tool that interprets signals from brain and facial analysis that has potential to be a novel resource for clinicians, therapists and caregivers.

University of the West of Scotland (UWS) (UK) academics have created a tool for recognizing emotions that could assist individuals with neurodiverse conditions, including autism. Historically, studying emotion recognition has been difficult and intricate. Nevertheless, thanks to advancements in vision processing and the availability of affordable devices like wearable electroencephalogram (EEG) and electrocardiogram (ECG) sensors, UWS academics collaborated to leverage these technologies and develop AI capable of accurately interpreting emotion-related signals from brain and facial analysis.

“Emotions are a fundamental aspect of the human experience, and understanding the signals that trigger different emotions can have a profound impact on various aspects of our lives,” said Professor Naeem Ramzan, Director of the Affective and Human Computing for SMART Environments Research Centre at UWS.

UWS researchers have created a multimodal database for the system, incorporating signals recorded during a study involving audio-visual stimuli. Participants in the study were recorded while self-assessing their emotional responses to each stimulus, evaluating aspects like reaction, stimulation and dominance. The signals were captured using a camera and wireless wearable equipment, which holds promising potential for integrating effective computing techniques into everyday applications.

The outcomes of the research have resulted in the development of extensive data that can be utilized alongside wearable technology. By leveraging multi-sensors and AI, this data becomes a crucial tool for recognizing emotions. Additionally, the collected data serves as a valuable resource for researchers and industry experts, granting them deeper insights into emotional triggers and acting as a reference point for unlocking new potentials in fields such as health and well-being, education, and security.

This significant advancement presents a novel resource for clinicians, therapists, and caregivers, providing enhanced insight into the emotional well-being of individuals across various neurodiverse conditions. This breakthrough holds the potential to enhance mental health assessments and enable early intervention for emotional difficulties, thereby expanding the opportunities for personalized therapeutic interventions.

Moreover, this technology paves the way for the development of augmented reality, virtual reality, and robotics applications specifically tailored to comprehend and express emotions, thus aiding individuals in a more profound manner.