Emotion AI: Do we want AI to understand our feelings?

IMAGE CREDIT:
Image credit
iStock

Emotion AI: Do we want AI to understand our feelings?

Thrive from future trends

Subscribe today to equip your team with the leading trend and foresight platform used by multidisciplinary, future-focused teams working across departments in Strategy, Innovation, Product Development, Investor Research, and Consumer Insights. Convert industry trends into practical insights for your business.

Starting at $15/month

Emotion AI: Do we want AI to understand our feelings?

Subheading text
Companies are heavily investing in AI technologies to capitalize on machines being able to analyze human emotions.
    • Author:
    • Author name
      Quantumrun Foresight
    • September 6, 2022

    Post text

    Artificial intelligence (AI) systems are learning to recognize human emotions and leverage that information in various sectors, from healthcare to marketing campaigns. For example, websites use emoticons to gauge how viewers respond to their content. However, is emotion AI everything that it claims to be? 

    Emotion AI context

    Emotion AI (also known as affective computing or artificial emotional intelligence) is a subset of AI that measures, understands, simulates, and responds to human emotions. The discipline dates back to 1995 when MIT Media lab professor Rosalind Picard released the book “Affective Computing.” According to the MIT Media Lab, emotion AI allows for more natural interaction between people and machines. Emotion AI attempts to answer two questions: what is the human’s emotional state, and how will they react? The answers collected heavily impact how machines provide services and products.

    Artificial emotional intelligence is often interchanged with sentiment analysis, but they are different in data collection. Sentiment analysis is focused on language studies, such as determining people’s opinions about specific topics according to the tone of their social media posts, blogs, and comments. However, emotion AI relies on facial recognition and expressions to determine sentiment. Other effective computing factors are voice patterns and physiological data like changes in eye movement. Some experts consider sentiment analysis a subset of emotion AI but with fewer privacy risks.

    Disruptive impact

    In 2019, a group of inter-university researchers, including Northeastern University in the US and the University of Glasgow, published studies revealing that emotion AI doesn’t have a solid scientific foundation. The study highlighted that it doesn’t matter if humans or AI are conducting the analysis; it’s challenging to accurately predict emotional states based on facial expressions. The researchers argue that expressions are not fingerprints that provide definitive and unique information about an individual. However, some experts don’t agree with this analysis. The founder of Hume AI, Alan Cowen, argued that modern algorithms had developed datasets and prototypes that accurately correspond to human emotions. Hume AI, which raised $5 million USD in investment funding, uses datasets of people from the Americas, Africa, and Asia to train its emotion AI system. 

    Other emerging players in the emotion AI field are HireVue, Entropik, Emteq, and Neurodata Labs. Entropik uses facial expressions, eye gaze, voice tones, and brainwaves to determine the impact of a marketing campaign. A Russian bank uses Neurodata to analyze client sentiments when calling customer service representatives. 

    Even Big Tech is starting to capitalize on the potential of emotion AI. In 2016, Apple purchased Emotient, a San Diego-based firm analyzing facial expressions. Alexa, Amazon’s virtual assistant, apologizes and clarifies its responses when it detects that its user is frustrated. Meanwhile, Microsoft’s speech recognition AI firm, Nuance, can analyze drivers’ emotions based on their facial expressions.

    Implications of emotion AI

    Wider implications of emotion AI may include: 

    • Big Tech purchasing more startups to expand their AI research and capabilities, including the use of emotion AI in self-driving vehicles.
    • Call center customer service departments using emotion AI to anticipate customer behavior based on the tone of their voice and changes in their facial expressions.
    • Increasing investments in affective computing research, including expanded partnerships with global universities and research institutions.
    • Increasing pressure for governments to regulate how facial and biological data are collected, stored, and used.
    • Deepening racial and gender discrimination through misinformation or erroneous analyses.

    Questions to comment on

    • Would you consent to have emotion AI apps scan your facial expressions and voice tone to anticipate your emotions?
    • What are the possible risks of AI potentially misreading emotions?

    Community forecast feedback

    View the community's ratings after you leave your own below.

    Average year

    All readers

    Average year

    Qr readers

    --

    Average vote

    All readers

    Average vote

    Qr readers

    --

    Average vote

    All readers

    Average vote

    Qr readers

    --

    Average vote

    All readers

    Average vote

    Quantumrun readers

    --
    --
    --

    Average vote

    Company readers

    Insight references

    The following popular and institutional links were referenced for this insight:

    MIT Management Sloan School Emotion AI, explained