Most people know that Artificial Intelligence can achieve specific personal tasks like locating the closest ice cream shop, and business tasks like synthesizing large volumes of data. AI’s cognitive abilities such as learning and identifying patterns are certainly its most famous attributes. But if AI is the imitation of human intelligence, we cannot forgo without the considering one important component—our emotions.
Emotions like happiness, excitement, anger and grief shapes all aspects of life, from how well we learn, how we bond with other people and our well-being. Yet, we are immersed in a hi-tech society where machines can mostly analyze data but cannot understand how we feel.
Making a machine emotionally intelligent is appealing because it will naturalize our experience with technology and we can create highly personalized user experiences. This was first explored in 1995 by Rosalind Picard, an American scholar, in her report named ‘Affective Computing’. As she explored the key challenges in computing that relates to, arises from or influences emotions, she laid the foundation for Artificial Emotional Intelligence or Emotion AI. Years later in 2020, the field is still nascent but very valuable for industries where we interact with technology regularly.
How Does Emotion AI work?
In our daily lives, we communicate so much in addition to our words: gestures, tone of voice, body language and facial expressions. But in the broad spectrum of emotional expressions, each one is unique and varies subtly from one another. In order to create emotionally intelligent machines, AI algorithms are shown thousands of images of people smiling and it will steadily learn the imperceptible characteristics of a human smile. The same must be repeated for other expressions too such as a frown or a smirk, because it’s important for AI to recognize a particular emotion and distinguish it from another as well. Thanks to deep learning networks, the AI algorithm can learn to do detect emotions easily, effectively and in real-time.
The State of Emotion AI Today
Gartner, the world’s leading research and advisory company, predicts that 10% of the world’s smartphones will have emotion AI capabilities by 2022. “By 2022, your personal device will know more about your emotional state than your own family”, reports Annette Zimmermann, the research vice president at Gartner.
Artificial emotional intelligence is very useful in business, especially in customer support and advertising. As of now, apart from only a few successful startups specializing in emotion AI, newer organizations including Google and Amazon are taking up an interest.
Customer support: Many businesses today use smart conversational systems such as virtual personal assistants, chatbots and interactive voice response (IVR) to answer peoples’ queries. While they are effective in specific tasks, they lack the context to the person’s question and his or her emotional state. So, making these systems capable of understanding how each customer feels can greatly enhance the user experience.
Advertising: One of the biggest pain points in modern advertising is that businesses spend millions on ad campaigns aiming to emotionally connect with viewers. At the same time, they have no way of knowing how exactly people received these ads. Beyond ambiguous mouse click activity, they don’t really know if people felt happy, sad, boring or disgusted.
Emotion AI can monitor a person’s minute facial expressions as he or she watches an advertisement to measure emotions. It can even work with additional metrics such as the customer’s level of engagement, attentiveness and mapping emotions over the duration of the video. All of this data will help advertisers and marketers know the effectiveness of their advertisements and they can strike the right emotional chord with people.
Applications in Other Industries: For certain service providers, the knowledge of a customer’s emotional states can be very valuable. An emerging use case is the automotive industry, especially in AI self-driving cars. In addition to driving assistance, the software can monitor the driver’s emotions and alert him or her in case of extreme emotions such as drowsiness. Another case is in educational AI software can adapt to the difficulty levels of students’ emotions such as confusion, boredom and frustration. Some more popular applications include call centres, fraud detection, connected home, retail and healthcare.
While emotionally intelligent machines can prove very valuable for certain use cases, there are some circumstances that raises ethical and legal concerns. For instance, an extreme case is if a profit-oriented organization installs emotion AI in the work place for its lucrative opportunity—at the cost of the employees’ dignity.
Andrew McStay, a professor who teaches the ethics and the impact of emerging technologies on society, brings critical awareness to emotion recognizing technology in his latest book. He explains that despite the usefulness of the technology, the emerging field can prompt the “datafication” of our emotional lives. As a safety measure, he strongly advocates for modern organizations to follow guidelines for the ethical usage of emotion AI.
Key Points to Remember
AI is going mainstream but most of the emphasis is largely only on its cognitive ability. AI’s social and emotional competence is often forgotten. As emotions are a vital part of our lives, the scope of AI without emotional intelligence becomes very limited. So, leaders should consider how the technology can change industries in the future and the kind of role it may play in their firms.