Today, many people have heard about Artificial Intelligence. However, only a few know about Affective Computing or Emotion Artificial Intelligence. What does this concept mean, how does it work and why is it important for each of us? Let ‘s figure it out together.
Emotion Artificial Intelligence – what is it?
Emotion Artificial Intelligence (AI), or Affective Computing, is an area of AI that explores and develops systems that have the ability to recognize, interpret, process, and model human emotions. Such systems are an important part of human-computer interaction. From the research point of view, emotion AI includes not only computer science, but also psychology of emotions, sociology, neuroscience, cognitive science, and others . Emotion AI is of great interest to both scientists and practitioners.
Dr. Rosalind Pickard, a researcher and professor at the Massachusetts Institute of technology, is the founder of the emotion AI concept. In 1997 she published the book “Affective computing”, in which she described the importance of human emotions, and for the first time outlined the possibilities of recognizing and modeling emotions using computer systems . Emotion AI aims for the computer to be able to interpret human’s emotional state and adapt its behavior according to it.
Emotion AI is seen as one of the technologies that provides new business opportunities. In 2019, emotion AI was mentioned by Gartner in their famous study “Hype Cycle for Emerging Technologies” as a rapidly developing technology . This study suggests that emotion AI will continue to develop, spread, and increase in popularity in the coming years.
How does a computer understand people’s emotions?
Talking about human’s emotions, most often we mean different facial expressions. However, people can express emotions in various ways. For example, clenched fists may indicate internal tension, or an emoji in the text message may indicate happiness. In other words, we use different channels for expressing emotions. Likewise AI uses the same different channels for recognizing and modeling emotions. These channels are called “emotion modalities”: facial expression, body movement and gestures, verbal and nonverbal speech signals, and physiological signals.
To recognize emotions through these modalities, different approaches and technological tools are used. For example, video is used to recognize facial emotions and gestures, audio is used for speech detection, while special biometric sensors are used for physiological signals recognition. These are various ways to collect and process the primary data.
Then emotion AI systems use machine learning algorithms to search for specific patterns, identify emotions, and create the system’s response. This exact response depends on the specific technology and purpose of the system. This can be either a report for the user or a reaction from the system itself using knowledge about the user’s emotional state.
A simple example to understand this technology is voice assistants. Already now voice assistants are able to distinguish and remember the voice of exact people who talk to them, and thus use different information for different users. The technology also learns to adapt the volume of the speech to the speaker’s voice, and, for example, respond to a whisper comment by whispering .
How does business use emotion AI?
For a long time the decision-making process in business was approached rationally. It was believed that to make the most effective decision it is necessary to avoid emotions. More importantly, it was believed that people can discard all emotions and be guided only by logic. However, scientists later found that any human decision, including the decision to buy a product or accept a candidate for a job, is directly influenced by emotions . That’s why today companies are trying to take into account the emotional component and its impact on decision-making in different areas: from analyzing consumer behavior to developing new products and services. The human factor is not an opportunity for a mistake, but rather the presence of human emotions that could not be predicted or modeled until now. We can say that today emotions are one of the driving forces in business.
In recent years emotion AI technologies have started to move from the scientific laboratories into commercial development. Teams of scientists from research labs launch startups, while huge companies create distinctive projects and research areas for emotion AI and its application in business.
Although the applications of such technologies are yet not fully understood, some of the use areas are more common than others. For example, healthcare, automotive, telecommunications, sales, and customer support . Let’s take a closer look at how emotion AI can be used in some of these industries.
Healthcare and medicine are areas of great demand for developing and applying emotion AI. The use of biometric sensors in personal devices can help to predict and prevent depression or sudden epilepsy attacks in advance. Emotion recognition technologies are also widely used to help people with autism communicate with others. For instance, companies such as BioStream Technologies, Empatica, and Sentio Solutions specialize in this field.
Another application area is automotive industry. When we talk about modern cars, we imply not only the vehicles themselves, but also the built-in onboard software and equipment. Computer systems control the air temperature in the cabin, turn on the driver’s favorite music, and suggest the optimal route. Using emotion AI technologies, the software will also be able to determine the driver’s emotional state, stress level, or sudden health problems. This can make trips more safe for both drivers and passengers, as well as for others. Сompanies such as Gestion HQ, Eyeris, and Eyesight use emotional intelligence in the automotive industry.
Support centers and customer services also use emotion AI systems. Especially those that automatically assess the caller’s emotional state and give tips to call center employees. Examples of companies that use such systems can be audEERING, Behavioral Signals, and Neurodata Lab.
Source: Empatica company official website
Source: Eyesight company official website
Source: Neurodata Lab company official website
The use of emotion AI is not limited to a few specific industries. Any business deals with people, whether they are customers in the store, app users, or employees of the client company, as in the case of B2B relationships. This means that any business will benefit from understanding people’s emotions. Emotion AI technologies contribute to a deeper understanding of clients, both existing and potential, and a more accurate definition of their needs. This will allow the company to improve the user experience with the product and avoid additional mistakes when changing and developing the product in the future. On the other hand, emotion AI can also be used internally in a company. Special applications can provide support for employees in the workplace, assess their physical and emotional state, and assist in case of a crisis.
SAMSONOWA & Partners team tracks the development of emotion AI technologies and recent commercial projects in this area. We approach the study of emotion AI from the business point of view. How are the companies that bring emotion AI products to market organized internally? We study what business models they use, how they form a value proposition in the market, who they identify as their customers, and how they approach the issue of ethics. In 2019 we presented the first results of our research in Cambridge at the 8th International Conference on Affective Computing & Intelligent Interaction (ACII), dedicated to emotion AI technologies.
Get free market research
Based on the analysis of open sources, IPERF institute in collaboration with SAMSONOWA & Partners conducted a market research on emotion AI providers and created a marketing report. You can get this free report by leaving you contact data below.
- Tao, Jianhua; Tieniu Tan (2005). “Affective Computing: A Review”. Affective Computing and Intelligent Interaction.
- Picard R. 1997. Affective computing, MIT Press.
- Gartner Hype Cycle for Emerging Technologies
- Ormandy R. 2019. From Synapses to Ephapsis: Embodied Cognition and Wearable Personal Assistants. Artificial Intelligence in the Age of Neural Networks and Brain Computing
- Schwarz N. 2010. Emotion, cognition, and decision making. Cognition and Emotion, vol. 14
- Affective Computing Market: Opportunity and Forecast, 2019-2026. 2019. [Electronic resource]. https://www.alliedmarketresearch.com/affective-computing-market. (accessed: 18.02.2020).