CIO

Emotionally intelligent machines to be used for everyday applications by 2020: Frost & Sullivan

Affective computing can be embedded into almost any application, says Frost & Sullivan’s Debarun Guha Thakurta

Artificial intelligence that includes an emotional component will be embedded into everyday applications within the next five years, says senior research analyst of TechVision at Frost & Sullivan, Debarun Guha Thakurta.

Emotionally intelligent machines, referred to as ‘affective computing’, is where computers are able to understand the emotional state of a user. It relies mostly on computer vision and natural language processing with sensor inputs to detect human emotions. Sensor inputs can also include measuring body heat or temperature of the user, which can be used to indicate certain emotions.

When making decisions on the best outcome from data, understanding human emotion enhances decision making, says Thakurta.

“Affective computing is where machines or applications become intelligent to understand the human factor, the human emotion or feelings. This makes the decision making more realistic and more acceptable so that chances of resilience becomes lesser. So it brings the emotion into decision making.”

One area of affective computing that is gaining rapid traction, according to Thakurta, is facial recognition emotional analytics. He played out the scenario of a robotic retail store assistant that can pick up on customers’ emotional reactions to products to determine if they are satisfied or dissatisfied, looking lost and confused, or bored.

He gave other examples of how affective computing could be applied in other industries such as healthcare and education.

“It could be used in psychology, where the patient is not able to express their feelings or thoughts. Affective computing can be used to understand what is going on inside the patient’s mind.

“In the education sector, where students are not able to understand the concepts in the lecture and are shy to raise a question, affective computing systems could help understand student’s thoughts and whether they are actually satisfied [to then address this].”

Thakurta said the accuracy of some facial recognition machines in detecting emotions today is around 80 per cent.

“However, facial recognition emotional analytics still has limited spectrum; it just detects six or seven basic emotions,” he pointed out.

“Machines should be equipped with more sensor inputs, more than one. Computer vision alone might not be enough to understand emotion, it also requires speech processing, gesture recognition, body heat temperature measuring.

“There is still plenty of scope for development.”