Menu
Enterprises, emotion and the rise of the ‘empathy economy’

Enterprises, emotion and the rise of the ‘empathy economy’

Artificial intelligence can now detect human emotion better than people can. It’s time to get emotional about AI.

Big business is getting emotional.

User interfaces and other aspects of enterprise computing are being increasingly designed to detect the emotional states or moods of users, and also to simulate emotion when they communicate back to the users.

A Gartner report published in January said that within four years, your devices will “know more about your emotional state than your own family.”

Deep learning has advanced emotion detection from basic emotions such as happiness, surprise, anger, sadness, fear and disgust to more than 20 more subtle emotions that include awe, happy surprise and hate. (Psychologists say people have 27 different emotions.)

The University of Ohio developed a program that recognizes 21 emotions based on facial expressions in photographs. Here’s the shocking part: The researchers claim that their system is better than humans at detecting these emotions.

There’s a good reason and a great reason for emotional interfaces in the enterprise.

First, the good reason.

Empathy boosts business

The “empathy economy” is the monetary or business value created by AI that detects and simulates human emotion, a capability that will transform customer service, virtual assistants, robotics, factory safety, healthcare and transportation.

A Cogito survey conducted by Frost & Sullivan revealed that 93% of the people surveyed said that customer service interactions have an impact on their perception of a company. And empathy is one of the key factors in quality interactions, according to the company.

Cogito’s AI software, which is based extensively on behavioral science research from MIT’s Human Dynamics Lab, analyzes the emotional state of customers and gives instant feedback to human call center agents, enabling them to empathize more easily with customers.

This kind of technology gives call center staff superpowers of empathy, which could greatly improve public perception of a company.

Companies such as Affectiva and Realeyes offer cloud-based solutions that use webcams to track facial expressions and heart rate (by detecting the pulse in the skin of the face). One use is market research (consumers watch ads, and the technology detects how they feel about the images or words in the ads).

The companies are looking to expand into other fields, such as healthcare, where automated call centers might detect depression or pain in the voice of the caller, even if the caller doesn’t express it verbally.

A robot called Forpheus, created by Japan’s Omron Automation and demonstrated at CES in January, plays Ping-Pong. Part of its arsenal of table tennis skills is its ability to read body language to figure out both the mood and skill level of its human opponent.

The point of the robot isn’t Ping-Pong, but industrial machines that work “in harmony” with humans, boosting both productivity and safety. By reading the body language of factory workers, for example, industrial robots could anticipate how and where people might move.

Robots such as Qihan Technology’s Sanbot, SoftBank Robotics’ Pepper and Honda’s 3E-A18 exist to help people in general ways, essentially acting as rolling information booths at airports, malls, hotels and other locations. They’re built to understand and mimic human emotion, so they can provide more satisfying answers to any questions posed. These robots might read body language, voice patterns and facial expressions to figure out how the user is feeling, or whether the user is confused.

Another major and obvious application for emotion detection is in cars and trucks. By using biometric sensors and cameras in dashboards, seats or seat belts, on-board AI can tell if a driver is stressed out or too tired to drive. Emotion-detecting cars could reduce accidents and lower insurance premiums.

Ford, for example, is working with the EU to develop such a system.

Emotion detection isn’t just the analysis of photos, videos and spoken language.

IBM’s Watson has a “Tone Analyzer“ feature that can detect emotion and even sarcasm in written communication.

While the applications for emotion AI are myriad in the enterprise, the public will first encounter them in consumer and public spheres.

Facebook was recently granted patents for “emotion detecting selfie filters.” The idea is to auto-select an appropriate “mask” for selfies based on the emotion detected in the photo. For example, if the person looks sad, the filter defaults to cartoonish teardrops. A person who looks happy gets the “Happy Panda” filter.

A video game called Nevermind uses biofeedback to detect players’ moods and adjust difficulty level accordingly — it gets harder the more afraid the player becomes. (It uses technology from Affectiva.)

These are trivial applications, but they’ll help acclimate the public to the idea of software that detects emotions.

In the U.K., the government is using a service called G-Cloud 10, purpose-built by FlyingBinary and Emrays, to place its finger on the pulse of the population based on emotions detected in social media posts and comments.

And for people who live in China, the first encounter with emotion AI might be in the schools. In March, a school in China even added cameras to classrooms that use AI to monitor the emotional states of all the students. (The system also takes attendance and tracks what the kids are doing at each moment.) It even grades students on how distracted the students are (a version of Orwell’s concept of “facecrime”). They need to not only pay attention, but also look like they’re paying attention.

Emotion and empathy boost business in many ways. That’s the good reason for this technology.

Here’s the great reason.

Emotion is necessary for communication

AI and robots are machines, but people aren’t.

It’s a widespread misconception that the content of our words is the sum total of human communication. In fact, people communicate with words, vocal intonations, facial expressions, hand gestures and body language.

That’s why email is fraught with the potential for miscommunication. Without nonverbal cues, much of the intended meaning is lost.

In the sweeping history of computing, the perpetual trend is this: As computers get more powerful, a big part of that power is applied to making the computers work harder for the quality of interaction between the machine and the user.

Once upon a time, the “interface” was switches, punch cards and tape. The human had to work hard to speak the binary language of computers. Fast forward through the command line, the GUI and the spoken-language interfaces. With each progressing step, the machines are working harder to make interfacing easier for the humans.

Trouble is, today’s spoken-word interfaces fail to satisfy, because essentially they’re dealing exclusively with the words, and not the nonverbal cues. When you talk to a virtual assistant, for example, your spoken words are converted into text, and it’s the text that’s analyzed. In the future, both text and vocal intonations will be processed to derive meaning and context.

The most powerful benefit from emotional AI is that human/machine interaction will increasingly involve both verbal and nonverbal communication, which will improve understanding on both sides.

The most interesting use for emotion detection and simulation could be in our everyday virtual assistants. Future versions of Siri, Google Assistant, Alexa and Cortana will interpret interaction differently based on the emotion of the user.

I told you back in October that the iPhone X’s camera system and Face ID features opened the door for high-quality emotion detection by smartphones. Now the whole industry is working fast toward empathetic phones.

China’s top smartphone maker, Huawei, is working on updating its existing virtual assistant (which currently has 110 million users, according to the company) with AI that detects the emotions of users.

All the major virtual assistant makers -— Apple, Amazon, Google, Samsung — are working feverishly on improving voice interaction with AI-based emotion sensors and simulators.

The bottom line is that emotional machines boost the bottom line. They do this in a hundred ways, and the most important is improving communication between people and the systems that power business.

Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

More about AmazonAppleEUFacebookGartnerGoogleHuaweiIBMMITOmronPandaSamsungTechnologyTone

Show Comments
[]