When You Will Encounter Genuine Emotional AI

Genuine Emotional AI

Recent tremendous growth of mobile technologies and advancements of artificial intelligence (AI) in virtually any industry will give a boost to some quite unexpected applications of AI in the coming decades. Emotional AI is one of the fields where machine learning and algorithms offer huge potential although reading and controlling emotions through AI is quite a challenging task. At this stage, researchers focus on developing AI algorithms that can read and respond to human emotions. Once we have viable technology that accurately interprets our emotional status and reaction, we can start thinking of building AI-powered systems that are capable of convincingly simulating emotions or have their own emotions.

Reading Emotions and Reacting to Them

Possible applications of emotional AI spans areas ranging from self-driving cars, to personal robots and virtual assistants, to IoT computing and literary all major consumer electronic devices. Any of those devices or apps could utilize AI to provide enhanced human-machine interaction, emotional awareness or machine emotional intimacy. This is also a way to overcome prejudices related to use of AI-powered systems as a reliable tools assisting humans in their personal and work life. In fact, development of emotionally intelligent devices and software is a crucial factor for the overall growth of AI applications across any industry.

Androids should not the ultimate goal; one would reasonably argue that having human-like robots does not make sense by any means. Nevertheless, a machine-learning algorithm coupled with AI capabilities can be extremely useful in human-machine interactions. American psychologist Paul Ekman identified six basic emotions in the 1980s − anger, disgust, fear, happiness, sadness, and surprise. Later, Professor Robert Plutchik extended basic emotions to eight, grouping them into four pairs of polar opposites: joy-sadness, anger-fear, trust-distrust, and surprise-anticipation.

A few AI startups work on development of algorithms that are able to recognize these human emotions and eventually develop similar emotions in intelligent machines. Although founders of emotional AI startups like Patrick Levy-Rosenthal claim their algorithms are achieving up to 98 percent emotion recognition accuracy on conversations we still do not have viable emotional AI for widespread use. Facial recognition of emotional states is progressing rapidly but most human-machine interactions are performed without the use of a camera. Hence, natural language understanding appears as a more crucial technology to develop if we want to build viable solutions utilizing emotional AI.

How to Trust Emotional AI?

Making a device that develops unique personality through reinforcement learning and Machine Learning is achievable today. Nonetheless, it has applications mostly as an entertaining device unless it is able to recognize human emotions with 100 percent accuracy and take meaningful actions depending not solely only on emotions but while taking into account a complex set of other factors.

Imagine an autonomous vehicle that can also be controlled using voice commands. The vehicle is speeding to a crosswalk where a pedestrian is crossing the road. An emotional and instinctive reaction of the driver would be to shout “Stop” or give a similar command. What if the better decision is to speed up and find a way around the left or the right side of the pedestrian? How should the car’s AI react to the emotional voice command as opposed to its algorithms? Obviously, a priority of computed algorithmic actions can be programmed for such a situation but the above case is a relatively simple one. What if the driver/passenger experiences a momentary lapse of reason and issues a command that has nothing to do with reality and immediate surrounding? Emotion recognition and interpretation is much more complicated than simply reading facial and language expressions.

Nonetheless, emotional AI is progressing rapidly and Alexei Samsonovich, a professor in the Cybernetics Department at the Moscow Engineering Physics Institute, already proposed a multi-part test intended to answer a critical question: How do we know that such intelligence actually experiences real, human-like emotions?

Virtual agents and robots should be human-like so that humans could trust them and cooperate with them as with their equals. Therefore, artificial intelligence must be socially and emotionally responsive and able to think and learn like humans. And that implies such mechanisms as narrative thinking, autonomous goal setting, creative reinterpreting, active learning, and the ability to generate emotions and maintain interpersonal relationships,” Samsonovich explains.

Prepare for Emotional AI

Creation of such a test is possible in a completely virtual environment where no face-to-face contact is available. However, emotional AI also faces a multi-attribute classification problem related to facial expressions. Human facial expressions, for instance, involve eyes, brow, and lips but also depend on age, race, and gender. AI researchers and developers should also take into account factors like lighting and orientation to the camera.

Body movements i.e. body language is even more challenging for AI to interpret. Humans have learned to decipher body language through millions of years of evolution. However, it is not simply about reading body language but putting it into context. Shrugging your shoulders is a common gesture but it could express different emotions in different contexts.

Finally, you’ll have to wait for an android to join your family for the time being, but a mass device that is capable to understand basic human emotions and use AI to respond and take actions based on them will be available somewhere around 2020.

By Kiril V. Kirilov

David Loo

The Long-term Costs of Data Debt: How Inaccurate, Incomplete, and Outdated Information Can Harm Your Business

The Long-term Costs of Data Debt It’s no secret that many of today’s enterprises are experiencing an extreme state of data overload. With the rapid adoption of new technologies to accommodate pandemic-induced shifts like remote ...
Marcus Schmidt

What IT Leaders Should Know About Microsoft’s Operator Connect

Microsoft’s Operator Connect Earlier this year, Microsoft announced a new calling service for Microsoft Teams (Teams) users called Operator Connect. IT leaders justifiably want to know how Operator Connect is different from Microsoft’s existing PSTN ...
Matrix

Are We Building The Matrix?…

When sci-fi films like Tom Cruise’s Oblivion depict humans living in the clouds, we imagine that humanity might one day leave our primitive dwellings attached to the ground and ascend to floating castles in the ...
Doug Hazelman Cloudberry

Managing an Increasingly Complex IT Environment

Managing Complex IT Environments The hybrid work model is here to stay—at least for the time being. That’s how things feel in these still uncertain times. This new way of work that has evolved from ...
Jim Fagan

The Geopolitics of Subsea Connectivity

Subsea Connectivity Digital transformation and the migration of data and applications to the cloud is a global phenomenon. While we may like to think that the cloud knows no borders, the reality is that geopolitics ...

CLOUD MONITORING

The CloudTweaks technology lists will include updated resources to leading services from around the globe. Examples include leading IT Monitoring Services, Bootcamps, VPNs, CDNs, Reseller Programs and much more...

  • Opsview

    Opsview

    Opsview is a global privately held IT Systems Management software company whose core product, Opsview Enterprise was released in 2009. The company has offices in the UK and USA, boasting some 35,000 corporate clients. Their prominent clients include Cisco, MIT, Allianz, NewVoiceMedia, Active Network, and University of Surrey.

  • Nagios

    Nagios

    Nagios is one of the leading vendors of IT monitoring and management tools offering cloud monitoring capabilities for AWS, EC2 (Elastic Compute Cloud) and S3 (Simple Storage Service). Their products include infrastructure, server, and network monitoring solutions like Nagios XI, Nagios Log Server, and Nagios Network Analyzer.

  • Datadog

    DataDog

    DataDog is a startup based out of New York which secured $31 Million in series C funding. They are quickly making a name for themselves and have a truly impressive client list with the likes of Adobe, Salesforce, HP, Facebook and many others.

  • Sematext Logo

    Sematext

    Sematext bridges the gap between performance monitoring, real user monitoring, transaction tracing, and logs. Sematext all-in-one monitoring platform gives businesses full-stack visibility by exposing logs, metrics, and traces through a single Cloud or On-Premise solution. Sematext helps smart DevOps teams move faster.