When You Will Encounter Genuine Emotional AI

Genuine Emotional AI

Recent tremendous growth of mobile technologies and advancements of artificial intelligence (AI) in virtually any industry will give a boost to some quite unexpected applications of AI in the coming decades. Emotional AI is one of the fields where machine learning and algorithms offer huge potential although reading and controlling emotions through AI is quite a challenging task. At this stage, researchers focus on developing AI algorithms that can read and respond to human emotions. Once we have viable technology that accurately interprets our emotional status and reaction, we can start thinking of building AI-powered systems that are capable of convincingly simulating emotions or have their own emotions.

Reading Emotions and Reacting to Them

Possible applications of emotional AI spans areas ranging from self-driving cars, to personal robots and virtual assistants, to IoT computing and literary all major consumer electronic devices. Any of those devices or apps could utilize AI to provide enhanced human-machine interaction, emotional awareness or machine emotional intimacy. This is also a way to overcome prejudices related to use of AI-powered systems as a reliable tools assisting humans in their personal and work life. In fact, development of emotionally intelligent devices and software is a crucial factor for the overall growth of AI applications across any industry.

Androids should not the ultimate goal; one would reasonably argue that having human-like robots does not make sense by any means. Nevertheless, a machine-learning algorithm coupled with AI capabilities can be extremely useful in human-machine interactions. American psychologist Paul Ekman identified six basic emotions in the 1980s − anger, disgust, fear, happiness, sadness, and surprise. Later, Professor Robert Plutchik extended basic emotions to eight, grouping them into four pairs of polar opposites: joy-sadness, anger-fear, trust-distrust, and surprise-anticipation.

A few AI startups work on development of algorithms that are able to recognize these human emotions and eventually develop similar emotions in intelligent machines. Although founders of emotional AI startups like Patrick Levy-Rosenthal claim their algorithms are achieving up to 98 percent emotion recognition accuracy on conversations we still do not have viable emotional AI for widespread use. Facial recognition of emotional states is progressing rapidly but most human-machine interactions are performed without the use of a camera. Hence, natural language understanding appears as a more crucial technology to develop if we want to build viable solutions utilizing emotional AI.

How to Trust Emotional AI?

Making a device that develops unique personality through reinforcement learning and Machine Learning is achievable today. Nonetheless, it has applications mostly as an entertaining device unless it is able to recognize human emotions with 100 percent accuracy and take meaningful actions depending not solely only on emotions but while taking into account a complex set of other factors.

Imagine an autonomous vehicle that can also be controlled using voice commands. The vehicle is speeding to a crosswalk where a pedestrian is crossing the road. An emotional and instinctive reaction of the driver would be to shout “Stop” or give a similar command. What if the better decision is to speed up and find a way around the left or the right side of the pedestrian? How should the car’s AI react to the emotional voice command as opposed to its algorithms? Obviously, a priority of computed algorithmic actions can be programmed for such a situation but the above case is a relatively simple one. What if the driver/passenger experiences a momentary lapse of reason and issues a command that has nothing to do with reality and immediate surrounding? Emotion recognition and interpretation is much more complicated than simply reading facial and language expressions.

Nonetheless, emotional AI is progressing rapidly and Alexei Samsonovich, a professor in the Cybernetics Department at the Moscow Engineering Physics Institute, already proposed a multi-part test intended to answer a critical question: How do we know that such intelligence actually experiences real, human-like emotions?

Virtual agents and robots should be human-like so that humans could trust them and cooperate with them as with their equals. Therefore, artificial intelligence must be socially and emotionally responsive and able to think and learn like humans. And that implies such mechanisms as narrative thinking, autonomous goal setting, creative reinterpreting, active learning, and the ability to generate emotions and maintain interpersonal relationships,” Samsonovich explains.

Prepare for Emotional AI

Creation of such a test is possible in a completely virtual environment where no face-to-face contact is available. However, emotional AI also faces a multi-attribute classification problem related to facial expressions. Human facial expressions, for instance, involve eyes, brow, and lips but also depend on age, race, and gender. AI researchers and developers should also take into account factors like lighting and orientation to the camera.

Body movements i.e. body language is even more challenging for AI to interpret. Humans have learned to decipher body language through millions of years of evolution. However, it is not simply about reading body language but putting it into context. Shrugging your shoulders is a common gesture but it could express different emotions in different contexts.

Finally, you’ll have to wait for an android to join your family for the time being, but a mass device that is capable to understand basic human emotions and use AI to respond and take actions based on them will be available somewhere around 2020.

By Kiril V. Kirilov

Gary Bernstein
Artificial Intelligence (AI) has emerged as a transformative force that is reshaping industries, improving our daily lives, and pushing the boundaries of human potential. This cutting-edge technology is no longer confined to science fiction; it ...
Tiago Ramalho
More equitable future for food distribution with AI At best, only 70% of food gets used in the United States. The rest goes to waste. Although devastating, the good news is this massive waste of ...
Vulnerabilities
Cyber Threat Intelligence In an era of rapid digital transformation, we have witnessed a concerning evolution in the cyber threat landscape. Recent data analyses, as illustrated in the "Cyber Threat Intelligence Index: Q3 2023" report, ...
David Cantor
These are monumental topics that command volumes of diligent research, backed by empirical evidence and citations from subject-matter experts. Yet, I’m afraid we don’t have the time for this. In 2022, I had a video ...
Ray Meiring
Fueled by extensive demand in IT, healthcare, financial services, and telecommunication—initially spurred by the pandemic-driven frenzy to transition to remote working—managed service providers (MSPs) are busier than ever. As businesses adopt MSP services to upgrade, ...
Cloudtweaks Comic Ai
How AI Is Important for Businesses Shifting to Remote Work The Coronavirus Pandemic has taught us that organizations must have remote work choices. It is no longer possible to work in a digital environment. The ...

Get Smarter

Whether you're just starting out in the online industry or looking to take your skills to the next level, Get Smarter eLearning platform is the perfect choice for you. Sign up today and start your journey towards online success!

Use code LEARN15 to enjoy 15% off all courses.