When You Will Encounter Genuine Emotional AI

Genuine Emotional AI

Recent tremendous growth of mobile technologies and advancements of artificial intelligence (AI) in virtually any industry will give a boost to some quite unexpected applications of AI in the coming decades. Emotional AI is one of the fields where machine learning and algorithms offer huge potential although reading and controlling emotions through AI is quite a challenging task. At this stage, researchers focus on developing AI algorithms that can read and respond to human emotions. Once we have viable technology that accurately interprets our emotional status and reaction, we can start thinking of building AI-powered systems that are capable of convincingly simulating emotions or have their own emotions.

Reading Emotions and Reacting to Them

Possible applications of emotional AI spans areas ranging from self-driving cars, to personal robots and virtual assistants, to IoT computing and literary all major consumer electronic devices. Any of those devices or apps could utilize AI to provide enhanced human-machine interaction, emotional awareness or machine emotional intimacy. This is also a way to overcome prejudices related to use of AI-powered systems as a reliable tools assisting humans in their personal and work life. In fact, development of emotionally intelligent devices and software is a crucial factor for the overall growth of AI applications across any industry.

Androids should not the ultimate goal; one would reasonably argue that having human-like robots does not make sense by any means. Nevertheless, a machine-learning algorithm coupled with AI capabilities can be extremely useful in human-machine interactions. American psychologist Paul Ekman identified six basic emotions in the 1980s − anger, disgust, fear, happiness, sadness, and surprise. Later, Professor Robert Plutchik extended basic emotions to eight, grouping them into four pairs of polar opposites: joy-sadness, anger-fear, trust-distrust, and surprise-anticipation.

A few AI startups work on development of algorithms that are able to recognize these human emotions and eventually develop similar emotions in intelligent machines. Although founders of emotional AI startups like Patrick Levy-Rosenthal claim their algorithms are achieving up to 98 percent emotion recognition accuracy on conversations we still do not have viable emotional AI for widespread use. Facial recognition of emotional states is progressing rapidly but most human-machine interactions are performed without the use of a camera. Hence, natural language understanding appears as a more crucial technology to develop if we want to build viable solutions utilizing emotional AI.

How to Trust Emotional AI?

Making a device that develops unique personality through reinforcement learning and Machine Learning is achievable today. Nonetheless, it has applications mostly as an entertaining device unless it is able to recognize human emotions with 100 percent accuracy and take meaningful actions depending not solely only on emotions but while taking into account a complex set of other factors.

Imagine an autonomous vehicle that can also be controlled using voice commands. The vehicle is speeding to a crosswalk where a pedestrian is crossing the road. An emotional and instinctive reaction of the driver would be to shout “Stop” or give a similar command. What if the better decision is to speed up and find a way around the left or the right side of the pedestrian? How should the car’s AI react to the emotional voice command as opposed to its algorithms? Obviously, a priority of computed algorithmic actions can be programmed for such a situation but the above case is a relatively simple one. What if the driver/passenger experiences a momentary lapse of reason and issues a command that has nothing to do with reality and immediate surrounding? Emotion recognition and interpretation is much more complicated than simply reading facial and language expressions.

Nonetheless, emotional AI is progressing rapidly and Alexei Samsonovich, a professor in the Cybernetics Department at the Moscow Engineering Physics Institute, already proposed a multi-part test intended to answer a critical question: How do we know that such intelligence actually experiences real, human-like emotions?

Virtual agents and robots should be human-like so that humans could trust them and cooperate with them as with their equals. Therefore, artificial intelligence must be socially and emotionally responsive and able to think and learn like humans. And that implies such mechanisms as narrative thinking, autonomous goal setting, creative reinterpreting, active learning, and the ability to generate emotions and maintain interpersonal relationships,” Samsonovich explains.

Prepare for Emotional AI

Creation of such a test is possible in a completely virtual environment where no face-to-face contact is available. However, emotional AI also faces a multi-attribute classification problem related to facial expressions. Human facial expressions, for instance, involve eyes, brow, and lips but also depend on age, race, and gender. AI researchers and developers should also take into account factors like lighting and orientation to the camera.

Body movements i.e. body language is even more challenging for AI to interpret. Humans have learned to decipher body language through millions of years of evolution. However, it is not simply about reading body language but putting it into context. Shrugging your shoulders is a common gesture but it could express different emotions in different contexts.

Finally, you’ll have to wait for an android to join your family for the time being, but a mass device that is capable to understand basic human emotions and use AI to respond and take actions based on them will be available somewhere around 2020.

By Kiril V. Kirilov

Viral Infection Wearabletech
Byod.png
Growing Up.png
Disaster Plan.png
Gary Bernstein
Secure Remote Authentication When employees are working remotely, they need to be able to access company resources and applications just as if they were in the office. This means that remote authentication needs to be ...
Ronald van Loon
Contact Centers with AI and Cloud Contact centers have experienced overwhelming strain since the onset of the pandemic and for many organizations this chaotic trajectory has continued. In the travel industry, for example, airlines are ...
Cybersecurity Maturity Model Certification (CMMC)
The Rise Of The Corporate VPN Over the past few years, VPNs have become increasingly widely used by the general public. Initially, they were used exclusively by the tech-savvy and privacy obsessed, but as the ...
Making the Move to Cloud Storage
Moving to Cloud Storage If your organization is building or maintaining software services, scalable and performant storage is a prime concern. Most modern applications have exorbitant data requirements, and mission critical use cases require storage ...
Gilad David Maayan
What Are Kubernetes Errors? Kubernetes is an open-source container orchestration system for automating software deployment, scaling, and management of containerized applications. There are many types of errors that can occur when using Kubernetes. Some common ...