December 13, 2017
From a digital personal assistant (SIRI) to autonomous cars, it’s all Artificial Intelligence (AI). As a matter of fact, it has emerged to a greater extent in last ten years; as it is now being used in top-level industries such as Bank & financial System, Medical Sciences, Heavy Tech-industry and Air Transport.
Today, many of us have made AI the part of our daily lives. From narrow tasks i.e. driving cars to new places, facial recognition, internet searches etc., to more advanced functions i.e. incorporating it into our complete real life, taking suggestions to real-time decisions. That said, for such advanced humanly interactions, AI requires Emotional Intelligence (EQ).
Frankly, I’ve never been the fan of AI that requires EQ, which is fed be the “humans”. Simply put, the reason of the existence of “EQ” is to understand human feelings. Form body language to tone of voice, EQ attempts to comprehend the real-life actions by humans—just like humans. However, keep it in mind, if empathy is not well taught, it can lead to lose the relationship, friendships, and even jobs.
For such reasons, it is vital to understand the imperfections of emotional intelligence; EQ can either be used for empathic means or manipulate others. Whereas, a leading company i.e. Affectiva is renowned for sensing emotions in videos. Its work revolves around affective computing—the use of technology to understand and comprehend human emotions.
The CEO of Affectiva “Rana el Kaliouby” said, we’ll make it possible for AI to analyze human moods and emotions, making it a part of more human-like interactions that improve the quality of life. She was much affected by the EQ concept in the movie “Her”; the movie explains Joaquin Phoenix’s role, falling in love with a bot like SIRI—AI assistant Samantha.
Rana el Kaliouby spoke with VentureBeat last month, where she briefed about how the company is striving hard to make AI more practical in terms of human emotions with the help of EQ, keeping in mind the ethics and morality of human beings.
Moreover, she talked about overcoming the overlaps in understanding human face; as our human face possess the dynamic nature, it becomes hard to predict the positive and negative expressions. However, the human voice is an area which reveals slightly more genuine emotions than face. In addition, she said that, we can detect the actual feelings behind the smiles through facial expressions. It reveals when people are actually laughing and when they’re actually sad.
Not only EQ will help AI to understand human actions, but also it will allow AI to be able to predict people’s anger through their facial expressions. As there is a wider spectrum in voice, like hot anger and cold anger, and frustration and irritation. Rana el Kaliouby believes the entire spectrum to understand human emotions is a lot clearer in the voice channel.
Frankly, this would be a whole lot amazing technology if robots could actually understand human emotions. Yet, according to Naveen Joshi, the CEO of enterprise development company Allerin, while speaking about how EQ will entirely change the working of AI, there will be some grey areas which would never be refined. He said, “Even the most advanced Artificial Intelligence (AI) with the help of Emotional Intelligence (EQ) would still lack critical factors like the ability to comprehend the emotions like human beings.”