Joy, anger and sadness – deciphering the emotions of AI robots

We have always thought that the biggest difference between robots and humans is that robots do not have human emotions. Now this boundary between robots and humans, with the development of AI technology, will be broken, AI robots today can also have their own emotions, like humans, they not only have their own “joy, anger, sadness and happiness”, and even understand human emotions and react.

Emotion is a profound technological gap

Human beings are advanced animals, and have developed a rich variety of human emotions, such as love, happiness, hatred, disgust, beauty, etc., out of the primitive consciousness of lower animal instincts. These human emotions often show irrational characteristics, so it is difficult for robots to imitate them, because the operation of robots is based on rational logic, there can be no common sense emotional ability, and it is never a simple thing to make robots have human emotions.

For a simple example, the Sir voice assistant on the iPhone is a highly intelligent voice robot developed by Apple, in the previous version, if you send Sir “I’m hurt, call an ambulance”, Siri will say “Okay, from now on, I’ll call you ‘ambulance’.” Apparently, the “smart” Siri did not read the emotional expressions in the helper’s language (Figure 1).

201819jsjd1

Figure 1 Sir voice assistant

The joy and sorrow behind highly intelligent robots

As humans demand more and more from robots, simple robots can no longer meet our needs, and for this reason scientists have done a lot of development for emotional understanding of robots. So, how can we make a logically running robot have emotional capabilities?

First let’s understand how robot learning capabilities are acquired. To learn a complex skill (including emotional learning), a robot must first extract a variety of patterns from a large amount of data, and then train the learning results through certain algorithmic models, while continuously optimizing the results to improve accuracy. With the development of AI, neural network, and big data technologies, AI robots nowadays first acquire certain learning ability through complex algorithmic models plus massive data learning, and then develop self-learning ability, which can debug data weights and read relevant data by themselves to better fit actual data patterns and achieve artificial intelligence (Figure 2).

201819jsjd2

Figure 2 Illustration of AI learning capabilities

The learning of emotion by AI robots follows the same approach as above. Scientists first prepare a large amount of emotional data for the AI robot to learn. These data are things that humans usually express, such as common conversations between people, human reactions when they see something they like or hate, data on various expressions of the human face, and so on. Many human emotions are expressed through facial expressions. In order for robots to recognize human facial expression data, scientists will let robots capture various expression data of human faces through recognition devices such as cameras and sensors, and perform data modeling on these expression data to find out the mapping relationship between the external appearance data of emotions and the internal emotional state through specific algorithms, so that the current human intrinsic emotion type is identified (Figure 3).

201819jsjd3

Figure 3 Facial emotion data capture

Scientists digitize the emotional data collected above and turn it into data that AI robots can “read”. This way, after a large amount of data learning, and then through the AI self-learning, after a long period of training, the AI robot can have the ability to recognize, understand and adapt to human emotions, and eventually make the computer can interact with people in a natural, friendly and vivid way.

Take Google Intelligent Assistant as an example, when you say to it, “I feel sad” (I feel sad), it will reply, “I wish I had arms so i could give you a hug” (I wish had arms so i could give you a hug). In this set of conversations, the intelligent assistant has understood your emotions and gave timely comfort, such an emotional and warm reply, so that we no longer think it is a cold robot (Figure 4).

201819jsjd4

Figure 4 Google intelligent assistant dialogue

Of course, human emotions are extremely rich, and in the above example, a thousand warm responses from an AI robot may not be warm enough for some people, or they simply do not need a warm response. But with the robot’s powerful artificial intelligence and self-learning ability, as long as the user uses it for a period of time, the AI robot will understand and be familiar with you and can provide more humane services at subsequent times.

AI has emotion, life is easier

AI robots can hear and understand human emotions, they will give us more help in daily life. Take a phenomenon in today’s society as an example, with the popularity of the Internet, human interaction is becoming less and less, and many people suffer from various psychological diseases. They are not willing to communicate with real people, but prefer to spill their hearts to a chatbot, express their joy, anger and sorrow, as well as tell their unknowable secrets. When AI bots can understand the user’s emotions, when such patients are chatting with the bot, the bot can monitor his mood swings and give comfort with various interactive means, almost like a professional psychiatrist (Figure 5).

201819jsjd5

Figure 5 The Woebot robot that monitors mood swings

Of course, there are many similar robots, such as “Xiao Bing” developed by Microsoft, which can chat with you and relieve your boredom, and even help you compose songs and write articles. In short, with the development of AI technology, AI robots will become more and more powerful, and will certainly bring more convenience to our lives (Figure 6).

201819jsjd6

Figure 6 Powerful Little Ice Robot

Leave a Comment