{{announcement.body}}
{{announcement.title}}

Artificial Emotional Intelligence: Teaching AI to Detect and Express Human Emotions

DZone 's Guide to

Artificial Emotional Intelligence: Teaching AI to Detect and Express Human Emotions

In this article, see how to teach AI to detect and express human emotions.

· AI Zone ·
Free Resource

Artificial emotional intelligence is a set of practices, techniques, and tools used to teach artificial intelligence how to detect, assess, and identify human emotions. Emotional AI techniques typically involve the usage of facial and body recognition, voice and language recognition, and data management. To achieve these purposes, you can use deep learning for facial emotion recognition, and implement unsupervised emotion recognition in text.

What Is Emotional AI?

Emotional AI uses machine learning to detect and interpret emotions in text, audio, or video data. It employs a variety of technologies to collect and analyze data related to facial expressions, gestures, tone of voice, language use, and situational context. Emotional AI also incorporates psychological research as the basis on which interpretations are trained and reported.

Emotional AI: Key Components

There are three main components when developing and employing emotional AI:

  • Facial and body recognition—facial recognition is used to detect emotions from facial features and expressions. Body recognition is used to detect emotions from gestures and the cadence (speed or jerkiness) of movement. 
  • Voice and language recognition—voice recognition is used to detect tone, pitch, and cadence of voices. Language recognition is used to detect emotion according to the content of language, such as words used and the context of words. This is accomplished through natural language processing (NLP). 
  • Data management—it is impossible to train or deploy an emotional AI model without sufficient data management. The complexity of human emotions requires large amounts of real-time data to accurately analyze interactions. Even if you are only interested in point in time analyses, the amount of data needed is still significant.

Depending on the implementation, there are some other methods that can be used to detect emotion, including biometric feedback, attention (gaze or responsiveness), and interaction. These methods can also be used to improve analyses based on video, audio, or text components. 

Emotion Recognition Methods

When developing emotional AI, there are a variety of methods you can use. Below are methods that are well adopted. 

Deep Learning for Facial Emotion Recognition

Neural networks are often used to more accurately classify data, including emotional content. With these tools, researchers can rely on the network to automatically engineer features and refine classifications in a way that mirrors how humans may learn to interpret emotions. 

For example, convolutional neural networks (CNNs) can be fed an array of images from which the network learns to identify which features are relevant for the detection of a specific emotion. After many repetitions and with a broad enough dataset, these networks should then be able to accurately detect emotion in static images. Or, using a 3D CNN, detect emotions and emotional changes in moving images. 

Unsupervised Emotion Recognition in Text

While deep learning can be supervised or semi-supervised, there are also unsupervised methods that researchers can apply. These methods are more common for text-based detection and are being developed not to depend on established affect lexicons, like WordNet-Affect. By eliminating this reliance, models can extend past the limitations of existing emotional models and identify more variations or nuance in emotion.

Quick Tutorial: Use Tensorflow.js to Build an Emotion Recognition Application

In the following tutorial, you will learn how to build an emotion recognition app with real time responsiveness. The application is built using Pusher and Tensorflow.js. When finished, a user takes an image of their face, the model predicts their emotion, and the result is pushed to a dashboard. 

One possible use for an application like the one you are creating here is to gauge user feedback as a user explores updated features or webpages. This tutorial is adapted from a longer tutorial by Oreoluwa Ogundipe, which you can find here.

1. Prerequisites

To begin, you should install Vue.js CLI. This is the framework used to develop the application interface. To install this CLI, you can use the following command:

yarn global add @vue/cli

Next, you need to create your application project. 

vue create realtime-feedback

Then, you can install the remaining JavaScript (JS) libraries you’ll be using.  

yarn add axios @tensorflow/tfjs @tensorflow-models/knn-classifier@tensorflow-
models/mobilenet 

2. Create a Homepage Component

After your tools are set up, you can create your Camera component. This component should be created in the src/components folder. This component is the control for the webcam your application is using. 

When finished, your homepage and camera component should provide the following functionality:

  • A webcam feed
  • User can take a picture
  • User can select how to use the picture: for testing or training dataset
  • User can train model based on the sample images obtained
  • User can display the emotion predicted by the model for the current image

The markup is straightforward — you can see the code in the original tutorial.

3. Add Tensorflow Methods to Your Home Component

Now you will add methods that: 

  • Allow your home component to feed the images taken by the user to TensorFlow
  • Add images to testing and training datasets
  • Process images
  • Use images to train the model
Java
 




xxxxxxxxxx
1
35


 
1
mounted: function(){
2
 
          
3
     this.init();
4
 
          
5
   },
6
 
          
7
   methods: {
8
 
          
9
     async init(){
10
 
          
11
      this.classifier = knnClassifier.create();
12
 
          
13
      this.mobilenet = await mobilenetModule.load();
14
 
          
15
     },
16
 
          
17
     trainModel(){
18
 
          
19
      let selected = document.getElementById("emotion_options");
20
 
          
21
      this.class = selected.options[selected.selectedIndex].value;
22
 
          
23
      this.addExample();
24
 
          
25
     },
26
 
          
27
     addExample(){
28
 
          
29
      const img= tf.fromPixels(this.$children[0].webcam.webcamElement);
30
 
          
31
      const logits = this.mobilenet.infer(img, 'conv_preds');
32
 
          
33
      this.classifier.addExample(logits, parseInt(this.class));
34
 
          
35
     },



Now that you can classify your emotion, you can collect your image with getEmotion()and send the value to your backend server. This is done using the registerEmotion() seen below. 

Java
 




x
41


 
1
[...]
2
 
          
3
     async getEmotion(){
4
 
          
5
      const img = tf.fromPixels(this.$children[0].webcam.webcamElement);
6
 
          
7
      const logits = this.mobilenet.infer(img, 'conv_preds');
8
 
          
9
      const pred = await this.classifier.predictClass(logits);
10
 
          
11
      this.detected_e = this.emotions[pred.classIndex];
12
 
          
13
      this.registerEmotion();
14
 
          
15
     },
16
 
          
17
     changeOption(){
18
 
          
19
       const selected = document.getElementById("use_case");
20
 
          
21
       this.mode = selected.options[selected.selectedIndex].value;
22
 
          
23
     },
24
 
          
25
     registerEmotion(){
26
 
          
27
       axios.post('http://localhost:3128/callback', {
28
 
          
29
         'emotion': this.detected_e
30
 
          
31
       }).then( () => {
32
 
          
33
         alert('Thanks for letting us know how you feel');
34
 
          
35
       });
36
 
          
37
     }
38
 
          
39
    }
40
 
          
41
  };



And that’s it! You should now have a simple application that lets you take images, train a dataset, and automatically detect the emotion in new, unseen images.

Emotional AI Case Studies

Although emotional AI is still a growing field, there have already been several notable applications. In particular, many companies are working to leverage emotional AI to improve marketing and customer outreach. While the inclusion of artificial intelligence in marketing tools and the collection of vast amounts of consumer data is common, the inclusion of emotional detection is new. 

Below are a few case studies highlighting how this technology is being applied, both in marketing and less commercial ways. 

Video

Huawei, in partnership with the Polish Blind Association, recently created an app that leverages emotional AI to enable blind people to “see” the emotions of those they’re interacting with. The app, called Facing Emotions, uses video detection methods to classify emotions and facial responses, and report back the analyses in an audio format. It does this by evaluating the position of facial features and interpreting the correlation in their positions. 

Audio

Amazon has a well-known product in which the company is trying to include emotional AI; its voice assistant Alexa. This product enables customers to perform a variety of tasks, from turning on lights to ordering merchandise with just their voice. According to patent reports, Amazon has developed tools that will enable Alexa to recognize a range of emotions through voice analysis. This is significant considering the access that Alexa has to user’s daily lives and the breadth of audio data that can be collected. 

Text

Nationwide, A UK banking company, has implemented text analysis of emotions to help interpret customer emails. The company is using a model developed by SAS to increase understanding of customer issues and to improve customer issue resolution speeds. The model used NLP and several other AI technologies to accomplish sentiment analysis in customer correspondence. 

Conclusion

Emotional AI is quickly becoming a critical part of model training. There is a wide range of ongoing applications of artificial emotional intelligence, including in video, audio, and text technologies. Emotional AI can help blind people see emotions. As a whole, the expectation and hope are that once emotional AI reaches full maturation, it can help AI-based technologies better serve humans. Once AI software understands human emotions, it can provide humans with content relevant to displayed emotions.

Topics:
ai, artificial intelligence, deep learning, emotional ai, facial recognition

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}