Recognizing Facial Emotions

Whatever we feel at heart is understood by our facial expressions. Facial expressions are a vital mode of communication. It is said that any person’s behaviour is controlled by his/her face. Social Media to video chat applications our emotions are tracked everywhere. Medical research has also used them widely. Mainly to understand mental health by analyzing emotions. AI has made this possible. Computer Vision has reached new heights from recognising only faces to decoding face gestures. Face recognition has been in this field for ages but have you ever wondered how interesting it would be to decode facial expressions mainly happy, sad, fear, anger, surprise, neutral, disgust.

In this article, I’ll be discussing how to create a face emotion recognizer using ‘FER’ library from python.

FER library

Facial Expression Recognition with a deep neural network using Tensorflow and Keras libraries implemented in python. Dataset used is from Kaggle competition Challenges in Representation Learning: Facial Expression Recognition Challenge.

Dependencies:

OpenCV 3.2+

Tensorflow 1.7+

Python 3.6+

Installation

pip install fer

Predicting emotions over the static image

from fer import FER
import matplotlib.pyplot as plt 
img = plt.imread("happy.jpg")
detector = FER(mtcnn=True)
print(detector.detect_emotions(img))
plt.imshow(img)

[OrderedDict([(‘box’, (172, 31, 93, 93)), (’emotions’, {‘angry’: 0.0, ‘disgust’: 0.0, ‘fear’: 0.0, ‘happy’: 1.0, ‘sad’: 0.0, ‘surprise’: 0.0, ‘neutral’: 0.0})])]

<matplotlib.image.AxesImage at 0x1ad9f99fd30>

Detector returns a list containing the Ordered dictionary of bounding box notations where the face is detected and all the 7 emotions in decimals values from 0 to 1. The FER contains the Keras model built with convolutional neural networks and weights saved in HDF5 model. Alternatively, there’s Peltarion API which could be used in the backend in place of Keras model. MTCNN stands for multi cascade convolutional network. It is an advanced technique for detecting faces. If mtcnn=False then by default OpenCV Haar Cascade Classifier is used. Lastly, the detect_emotions() function is called to classify the emotion into ‘happy’, ’sad’, ‘disgust’, ‘anger’, ‘fear’, ‘neutral’ with values for each.

If we wish to only want the emotion with the highest score we can directly do that with top_emotion() function.

emotion, score = detector.top_emotion(img)
print(emotion,score)

happy 1.0

Lets look at other emotions:

img = plt.imread("sad.jpg")
detector = FER()
print(detector.top_emotion(img))
plt.imshow(img)

(‘sad’, 0.91)

Similarly, the rest of the emotions are shown.

(‘angry’, 0.96)

(‘neutral’, 0.88)

(‘surprise’, 0.95)

(‘fear’, 0.92)

The ‘fer’ library has a separate module for analysis of facial emotions in videos. This extracts frames and performs emotion analysis using video.analyze() function over detected faces.

from fer import Video
from fer import FER
video_filename = "D:/python/YouTube.mp4"
video = Video(video_filename)
# Analyze video, displaying the output
detector = FER(mtcnn=True)
video.analyze(detector, display=True)

Storing the emotion and detector face

For detected face bounding boxes is drawn and the emotion having the highest value is shown with brighter text colour than others.

import cv2
from fer import FER
detector = FER(mtcnn=True) 
image = cv2.imread("andrew_ng.jpg")
result = detector.detect_emotions(image)

As discussed above the result list will return bounding box and emotion values which are stored in separate arrays.

bounding_box = result[0]["box"]
emotions = result[0]["emotions"]
Bounding around face is drawn
cv2.rectangle(image,(bounding_box[0], bounding_box[1]),
(bounding_box[0] + bounding_box[2], bounding_box[1] + bounding_box[3]),(0, 155, 255), 2,)

Now the emotion along with scores are highlighted

for idx, (emotion, score) in enumerate(emotions.items()):
    color = (211, 211, 211) if score < 0.01 else (255, 0, 0)
    emotion_score = "{}: {}".format(
          emotion, "{:.2f}".format(score) if score > 0.01 else ""
        )
    cv2.putText(image,emotion_score,
            (bounding_box[0], bounding_box[1] + bounding_box[3] + 30 + idx * 15),cv2.FONT_HERSHEY_SIMPLEX,0.5,color,1,cv2.LINE_AA,)
cv2.imwrite("emotion.jpg", image)

Conclusion

Face emotion detectors with a few lines of code using the fer library is easy to implement and can be integrated with other applications. Robotics has evolved over the years and many bots can now mimic human reactions. Another use case could be online proctored exams where people’s reactions can be tracked and monitored.

The complete code of the above implementation is available at the AIM’s GitHub repository. Please visit this link to find the notebook of this code.

This article has been published from the source link without modifications to the text. Only the headline has been changed.

Source link

Most Popular