Recognizing Facial Emotions

Audio version of the article

Whatever we feel at heart is understood by our facial expressions. Facial expressions are a vital mode of communication. It is said that any person’s behaviour is controlled by his/her face. Social Media to video chat applications our emotions are tracked everywhere. Medical research has also used them widely. Mainly to understand mental health by analyzing emotions. AI has made this possible. Computer Vision has reached new heights from recognising only faces to decoding face gestures. Face recognition has been in this field for ages but have you ever wondered how interesting it would be to decode facial expressions mainly happy, sad, fear, anger, surprise, neutral, disgust.

In this article, I’ll be discussing how to create a face emotion recognizer using ‘FER’ library from python.

FER library

Facial Expression Recognition with a deep neural network using Tensorflow and Keras libraries implemented in python. Dataset used is from Kaggle competition Challenges in Representation Learning: Facial Expression Recognition Challenge.

Dependencies:

OpenCV 3.2+

Tensorflow 1.7+

Python 3.6+

Installation

pip install fer

Predicting emotions over the static image

from fer import FER
import matplotlib.pyplot as plt 
img = plt.imread("happy.jpg")
detector = FER(mtcnn=True)
print(detector.detect_emotions(img))
plt.imshow(img)

[OrderedDict([(‘box’, (172, 31, 93, 93)), (’emotions’, {‘angry’: 0.0, ‘disgust’: 0.0, ‘fear’: 0.0, ‘happy’: 1.0, ‘sad’: 0.0, ‘surprise’: 0.0, ‘neutral’: 0.0})])]

<matplotlib.image.AxesImage at 0x1ad9f99fd30>

Detector returns a list containing the Ordered dictionary of bounding box notations where the face is detected and all the 7 emotions in decimals values from 0 to 1. The FER contains the Keras model built with convolutional neural networks and weights saved in HDF5 model. Alternatively, there’s Peltarion API which could be used in the backend in place of Keras model. MTCNN stands for multi cascade convolutional network. It is an advanced technique for detecting faces. If mtcnn=False then by default OpenCV Haar Cascade Classifier is used. Lastly, the detect_emotions() function is called to classify the emotion into ‘happy’, ’sad’, ‘disgust’, ‘anger’, ‘fear’, ‘neutral’ with values for each.

If we wish to only want the emotion with the highest score we can directly do that with top_emotion() function.

emotion, score = detector.top_emotion(img)
print(emotion,score)

happy 1.0

Lets look at other emotions:

img = plt.imread("sad.jpg")
detector = FER()
print(detector.top_emotion(img))
plt.imshow(img)

(‘sad’, 0.91)

Similarly, the rest of the emotions are shown.

(‘angry’, 0.96)

(‘neutral’, 0.88)

(‘surprise’, 0.95)

(‘fear’, 0.92)

The ‘fer’ library has a separate module for analysis of facial emotions in videos. This extracts frames and performs emotion analysis using video.analyze() function over detected faces.

from fer import Video
from fer import FER
video_filename = "D:/python/YouTube.mp4"
video = Video(video_filename)
# Analyze video, displaying the output
detector = FER(mtcnn=True)
video.analyze(detector, display=True)

Storing the emotion and detector face

For detected face bounding boxes is drawn and the emotion having the highest value is shown with brighter text colour than others.

import cv2
from fer import FER
detector = FER(mtcnn=True) 
image = cv2.imread("andrew_ng.jpg")
result = detector.detect_emotions(image)

As discussed above the result list will return bounding box and emotion values which are stored in separate arrays.

bounding_box = result[0]["box"]
emotions = result[0]["emotions"]
Bounding around face is drawn
cv2.rectangle(image,(bounding_box[0], bounding_box[1]),
(bounding_box[0] + bounding_box[2], bounding_box[1] + bounding_box[3]),(0, 155, 255), 2,)

Now the emotion along with scores are highlighted

for idx, (emotion, score) in enumerate(emotions.items()):
    color = (211, 211, 211) if score < 0.01 else (255, 0, 0)
    emotion_score = "{}: {}".format(
          emotion, "{:.2f}".format(score) if score > 0.01 else ""
        )
    cv2.putText(image,emotion_score,
            (bounding_box[0], bounding_box[1] + bounding_box[3] + 30 + idx * 15),cv2.FONT_HERSHEY_SIMPLEX,0.5,color,1,cv2.LINE_AA,)
cv2.imwrite("emotion.jpg", image)

Conclusion

Face emotion detectors with a few lines of code using the fer library is easy to implement and can be integrated with other applications. Robotics has evolved over the years and many bots can now mimic human reactions. Another use case could be online proctored exams where people’s reactions can be tracked and monitored.

The complete code of the above implementation is available at the AIM’s GitHub repository. Please visit this link to find the notebook of this code.

This article has been published from the source link without modifications to the text. Only the headline has been changed.

Source link

- Advertisment -

Most Popular

Introductory Guide on XCFramework and Swift Package

In WWDC 2019, Apple announced a brand new feature for Xcode 11; the capability to create a new kind of binary frameworks with a special format...

Understanding Self Service Data Management

https://dts.podtrac.com/redirect.mp3/www.dataengineeringpodcast.com/podlove/file/704/s/webplayer/c/episode/Episode-159-Isima.mp3 Summary The core mission of data engineers is to provide the business with a way to ask and answer questions of their data. This often...

Understanding Machine Learning Data Preparation Techniques

Predictive modeling machine learning projects, such as classification and regression, always involve some form of data preparation. The specific data preparation required for a dataset...

Java and Python in Top List of Self taught Languages

Here's a report for the times: Specops Software sifted data from Ahrefs.com using its Google and YouTube search analytics tool to surface a list of the programming languages people most...

Crypto bulls predict the future for Bitcoin

Bitcoin is back. The cryptocurrency last week passed the $18,000 level for the first time since its all-time peak in December 2017. As...

Tracking Machine Learning experiments with Allegro AI

https://cdn.changelog.com/uploads/practicalai/97/practical-ai-97.mp3 DevOps for deep learning is well… different. You need to track both data and code, and you need to run multiple different versions of...
- Advertisment -