Survey of Emotion Detection Based on Text and Facial Modalities
Published: 2019
Author(s) Name: Archi Agarwal, Harshita Dadhich and Anushree Shrivastava |
Author(s) Affiliation: Dept. of Comp. Science & Engg, SET, Mody Univ. of Science and Tech., Rajasthan, India.
Locked
Subscribed
Available for All
Abstract
Humans express their feelings directly or
indirectly through their facial expression, speech, writing
or gestures. In the current era, people express their
feelings through social media, news, articles and micro
blogs. It means of emoting varies from location, culture,
gender etc. Due to advancement computer robotics, new
angles in human-computer interactions have been brought
about which are affecting our day to day life. Real time
face to face communication requires quick and accurate
assessments to make computers comprehend human needs
and improve on its ability to communicate. An important
basis for these assessments is human emotion. To tackle
emotion detection problem more precisely, researchers all
around the world are finding effective ways through text,
speech, psychology and facial modality. This Survey covers
few of the already existing emotion recognition model
datasets, their techniques and features. The paper focuses
on reviewing emotions based on text and facial modalities.
It recaps the achievements in the field of emotion detection
and has highlighted extensions for better outcome.
Keywords: Deep learning, Emotion detection, Facial recognition, Machine learning.
View PDF