List of facial expression databases explained

A facial expression database is a collection of images or video clips with facial expressions of a range of emotions.Well-annotated (emotion-tagged) media content of facial behavior is essential for training, testing, and validation of algorithms for the development of expression recognition systems. The emotion annotation can be done in discrete emotion labels or on a continuous scale. Most of the databases are usually based on the basic emotions theory (by Paul Ekman) which assumes the existence of six discrete basic emotions (anger, fear, disgust, surprise, joy, sadness). However, some databases include the emotion tagging in continuous arousal-valence scale.

In posed expression databases, the participants are asked to display different basic emotional expressions, while in spontaneous expression database, the expressions are natural. Spontaneous expressions differ from posed ones remarkably in terms of intensity, configuration, and duration. Apart from this, synthesis of some AUs are barely achievable without undergoing the associated emotional state. Therefore, in most cases, the posed expressions are exaggerated, while the spontaneous ones are subtle and differ in appearance.

Many publicly available databases are categorized here.[1] [2] Here are some details of the facial expression databases.

DatabaseFacial expressionNumber of SubjectsNumber of images/videosGray/ColorResolution, Frame rateGround truthType
FERG-3D-DB (Facial Expression Research Group 3D Database) for stylized characters [3] angry, disgust, fear, joy, neutral, sad, surprise 439574 annotated examples ColorEmotion labelsFrontal pose
Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS) [4] Speech: Calm, happy, sad, angry, fearful, surprise, disgust, and neutral. Song: Calm, happy, sad, angry, fearful, and neutral. Each expression at two levels of emotional intensity.24 7356 video and audio files Color1280x720 (720p)Facial expression labelsRatings provided by 319 human ratersPosed
Extended Cohn-Kanade Dataset (CK+)[5] neutral, sadness, surprise, happiness, fear, anger, contempt and disgust123 593 image sequences (327 sequences having discrete emotion labels)Mostly gray640* 490Facial expression labels and FACS (AU label for final frame in each image sequence)Posed; spontaneous smiles
Japanese Female Facial Expressions (JAFFE)[6] neutral, sadness, surprise, happiness, fear, anger, and disgust10213 static imagesGray256* 256Facial expression labelPosed
MMI Database[7] 431280 videos and over 250 imagesColor720* 576AU label for the image frame with apex facial expression in each image sequencePosed and Spontaneous
Belfast Database[8] Set 1 (disgust, fear, amusement, frustration, surprise)114570 video clipsColor720*576Natural Emotion
Set 2 (disgust, fear, amusement, frustration, surprise, anger, sadness)82650 video clipsColor
Set 3 (disgust, fear, amusement)60180 video clipsColor1920*1080
Indian Semi-Acted Facial Expression Database (iSAFE)[9] Happy, Sad, Fear, Surprise, Angry, Neutral, Disgust44395 clipsColor1920x1080(60 fps)Emotion labelsSpontaneous
DISFA[10] -274,845 video framesColor1024*768; 20 fpsAU intensity for each video frame (12 AUs)Spontaneous
Multimedia Understanding Group (MUG)[11] neutral, sadness, surprise, happiness, fear, anger, and disgust861462 sequencesColor896*896, 19fpsEmotion labelsPosed
Indian Spontaneous Expression Database (ISED)[12] sadness, surprise, happiness, and disgust50428 videos Color1920* 1080, 50 fpsEmotion labelsSpontaneous
Radboud Faces Database (RaFD)[13] neutral, sadness, contempt, surprise, happiness, fear, anger, and disgust67Three different gaze directions and five camera angles (8*67*3*5=8040 images)Color681*1024Emotion labelsPosed
Oulu-CASIA NIR-VIS databasesurprise, happiness, sadness, anger, fear and disgust80three different illumination conditions: normal, weak and dark (total 2880 video sequences)Color320×240Posed
FERG (Facial Expression Research Group Database)-DB[14] for stylized charactersangry, disgust, fear, joy, neutral, sad, surprise 655767Color768x768Emotion labelsFrontal pose
AffectNet[15] neutral, happy, sad, surprise, fear, disgust, anger, contempt~450,000 manually annotated~ 500,000 automatically annotatedColorVariousEmotion labels, valence, arousalWild setting
IMPA-FACE3D[16] neutral frontal, joy, sadness, surprise, anger, disgust, fear, opened, closed, kiss, left side, right side, neutral sagittal left, neutral sagittal right, nape and forehead (acquired sometimes)38534 static imagesColor640X480Emotion labelsPosed
FEI Face Databaseneutral,smile2002800 static imagesColor640X480Emotion labelsPosed
Aff-Wild[17] [18] valence and arousal200~1,250,000 manually annotatedColorVarious (average = 640x360)Valence, ArousalIn-the-Wild setting
Aff-Wild2[19] [20] neutral, happiness, sadness, surprise, fear, disgust, anger + valence-arousal + action units 1,2,4,6,12,15,20,25458~2,800,000 manually annotatedColorVarious (average = 1030x630)Valence, Arousal, 7 basic expressions, action units for each video frameIn-the-Wild setting
Real-world Affective Faces Database (RAF-DB)[21] [22] 6 classes of basic emotions (Surprised, Fear, Disgust, Happy, Sad, Angry) plus Neutral and 12 classes of compound emotions (Fearfully Surprised, Fearfully Disgusted, Sadly Angry, Sadly Fearful, Angrily Disgusted, Angrily Surprised, Sadly Disgusted, Disgustedly Surprised, Happily Surprised, Sadly Surprised, Fearfully Angry, Happily Disgusted)29672 annotated examplesColorVarious for original dataset and 100x100 for aligned datasetEmotion labelsPosed and Spontaneous

Notes and References

  1. Web site: collection of emotional databases. https://web.archive.org/web/20180325205102/http://emotion-research.net/wiki/Databases. 2018-03-25. dead.
  2. Web site: facial expression databases.
  3. Aneja, Deepali, et al. "Learning to generate 3D stylized character expressions from humans." 2018 IEEE Winter Conference on Applications of Computer Vision (WACV). IEEE, 2018.
  4. Livingstone & Russo (2018). The Ryerson Audio-Visual Database ofEmotional Speech and Song (RAVDESS): A dynamic, multimodal set of facial and vocal expressions in North American English.
  5. P. Lucey, J. F. Cohn, T. Kanade, J. Saragih, Z. Ambadar and I. Matthews, "The Extended Cohn-Kanade Dataset (CK+): A complete facial expression dataset for action unit and emotion-specified expression," in 3rd IEEE Workshop on CVPR for Human Communicative Behavior Analysis, 2010
  6. Book: 10.5281/zenodo.3451524. 1998. Lyons. Michael. The Japanese Female Facial Expression (JAFFE) Database. Kamachi. Miyuki. Gyoba. Jiro.
  7. M. Valstar and M. Pantic, "Induced disgust, happiness and surprise: an addition to the MMI facial expression database," in Proc. Int. Conf. Language Resources and Evaluation, 2010
  8. I. Sneddon, M. McRorie, G. McKeown and J. Hanratty, "The Belfast induced natural emotion database," IEEE Trans. Affective Computing, vol. 3, no. 1, pp. 32-41, 2012
  9. Book: Singh. Shivendra. Benedict. Shajulin. Advances in Signal Processing and Intelligent Recognition Systems . Indian Semi-Acted Facial Expression (ISAFE) Dataset for Human Emotions Recognition . 2020. Thampi. Sabu M.. Hegde. Rajesh M.. Krishnan. Sri. Mukhopadhyay. Jayanta. Chaudhary. Vipin. Marques. Oge. Piramuthu. Selwyn. Corchado. Juan M.. https://link.springer.com/chapter/10.1007/978-981-15-4828-4_13. Communications in Computer and Information Science. 1209 . en. Singapore. Springer. 150–162. 10.1007/978-981-15-4828-4_13. 978-981-15-4828-4.
  10. S. M. Mavadati, M. H. Mahoor, K. Bartlett, P. Trinh and J. Cohn., "DISFA: A Spontaneous Facial Action Intensity Database," IEEE Trans. Affective Computing, vol. 4, no. 2, pp. 151–160, 2013
  11. N. Aifanti, C. Papachristou and A. Delopoulos, The MUG Facial Expression Database, in Proc. 11th Int. Workshop on Image Analysis for Multimedia Interactive Services (WIAMIS), Desenzano, Italy, April 12–14, 2010.
  12. S L Happy, P. Patnaik, A. Routray, and R. Guha, "The Indian Spontaneous Expression Database for Emotion Recognition," in IEEE Transactions on Affective Computing, 2016, .
  13. Langner, O., Dotsch, R., Bijlstra, G., Wigboldus, D.H.J., Hawk, S.T., & van Knippenberg, A. (2010). Presentation and validation of the Radboud Faces Database. Cognition & Emotion, 24(8), 1377—1388.
  14. Web site: Facial Expression Research Group Database (FERG-DB). grail.cs.washington.edu. 2016-12-06.
  15. Mollahosseini. A.. Hasani. B.. Mahoor. M. H.. 2017. AffectNet: A Database for Facial Expression, Valence, and Arousal Computing in the Wild. IEEE Transactions on Affective Computing. PP. 99. 18–31. 10.1109/TAFFC.2017.2740923. 1949-3045. 1708.03985. 37515850.
  16. Web site: IMPA-FACE3D Technical Reports. visgraf.impa.br. 2018-03-08.
  17. Book: Zafeiriou. S.. Kollias. D.. Nicolaou. M.A.. Papaioannou. A.. Zhao. G.. Kotsia. I.. 2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW) . Aff-Wild: Valence and Arousal 'In-the-Wild' Challenge . 2017. https://eprints.mdx.ac.uk/22045/1/aff_wild_kotsia.pdf. 1980–1987. 10.1109/CVPRW.2017.248. 978-1-5386-0733-6. 3107614.
  18. Kollias. D.. Tzirakis. P.. Nicolaou. M.A.. Papaioannou. A.. Zhao. G.. Schuller. B.. Kotsia. I.. Zafeiriou. S.. 2019. Deep Affect Prediction in-the-wild: Aff-Wild Database and Challenge, Deep Architectures, and Beyond. International Journal of Computer Vision . 127. 6–7. 907–929. 10.1007/s11263-019-01158-4. 13679040. free. 1804.10938.
  19. Kollias. D.. Zafeiriou. S.. 2019. Expression, affect, action unit recognition: Aff-wild2, multi-task learning and arcface. British Machine Vision Conference (BMVC), 2019. 1910.04855.
  20. Book: Kollias. D.. Schulc. A.. Hajiyev. E.. Zafeiriou. S.. 2020 15th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2020) . Analysing Affective Behavior in the First ABAW 2020 Competition . 2020. https://www.computer.org/csdl/proceedings-article/fg/2020/307900a794/1kecIYu9wL6. 637–643. 10.1109/FG47880.2020.00126. 2001.11409. 978-1-7281-3079-8. 210966051.
  21. Web site: Li.. S.. RAF-DB. Real-world Affective Faces Database.
  22. Book: Li. S.. Deng. W.. Du. J.. 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) . Reliable Crowdsourcing and Deep Locality-Preserving Learning for Expression Recognition in the Wild . 2017. https://ieeexplore.ieee.org/document/8099760. 2584–2593. 10.1109/CVPR.2017.277. 978-1-5386-0457-1. 11413183.