Dynamic Affect Recognition Evaluation
The Dynamic Affect Recognition Evaluation (DARE) is a standardized tool for assessing facial emotion recognition1. DARE uses the Cohn-Kanade Action Unit-Coded Facial Expression Database2 to create video sequences of facial expressions. The stimuli includes uncompressed video files (i.e., series of still images) consisting of six basic emotions. The images are morphed and the final videos include a face starting with a neutral expression and slowly transitioning into one of the six target emotions (Sad, Fear, Surprise, Disgust, Anger, and Happiness). Video durations range from 15 to 33 seconds. Participants are asked to indicate when they could identify the emotion (latency) and then verbally identify the emotion (accuracy). Duration of DARE administration ranges from 10-20 minutes, depending on the average latency of each individual participant to respond to the stimuli.
For repeated DARE administrations, shorter versions of the program are available so that the videos are divided between two administrations.
For more information on the DARE, or to request use of the software and stimuli videos, please contact Dr Keri J. Heilman at firstname.lastname@example.org.
- 1Porges, S.W., Cohn, J.F., Bal, E., Lamb, D., & Lewis, G.F. 2016. The Dynamic Affect Recognition Evaluation software v2. Brain-Body Center for Psychophysiology and Bioengineering, University of North Carolina, Chapel Hill, NC.
- 2Cohn, J. F., Zlochower, A., Lien, J., & Kanade, T. 1999. Automated face analysis by feature point tracking has high concurrent validity with manual FACS coding. Psychophysiology, 36, 35-43.
- Kanade, T., Cohn, J.F., & Tian, Y. 2000. Comprehensive database for facial expression analysis. Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition (FG’00), Grenoble, France, 46-53.