The automatic recognition of human emotions is of great interest in the context of multimedia applications and brain-computer interfaces. While users’ emotions could be assessed based on questionnaires, the results may be biased because the answers could be influenced by social expectations. More objective measures of emotions could be obtained by studying the users' physiological responses. The present database has been constructed in particular to evaluate the usefulness of electroencephalography (EEG) for emotion recognition in the context of audio-visual stimuli, but it also contains simultaneous physiological recordings (electrocardiogram, respiration, blood oxygen level, pulse rate, galvanic skin response) in addition to the EEG data. To the best of our knowledge, this database is the first publicly available one containing high-resolution (HR) EEG recordings for studies of emotion.
In order to study the emotions elicited by audio-visual stimuli (film excerpts), we recruited 40 subjects to participate in an experiment. During the experiment, we presented 13 emotional videos (7 videos for positive emotions and 6 videos for negative emotions) and 13 neutral videos to the subjects.The subjects were comfortably installed on a chair at about 1m distance from the 21'' computer which was used for the presentation of the stimuli.
The experiment was implemented in E-Prime 2.0 (Psychology Software Tools, Pittsburgh, PA) according to the experimental protocol shown in Figure 1. Except for the test trial, both the neutral videos and the emotional videos were presented in an arbitrary order. The emotional stimuli were selected from the FilmStim database (A. Schaefer, F. Nils, X. Sanchez, and P. Philippot, “Assessing the effectiveness of a large database of emotion-eliciting films: a new tool for emotion researchers,” Cognition & Emotion, vol. 24, no. 7, pp. 1153–1172, 2010, http://nemo.psp.ucl.ac.be/FilmStim/) and were between 40 s and 6 min long. After each emotional video, the subjects evaluated the emotions felt during the video on a negative (-1) - neutral (0) - positive (1) discrete scale. HR-EEG recordings and physiological measurements (1-channel electrocardiogram, respiration, blood oxygen level, pulse rate) were acquired with a 257-channel EGI system (EGI, Electrical Geodesics Inc., Eugene, USA) at a sampling rate of 1000 Hz and galvanic skin response measurements were recorded simultaneously using a SenseWear sensor (BodyMedia Inc.) with a sampling rate of 31 Hz. The experiment took place in a shielded room in Pontchaillou hospital, Rennes, France, and was approved by the local ethical review board of Inserm.
The database includes the EEG and other physiological recordings of the 40 subjects collected during the viewing of neutral and emotional videos and for the black screen periods. The data are provided in Matlab file format. Further information on the subjects (age and gender), the individual ratings of the videos (self-assessment labels), and the experiment (order of presentation of the videos) is also included in the Matlab files, along with all the necessary details on preprocessing and data analysis required to reproduce the results presented in (H. Becker, J. Fleureau, P. Guillotel, F. Wendling, I. Merlet, and L. Albera, “Emotion recognition based on high-resolution EEG recordings and reconstructed brain sources”, submitted to IEEE Transactions on Affective Computing, 2016). Moreover, we provide a Matlab script to read these data. More details about the data and the contents of the Matlab files are given here.
Citing the database
Please cite the following paper in your publications making use of the HR-EEG4EMO database:
H. Becker, J. Fleureau, P. Guillotel, F. Wendling, I. Merlet, and L. Albera, “Emotion recognition based on high-resolution EEG recordings and reconstructed brain sources”, submitted to IEEE Transactions on Affective Computing, 2016