EEG Signal Classification during Listening to Emotional Music

Illustration of the location of 32 electrodes on the head.


An approach to recognize the emotion responses during multimedia presentation using the electroencephalogram (EEG) signals is proposed. The association between EEG signals and music-induced emotion responses was investigated in three factors, including: 1) the types of features, 2) the temporal resolutions of features, and 3) the components of EEG. The results showed that the spectrum power asymmetry index of EEG signal was a sensitive marker to reflect the brain activation related to emotion responses, especially for the low frequency bands of delta, theta and alpha components. Besides, the maximum classification accuracy was obtained around 92.73 % by using support vector machine (SVM) based on 60 features derived from all EEG components with the feature temporal resolution of one second. As such, it will be able to provide key clues to develop EEG-inspired multimedia applications, in which multimedia contents could be offered interactively according to the users’ immediate feedback.


Reference :

Yuan-Pin Lin, Chi-Hong Wang, Tien-Lin Wu, Shyh-Kang Jeng and Jyh-Horng Chen, “Support Vector Machine for EEG Signal Classification during Listening to Emotional Music, “ IEEE International Workshop on Multimedia Signal Processing (MMSP'08), Queensland, Australia, October 8-10, 2008.|

pub/dsp/eeg_signal_classification_during_listening_to_emotional_music.txt · Last modified: 2009/04/09 22:21 by dspadmin     Back to top