EEG-based emotion recognition in music listening

Yuan Pin Lin, Chi Hong Wang, Tzyy Ping Jung*, Tien Lin Wu, Shyh Kang Jeng, Jeng Ren Duann, Jyh Horng Chen

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

498 Scopus citations

Abstract

Ongoing brain activity can be recorded as electroencephalograph (EEG) to discover the links between emotional states and brain activity. This study applied machine-learning algorithms to categorize EEG dynamics according to subject self-reported emotional states during music listening. A framework was proposed to optimize EEG-based emotion recognition by systematically 1) seeking emotion-specific EEG features and 2) exploring the efficacy of the classifiers. Support vector machinewas employed to classify four emotional states (joy, anger, sadness, and pleasure) and obtained an averaged classification accuracy of 82.29% ± 3.06% across 26 subjects. Further, this study identified 30 subjectindependent features that were most relevant to emotional processing across subjects and explored the feasibility of using fewer electrodes to characterize the EEG dynamics during music listening. The identified features were primarily derived from electrodes placed near the frontal and the parietal lobes, consistent with many of the findings in the literature. This study might lead to a practical system for noninvasive assessment of the emotional states in practical or clinical applications.

Original languageEnglish
Article number5458075
Pages (from-to)1798-1806
Number of pages9
JournalIEEE Transactions on Biomedical Engineering
Volume57
Issue number7
DOIs
StatePublished - Jul 2010

Keywords

  • EEG
  • emotion
  • machine learning
  • music

Fingerprint Dive into the research topics of 'EEG-based emotion recognition in music listening'. Together they form a unique fingerprint.

Cite this