A novel method for music retrieval by integrating content-based and emotion-based features

Cheng Che Lu, Vincent Shin-Mu Tseng

Research output: Contribution to journalArticle


Although a number of studies have been done on the topic of music retrieval, most of them have focused only on the mono music of MIDI. In fact, the retrieval methods for mono music are unrealistic for the retrieval of popular music. Furthermore, current techniques on music retrieval are not sufficient for those who want to obtain music pieces that match the emotions they prefer. On the other hand, users may simultaneously assign the musical segments and musical emotions when searching for the needed music. In this paper, we propose a novel method called Integrated Music Information Retrieval (IMIR) that utilizes both content-based and emotion-based features for music retrieval in order to match the users' needs. To retrieve the music from large amounts of digital music more efficiently, we propose that all music can be transformed into proposed music representations and then recorded in the indexes. The experimental results show that the proposed method substantially outperforms existing methods in terms of efficiency in content-based music retrieval. We also present that our method is very effective for emotion-based music retrieval.

Original languageEnglish
Pages (from-to)4077-4091
Number of pages15
JournalInternational Journal of Innovative Computing, Information and Control
Issue number9
StatePublished - 1 Sep 2010


  • Music indexing
  • Music representation
  • Music retrieval
  • Music searching

Fingerprint Dive into the research topics of 'A novel method for music retrieval by integrating content-based and emotion-based features'. Together they form a unique fingerprint.

  • Cite this