Although a number of studies have been done on the topic of music retrieval, most of them have focused only on the mono music of MIDI. In fact, the retrieval methods for mono music are unrealistic for the retrieval of popular music. Furthermore, current techniques on music retrieval are not sufficient for those who want to obtain music pieces that match the emotions they prefer. On the other hand, users may simultaneously assign the musical segments and musical emotions when searching for the needed music. In this paper, we propose a novel method called Integrated Music Information Retrieval (IMIR) that utilizes both content-based and emotion-based features for music retrieval in order to match the users' needs. To retrieve the music from large amounts of digital music more efficiently, we propose that all music can be transformed into proposed music representations and then recorded in the indexes. The experimental results show that the proposed method substantially outperforms existing methods in terms of efficiency in content-based music retrieval. We also present that our method is very effective for emotion-based music retrieval.
|Number of pages||15|
|Journal||International Journal of Innovative Computing, Information and Control|
|State||Published - 1 Sep 2010|
- Music indexing
- Music representation
- Music retrieval
- Music searching