Music is highly related to human affective feelings with different kinds of emotions may be embedded in a music work simultaneously. Hence, how to extract emotions from music has been a hot topic for music information retrieval over the past few decades. To this end, a considerable number of multi-labeling studies have been conducted on tagging music emotions. In this paper, we conduct a comparative analysis of state-of-the-art methods for music emotion annotation through extensive experimental evaluations. Comparative experiments were performed on real dataset CAL500 with different evaluation metrics. Moreover, to reveal the robustness, the compared algorithms including different domains of annotation ones were examined with simple and complex types of emotions. The experimental results provide the researchers with insightful ideas in algorithm design for emotionalizing music from technical point of view.