This study presents an approach to analyse the inherent emotional ingredients in the polyphonic music signals, and applied to the soundscape emotion analysis. The proposed real-time music emotion trajectory tracking systems are established by machine learning techniques, music signal processing, and the integration of two-dimensional emotion plane and taxonomy as emotion recognition model. Two sets of 192 emotion-predefined training data are collected, which are popular music and western classical music respectively. Music acoustical parameters of volume, onset density, mode, dissonance, and timbre are extracted as the characteristics of music signal. Experimental results verified that different sets of training data would lead to the variation of boundaries among two emotion recognition models. This study proposed an access to environmental sound designing based on emotion recognition and psychoacoustics, especially focusing on the needs of various fields for commercial purpose or auditory atmosphere creation. The soundscape study is conducted by evaluating the effectiveness of emotion locus variation of selected urban soundscape sets blending with music signals. The simulation of playing background music in authentic field makes good use of music emotional characteristics to help people alter the emotion states and the state of mind, and further affect human behaviour and decision-making.