Shot-by-shot match video segmentation is essential in video-based microscopic data annotation and collection for strategic analysis. With the help of deep learning vision technology, the shuttlecock trajectory can be depicted from broadcast video with accuracy around 78%. In this work, to develop automatic badminton match video labeling, we applied Artificial Neural Networks (ANNs) in the contest strategy data collection to speed up the labeling procedure. The proposed ANN was trained to detect badminton shot events based on shuttlecock trajectories in the contest video. Badminton shot events include serving, hitting, and dead ball. With the help of these shot events, the strategy analyst could annotate strategy information more efficiently and reduce labor costs significantly.