Indoor positioning has been intensively studied recently due to the exploding demands of indoor mobile applications. While numerous works have employed wireless signals or dead-reckoning techniques, wearable computing poses new opportunities as well as challenges to the localization problem. This research studies the wearable localization problem by proposing a particle filter-based scheme to fuse the inputs from wearable inertial and visual sensors on human body. Specifically, the filter takes inertial measurements, wireless signals, visual landmarks, and indoor floor plans as inputs for location tracking. The inertial signals imply human body movements, the wireless signals indicate a rough absolute region inside a building, while the visual landmarks provide relative angles viewed from particular positions to these markers. Furthermore, a head-mounted display provides intuitive and friendly interfaces to users. The proposed system has also been prototyped and tested in our campus, and the experiments demonstrate an average localization error of about one meter.