Wearable technologies have attracted considerable attention in mobile computing. This paper proposes a system for localization on wearable devices, such as the smart glasses. The existing approaches to localization, such as AoA, ToA, TDoA, GPS, and RF-based solutions, rely on auxiliary signals at set distances. This limitation constrains the availability of LBS when a mobile device is unable to receive the auxiliary signals. For example, GPS is not available in indoor environments. This paper proposes a system for 'self-contained' localization, which refers to the capability of a device to determine its location without having to rely on auxiliary signals, and demonstrates the feasibility of using inertial measurement unit (IMU) and visual sensors in a smart phone to achieve this goal. Based on a concept called inertial sensor-assisted localization (ISAL), IMU sensors are triggered by user motions and visual cues are taken from existing objects. Using augmented reality (AR) techniques, objects captured by the smart phone camera are 'tagged' manually or automatically by image processing tools. The angles of these objects relative to the user are then measured by IMU sensors. Based on these angles, we then develop an angulation algorithm to determine the location of the user.