3D instrument reconstruction and tracking are critical steps in minimally invasive surgery. Nowadays, some image-based methods have been proposed. Trying to minimize the damage to human body, those systems only used one camera as the major sensor to track and reconstruct the instrument and hence lost performance. For performance improvement, an inertial measurement unit (IMU) was newly integrated in our proposed system owing to two factors: first, the IMU could be installed in the instrument without extra body damage. Second, the IMU could provide direct motion information for tracking. However, the IMU measurements are far from perfect due to the gyro and acceleration biases. Thus, we proposed to compromise the information from a camera system and an IMU system to estimate the position, velocity and direction of the instrument. An Extended Kalman Filter was finally adopted to integrate information from different sources, compensate the biases of IMU, and track the instrument in a unified framework. The results of the experiment show the effectiveness of our method compared with image-based and IMU-based methods.