Intelligent transportation systems is widely studied in autonomous driving car recently. To identify the location of Autonomous driving car and verify the environment, Simultaneous Localization and Mapping (SLAM) is a concept for constructing the unknown environment model and estimating sensor motion during movement. One of the major challenges of monocular SLAM is fast movement and video image suture on dual fisheye camera. A field of view of a general monocular camera is limited to barely 50 to 70 degrees. For these reasons, this paper chooses dual back-to-back fisheye cameras to overcome the narrow field of view and capture 360-degree images for monitor and SLAM applications. However, the fisheye lens poses a challenge of large distortion and the angle of the two fisheye camera causes image difference while unwrapping and blending to get a 360-degree panoramic image. This paper proposes an algorithm to successfully calibrate, project, stitch, and blend to acquire the equirectangular panoramic images. The experimental result shows that this algorithm is effective, fast (27.7 FPS), and overcomes the difficulties of significant changes in the scene of continuous frames when the dual eye fish camera is in motion on Xavier without using the graphic card. This fast and accurate image stitching method can also apply to the bird view of an autonomous driving car on fast-moving status.