Project Overview
This project implements a real-time stereo visual odometry system using the KITTI dataset. It estimates the camera trajectory by extracting stereo depth maps, detecting ORB features, tracking with optical flow, and solving the PnP problem for motion estimation. The system visualizes the trajectory and saves it for further use in SLAM or mapping pipelines.
Key Features
- Stereo image rectification and disparity computation using OpenCV's StereoSGBM.
- ORB feature detection and optical flow-based tracking using Lucas-Kanade method.
- 3D point reconstruction using stereo depth and camera intrinsics.
- Pose estimation via PnP + RANSAC and real-time trajectory visualization.
Methodology
- Preprocessing: Load grayscale stereo image pairs from KITTI. Resize and scale intrinsic parameters accordingly.
- Disparity Computation: Generate disparity maps using StereoSGBM to estimate pixel-wise depth.
- Feature Detection & Tracking: Detect ORB features in the left image and track them using optical flow in the next frame.
- 3D-2D Correspondences: Use disparity to back-project 3D points from tracked features.
- Pose Estimation: Solve the PnP problem using RANSAC and convert it to a 4x4 transformation matrix.
- Trajectory Estimation: Update global pose and plot trajectory on a 2D canvas.
Results
The final output includes real-time disparity map visualization, tracked keypoints, and the camera trajectory displayed as a 2D plot. The system demonstrates stable motion estimation across long KITTI sequences.
Conclusion
This stereo visual odometry pipeline provides a robust and real-time method for ego-motion estimation using passive vision. It is suitable for robotics navigation, SLAM systems, and autonomous vehicle applications.