PROJECT TITLE :
Perception in Disparity: An Efficient Navigation Framework for Autonomous Vehicles With Stereo Cameras
Stereo cameras are widely employed in autonomous vehicles as environmental perception sensors as a result of of their availability and low value. But, efficiently utilizing the obtained disparity pictures to get a desirable local path for the vehicle still remains a difficult downside. In this paper, we tend to gift a completely unique navigation framework for autonomous vehicles equipped with stereo cameras, featuring finishing all of the perception and path coming up with tasks directly within the disparity space. Comparing with the favored three-D counterpart, disparity may be a projected geometric area that contains additional primitive data directly computed from the stereo images. Furthermore, disparity image is a more compact representation for massive field of three-D Cartesian house, which makes perception and path finding in longer distance doable. The proposed framework is composed of three modules, particularly, local disparity map building, slope analysis and obstacle detection, and path coming up with. 2 important properties regarding the motion and slope in disparity space are presented for the first time, i.e., the motion model and the slope model in disparity area. With the motion model, the framework initial fuses consecutive disparity maps to construct a a lot of reliable and complete local map. Then, a novel slope analysis methodology known as V-Intercept is developed primarily based on the slope model. It can efficiently analyze slopes and obstacles, generating a reasonable price map directly from the disparity image. Finally, the obstacles in the value map are expanded properly, and a customized A* search algorithm is performed to seek out a cheap path in disparity area. The experimental results show that our framework works well below numerous sorts of environments. The resulting system can efficiently understand and plan on a a lot of larger range and react to obstacles further beyond the traditional Cartesian-based mostly methodology.
Did you like this research project?
To get this research project Guidelines, Training and Code... Click Here