期刊名称:International Journal of Advanced Robotic Systems
印刷版ISSN:1729-8806
电子版ISSN:1729-8814
出版年度:2016
卷号:13
DOI:10.5772/64011
语种:English
出版社:SAGE Publications
摘要:State estimation is a fundamental necessity for any application involving autonomous robots. This paper describes a visual-aided inertial navigation and mapping system for application to autonomous robots. The system, which relies on Kalman filtering, is designed to fuse the measurements obtained from a monocular camera, an inertial measurement unit (IMU) and a position sensor (GPS). The estimated state consists of the full state of the vehicle: the position, orientation, their first derivatives and the parameter errors of the inertial sensors (i.e., the bias of gyroscopes and accelerometers). The system also provides the spatial locations of the visual features observed by the camera. The proposed scheme was designed by considering the limited resources commonly available in small mobile robots, while it is intended to be applied to cluttered environments in order to perform fully vision-based navigation in periods where the position sensor is not available. Moreover, the estimated map of visual features would be suitable for multiple tasks: i) terrain analysis; ii) three-dimensional (3D) scene reconstruction; iii) localization, detection or perception of obstacles and generating trajectories to navigate around these obstacles; and iv) autonomous exploration. In this work, simulations and experiments with real data are presented in order to validate and demonstrate the performance of the proposal.
关键词:Visual-based Navigation; Mapping; Autonomous Vehicles; Sensor Fusion; State Estimation