Data fusion for localization and mapping

campus_yard_3d-300x180

We design and evaluate a data fusion system for localization of a mobile skid-steer robot intended for USAR missions. We exploit a rich sensor suite including both proprioceptive (inertial measurement unit and tracks odometry) and exteroceptive sensors (omnidirectional camera and rotating laser rangefinder). To cope with the specificities of each sensing modality (such as significantly differing sampling frequencies), we introduce a novel fusion scheme based on an extended Kalman filter for six degree of freedom orientation and position

V. Kubelka, L. Oswald, F. Pomerleau, F. Colas, T. Svoboda, and M. Reinstein. Robust data fusion of multi-modal sensory information for mobile robots. In Journal of Field Robotics, June 2015, Vol 32, Issue 4. [pdf]

Responsible person: Petr Pošík