|Description:||Unmanned Aerial Vehicles (UAVs) are expanding from open outdoor environments into more constrained indoor locations thanks to improvements in accuracy and precision of localization, navigation and control algorithms in recent years. New possibilities for deployment of autonomous UAV swarms into indoor environments emerge, leading to the development of high-level mission-oriented algorithms. Our team is in particular interested in mapping and documenting of historic buildings to assess the condition of the ceiling, murals, statues, stained glass, etc. A reliable position estimate of the UAV is needed for all indoor autonomous flights. Since the global navigation satellite system (GNSS) services are generally not available indoor, the UAV must be localized using onboard sensors only. One of the sensors that are widely used for localization of UAVs are the passive stereo cameras. While cameras work well in outdoor environments, their deployments indoor is problematic, due to insufficient light. The solution is to use structured light approach, which consists of projecting a near-infrared (IR) pattern of parallel lines that are registered by two cameras mounted on a fixed baseline. The depth in the image is estimated from the deformation of the narrow-band pattern.
The goal of this project is to develop a simultaneous localization and mapping (SLAM) system based on a structured light depth camera. The system will read the depth image from the camera, extract stable features, and find corresponding features in the simultaneously built map to estimate the current position of the UAV in the map. A loop closure detection algorithm will be employed to correct the position drift after returning to a previously visited part of the map.