Multi-sensor (IR) data fusion for mobile robot navigation using occupancy grid method.
|Title:||Multi-sensor (IR) data fusion for mobile robot navigation using occupancy grid method.|
|Abstract:||The main topic of the thesis is the multi-sensor data fusion in the context of mobile robot navigation. The work presented has been part of a continuous research done in the field of mobile robots. In that respect, a mobile robot platform with an onboard manipulation capability has been developed as an experimental platform for a multi-sensor system for teleautonomous applications in an unstructured environment. Different types of sensors have been provided to gather information about the environment: infrared (IR) range finders, vision, tactile, position. While vision and tactile sensors were approached by other related work, this thesis is essentially aimed to solve the following problems: (1) Mobile robot navigation, in terms of electronic interface and computerized control with real-time range data acquisition for obstacle avoidance, using a GUI (Graphical User Interface); (2) Occupancy grid method implementation using Bayesian analysis for the case of an IR ranging sensor; (3) Unstructured environment mapping, in the context of multi-sensor data fusion of mobile robot views taken from different given positions; (4) Global map integration with other types of sensory information (vision).|
|Collection||Thèses, 1910 - 2010 // Theses, 1910 - 2010|