Visual Localization and Mapping for Autonomous Robots. Techniques and Applications

Date: Fri, Sep 25 2009

Hour: 11:00

Location: Bizkaia Technology Park, Building 500 E-48160 DERIO - Basque Country- Spain

Speakers: Joan Solà

My talk at BCAM will mainly focus on the problem of simultaneous localization and mapping (SLAM) in autonomous robotics, with a special focus on visual perception. I will start by stating the general SLAM problem and by presenting a well known solution based on the Extended Kalman Filter (EKF). I will then expose the challenges of trying to solve it by just using vision sensors, and very particularly in the case of monocular vision: using monocular vision has the benefits of simplicity and compact size, but the robot looses its ability to directly measure distances to objects, rendering the mapping problem much more challenging. I will introduce the severe non-linearity conditions imposed by a projective sensor like vision and how this challenges EKF enormously.

I will highlight the different degree of achievement of 6DOF-SLAM in the sub-problems of Localization and Mapping, and how SLAM can effectively be solved online if what we are particularly interested in is the localization part -- and can therefore afford a very simplistic mapping.

In order to increase the map representativeness, I will show ways of parametrizing some interesting primitives to describe the robot's surroundings, like points and line segments in 3D space, always by ensuring the necessary linearity conditions for EKF to work.

I will finally present some ongoing work on humanoid robotics to incorporate these techniques as a means to observe its 6DOF dynamics and achieve this way a state observer useful for real-time stabilization tasks based on visual perception. For this, I will briefly present the current stabilization and walking techniques to highlight the importance of incorporating a visual state observer in the control loop.

Confirmed speakers:

Joan Solà