Visual Servoing for Mobile Robots



Singularity-free Visual Servoing    

A novel approach for visual control of wheeled mobile robots has been proposed, extending the existing works that use the trifocal tensor as source for measurements. In this approach, singularities typically encountered in this kind of methods are removed by formulating the control problem based on the trifocal tensor and by using a virtual target vertical translated from the real target.



A single controller able to regulate the robot pose towards the desired configuration without local minima is designed. Additionally, the proposed approach is valid for perspective cameras as well as catadioptric systems obeying a central camera model.

  1. H. M. Becerra, J.B. Hayet and C. Sagüés, "A Single Visual‐Servo Controller of Mobile Robots with Super‐Twisting Control", Robotics and Autonomous Systems, accepted, March 2014.
  2. H. M. Becerra, J. B. Hayet and C Sagüés, “Virtual target formulation for singularity-free visual control using the trifocal tensor,” Lecture Notes in Computer Science 7914, Proc. of 5th Mexican Conference on Pattern Recognition (MCPR'13), J.A. Carrasco-Ochoa et al. (Eds.), Springer-Verlag, pp. 30--39, 2013. pdf

Pose-Estimation-Based Visual Servoing     

We propose a new visual servoing scheme based on pose-estimation to drive mobile robots to a desired location specified by a target image. Our scheme recovers the camera location (position and orientation) using an Extended Kalman Filter (EKF) algorithm with the 1D trifocal tensor (TT) or the epipolar geometry as measurement model. A state-estimated feedback control law is designed to solve a tracking problem for the lateral and longitudinal robot coordinates. The desired trajectories to be tracked ensure total correction of both position and orientation using a single step control law, although the orientation is a DOF in the control system.




     The interest of this approach is that the new visual servoing scheme allows knowing the real world path performed by the robot without the computational load introduced by position-based approaches. The effectiveness of our approach is tested via simulations and real-world experiments using a hypercatadioptric imaging system.
 1. Simulation of the pose-estimation-based control. 2. Real-world experiment of the pose-estimation-based control.

  1. H. M. Becerra and C. Sagüés, "Exploiting the Trifocal Tensor in Dynamic Pose Estimation for Visual Control," IEEE Transactions on Control Systems Technology, Vol. 21, No. 5, pages 1931--1939, September 2013. pdf
  2. H. M. Becerra and C. Sagüés, “Dynamic Pose-Estimation from the Epipolar Geometry for Visual Servoing of Mobile Robots,” IEEE International Conference on Robotics and Automation (ICRA'11), pp. 417--422, Shanghai, China, May 2011. pdf
  3. H. M. Becerra and C. Sagüés, “Pose-Estimation-Based Visual Servoing for Differential-Drive Robots using the 1D Trifocal Tensor,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS'09), pp. 5942--5947, St. Louis, MO, USA, October 2009. pdf

1D Trifocal Tensor-Based Visual Servoing    

We have introduced an image-based approach to perform visual control for wheeled mobile robots using for the first time the elements of the 1D trifocal tensor directly in the control law. The visual control utilizes the usual teach-by-showing strategy without requiring any a prior knowledge of the scene and does not need any auxiliary image.



     The main contribution of this work is that the proposed two-steps control law ensures total correction of both position and orientation without switching to any other visual constraint rather than the 1D trifocal tensor. Our approach can be applied with any visual sensor obeying approximately a central projection model, presents good robustness to image noise, and avoids the problem of short baseline by exploiting the information of three views. We exploit the properties of omnidirectional images to preserve bearing information by using the 1D trifocal tensor. The control law exploits the sliding mode control technique in a square system, ensuring stability and robustness for the closed loop.
 1. Simulation of the trifocal tensor-based control. 2. Real-world experiment of the trifocal tensor-based control.

  1. H. M. Becerra, G. López-Nicolás and C. Sagüés, "Omnidirectional Visual Control of Mobile Robots based on the 1D Trifocal Tensor," Robotics and Autonomous SystemsVol. 58, Issue 6, pages 796--808, June 2010pdf
  2. H. M. Becerra and C. Sagüés, “A Novel 1D Trifocal Tensor-Based Control for Differential-Drive Robots,” IEEE International Conference on Robotics and Automation (ICRA'09), pp. 1104--1109, Kobe, Japan, May 2009. pdf

Epipolar-Based Visual Servoing     

We have proposed a novel control law based on sliding mode theory in order to perform visual servoing for mobile robots. The control law exploits the epipolar geometry extended to three views on the basis of image-based visual servoing. The control strategy is performed in two steps. The first step achieves total correction of both orientation and lateral error, while in a second step depth correction is performed.



     Our main contributions on this topic are, 1) the control law performs orientation, lateral error and depth correction with no auxiliary images and it is achieved without necessity of switching to any other visual constraint rather than epipolar geometry, 2) the control law deals with singularities induced by the epipolar geometry in the first step maintaining always bounded inputs, and 3) it is the first time that a robust control law is proposed to face uncertainty in parameters of the vision system, in such a way that our approach does not need a precise camera calibration. Moreover, the target location is reached even with noise in the image.  
1. Simulation of the robust epipolar-based control. 2. Real-world experiment of the robust epipolar-based control.

  1. H. M. Becerra, G. López-Nicolás and C. Sagüés, "A Sliding Mode Control Law for Mobile Robots based on Epipolar Visual Servoing from Three Views," IEEE Transactions on RoboticsVol. 27, No. 1, pages 175--183, February 2011. pdf
  2. H. M. Becerra and C. Sagüés, “Sliding Mode Control for Visual Servoing of Mobile Robots using a Generic Camera,” Chapter 12, Sliding Mode Control,  A. Bartoszewicz (Ed.)ISBN 978-953-307-162-6, INTECH, pp. 221--236, April 2011. pdf
  3. H. M. Becerra and C. Sagüés, “A Sliding Mode Control Law for Epipolar Visual Servoing of Differential-Drive Robots,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS'08), pp. 3058--3063, Nice, France, September 2008. pdf




Main          CV          Publications          Teaching          Personal