Visual servoing for a robotic arm for mine detecting robot Marwa

download Visual servoing for a robotic arm for mine detecting robot Marwa

of 42

Transcript of Visual servoing for a robotic arm for mine detecting robot Marwa

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    1/42

    LUMS Mine Detector Project

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    2/42

    Using visual information to control a robot (Hutchinson et al.1996). Vision may or may not be used in the feedback loop.

    Visual (image based) features such as points, lines andregions can be used to, for example, enable the alignment ofa manipulator / gripping mechanism with an object

    RobotMovement

    VisionSystem

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    3/42

    Can I move the manipulator sothat the current image matches

    the reference image?

    Current Image Reference image

    Measurements

    Corners Lines Regions Corner features

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    4/42

    Open-loop robot control

    The extraction of the image information and control of the robot are

    two separate tasks

    Once the information is extracted, a control sequence is generated

    and the robot moves blindly, assuming that there is no change in theenvironment. Vision information is extracted only once.

    VisionSystem

    ControlSequence

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    5/42

    Visual Servoing (Hill & Park 1979)

    Dynamic look and move systems

    Control of the robot is done in two stages. The vision system provides theinput to the robot controller which in turn uses joint feedback to internallystabilize the robot. Visual information is extracted continuously.

    Vision SystemRobot

    Controller

    Joint Feedback

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    6/42

    Visual Servoing (Hill & Park 1979)

    Direct visual servo systems

    Here, visual controller directly computes the input to the robot joints androbot controller is eliminated altogether.

    VisualController

    Joint Feedback

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    7/42

    2D image measurements are used directly

    Reduce the error between a set of currentand desired image features

    Image basedVisual Servo

    systems

    3D information about the scene is estimatedwith a known camera model.

    The control task is defined in 3D worldcoordinates

    Positionbased visual

    servo systems

    A combination of previous 2 approaches

    Also called 2 D visual servoing

    Hybrid visualservo systems

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    8/42

    Number ofCameras

    1

    Eye-In-Hand

    Stand-Alone

    2

    Eye-In-Hand

    Stand-Alone

    >2

    Redundant-Camera-System

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    9/42

    Maintain a fixed distance and orientation w.r.t the ground.

    Two main tasks

    Visual perception for ground profiling

    Arm joint control for obtaining the desired wrist configuration

    Arm visual

    servoing

    VisualFeedback

    Arm jointcontrol

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    10/42

    Binocular Stand-Alone Position based Dynamic look-and

    -move

    SensorPayload

    VisionSystem

    Joint 1

    Joint 2

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    11/42

    Stereo Vision Custom built rig 2 logitech c500

    webcams Total cost < $100 OpenCV library

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    12/42

    Motivation Used for 3D

    reconstruction ofa scene captured

    simultaneouslyby 2 cameras

    Depthinformation isnot availablefrom a single

    image.

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    13/42

    Motivation

    By capturing

    images of a

    scene from 2

    viewpoints we

    can calculate the

    depth through

    triangulation

    The depth of a

    point is inverselyproportional to

    its disparity

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    14/42

    Camera calibration

    Estimate the camera matrix containing the following parameters

    The focal lengths of both cameras

    Principle point offsets

    Radial and tangential distortion coefficients

    Done by capturing images of a known

    3D object, and solving the equation of

    the pinhole camera model for the

    required unknowns

    The calibration object

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    15/42

    The Calibration Process

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    16/42

    Stereo calibration

    After the calibration of the individual cameras, the stereo parameters

    must be estimated.

    These relate to the relative placement of both cameras in space. Theparameters include

    The translation vector

    The rotation matrix

    The essential matrix

    The fundamental matrix

    Same procedure as single camera calibration

    OpenCV provides routines both for simple and stereo calibration

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    17/42

    Image rectification for faster correspondences

    Use the epipolar

    constraint to reduce

    the search space We can even transform

    the images so that the

    epipolar lines are

    horizontal and the

    images are row aligned.

    Epipolar Geometry

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    18/42

    Image rectification for faster correspondences

    OpenCV provides 2 methods for image rectification

    As stereo calibration parameters are available beforehand, we have

    used calibrated rectification. Also known as Bougets method.

    Uncalibrated Rectification Stereo pair may not be calibrated

    Calibration parametersestimated along with rest of theunknowns

    Calibrated Rectification Stereo pair calibrated

    beforehand

    More accurate than uncalibratedrectification

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    19/42

    Some rectification results from local outdoor experiments

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    20/42

    Some rectification results from local outdoor experiments

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    21/42

    Some rectification results from local outdoor experiments

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    22/42

    Finding correspondences and generating the disparity maps

    The disparity can be

    calculated easily oncethe images are rowaligned. It is the differencebetween the value ofxL and xR

    Disparity is inverselyproportional to depth

    d = xL - xR

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    23/42

    Finding correspondences and generating the disparity maps

    OpenCV provides 3 algorithms for correspondences

    Block matching

    Semi-Global block matching

    Graph-Cut Algorithm

    Block matching

    Matching through correlation

    The correlation function is a simple Sum of Squared Differences (SSD)window.

    Does not find a lot of correspondences but gives results in real-time.

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    24/42

    Disparity Maps

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    25/42

    Disparity Maps

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    26/42

    Disparity Maps

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    27/42

    Generating the 3D point cloud

    The disparity map can be used to

    obtain the point cloud with the

    help of the extrinsic and intrinsic

    camera parameters derived fromthe calibration process

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    28/42

    Generating the 3D point cloud

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    29/42

    Generating the 3D point cloud

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    30/42

    Plane fitting through PCA

    The point cloud can now be

    used to calculate the normal

    vector of the visible terrain.

    This vector will eventually be

    used to adjust the angle of

    the arm.

    The normal is simply thesingular vector with the

    smallest singular value.

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    31/42

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    32/42

    2 DoF P-R configuration

    Sensory feedback National

    Instrumentshardware

    SensorPayload

    VisionSystem

    Joint 1

    Joint 2

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    33/42

    Lab experimentalsetup

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    34/42

    Sensors and Circuitry

    RotaryEncoder

    LinearEncoder

    SbRIO(NI)

    PowerDistribution

    Motor drive(C-SeriesModule)

    Interface

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    35/42

    National Instruments Single board RIO (Sb-RIO).

    Real time processor

    Reconfigurable FPGA

    Analog and Digital I/O. C series connectivity

    Stand alone

    Communication

    Programmable with

    LabVIEW

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    36/42

    Programming environment

    LabVIEW 2010

    Graphical

    Real time module Parallelism

    Interfacing

    OpenCV code

    with LabVIEW

    SbRIO with PC

    Program Structure

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    37/42

    Main control loop

    Simple on-off control.

    Two tasks

    Visual ground profiling through stereo

    Joint motor control (critical).

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    38/42

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    39/42

    The speed breaker experiment

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    40/42

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    41/42

    Chaumette and Hutchinson (2006) Chaumette and Hutchinson (2007) Kragic and Christensen (?)

    Learning OpenCv by Bradski and Kaehler ni.com

  • 7/29/2019 Visual servoing for a robotic arm for mine detecting robot Marwa

    42/42