Cj CDC Poster

download Cj CDC Poster

of 1

Transcript of Cj CDC Poster

  • 8/12/2019 Cj CDC Poster

    1/1

    UAV Tracking with a Monocular CameraUAV Tracking with a Monocular CameraJian Chen and Darren DawsonJian Chen and Darren Dawson

    Department of Naval Architecture and Marine Engineering, University of Michigan, Ann Arbor, MI 48109, E-mail: [email protected] of Electrical and Computer Engineering, Clemson University, Clemson, SC 29634

    ui(t),vi(t) 2 R

    The four feature points, denoted by , are considered to be coplanar and notcolinear. From the geometry between the coordinate frames and the feature points, the followingEuclidean homography can be obtained

    Each feature point on and will have a projected pixel coordinate expressed in terms of ,denoted by for , and for , that are defined as follows

    Remark: Since , can be made arbitrarily small with large enough.This allows us to drive to any arbitrarily small value thus ensuring that the ultimate boundon can be made arbitrarily small. Therefore, the following UAV can trackthe leading UAV with the relative position and orientation and arbitrarily close to

    the desired relative position and orientation and , respectively.

    To calculate the Euclidean homography from the pixel information, the projective homographyis constructed as follows

    The Faugeras decomposition algorithm can be used to compute and with theimages of the four feature point pairs on and .

    Rotational error is defined as follows

    The open-loop dynamics can be expressed as follows

    Based on the structure of the open-loop error systems, the angular and linear velocity control

    inputs for the following UAV are designed as follows

    Theorem: Provided the linear and angular velocities of the leading UAV are bounded for alltime , the control inputs ensure global uniformly ultimately bounded (GUUB) positionand orientation tracking in the sense that

    Where , and are positive constants.

    The objective is to develop a visual servo controller

    that ensure the following UAV, denoted by , tracks aleading UAV, denoted by , with a fixed relativetranslation and a fixed relative orientationbetween and in the sense that

    From point of view, the control objective is toensure coincide with .

    Assumption: The linear and angular velocity of theleading UAV are bounded.

    AbstractAbstract

    A visual servo tracking controller is developed in this paper fora monocular camera system mounted on an Unmanned AerialVehicle (UAV) to track a leading UAV with a fixed relative

    position and orientation.A theoretical framework is developed for homography based

    visual servo technique for the general case in which both the

    Problem FormulationProblem Formulation

    UAV Tracking Systems

    UAV coordinate systems

    Vision System ModelVision S

    ystem Model

    Tracking Control DevelopmentTracking Control Development

    ConclusionConclusion

    denote positive constant control gains

    R(t)

    R= R (R)T xf = xf Rxf

    mizi

    =zizi| {z}

    R + xhn

    T

    | {z }

    miz

    i

    :

    i H xh =

    xf

    d

    R (t)! R;x f (t)! xf t ! "

    #

    # d

    R# d

    xf

    $! , (t)%(t)

    ! # = &"$! v# = &2xh &"; &2

    ' '"im

    &" ! "'"= '" &"

    xf(t)($!(t), xh(t))

    Rx

    f

    camera and the object are moving relative to an inertial coordinate frame. This technique canalso be applied to visual estimation of velocity or Euclidean struction of a moving object by amoving camera.

    #

    *i = i

    H "

    | {z }*i

    *i = mizi

    *i = miz

    i

    *i ,-

    ui vi "T

    *i ,-

    ui vi "

    T

    In this paper, the position/orientation of a UAV is forced to track a leading UAV with a fixedrelative position and orientation. The impact that the development in this paper makes is that anew analytical approach has been developed using relative position and orientation concept toenable the homography-based visual servo approach to be applied to a general vision systemwhere neither the camera nor the object is required to be stationary. Another contribution ofthe paper is that the following UAV does not require the 6DOF velocity of the leading UAV sothe approach is decentralized.

    the axis-anglerepresentation of

    (t) 2 R.;%(t) 2 RR (t)

    / / #/ /ui (t), v

    i (t) 2 R

    The information obtained by decomposing the Euclidean homography is used to develop arobust kinematic controller which eliminates the knowledge of the leading UAV's velocities.

    A Lyapunov-based analysis is used to show that the proposed control strategy achievesglobal uniformly ultimately bounded (GUUB) position and orientation tracking.

    ## d #

    Pin-hole camera model:

    t 0

    '"

    1i i= ", 2, ., 3

    i (t) R(t)(*i(t); *i) / /

    &$! (t)&, &xh(t)& 4*

    ' $x*( 5t ) + '"

    5

    d :xh= v# + d 6xh7- ! # +82

    9$! = !! # + ! 8"82= R

    T v# d

    -RT ! # d

    -Rxf

    8"= RT ! # d