Farhad Samadzadegan*, Ghasem Abdi** & 4 a...

6
Autonomous Navigation of Unmanned Aerial Vehicles Based on Multi-Sensor Data Fusion Abstract: During the development of Unmanned Aerial Vehicles (UAVs), one of the major concerns has been the issue of improving the accuracy, coverage, and reliability of automatic navigation system within the imposed weight and cost limitations. Standard aerial navigation systems often rely on Global Positioning System (GPS) and Inertial Measurement Unit (IMU), alone or in a combination. In aerial vehicles the GPS signal can becomes unreliable, blocked or jammed by international interferences (especially for a GPS operating on civilian frequencies). On the other hand, a stand-alone IMU drifts with time and will be unacceptable after a few seconds (especially for small-size aerial vehicles which use low-cost IMU). In this respect, many researches have been made to improve of the efficiency and robustness of GPS/IMU navigation by visual aiding; this can be achieved by combining inertial measurements from an IMU with the position resulting from visual observations. This paper represents a method for multi-sensor based navigation of aerial vehicles which is to determine precise pose parameters of the vehicle in real time. In this context, a Vision-Based Navigation (VBN) system provides attitude and position observations in an Extended Kalman Filter (EKF) algorithm for precisely determining the pose parameters of the vehicle using IMU motion model. The pose estimation strategy has been tested on a number of different sites and experimental results prove the feasibility and robustness of the proposed method. Keywords: Autonomous Outdoor Navigation System, Extended Kalman Filter, Image Geo-referencing, Multi- Sensor Data Fusion, Strapdown Inertial Navigation, Vision-Based Navigation. 1. Introduction Recently, Unmanned Aerial Vehicle (UAV) systems have begun to use in civil applications such as surveying and mapping, disaster management, spatial information acquisition, data collection from inaccessible areas and geophysics exploration [1]. During the development of UAVs, one of the major concerns has been the issue of improving the accuracy, coverage, and reliability of automatic navigation system within the imposed weight and cost limitations. Nevertheless, state of the art of these systems is still not able to guarantee an acceptable level of automation in their navigation system to authorize the use of such a system in different civilian application areas [2–6]. Airborne navigation systems can be divided into two categories: inertial (or dead-reckoning) navigation, and reference (or absolute) based navigation [7–9]. An Inertial Navigation System (INS) makes use of an Inertial Measurement Unit (IMU) to sense the vehicle’s rotation rate and acceleration, which are used to obtain vehicle states such as position, velocity, and attitude; it also provides these at high data rates which are crucial for guidance and control. However, its diverging error nature due to the integration process requires absolute sensors (e.g. Global Positioning System (GPS)) in order to constrain the drift [3], [7–9]. Absolute sensors are categorized into two groups: beacon based and terrain based [7], [8]. The GPS is the most common beacon based navigation system and there have been extensive research activities in the fusion of INS and GPS systems [10–15]. The GPS-aided INS provides long-term stability with high accuracy and it has worldwide coverage in any weather condition. The main drawback is its dependency on external satellite signals which can be easily blocked or jammed by intentional interference [3], [7–9]. An alternative approach is passive estimation of absolute position based on geo-referenced image registration algorithm, which is explored in researches of the other group [1], [3], [6], [16], [17]. The passive estimation strategy provides absolute vehicle states with high accuracy and it has worldwide coverage. However, accuracy of this strategy is highly dependent to the accuracy of database information which is stored in memory platform and automatic registration according to captured time, viewing angle, and different structures of aerial images and reference data [18]. All navigation systems manifest themselves with shortcomings and disadvantageous, but the fact that redundant measurements are available for pose estimation greatly enhances their reliability [9]. This paper represents a method for multi-sensor based navigation of aerial vehicles which is to determine precise pose parameters of the vehicle in real time. In this context, a Vision-Based Navigation (VBN) system provides attitude Farhad Samadzadegan*, Ghasem Abdi** *Department of Surveying and Geomatics Engineering, College of Engineering, University of Tehran, Tehran, Iran, [email protected] **Department of Surveying and Geomatics Engineering, College of Engineering, University of Tehran, Tehran, Iran, [email protected] 868 ﻣﺘﻠﺐ ﺳﺎﯾﺖMatlabSite.com MatlabSite.com ﻣﺘﻠﺐ ﺳﺎﯾﺖ

Transcript of Farhad Samadzadegan*, Ghasem Abdi** & 4 a...

Page 1: Farhad Samadzadegan*, Ghasem Abdi** & 4 a efiles.matlabsite.com/docs/papers/icee2012/2/icee2012-109.pdf · Farhad Samadzadegan*, Ghasem Abdi** *Department of Surveying and Geomatics

1 2 3 4 5 6 7 8 91011121314151617181920212223242526272829303132333435363738394041424344454647484950515253545556576061

Autonomous Navigation of Unmanned Aerial Vehicles Based on Multi-Sensor Data Fusion

Abstract: During the development of Unmanned Aerial Vehicles (UAVs), one of the major concerns has been the issue of improving the accuracy, coverage, and reliability of automatic navigation system within the imposed weight and cost limitations. Standard aerial navigation systems often rely on Global Positioning System (GPS) and Inertial Measurement Unit (IMU), alone or in a combination. In aerial vehicles the GPS signal can becomes unreliable, blocked or jammed by international interferences (especially for a GPS operating on civilian frequencies). On the other hand, a stand-alone IMU drifts with time and will be unacceptable after a few seconds (especially for small-size aerial vehicles which use low-cost IMU). In this respect, many researches have been made to improve of the efficiency and robustness of GPS/IMU navigation by visual aiding; this can be achieved by combining inertial measurements from an IMU with the position resulting from visual observations. This paper represents a method for multi-sensor based navigation of aerial vehicles which is to determine precise pose parameters of the vehicle in real time. In this context, a Vision-Based Navigation (VBN) system provides attitude and position observations in an Extended Kalman Filter (EKF) algorithm for precisely determining the pose parameters of the vehicle using IMU motion model. The pose estimation strategy has been tested on a number of different sites and experimental results prove the feasibility and robustness of the proposed method.

Keywords: Autonomous Outdoor Navigation System, Extended Kalman Filter, Image Geo-referencing, Multi-Sensor Data Fusion, Strapdown Inertial Navigation, Vision-Based Navigation.

1. Introduction

Recently, Unmanned Aerial Vehicle (UAV) systems have begun to use in civil applications such as surveying and mapping, disaster management, spatial information acquisition, data collection from inaccessible areas and geophysics exploration [1].

During the development of UAVs, one of the major concerns has been the issue of improving the accuracy, coverage, and reliability of automatic navigation system within the imposed weight and cost limitations. Nevertheless, state of the art of these systems is still not able to guarantee an acceptable level of automation in their navigation system to authorize the use of such a system in different civilian application areas [2–6].

Airborne navigation systems can be divided into two categories: inertial (or dead-reckoning) navigation, and reference (or absolute) based navigation [7–9]. An Inertial Navigation System (INS) makes use of an Inertial Measurement Unit (IMU) to sense the vehicle’s rotation rate and acceleration, which are used to obtain vehicle states such as position, velocity, and attitude; it also provides these at high data rates which are crucial for guidance and control. However, its diverging error nature due to the integration process requires absolute sensors (e.g. Global Positioning System (GPS)) in order to constrain the drift [3], [7–9]. Absolute sensors are categorized into two groups: beacon based and terrain based [7], [8]. The GPS is the most common beacon based navigation system and there have been extensive research activities in the fusion of INS and GPS systems [10–15]. The GPS-aided INS provides long-term stability with high accuracy and it has worldwide coverage in any weather condition. The main drawback is its dependency on external satellite signals which can be easily blocked or jammed by intentional interference [3], [7–9]. An alternative approach is passive estimation of absolute position based on geo-referenced image registration algorithm, which is explored in researches of the other group [1], [3], [6], [16], [17]. The passive estimation strategy provides absolute vehicle states with high accuracy and it has worldwide coverage. However, accuracy of this strategy is highly dependent to the accuracy of database information which is stored in memory platform and automatic registration according to captured time, viewing angle, and different structures of aerial images and reference data [18].

All navigation systems manifest themselves with shortcomings and disadvantageous, but the fact that redundant measurements are available for pose estimation greatly enhances their reliability [9]. This paper represents a method for multi-sensor based navigation of aerial vehicles which is to determine precise pose parameters of the vehicle in real time. In this context, a Vision-Based Navigation (VBN) system provides attitude

Farhad Samadzadegan*, Ghasem Abdi** *Department of Surveying and Geomatics Engineering, College of Engineering, University of Tehran, Tehran,

Iran, [email protected] **Department of Surveying and Geomatics Engineering, College of Engineering, University of Tehran,

Tehran, Iran, [email protected]

868

متلب سایت

MatlabSite.com

MatlabSite.com متلب سایت

Page 2: Farhad Samadzadegan*, Ghasem Abdi** & 4 a efiles.matlabsite.com/docs/papers/icee2012/2/icee2012-109.pdf · Farhad Samadzadegan*, Ghasem Abdi** *Department of Surveying and Geomatics

1 2 3 4 5 6 7 8 91011121314151617181920212223242526272829303132333435363738394041424344454647484950515253545556576061

and position observations in an Extended Kalman Filter (EKF) algorithm for precisely determining the pose parameters of the vehicle using IMU motion model. In the following, concept of the proposed method, inertial navigation, vision-based pose estimation, sensor fusion using EKF will be described. Finally, experiments and results obtained by this system will be presented.

2. Proposed Method

The proposed method is divided into two main phases: The Pre-mission phase is based on the extraction of salient features and descriptors vector in the 3D object space (ortho-rectified image) and in the During-mission phase, the VBN system provides attitude and position observations in the EKF algorithm for precisely determining the pose parameters of the vehicle using IMU motion model. A schematic illustration of the proposed method is given in Fig. 1. In the following the main components of each phase are elaborately described.

Fig. 1: Diagram of the proposed method [3]

2.1 Pre-mission Phase

The ortho-rectified images are an important layer of topographic information, which is generated from satellite imageries and corresponding Digital Surface Model (DSM) by rectification process [17]. A schematic illustration of the rectification process is given in Fig. 2.

Fig. 2: Diagram of the rectification process [17], [18]

2.1.1 Feature and Descriptor Extraction

This step is based on the extraction of salient features and descriptors vector in the 3D object space. Significant points (salient point, region corners, line intersections, and etc.) are understood as features here that are distinct,

spread all over the image and efficiently detectable in both spaces (image and object spaces) [17].

The automatic selection of image points is based on the Scale Invariant Feature Transform (SIFT) algorithm proposed by Lowe (2004), which is invariant to image translation, scaling, rotation, and partially invariant to illumination changes and affine 3D projection [19]. Finally, the derived coordinates and descriptors vector of features are stored in the geo-referenced database.

2.2 During-mission Phase

The During-mission phase is divided into three main modules, namely nonlinear process model, linear observation model, and estimation process. In the following the main components of each module are elaborately described.

2.2.1 Nonlinear Process Model

The process model includes the vehicle dynamic model and can be written as a first-order vector difference equation in discrete time:

kw,ku,1-kxfkx (1)

where f is a nonlinear state transition function at time k , which forms the current device state ( )kx , from the previous state ( )1-kx , the current control input ( )ku , and the process noise ( )kw , which is modelled as a zero mean Gaussian noise and variance of ( )kQ .

The nonlinear vehicle model is a strapdown inertial navigation algorithm that computes the pose parameters of the vehicle from the inertial measurement inputs. In the earth-fixed local-tangent frame formulation, the vehicle model becomes [7], [8], [20–22]:

tk1kE1k

tgkf1kC1kV

t1kV1kP

k

kV

kP

bnb

n

nbnb

n

nn

n

n

n

(2)

Where ( )kPn , ( )kVn , and ( )kΨn are the position, velocity, and attitude in the navigation frame, respectively. ( )kf b and ( )kωb are acceleration and rotation rates measured in the body frame. n

bC is the direction cosine matrix (DCM) and n

bE is the matrix which transforms the rotation rates in the body frame to Euler angle rates:

C/CC/S0

SC0

C/SCC/SS1

E

CCCSS

SSCCSSSSCCSC

CSCSSCSSSCCC

C

nb

nb

(3)

where S and C represent sine and cosine respectively.

869

متلب سایت

MatlabSite.com

MatlabSite.com متلب سایت

Page 3: Farhad Samadzadegan*, Ghasem Abdi** & 4 a efiles.matlabsite.com/docs/papers/icee2012/2/icee2012-109.pdf · Farhad Samadzadegan*, Ghasem Abdi** *Department of Surveying and Geomatics

1 2 3 4 5 6 7 8 91011121314151617181920212223242526272829303132333435363738394041424344454647484950515253545556576061

2.2.2 Linear Observation Model

Various sensors can measure relative and absolute observations between the vehicle and the terrain. The observation model relates these observations to the state as:

kv,kxhkz (4)

where h is the observation model at time k , and ( )kv is the observation noise, which is modelled as a zero mean Gaussian noise and variance of ( )kR . In the following, we address a method for vision-based pose estimation, which is estimating the absolute position and attitude of the platform based on registering of aerial images to the geo-referenced database in real time (Fig. 3).

Fig. 3: Diagram of the vision-based pose estimation algorithm

Firstly, the SIFT algorithm is used to detect and

describe local features in the sensed images, which are invariant with respect to translation, rotation, and scale. Next, given a set of keypoints detected in the aerial images and geo-referenced satellite imagery, a simple matching scheme based on the nearest neighbors in SIFT descriptor’s feature space is employed. This simple matching scheme considers the SIFT feature and may result in outlying correspondences. Therefore, we need another method to search for the true keypoint correspondences by imposing a geometric constraint on the potential matches. To efficiently reject the outliers, RANdom SAmple Consensus (RANSAC) [23] is used to improve finding outliers using Direct Linear Transformation (DLT) equations [24], [25]. Finally, we

address the issues of well-known DLT equations for mathematical transformation between 2D image and 3D object space:

1ZDYDXD

DZDYDXDy

1ZDYDXD

DZDYDXDx

11109

8765

11109

4321

(5)

where image coordinates )y,x( and object geo-referenced ground coordinate )Z,Y,X( of the conjugate points can be measured after registration procedure. The unknown Exterior Orientation Parameters (EOPs) consist of: position of projection centre of the camera body in ground coordinate system, and elements of rotation matrix which is the rotation matrix between camera and geo-referenced object space. These EOPs are extracted from the DLT parameters (Equation (6)). The DLT parameters are estimated by measured conjugate points in both the aerial image and the geo-referenced database. Since for each conjugate pair of points two DLT equation can be written, at least six well-distributed conjugate points are required to estimate the eleven unknown DLTs from twelve equations using single image resection algorithm [25]. The introduction of more than six points increases the redundancy, gives an RMSE measure for estimated parameters and strengthens the solution of the parameters [3].

(6)

In the above equation 0x and c are defined as followed:

2/)cc(c

y-)DDD/()DDD(c

x-)DDD/()DDD(c

DDD/)DDDDDD(y

DDD/)DDDDDD(x

yx

20

211

210

29

27

26

25y

20

211

210

29

23

22

21x

211

210

29117106950

211

210

29113102910

(7)

The EOPs which have been estimated by geo-referencing process, determine accurate angles and positions of the site of the camera with respect to the object geo-referenced coordinate system, While the definitions of the navigation angles which are defined according to the aviation standard norm "ARINC 705" differ from those used in geo-referencing, the required transformation must be applied to convert the EOPs resulting from DLT equation to what can be fused with IMU measurements in [26]. Fig. 4 schematically demonstrates different coordinate systems used in geo-referencing and pose estimation.

))DDD*cos*c/()DxD((cos

)DDD/D(sin

)D/D(tan

1

D

D

DDD

DDD

DDD

Z

Y

X

211

210

29901

1

211

210

299

1

11101

8

41

11109

765

321

C

C

C

870

متلب سایت

MatlabSite.com

MatlabSite.com متلب سایت

Page 4: Farhad Samadzadegan*, Ghasem Abdi** & 4 a efiles.matlabsite.com/docs/papers/icee2012/2/icee2012-109.pdf · Farhad Samadzadegan*, Ghasem Abdi** *Department of Surveying and Geomatics

1 2 3 4 5 6 7 8 91011121314151617181920212223242526272829303132333435363738394041424344454647484950515253545556576061

Fig. 4: Schematic definition of the coordinate systems used for vehicle

pose estimation [3]

2.2.3 Estimation Process

The fusion filter maintains an estimate of the vehicle pose by combining the vision-based measurements with the IMU dynamic state. Kalman filter, one of the most widely used fusion filters in the aerial navigation applications, is an efficient approximation of Bayesian recursive filter that estimates the state of a linear dynamic system from a series of noisy measurements [3], [7–9]. For the reason that the mechanization equation of IMU is nonlinear, we have used the EKF algorithm to estimate the pose parameters of the vehicle. In this context, the EKF algorithm is recursive and is broken into two steps, namely prediction and update.

In prediction step, the vehicle pose parameters are predicted forward in time using Equation (1) and Equation (2) with data supplied by the inertial sensors. The state covariance P is propagated forward via [7], [8]:

kF∇kQkF∇

kF∇1-k|1-kPkF∇1-k|kP

Tww

Txx (8)

where xF∇ and wF∇ are the jacobians of the state transition function with respect to the vehicle state ( )kx̂ and the noise input ( )kω , respectively.

In update step, the observation model runs at discrete time steps to correct the process model’s estimates by using visual aiding, which related to some elements of the state vector including position and attitude. Therefore, by comparing predicted values of the measurement vector with actual measurements from the VBN system, the EKF algorithm maintains the estimates of the IMU position and attitude via [7], [8]:

kkW1-k|kx̂k|kx̂ (9)

where the gain matrix ( )kW and innovation ( )kν are calculated as:

Rkh∇1-k|kPkh∇kS

kSkh∇1-k|kPkW

1-k|kx̂h-kzk

Txx

1-Tx

ii

(10)

where ( )kh∇ x is the jacobian of the observation model with respect to the predicted state vector ( )1-k|kx̂ . The state covariance ( )k|kP after the observation is updated via [7], [8]:

kWkRkWkh∇kW-I

1-k|kPkh∇kW-Ik|kP

TTx

x

(11)

3. Experiments and Results

The potential of the proposed method was evaluated through comprehensive experimental tests conducted on a wide variety of datasets. In this paper, the tests are conducted on a geo-referenced database consist of the geo-panchromatic IRS-P5 imageries with ground resolution of about 2.5 m. The test images were taken over the city of Isfahan, Iran on 2008 (Fig. 5). The relief variation in the area is in the range of 1550 m to 1980 m above sea level. The CCD chip of aerial video camera is 2048x2048 pixels with 0.6 m pixel size. The gyroscope bias is ±1 deg/s and accelerometer bias is ±25 mg, the scale factors of both is 1% and the bandwidth is 40 Hz. Fig. 5 shows the satellite imagery and images of aerial camera.

(a)

(d) (c) (b)

Fig. 5: Reference geo-database used for the evaluation of the proposed

method. a) IRS-P5 geo-panchromatic image, b, c, and d) Captured images

by aerial vehicle’s camera

In the image registration procedure, while we are

looking for the point correspondence, simultaneously the initial estimation of the camera position (provided by auxiliary data from IMU) is used to decrease the matching search region. Moreover, for performing a robust image registration algorithm, the scene which doesn’t have enough information content, a mosaic of several images will be used to provide more information.

871

متلب سایت

MatlabSite.com

MatlabSite.com متلب سایت

Page 5: Farhad Samadzadegan*, Ghasem Abdi** & 4 a efiles.matlabsite.com/docs/papers/icee2012/2/icee2012-109.pdf · Farhad Samadzadegan*, Ghasem Abdi** *Department of Surveying and Geomatics

1 2 3 4 5 6 7 8 91011121314151617181920212223242526272829303132333435363738394041424344454647484950515253545556576061

Fig. 6 and TABLE I show the results of image registration schema using RANSAC algorithm.

(b) (a)

(d) (c)

(f) (e)

Fig. 6: Correspondence keypoints after RANSAC image registration

TABLE I: Result of Image Registration Module Using SIFT Feature

Extraction and RANSAC Algorithm

Aerial

Image

Image

Keypoints

Satellite

Keypoints

Matched

Points

RSME

(X Axis)

RSME

(Y Axis)

1 5084 4109 107 0.47 0.41

2 6527 4041 158 0.45 0.47

3 8550 5121 185 0.48 0.40

After image registration process, the pose parameters

of the vehicle are determined by using at least six well-distributed conjugate points. Then, IMU data is filtered by EKF to determine the 3D position and attitude of the vehicle. As a reference data set, we compare the outcome of the proposed system by GPS/IMU results for the same trajectory. Fig. 7 illustrates the results for relative error of the proposed system in comparison with GPS/IMU. The VBN system was always rather close to the GPS position during the whole path, which indicates that this system

can be used as an alternative approach when GPS is failed. The maximum error of ±5 m for position and ±3

deg for attitude has been achieved by this system.

(a) (d)

(b) (e)

(c) (f)

Fig. 7: Relative error of the proposed system in comparison with

GPS/IMU. a, b, and c) Three component of position error (North, East, and

Height). d, e, and f) Three component of attitude error (Roll, Pitch, and

Yaw).

As can be seen from the position error plot in Fig. 7,

the position errors are increased when the image registration algorithm is not robust enough because of weak or repetitive texture. Therefore, the accuracy of the proposed system is highly dependent to the accuracy of image registration which is derived from the accuracy of image matching, geo-referenced database, and mathematic models.

4. Conclusion

This paper represented a new approach for multi-sensor based navigation that affords reliable estimates of the pose parameters. In the proposed methodology, we present an algorithm that robustly aligns an aerial image to a geo-referenced ortho satellite image while realistically updating the observation model in an EKF algorithm. The method uses SIFT feature based registration which is invariant with respect to translation, rotation, and scale. Furthermore, the system uses local features and therefore is robust to changes in the scene. The implemented methodology has proved to be very efficient and reliable for automatic navigation of aerial vehicles. The accuracy of the proposed integrated method is comparable with GPS/IMU system. As a result, the accuracy of the proposed method is highly dependent to the accuracy of image registration which is derived from the accuracy of image matching, geo-referenced database, and mathematic models.

872

متلب سایت

MatlabSite.com

MatlabSite.com متلب سایت

Page 6: Farhad Samadzadegan*, Ghasem Abdi** & 4 a efiles.matlabsite.com/docs/papers/icee2012/2/icee2012-109.pdf · Farhad Samadzadegan*, Ghasem Abdi** *Department of Surveying and Geomatics

1 2 3 4 5 6 7 8 91011121314151617181920212223242526272829303132333435363738394041424344454647484950515253545556576061

References

[1] B. Kamel, M. C. S. Santana, and T. C. De Almeida, “Position estimation of autonomous aerial navigation based on Hough transform and Harris corners detection,” in Proceedings of the 9th WSEAS international conference on Circuits, systems, electronics, control & signal processing, 2010, pp. 148–153.

[2] R. Karlsson, T. Schon, D. Tornqvist, G. Conte, and F. Gustafsson, “Utilizing model structure for efficient simultaneous localization and mapping for a UAV application,” in Aerospace Conference, 2008 IEEE, 2008, pp. 1–10.

[3] S. Saeedi, F. Samadzadegan, and N. El-Sheimy, “Vision-aided inertial navigation for pose estimation of aerial vehicles,” in Proceedings of the 22nd International Technical Meeting of The Satellite Division of the Institute of Navigation (ION GNSS 2009), 2009, pp. 453–459.

[4] R. J. Prazenica, A. Watkins, A. J. Kurdila, Q. F. Ke, and T. Kanade, “Vision-based kalman filtering for aircraft state estimation and structure from motion,” in 2005 AIAA Guidance, Navigation, and Control Conference and Exhibit, 2005, pp. 1–13.

[5] T. Lemaire, C. Berger, I. K. Jung, and S. Lacroix, “Vision-based slam: Stereo and monocular approaches,” International Journal of Computer Vision, vol. 74, no. 3, pp. 343–364, 2007.

[6] G. Conte and P. Doherty, “Vision-based unmanned aerial vehicle navigation using geo-referenced information,” EURASIP Journal on Advances in Signal Processing, vol. 2009, p. 10, 2009.

[7] J. Kim and S. Sukkarieh, “Autonomous airborne navigation in unknown terrain environments,” Aerospace and Electronic Systems, IEEE Transactions on, vol. 40, no. 3, pp. 1031–1045, 2004.

[8] J. Kim, “Autonomous navigation for airborne applications,” Dept. of Aerospace, Mechanical and Mechatronic Engineering, Graduate School of Engineering, University of Sydney, 2004.

[9] N. El-Sheimy, “Report on kinematic and integrated positioning systems,” TS5, vol. 1, pp. 19–26, 2002.

[10] S. Sukkarieh, E. M. Nebot, and H. F. Durrant-Whyte, “A high integrity IMU/GPS navigation loop for autonomous land vehicle applications,” Robotics and Automation, IEEE Transactions on, vol. 15, no. 3, pp. 572–578, 1999.

[11] Z. H. Lewantowicz, “Architectures and GPS/INS integration: Impact on mission accomplishment,” Aerospace and Electronic Systems Magazine, IEEE, vol. 7, no. 6, pp. 16–20, 1992.

[12] S. Snyder, B. Schipper, L. Vallot, N. Parker, and C. Spitzer, “Differential GPS/inertial navigation approach/landing flight test results,” Aerospace and Electronic Systems Magazine, IEEE, vol. 7, no. 5, pp. 3–11, 1992.

[13] J. H. Kim and S. Sukkarieh, “Flight test results of GPS/INS navigation loop for an autonomous unmanned aerial vehicle (UAV),” in ION GPS 2002: 15th International Technical Meeting of the Satellite Division of The Institute of Navigation; Portland, OR, 2002.

[14] R. Greenspan, “GPS and inertial integration,” Global Positioning System: Theory and applications., vol. 2, pp. 187–220, 1996.

[15] R. E. Phillips and G. T. Schmidt, “Gps/ins integration,” AGARD LECTURE SERIES AGARD LS, pp. 9–9, 1996.

[16] D. G. Sim, R. H. Park, R. C. Kim, S. U. Lee, and I. C. Kim, “Integrated position estimation using aerial image sequences,” IEEE transactions on pattern analysis and machine intelligence, pp. 1–18, 2002.

[17] F. Samadzadegan, M. Hahn, and S. Saeedi, “Position estimation of aerial vehicle based on a vision aided navigation system,” 2007.

[18] S. Saeedi, “Pose estimation of aerial vehicles based on a vision aided navigation system,” Dept. of Surveying and Geomatics Engineering, University College of Engineering, University of Tehran, 2007.

[19] D. G. Lowe, “Distinctive image features from scale-invariant keypoints,” International journal of computer vision, vol. 60, no. 2, pp. 91–110, 2004.

[20] J. Kim and S. Sukkarieh, “6DoF SLAM aided GNSS/INS navigation in GNSS denied and unknown environments,” Journal of Global Positioning Systems, vol. 4, no. 1-2, pp. 120–128, 2005.

[21] S. Sukkarieh, “Low cost, high integrity, aided inertial navigation systems for autonomous land vehicles,” Australian Center for Fields Robobtics, University of Sydney, march2000.

[22] J. Kim and S. Sukkarieh, “SLAM aided GPS/INS navigation in GPS denied and unknown environments,” in The 2004 International Symposium on GNSS/GPS, Sydney, 2004, pp. 6–8.

[23] M. A. Fischler and R. C. Bolles, “Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography,” Communications of the ACM, vol. 24, no. 6, pp. 381–395, 1981.

[24] Y. Abel-Aziz and H. Karara, “Direct linear transformation from comparator coordinates into object space coordinates,” Urbana, IL: American Society of Photogrammetry, pp. 1–18, 1971.

[25] B. Molnár, “Developing a web based photogrammetry software using DLT,” Pollack Periodica, vol. 5, no. 2, pp. 49–56, 2010.

[26] F. Samadzadegan and S. Saeedi, “A feature based image registration method for aerial vision based navigation,” presented at the Fourth Canadian Conference on Computer and Robot Vision, CRV 2007, 2007.

873

متلب سایت

MatlabSite.com

MatlabSite.com متلب سایت