Research Article Moving Target Positioning Based on a ...Research Article Moving Target Positioning...
Transcript of Research Article Moving Target Positioning Based on a ...Research Article Moving Target Positioning...
Research ArticleMoving Target Positioning Based on a DistributedCamera Network
Long Zhao12 Zhen Liu1 Tiejun Li1 Baoqi Huang2 and Lihua Xie2
1 School of Automation Science and Electrical Engineering Beihang University Beijing 100191 China2 School of Electrical and Electronic Engineering Nanyang Technological University Singapore 639798
Correspondence should be addressed to Long Zhao buaa dncbuaaeducn
Received 14 February 2014 Accepted 18 April 2014 Published 13 May 2014
Academic Editor Guoqiang Hu
Copyright copy 2014 Long Zhao et al This is an open access article distributed under the Creative Commons Attribution Licensewhich permits unrestricted use distribution and reproduction in any medium provided the original work is properly cited
We propose a systematic framework for moving target positioning based on a distributed camera network In the proposedframework low-cost static cameras are deployed to cover a large region moving targets are detected and then tracked usingcorresponding algorithms target positions are estimated by making use of the geometrical relationships among those camerasafter calibrating those cameras and finally for each target its position estimates obtained from different cameras are unifiedinto the world coordinate system This system can function as complementary positioning information sources to realize movingtarget positioning in indoor or outdoor environments when global navigation satellite system (GNSS) signals are unavailable Theexperiments are carried out using practical indoor and outdoor environment data and the experimental results show that thesystematic framework and inclusive algorithms are both effective and efficient
1 Introduction
The theory of navigation and positioning has been used invarious application fields such as positioning equipment tomonitor its working state and positioning a car or peopleto guide them to a certain place In these applications it isrequired that targets are positioned and tracked which can berealized using the global navigation satellite system (GNSS)or GNSS aided inertial navigation system (INS) Howeverthe GNSS system is subject to various limitations the mostcritical one of which is jamming GNSS signals are not alwaysavailable due to the blockage of high buildings canyons andforests among others For this reason a number of alternativetechnologies including optical [1] radio [2ndash4] RFID [5] andacoustic [6] have been proposed for indoor and outdoorpositioning systems over the years Most efforts were focusedonWiFi based localization which takes the advantage ofWiFiinfrastructures By using the userrsquos smartphone to measurethe signal strength from multiple WiFi access points theuserrsquos location can be constrained within a relatively smallregion in a large indoor environment However these systemsrely on prepared infrastructures installed beforehand andtheir accuracy critically depends on the number of available
access points which results in certain restrictions [5] Inaddition many results related to the application of thecamera based positioning systems have been reported in thelast few years [1] including simultaneous localization andmapping (SLAM) [7] and visual odometry [8] AlthoughSLAM is becoming a standard technique for indoor roboticapplications it is still challenging to apply SLAM in largeoutdoor environments
In recent years the size of camera networks growsquickly with the development of building safe and smartcity These cameras can provide complementary positioninginformation for moving target positioning in both indoorand outdoor environments when GNSS signals are unavail-able In this paper we will deal with the moving targetpositioning based on a distributed camera network in athree-dimensional space In existing camera networks staticcameras are generally deployed to cover a large regionand moving targets can only be detected and tracked viacertain algorithms running in a central supervisor unit buttheir positions are not determined [9ndash12] In some practicalapplications the total number of cameras is usually restrictedby various factors such as the cost and placement of camerasTo address this problem multiple pan tilt zoom (PTZ)
Hindawi Publishing CorporationMathematical Problems in EngineeringVolume 2014 Article ID 803743 11 pageshttpdxdoiorg1011552014803743
2 Mathematical Problems in Engineering
cameras or the combination of PTZ cameras and staticcameras can be deployed to fulfill some practical tasks [13ndash18]
The target detection target tracking and camera cali-bration are key to the moving target positioning process Toextract moving targets from a video frame of a static camerabackground subtraction is the most widely used approach[19 20] When the camera is stationary the backgroundscene is unchanging such that it is convenient to constructa background model [21 22] The capability of efficientlyand accurately estimating background images is critical forany robust background subtraction algorithm A well-knownmethod presented by Stauffer and Grimson [21] uses anadaptive strategy for modeling background Therein eachpixel is modeled using a separate Gaussian mixture which iscontinuously learnt based on online approximations Targetdetection at the current frame is then performed at pixel levelby comparing its value against themost probable backgroundGaussians However the adaptive Gaussian mixture algo-rithm suffers from a low convergence speed in the learningprocess especially in complicated environments For thisreason an improved adaptive Gaussian mixture learningalgorithm was introduced in [23]
Moving target tracking is an important component inthe field of computer vision and has been widely used inmany applications such as video surveillance [16] intelligenttransportation [24] and multiagent systems tracking andcontrol [25] Target tracking aims to estimate the positionand the shape of a target or a region in subsequent framesDuring target tracking a target is continuously tracked bycorrectly associating a target detected in subsequent frameswith the same identified track These methods and theirvariations commonlymake use of the one-to-one assumptionin the sense that a target can only generate at most onemeasurement in each frame and a measurement can onlyoriginate from at most one target However the one-to-one assumption rarely holds in practical applications due tothe splitting and merging processes as well as multitargetsexisting in a common scene In order to overcome theseshortcomings several approaches have been proposed formultitargets tracking in recent years [26ndash29]
Camera calibration is an essential procedure in dis-tributedmultitarget positioning and determines themappingbetween 3D world coordinates and 2D image coordinates inpractical applications The basic task of camera calibrationis to compute the camera extrinsic and intrinsic parameterswhich determine the imaging model and the relationshipbetween multiple camera coordinates With respect to dif-ferent applications the corresponding calibration algorithmsinclude the direct linear transformation (DLT) algorithm[9] Tsai algorithm [10] vanishing point algorithm [11]and Zhang algorithm [12] These algorithms have respectiveadvantages and disadvantages in various practical applica-tions In this paper we will focus on a fast calibration algo-rithm based on the vanishing point theory which overcomesthe defects of traditional measurements
The paper is organized as follows The system frameworkis presented in Section 2 Section 3 focuses on the targetdetection and tracking The fast calibration algorithm is
presented in Section 4 The test results of target positioningbased on a distributed camera network are reported inSection 5 Finally we draw some conclusions and shed lighton future work in Section 6
2 Systematic Framework and Problem
The work presented in this paper originates from a researchproject of moving target tracking and positioning in theDigital Navigation Center (DNC) at Beihang University Theprimary goal of the project is to develop a target positioningplatform to realize monitoring and positioning targets ina large region The systematic framework of the movingtarget positioning based on a distributed camera networkis shown in Figure 1 Due to the field of view and pricelimitations amass of static cameras are installed in a practicalapplication environment In order to realize moving targetpositioning in a large region it is necessary that the systemsupports targets detection and tracking Since a target canno longer be detected because either it leaves the fieldof view it stops and becomes static or it can no longerbe distinguished from the background it is reasonable totake target splitting and merging into account or detectmultiple targets Therefore the performance of the targetdetection tracking and association algorithms will influencethe reliability of the target positioning and it is necessary thatthe target positioning results between cameras are fused intoa world coordinate system 119883
119908119884119908
119885119908
In this paper we tackle several problems including thetargets detection tracking and association as well as thefast camera calibration and target positioning in the movingtarget positioning system based on a distributed cameranetwork in practical applications
3 Target Detection and Tracking
31 Target Detection Target detection is the basis of targettracking target positioning target recognition action recog-nition and so forth There are some common algorithmssuch as the optical flow algorithm [30] the frame differencealgorithm [31] and the background subtraction algorithm[32] in practical applications The most well-known and themost widely used one is background subtraction for staticcameras because it is convenient to construct a backgroundmodel and extract moving targets
The background modeling techniques can be dividedinto two categories the parametric techniques that use aparametric model for each pixel location and the samples-based techniques that build their model by aggregatingpreviously observed values for each pixel location [33] Themost popular parametric technique is based on the Gaussianmixture model (GMM) presented by Stauffer and Grimson[21] This algorithm relies on the principle that the pixelvalue for the same location in the image sequences satisfiesa Gaussian distribution as illustrated in Figure 2
While updating the background model each pixel ofa scene image is independently modelled by a mixture ofat most 119870 Gaussian distributions and employs an adaptive
Mathematical Problems in Engineering 3
1
2
3
4
5
6
7
8
L
Ze
Zw
Ye
Yw
Oe
Ow
120582
Xe
Xw
Figure 1 The systematic framework of the moving target positioning based on a distributed camera network
Frame 0 Frame 60Frame 40Frame 20 Frame 80 Frame 100 Frame 120
R 207G 195B 197
R 207G 195B 197
R 204G 191B 198
R 110G 89B 98
R 217G 200B 208
R 215G 200B 207
R 215G 198B 208
x
BackgroundForeground
y
Imag
e fra
me
Pixe
l inf
orm
atio
nPi
xel d
istrib
utio
n
minus3120590 120583 3120590
Figure 2 The model of pixel distribution
strategy with the result that the algorithm is adaptive andable to deal with multimodal backgrounds in a dynamicenvironment (eg changing time of day clouds swayingtree leafs and etc) However since its sensitivity cannotbe properly tuned its ability to successfully handle high-and low-frequency changes in the background is debatableTo overcome these shortages samples-based techniques [34]
circumvent a part of the parameter estimation step by build-ing their models from observed pixel values and enhancetheir robustness to noises They provide fast responses tohigh-frequency events in the background by directly includ-ing newly observed values in their pixel models Howeversince they update their pixel models in a first-in first-outmanner their ability to successfully handle concomitant
4 Mathematical Problems in Engineering
(a) Original video (b) Gaussian mixture model (c) Random background model
Figure 3 The target detection results of video data set 1
(a) Original video (b) Gaussian mixture model (c) Random background model
Figure 4 The target detection results of video data set 2
(a) Original video (b) Foreground image with ghost (c) The target detection result
Figure 5 The target detection results included the ghost
(a) Original video (b) Gaussian mixture model (c) Our method
Figure 6 The target detection results for frame 80
Mathematical Problems in Engineering 5
(a) Original video (b) Gaussian mixture model (c) Our method
Figure 7 The target detection results for frame 105
(a) Original video (b) Gaussian mixture model (c) Our method
Figure 8 The target detection results for frame 145
events evolving at various speeds is limited similarly to thelimitation in its adaptive ability of dealingwith the concurrentevents with different frequency In order to address thisissue random background modeling that is intuitively animproved samples-based algorithm is found in [33] Thisalgorithm assumes 119901
119905(119909) to be the value of the pixel 119909 at
time 119905 and imposes the constraint that the influence of avalue on the polychromatic space is restrictedwithin the localneighborhood Then a set of sample values is used as a pixelmodel to classify a value 119901
119905(119909) to be either a background or a
foreground pixel valueAn experiment was carried out using video data and
compared with the Gaussian mixture model presented byStauffer andGrimson [21] in which the video data set 1 comesfrom the evaluating data from Performance Evaluation ofTracking and Surveillance (PETS) database with the videoimage resolution of 768 times 576 pixels and the frame rateof 25 frames per second (fs) the video data set 2 comesfrom a practical surveillance system data in the DNC ofBeihang University with the video image resolution of 352times 288 pixels and the frame rate of 25 fs The experimentalresults are shown in Figures 3 and 4 As can be seen fromFigures 3 and 4 since trees were swinging in the windsuch movements were classified as foreground motions bythe Gaussian mixture model while the random backgroundmodel effectively detected the trees as the background How-ever both algorithms do not take into account the shadowof the target and thus are severely damaged in terms of thereliability and robustness of the target detection and trackingas illustrated in Figure 5 in which the used video data set 3comes from a practical surveillance system in the DNC of
Beihang University and is with the video image resolution of352 times 288 pixels and the frame rate of 25 fs
In order to remove the damage from the shadow ontarget detection and tracking we propose an algorithm bycombining the random backgroundmodel and the frame dif-ference algorithm and the mathematical model is describedas follows
Mask (119909 119910)
= 1
1003816100381610038161003816Dilate (119863 (119909 119910)) minus Erode (119863 (119909 119910))1003816100381610038161003816 = 1
0 otherwise
(1)
where 119863(119909 119910) denotes the mask image of the backgrounddifferencing Dilate(lowast) denotes the dilation operation of thetarget region block Erode(lowast) denotes the erosion operationof the target regionMask(119909 119910) denotes themask image of thedifference between the dilated and eroded image operations
Suppose that the number of the pixels with their valuesequal to 1 in Mask(119909 119910) is 119873
1and the number of the pixels
that are detected as foreground from differencing image andthe values of which at (119909 119910) in the template Mask(119909 119910) equalto 1 is 119873
2 If 11987311198732
gt 119879 where 119879 denotes a threshold thenthe target region block is the foreground target otherwise itis the shadow of the target
An experiment is carried out using the video data set 3and compared with the GMM The experimental results areshown in Figures 6 7 and 8 As can be seen the algorithmpresented in this paper is effective to remove the shadow ofthe target
6 Mathematical Problems in Engineering
Figure 9 The multitarget tracking results for video data set 4
Figure 10 The target tracking results across multicamera for video data set 3
u
(u ) (ud d)
Zc Yc
XcOc
zc
xc yc
Zw
Xw
Yw
Ow
P(xw yw zw)+
Figure 11The relationship between the world coordinates and pixelcoordinates
32 Target Tracking Once moving targets are detected thetrack initialization event is triggered such that the movingtargets can be continuously tracked by the tracking algorithmin the living period of a track (which starts from its initial-ization to its termination [35]) The termination of a trackoccurs when a target can no longer be detected because itleaves the field of view it stops and becomes static or it can
Figure 12 The fast calibrating camera tool software
no longer be distinguished from the background Detectedtargets are not confirmed to be true moving targets until theyhave been consistently tracked for a period of time beforetheir target tracks are initialized We create a dynamic listof potential tracks using all detected targets Associationswill be established between targets detected in a new imageframe and potential tracking targets When a potential targetis tracked in several continuous frames it is recognized as atrue moving target and a track will be initialized
Compared to a single target tracking the multitargetproblem poses additional difficulties data association needsto be solved that is it has to be decided which observation
Mathematical Problems in Engineering 7
Xc1
Yc1Oc1
Zc1
Xc2
Yc2
Oc2
Zc2
Xw
Yw
Ow
Zw
r c2c1
r c1w
r c2w
Figure 13 The schematic diagram of the target positioning
corresponds towhich target constraints between targets needto be taken into account Multitarget tracking algorithmscan be roughly divided into two categories the recursivealgorithms and the nonrecursive algorithms The recursivealgorithms base their estimate only on the state of theprevious frame such as Kalman filtering [36] and particlefiltering [27] in which different strategy is used to obtainan optimal solution over multiple frames and can thusbetter cope with ambiguous multimodal distributions Thenonrecursive algorithms seek optimality over an extendedperiod of time [37 38]
In practical applications target tracking takes targetsplitting and merging into account because of the factorssuch as illumination changes and occlusion Since the one-to-one tracking assumption rarely holds multitarget trackingproblem is still challenging In order to realize multitargettracking we propose a solution by combining the pyramidLucas-Kanade feature tracker [39] and Kalman filter Themathematical model of Kalman filtering is described asfollows
119883 (119896 + 1) = 119860 (119896 + 1 119896) 119883 (119896) + 119882 (119896)
119885 (119896) = 119867 (119896) 119883 (119896) + 119881 (119896)
(2)
where 119883(119896) = [119909(119896) 119910(119896) 119908(119896) ℎ(119896) V119909(119896) V119910
(119896) V119908
(119896)
Vℎ(119896)]119879 denotes the state 119860(119896 + 1 119896) denotes the state tran-
sition matrix 119882(119896) denotes the system noise 119867(119896) denotesthe measurement matrix 119885(119896) = [119909(119896) 119910(119896) 119908(119896) ℎ(119896)]
119879
denotes the measurement value 119881(119896) denotes the measure-ment noise 119909(119896) and 119910(119896) denote the horizontal and verticalordinates of the target centroid 119908(119896) and ℎ(119896) denote thewidth and height of the target envelope rectangle V
119909(119896)
V119910
(119896) V119908
(119896) and Vℎ(119896) denote the speeds of the 119909(119896) 119910(119896)
119908(119896) and ℎ(119896) respectivelyTo realize multitarget tracking across multiple cameras
we construct a similarity function of target matching to
realize target association and target tracking across multiplecameras The similarity function is described as follows
119891 (119886 119887) = 120572119872119889
(119886 119887) + 120573119872119904
(119886 119887) + 120574119872ℎ
(119886 119887)
119872119889
(119886 119887) =(119882119886
+ 119882119887)
(119882119886
+ 119882119887) + 119863119909
sdot(119867119886
+ 119867119887)
(119867119886
+ 119867119887) + 119863119910
119872119904
(119886 119887) =2119882119886119867119886119882119887119867119887
(119882119886119867119886)2
+ (119882119887119867119887)2
119872ℎ
(119886 119887) =2ℎ119886ℎ119887
ℎ2119886
+ ℎ2
119887
(3)
where 119886 and 119887 denote targets which will be matched119872119889(119886 119887) isin [0 1] denotes their position similarity (the larger
the 119872119889(119886 119887) is the closer their positions are) 119872
119904(119886 119887) isin
[0 1] denotes the similarity of their sizes (the larger the119872119904(119886 119887) is the closer their sizes are) 119872
ℎ(119886 119887) isin [0 1]
denotes the similarity of their heights (the larger the 119872ℎ(119886 119887)
is the closer their heights are) 120572 isin [0 1] 120573 isin [0 1] and 120574 isin
[0 1] denote the weight coefficients and satisfy 120572 + 120573 + 120574 = 1119883119886and 119884
119886denote the horizontal and vertical ordinates of the
target 119886 in the world coordinate system 119883119887and 119884119887denote the
horizontal and vertical ordinates of the target 119887 in the worldcoordinate system119882
119886and119882
119887denote half of the widths of the
targets 119886 and 119887 respectively 119867119886and 119867
119887denote half of their
heights respectively119863119909
= fabs(119883119886minus119883119887) denotes the absolute
difference between119883119886and119883
119887119863119910
= fabs(119884119886
minus119884119887) denotes the
absolute difference between 119884119886and 119884
119887 ℎ119886and ℎ119887denote their
heights in the world coordinate system 119891(119886 119887) isin [0 1] (thelarger the 119891(119886 119887) is the higher their matching similarity is)and vice versa In practical applications 120572 120573 and 120574 can beadjusted according to the accuracy of the 119872
119889(119886 119887) 119872
119904(119886 119887)
and 119872ℎ(119886 119887)
An experiment is carried out by using the video data sets3 and 4 which come from a practical surveillance systemin the DNC of Beihang University with the video imageresolution of 352 times 288 pixels and the frame rate of 25 fsThe experimental results are shown in Figures 9 and 10 inwhich the rectangle with the dotted blue lines denotes theoverlapping area between two cameras As can be seen thetarget tracking algorithm presented in this paper is effectiveand can track multiple targets across the multiple cameras
4 Fast Camera Calibrationand Target Positioning
41 Fast Camera Calibration Camera calibration is a keytechnology in determining the mapping between 3D worldcoordinates and 2D image coordinates for various computervision applications A schematic diagram describing themapping between 3D world coordinates and 2D image coor-dinates is shown in Figure 11 (119906
119889 V119889) is the image coordinate
of (119909119908
119910119908
119911119908
) if a perfect pinhole camera model is used(119906 V) is the actual image coordinate which deviates from(119906119889 V119889) due to lens distortion The distance between them is
termed as the radial distortion Therefore the mathematical
8 Mathematical Problems in Engineering
Figure 14 The experimental results in indoor environment
model from 3Dworld coordinates to 2D image coordinates isexpressed by [40]
119906 + 120575119906 = 119891119909
1199031
(119909119908
minus 119909119888) + 1199032
(119910119908
minus 119910119888) + 1199033
(119911119908
minus 119911119888)
1199037
(119909119908
minus 119909119888) + 1199038
(119910119908
minus 119910119888) + 1199039
(119911119908
minus 119911119888)
V + 120575V = 119891119910
1199034
(119909119908
minus 119909119888) + 1199035
(119910119908
minus 119910119888) + 1199036
(119911119908
minus 119911119888)
1199037
(119909119908
minus 119909119888) + 1199038
(119910119908
minus 119910119888) + 1199039
(119911119908
minus 119911119888)
(4)
where 119903119894
(119894 = 1 2 9) denotes the element of therotation matrix from the world coordinate frame to the pixelcoordinate frame 119891
119909and 119891
119910denote the focal lengths of
the camera in the 119909 and 119910 directions 120575119906 and 120575V denotethe photogrammetric distortions 119874
119888(119909119888 119910119888 119911119888) denotes the
coordinate of the camera in the world coordinate frame119875(119909119908
119910119908
119911119908
)denotes the coordinate of the target in theworldcoordinate frame
Traditional calibration algorithms for example DLTalgorithm [9] and Tsai algorithm [10] utilize a series ofmathematical transformations and algorithms to obtainparameters of the camera model They have been widelyused because of their simple mathematical model and theoryHowever these algorithms require a large amount of work torecord and check calibration points during calibration andthus are inefficient in practical applications For this reason afast calibration algorithmbased on the vanishing point theorywas introduced in [11 41] in which the photogrammetricdistortions 120575119906 and 120575V are unconsidered
According to the mathematical model of the fast cameracalibration presented in [11] we develop software to calibratecameras fast and can promptly check the accuracy of cameraparameters by (4) The calibration process and results of apractical camera are shown in Figure 12 As can be seenthe focal lengths of the camera are 39983535 pixels thecalibration error of the line segment 119860119875 is 398mm and
the rotation matrix 119877 and translation vector 119879 (unit mm) areas the following
119877 = [
[
0642 minus0766 0024
minus0222 minus0216 minus0951
0734 0605 minus0309
]
]
119879 = [
[
minus93497
190111
62305937
]
]
(5)
42 Target Positioning Once targets are continuouslytracked the space coordinates of the targets in the cameracoordinate frame can be computed by imaging model withcamera parameters as follows
[
[
119883119888
119884119888
119885119888
]
]
= 119885119888
[[[[[
[
119889119909
119891119909
0 0
0
119889119910
119891119910
0
0 0 1
]]]]]
]
[
[
119906
V1
]
]
+
[[[[[[
[
1199060119885119888119889119909
119891119909
V0119885119888119889119910
119891119910
0
]]]]]]
]
(6)
where 1199060and V0denote the principle point in the pixel frame
and 119889119909 and 119889119910 denote the pixel sizes of the camera in the 119909
and 119910 directionsWhen the targets are across multiple cameras the space
coordinates of the targets in each camera coordinate frameare computed respectively by (6) and then are unified intothe world coordinated system as illustrated in Figure 13 Themathematical model is described as follows
[
[
119883119908
119884119908
119885119908
]
]
= 119877119894([
[
119883119888119894
119884119888119894
119885119888119894
]
]
minus 119879119894) 119894 = 1 2 119873 (7)
where 119894 denotes the 119894th camera and 119873 denotes the number ofcameras
5 Experiment Test
The target positioning system presented in this paper is testedin indoor and outdoor environments in which all the low-cost static cameras are calibrated and the coordinates of
Mathematical Problems in Engineering 9
(a) 3D scene of new main building (b) Camera 1
(c) Camera 2 (d) Camera 3
(e) Camera 4 (f) Camera 5
(g) Camera 6 (h) Camera 7
Figure 15 The experimental results in outdoor environment
cameras are unified into the world coordinate system Targetsdetection and tracking are done by the target detection andtracking algorithm As a result the targets are positionedby imaging model and camera parameters in real time andtheir trajectories are displayed in a three-dimensional scene
The test results are respectively shown in Figures 14 and15 for indoor and outdoor environments As can be seenfrom Figure 14 when a target is continuously moving withinan indoor corridor the positioning system consisting of sixdistributed cameras is able to position this target in real
10 Mathematical Problems in Engineering
time and display its trajectory in a three-dimensional spacemodel Likewise as can be seen from Figure 15 when a targetis continuously moving outdoors the positioning systemconsisting of seven distributed cameras is able to position thistarget and display its trajectory in a three-dimensional spacemodel in real time as well The experimental results confirmthat the systematic framework and inclusive algorithmspresented in this paper are both effective and efficient
In this paper we assume that the ground is flat whichrarely holds in practical applications given a large regionIn order to solve this problem it is necessary to use digitalelevationmodel (DEM) to describe topographic relief in largeregions
6 Conclusion and Future Work
This paper presented the comprehensive design and imple-mentation of a moving target positioning system basedon a distributed camera network The system is composedof low-cost static cameras which provide complementarypositioning information for moving target positioning inindoor and outdoor environments when GNSS signals areunavailable In this system static cameras can cover a largeregion moving targets are detected and then tracked usingcorresponding algorithms target positions are estimated bymaking use of the geometrical relationships among thosecameras after calibrating those cameras and finally foreach target its position estimates obtained from differentcameras are unified into the world coordinate system Theexperimental results of the target detection tracking andpositioning system were reported based on real video data
Targets positioning and trackingwithmultiple static cam-eras were verified in both indoor and outdoor environmentsHowever the reliability and accuracy of target tracking andpositioning suffer from several environment factors Henceit is necessary to fuse information from various sensorssuch as radar infrared camera inertial measure unit (IMU)and wireless location system Regarding future work it ismeaningful to develop and test these algorithms in thepractical applications
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This project is supported by the Key Program of the NationalNatural Science Foundation of China (Grant no 61039003)the National Natural Science Foundation of China (Grantno 41274038) the Aeronautical Science Foundation of China(Grant no 2013ZC51027) the Aerospace Innovation Founda-tion of China (CASC201102) and the Fundamental ResearchFunds for the Central Universities
References
[1] R Mautz and S Tilch ldquoSurvey of optical indoor positioningsystemsrdquo in Proceedings of the International Conference onIndoor Positioning and Indoor Navigation (IPIN rsquo11) GuimaraesPortugal September 2011
[2] G Retscher E Moser D Vredeveld et al ldquoPerformance andaccuracy test of a WiFi indoor positioning systemrdquo Journal ofApplied Geodesy vol 1 no 2 pp 103ndash110 2007
[3] S J Ingram D Harmer and M Quinlan ldquoUltra wide bandindoor positioning systems and their use in emergenciesrdquo inProceedings of the Position Location and Navigation Symposium(PLANS rsquo04 ) pp 706ndash715 Monterey Calif USA April 2004
[4] G Anastasi R Bandelloni M Conti et al ldquoExperimentingan indoor bluetooth-based positioning servicerdquo in Proceedingsof the 23rd International Conference on Distributed ComputingSystems Workshops pp 480ndash483 Providence RI USA 2003
[5] H Liu H Darabi P Banerjee and J Liu ldquoSurvey of wirelessindoor positioning techniques and systemsrdquo IEEE Transactionson Systems Man and Cybernetics C Applications and Reviewsvol 37 no 6 pp 1067ndash1080 2007
[6] H-S Kim and J-S Choi ldquoAdvanced indoor localization usingultrasonic sensor and digital compassrdquo in Proceedings of theInternational Conference on Control Automation and Systems(ICCAS rsquo08) pp 223ndash226 Seoul Korea October 2008
[7] R Karlsson T B Schon D Tornqvist G Conte and FGustafsson ldquoUtilizing model structure for efficient simulta-neous localization and mapping for a UAV applicationrdquo inProceedings of the IEEE Aerospace Conference Digital ObjectIdentifier (AC rsquo08) Rome Italy March 2008
[8] D Nister O Naroditsky and J Bergen ldquoVisual odometryrdquoin Proceedings of the IEEE Computer Society Conference onComputer Vision and Pattern Recognition (CVPR rsquo04) pp 652ndash659 Washington DC USA July 2004
[9] Y I Abdel-Aziz andHMKarara ldquoDirect linear transformationinto object space coordinates in close-range photogrammetryrdquoin Proceedings of the Symposium on Close-Range Photogramme-try pp 1ndash18 Urbana Ill USA 1971
[10] R Y Tsai ldquoA versatile camera calibration technique for high-accuracy 3D Machine vision metrology using off-the-shelf TVcamera and lensesrdquo IEEE Journal of Robotics and Automationvol RA-3 no 4 pp 323ndash344 1987
[11] J Long X Zhang and L Zhao ldquoA fast calibration algorithmbased on vanishing point for scene camerardquo Applied Mechanicsand Materials vol 58ndash60 pp 1148ndash1153 2011
[12] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000
[13] P D Z Varcheie and G-A Bilodeau ldquoPeople tracking using anetwork-based PTZ camerardquoMachine Vision and Applicationsvol 22 no 4 pp 671ndash690 2011
[14] C-H Chen Y Yao D Page B Abidi A Koschan andMAbidildquoHeterogeneous fusion of omnidirectional and PTZ camerasfor multiple object trackingrdquo IEEE Transactions on Circuits andSystems for Video Technology vol 18 no 8 pp 1052ndash1063 2008
[15] X Clady F Collange F Jurie and P Martinet ldquoObject track-ing with a Pan-tilt-zoom camera application to car drivingassistancerdquo in Proceedings of the IEEE International Conferenceon Robotics and Automation (ICRA rsquo01) pp 1653ndash1658 SeoulKorea May 2001
[16] N Bellotto E Sommerlade B Benfold et al ldquoA dstributedcamera system formulti-resolution surveillancerdquo in Proceedings
Mathematical Problems in Engineering 11
of the 3rd ACMIEEE International Conference on DistributedSmart Cameras (ICDSC rsquo09) Como Italy September 2009
[17] C Ding B Song A Morye et al ldquoCollaborative sensing in adistributed ptz camera networkrdquo IEEE Transactions on ImageProcessing vol 21 no 7 pp 3282ndash3295 2012
[18] E B Ermis P Clarot P-M Jodoin and V Saligrama ldquoActivitybasedmatching in distributed camera networksrdquo IEEE Transac-tions on Image Processing vol 19 no 10 pp 2595ndash2613 2010
[19] D-M Tsai and S-C Lai ldquoIndependent component analysis-based background subtraction for indoor surveillancerdquo IEEETransactions on Image Processing vol 18 no 1 pp 158ndash167 2009
[20] J Migdal and W E L Grimson ldquoBackground subtractionusing Markov thresholdsrdquo in Proceedings of the IEEE Workshopon Motion and Video Computing (MOTION rsquo05) pp 58ndash65Breckenridge Colo USA January 2005
[21] C Stauffer andW E L Grimson ldquoLearning patterns of activityusing real-time trackingrdquo IEEETransactions on Pattern Analysisand Machine Intelligence vol 22 no 8 pp 747ndash757 2000
[22] D-S Lee ldquoEffective Gaussian mixture learning for video back-ground subtractionrdquo IEEE Transactions on Pattern Analysis andMachine Intelligence vol 27 no 5 pp 827ndash832 2005
[23] L Zhao and X He ldquoAdaptive Gaussian mixture learningfor moving object detectionrdquo in Proceedings of the 3rd IEEEInternational Conference on Broadband Network and Multime-dia Technology (IC-BNMT rsquo10) pp 1176ndash1180 Beijing ChinaOctober 2010
[24] S Sivaraman and M M Trivedi ldquoLooking at vehicles on theroad a survey of vision-based vehicle detection tracking andbehavior analysisrdquo IEEE Transactions on Intelligent Transporta-tion Systems vol 14 no 4 pp 1773ndash1795 2013
[25] G H Wen Z S Duan G R Chen and W W Yu ldquoConsen-sus tracking of multi-agent systems with lipschitz-type nodedynamics and switching topologiesrdquo IEEE Transactions onCircuits and Systems vol 60 no 9 pp 1ndash13 2013
[26] S Oh S Russell and S Sastry ldquoMarkov chain Monte Carlodata association formulti-target trackingrdquo IEEETransactions onAutomatic Control vol 54 no 3 pp 481ndash497 2009
[27] H Zhang and L Zhao ldquoIntegral channel features for particlefilter based object trackingrdquo in Proceedings of the 5th Interna-tional Conference on Intelligent Human-Machine Systems andCybernetics pp 190ndash193 Hangzhou China 2013
[28] M Taj and A Cavallaro ldquoDistributed and decentralized multi-camera trackingrdquo IEEE Signal Processing Magazine vol 28 no3 pp 46ndash58 2011
[29] N Krahnstoever T Yu S Lim et al ldquoCollaborative real-timecontrol of active cameras in large scale surveillance systemsrdquo inProceedings of the Workshop on Multi-camera and Multi-modalSensor Fusion Algorithms and Applications pp 1ndash12 MarseilleFrance 2008
[30] C Braillon C Pradalier J L Crowley and C Laugier ldquoReal-time moving obstacle detection using optical flow modelsrdquo inProceedings of the IEEE Intelligent Vehicles Symposium (IV rsquo06)pp 466ndash471 Tokyo Japan June 2006
[31] C Kim and J N Hwang ldquoFast and automatic video objectsegmentation and tracking for content-based applicationsrdquoIEEE Transactions on Circuits and Systems for Video Technologyvol 12 no 2 pp 122ndash129 2002
[32] R Jain and H H Nagel ldquoOn the analysis of accumulativedifference pictures from image sequences of real world scenesrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 1 no 2 pp 206ndash214 1978
[33] O Barnich andM Van Droogenbroeck ldquoViBE a powerful ran-dom technique to estimate the background in video sequencesrdquoin Proceedings of the IEEE International Conference onAcousticsSpeech and Signal Processing (ICASSP rsquo09) pp 945ndash948 TaibeiTaiwan April 2009
[34] A Elgammal D Harwood and L Davis ldquoNon-parametricmodel for background subtractionrdquo in Proceedings of the the 6thEuropean Conference on Computer Vision-Part II pp 751ndash767London UK 2000
[35] HWMao C H Yang G P Abousleman and J Si ldquoAutomatedmultiple target detection and tracking in UAV videosrdquo inAirborne Intelligence Surveillance Reconnaissance (ISR) Systemsand Applications VII vol 7668 of Proceedings of SPIE OrlandoFla USA 2010
[36] H Medeiros J Park and A C Kak ldquoDistributed objecttracking using a cluster-based Kalman filter in wireless cameranetworksrdquo IEEE Journal on Selected Topics in Signal Processingvol 2 no 4 pp 448ndash463 2008
[37] A Andriyenko and K Schindler ldquoMulti-target tracking bycontinuous energy minimizationrdquo in Proceedings of the IEEEConference on Computer Vision and Pattern Recognition (CVPRrsquo11) pp 1265ndash1272 Colorado Springs Colo USA June 2011
[38] A Andriyenko and K Schindler ldquoGlobally optimal multi targettracking on a hexagonal latticerdquo in Proceedings of the 11thEuropean Conference on Computer Vision pp 466ndash479 CreteGreece 2010
[39] J Y Bouguet ldquoPyramidal implementation of the affine lucaskanade feature tracker description of the algorithmrdquo IntelCorporation pp 1ndash9 2001
[40] Y Wang G Hu and Z Chen ldquoCalibration of CCD camera inimage matching experimental equipmentrdquo in Proceedings of the2nd International Symposium on Instrumentation Science andTechnology p 3146 Jinan China August 2002
[41] B Caprile and V Torre ldquoUsing vanishing points for cameracalibrationrdquo International Journal of Computer Vision vol 4 no2 pp 127ndash139 1990
Submit your manuscripts athttpwwwhindawicom
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical Problems in Engineering
Hindawi Publishing Corporationhttpwwwhindawicom
Differential EquationsInternational Journal of
Volume 2014
Applied MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical PhysicsAdvances in
Complex AnalysisJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
OptimizationJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Operations ResearchAdvances in
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Function Spaces
Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of Mathematics and Mathematical Sciences
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Algebra
Discrete Dynamics in Nature and Society
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Decision SciencesAdvances in
Discrete MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Stochastic AnalysisInternational Journal of
2 Mathematical Problems in Engineering
cameras or the combination of PTZ cameras and staticcameras can be deployed to fulfill some practical tasks [13ndash18]
The target detection target tracking and camera cali-bration are key to the moving target positioning process Toextract moving targets from a video frame of a static camerabackground subtraction is the most widely used approach[19 20] When the camera is stationary the backgroundscene is unchanging such that it is convenient to constructa background model [21 22] The capability of efficientlyand accurately estimating background images is critical forany robust background subtraction algorithm A well-knownmethod presented by Stauffer and Grimson [21] uses anadaptive strategy for modeling background Therein eachpixel is modeled using a separate Gaussian mixture which iscontinuously learnt based on online approximations Targetdetection at the current frame is then performed at pixel levelby comparing its value against themost probable backgroundGaussians However the adaptive Gaussian mixture algo-rithm suffers from a low convergence speed in the learningprocess especially in complicated environments For thisreason an improved adaptive Gaussian mixture learningalgorithm was introduced in [23]
Moving target tracking is an important component inthe field of computer vision and has been widely used inmany applications such as video surveillance [16] intelligenttransportation [24] and multiagent systems tracking andcontrol [25] Target tracking aims to estimate the positionand the shape of a target or a region in subsequent framesDuring target tracking a target is continuously tracked bycorrectly associating a target detected in subsequent frameswith the same identified track These methods and theirvariations commonlymake use of the one-to-one assumptionin the sense that a target can only generate at most onemeasurement in each frame and a measurement can onlyoriginate from at most one target However the one-to-one assumption rarely holds in practical applications due tothe splitting and merging processes as well as multitargetsexisting in a common scene In order to overcome theseshortcomings several approaches have been proposed formultitargets tracking in recent years [26ndash29]
Camera calibration is an essential procedure in dis-tributedmultitarget positioning and determines themappingbetween 3D world coordinates and 2D image coordinates inpractical applications The basic task of camera calibrationis to compute the camera extrinsic and intrinsic parameterswhich determine the imaging model and the relationshipbetween multiple camera coordinates With respect to dif-ferent applications the corresponding calibration algorithmsinclude the direct linear transformation (DLT) algorithm[9] Tsai algorithm [10] vanishing point algorithm [11]and Zhang algorithm [12] These algorithms have respectiveadvantages and disadvantages in various practical applica-tions In this paper we will focus on a fast calibration algo-rithm based on the vanishing point theory which overcomesthe defects of traditional measurements
The paper is organized as follows The system frameworkis presented in Section 2 Section 3 focuses on the targetdetection and tracking The fast calibration algorithm is
presented in Section 4 The test results of target positioningbased on a distributed camera network are reported inSection 5 Finally we draw some conclusions and shed lighton future work in Section 6
2 Systematic Framework and Problem
The work presented in this paper originates from a researchproject of moving target tracking and positioning in theDigital Navigation Center (DNC) at Beihang University Theprimary goal of the project is to develop a target positioningplatform to realize monitoring and positioning targets ina large region The systematic framework of the movingtarget positioning based on a distributed camera networkis shown in Figure 1 Due to the field of view and pricelimitations amass of static cameras are installed in a practicalapplication environment In order to realize moving targetpositioning in a large region it is necessary that the systemsupports targets detection and tracking Since a target canno longer be detected because either it leaves the fieldof view it stops and becomes static or it can no longerbe distinguished from the background it is reasonable totake target splitting and merging into account or detectmultiple targets Therefore the performance of the targetdetection tracking and association algorithms will influencethe reliability of the target positioning and it is necessary thatthe target positioning results between cameras are fused intoa world coordinate system 119883
119908119884119908
119885119908
In this paper we tackle several problems including thetargets detection tracking and association as well as thefast camera calibration and target positioning in the movingtarget positioning system based on a distributed cameranetwork in practical applications
3 Target Detection and Tracking
31 Target Detection Target detection is the basis of targettracking target positioning target recognition action recog-nition and so forth There are some common algorithmssuch as the optical flow algorithm [30] the frame differencealgorithm [31] and the background subtraction algorithm[32] in practical applications The most well-known and themost widely used one is background subtraction for staticcameras because it is convenient to construct a backgroundmodel and extract moving targets
The background modeling techniques can be dividedinto two categories the parametric techniques that use aparametric model for each pixel location and the samples-based techniques that build their model by aggregatingpreviously observed values for each pixel location [33] Themost popular parametric technique is based on the Gaussianmixture model (GMM) presented by Stauffer and Grimson[21] This algorithm relies on the principle that the pixelvalue for the same location in the image sequences satisfiesa Gaussian distribution as illustrated in Figure 2
While updating the background model each pixel ofa scene image is independently modelled by a mixture ofat most 119870 Gaussian distributions and employs an adaptive
Mathematical Problems in Engineering 3
1
2
3
4
5
6
7
8
L
Ze
Zw
Ye
Yw
Oe
Ow
120582
Xe
Xw
Figure 1 The systematic framework of the moving target positioning based on a distributed camera network
Frame 0 Frame 60Frame 40Frame 20 Frame 80 Frame 100 Frame 120
R 207G 195B 197
R 207G 195B 197
R 204G 191B 198
R 110G 89B 98
R 217G 200B 208
R 215G 200B 207
R 215G 198B 208
x
BackgroundForeground
y
Imag
e fra
me
Pixe
l inf
orm
atio
nPi
xel d
istrib
utio
n
minus3120590 120583 3120590
Figure 2 The model of pixel distribution
strategy with the result that the algorithm is adaptive andable to deal with multimodal backgrounds in a dynamicenvironment (eg changing time of day clouds swayingtree leafs and etc) However since its sensitivity cannotbe properly tuned its ability to successfully handle high-and low-frequency changes in the background is debatableTo overcome these shortages samples-based techniques [34]
circumvent a part of the parameter estimation step by build-ing their models from observed pixel values and enhancetheir robustness to noises They provide fast responses tohigh-frequency events in the background by directly includ-ing newly observed values in their pixel models Howeversince they update their pixel models in a first-in first-outmanner their ability to successfully handle concomitant
4 Mathematical Problems in Engineering
(a) Original video (b) Gaussian mixture model (c) Random background model
Figure 3 The target detection results of video data set 1
(a) Original video (b) Gaussian mixture model (c) Random background model
Figure 4 The target detection results of video data set 2
(a) Original video (b) Foreground image with ghost (c) The target detection result
Figure 5 The target detection results included the ghost
(a) Original video (b) Gaussian mixture model (c) Our method
Figure 6 The target detection results for frame 80
Mathematical Problems in Engineering 5
(a) Original video (b) Gaussian mixture model (c) Our method
Figure 7 The target detection results for frame 105
(a) Original video (b) Gaussian mixture model (c) Our method
Figure 8 The target detection results for frame 145
events evolving at various speeds is limited similarly to thelimitation in its adaptive ability of dealingwith the concurrentevents with different frequency In order to address thisissue random background modeling that is intuitively animproved samples-based algorithm is found in [33] Thisalgorithm assumes 119901
119905(119909) to be the value of the pixel 119909 at
time 119905 and imposes the constraint that the influence of avalue on the polychromatic space is restrictedwithin the localneighborhood Then a set of sample values is used as a pixelmodel to classify a value 119901
119905(119909) to be either a background or a
foreground pixel valueAn experiment was carried out using video data and
compared with the Gaussian mixture model presented byStauffer andGrimson [21] in which the video data set 1 comesfrom the evaluating data from Performance Evaluation ofTracking and Surveillance (PETS) database with the videoimage resolution of 768 times 576 pixels and the frame rateof 25 frames per second (fs) the video data set 2 comesfrom a practical surveillance system data in the DNC ofBeihang University with the video image resolution of 352times 288 pixels and the frame rate of 25 fs The experimentalresults are shown in Figures 3 and 4 As can be seen fromFigures 3 and 4 since trees were swinging in the windsuch movements were classified as foreground motions bythe Gaussian mixture model while the random backgroundmodel effectively detected the trees as the background How-ever both algorithms do not take into account the shadowof the target and thus are severely damaged in terms of thereliability and robustness of the target detection and trackingas illustrated in Figure 5 in which the used video data set 3comes from a practical surveillance system in the DNC of
Beihang University and is with the video image resolution of352 times 288 pixels and the frame rate of 25 fs
In order to remove the damage from the shadow ontarget detection and tracking we propose an algorithm bycombining the random backgroundmodel and the frame dif-ference algorithm and the mathematical model is describedas follows
Mask (119909 119910)
= 1
1003816100381610038161003816Dilate (119863 (119909 119910)) minus Erode (119863 (119909 119910))1003816100381610038161003816 = 1
0 otherwise
(1)
where 119863(119909 119910) denotes the mask image of the backgrounddifferencing Dilate(lowast) denotes the dilation operation of thetarget region block Erode(lowast) denotes the erosion operationof the target regionMask(119909 119910) denotes themask image of thedifference between the dilated and eroded image operations
Suppose that the number of the pixels with their valuesequal to 1 in Mask(119909 119910) is 119873
1and the number of the pixels
that are detected as foreground from differencing image andthe values of which at (119909 119910) in the template Mask(119909 119910) equalto 1 is 119873
2 If 11987311198732
gt 119879 where 119879 denotes a threshold thenthe target region block is the foreground target otherwise itis the shadow of the target
An experiment is carried out using the video data set 3and compared with the GMM The experimental results areshown in Figures 6 7 and 8 As can be seen the algorithmpresented in this paper is effective to remove the shadow ofthe target
6 Mathematical Problems in Engineering
Figure 9 The multitarget tracking results for video data set 4
Figure 10 The target tracking results across multicamera for video data set 3
u
(u ) (ud d)
Zc Yc
XcOc
zc
xc yc
Zw
Xw
Yw
Ow
P(xw yw zw)+
Figure 11The relationship between the world coordinates and pixelcoordinates
32 Target Tracking Once moving targets are detected thetrack initialization event is triggered such that the movingtargets can be continuously tracked by the tracking algorithmin the living period of a track (which starts from its initial-ization to its termination [35]) The termination of a trackoccurs when a target can no longer be detected because itleaves the field of view it stops and becomes static or it can
Figure 12 The fast calibrating camera tool software
no longer be distinguished from the background Detectedtargets are not confirmed to be true moving targets until theyhave been consistently tracked for a period of time beforetheir target tracks are initialized We create a dynamic listof potential tracks using all detected targets Associationswill be established between targets detected in a new imageframe and potential tracking targets When a potential targetis tracked in several continuous frames it is recognized as atrue moving target and a track will be initialized
Compared to a single target tracking the multitargetproblem poses additional difficulties data association needsto be solved that is it has to be decided which observation
Mathematical Problems in Engineering 7
Xc1
Yc1Oc1
Zc1
Xc2
Yc2
Oc2
Zc2
Xw
Yw
Ow
Zw
r c2c1
r c1w
r c2w
Figure 13 The schematic diagram of the target positioning
corresponds towhich target constraints between targets needto be taken into account Multitarget tracking algorithmscan be roughly divided into two categories the recursivealgorithms and the nonrecursive algorithms The recursivealgorithms base their estimate only on the state of theprevious frame such as Kalman filtering [36] and particlefiltering [27] in which different strategy is used to obtainan optimal solution over multiple frames and can thusbetter cope with ambiguous multimodal distributions Thenonrecursive algorithms seek optimality over an extendedperiod of time [37 38]
In practical applications target tracking takes targetsplitting and merging into account because of the factorssuch as illumination changes and occlusion Since the one-to-one tracking assumption rarely holds multitarget trackingproblem is still challenging In order to realize multitargettracking we propose a solution by combining the pyramidLucas-Kanade feature tracker [39] and Kalman filter Themathematical model of Kalman filtering is described asfollows
119883 (119896 + 1) = 119860 (119896 + 1 119896) 119883 (119896) + 119882 (119896)
119885 (119896) = 119867 (119896) 119883 (119896) + 119881 (119896)
(2)
where 119883(119896) = [119909(119896) 119910(119896) 119908(119896) ℎ(119896) V119909(119896) V119910
(119896) V119908
(119896)
Vℎ(119896)]119879 denotes the state 119860(119896 + 1 119896) denotes the state tran-
sition matrix 119882(119896) denotes the system noise 119867(119896) denotesthe measurement matrix 119885(119896) = [119909(119896) 119910(119896) 119908(119896) ℎ(119896)]
119879
denotes the measurement value 119881(119896) denotes the measure-ment noise 119909(119896) and 119910(119896) denote the horizontal and verticalordinates of the target centroid 119908(119896) and ℎ(119896) denote thewidth and height of the target envelope rectangle V
119909(119896)
V119910
(119896) V119908
(119896) and Vℎ(119896) denote the speeds of the 119909(119896) 119910(119896)
119908(119896) and ℎ(119896) respectivelyTo realize multitarget tracking across multiple cameras
we construct a similarity function of target matching to
realize target association and target tracking across multiplecameras The similarity function is described as follows
119891 (119886 119887) = 120572119872119889
(119886 119887) + 120573119872119904
(119886 119887) + 120574119872ℎ
(119886 119887)
119872119889
(119886 119887) =(119882119886
+ 119882119887)
(119882119886
+ 119882119887) + 119863119909
sdot(119867119886
+ 119867119887)
(119867119886
+ 119867119887) + 119863119910
119872119904
(119886 119887) =2119882119886119867119886119882119887119867119887
(119882119886119867119886)2
+ (119882119887119867119887)2
119872ℎ
(119886 119887) =2ℎ119886ℎ119887
ℎ2119886
+ ℎ2
119887
(3)
where 119886 and 119887 denote targets which will be matched119872119889(119886 119887) isin [0 1] denotes their position similarity (the larger
the 119872119889(119886 119887) is the closer their positions are) 119872
119904(119886 119887) isin
[0 1] denotes the similarity of their sizes (the larger the119872119904(119886 119887) is the closer their sizes are) 119872
ℎ(119886 119887) isin [0 1]
denotes the similarity of their heights (the larger the 119872ℎ(119886 119887)
is the closer their heights are) 120572 isin [0 1] 120573 isin [0 1] and 120574 isin
[0 1] denote the weight coefficients and satisfy 120572 + 120573 + 120574 = 1119883119886and 119884
119886denote the horizontal and vertical ordinates of the
target 119886 in the world coordinate system 119883119887and 119884119887denote the
horizontal and vertical ordinates of the target 119887 in the worldcoordinate system119882
119886and119882
119887denote half of the widths of the
targets 119886 and 119887 respectively 119867119886and 119867
119887denote half of their
heights respectively119863119909
= fabs(119883119886minus119883119887) denotes the absolute
difference between119883119886and119883
119887119863119910
= fabs(119884119886
minus119884119887) denotes the
absolute difference between 119884119886and 119884
119887 ℎ119886and ℎ119887denote their
heights in the world coordinate system 119891(119886 119887) isin [0 1] (thelarger the 119891(119886 119887) is the higher their matching similarity is)and vice versa In practical applications 120572 120573 and 120574 can beadjusted according to the accuracy of the 119872
119889(119886 119887) 119872
119904(119886 119887)
and 119872ℎ(119886 119887)
An experiment is carried out by using the video data sets3 and 4 which come from a practical surveillance systemin the DNC of Beihang University with the video imageresolution of 352 times 288 pixels and the frame rate of 25 fsThe experimental results are shown in Figures 9 and 10 inwhich the rectangle with the dotted blue lines denotes theoverlapping area between two cameras As can be seen thetarget tracking algorithm presented in this paper is effectiveand can track multiple targets across the multiple cameras
4 Fast Camera Calibrationand Target Positioning
41 Fast Camera Calibration Camera calibration is a keytechnology in determining the mapping between 3D worldcoordinates and 2D image coordinates for various computervision applications A schematic diagram describing themapping between 3D world coordinates and 2D image coor-dinates is shown in Figure 11 (119906
119889 V119889) is the image coordinate
of (119909119908
119910119908
119911119908
) if a perfect pinhole camera model is used(119906 V) is the actual image coordinate which deviates from(119906119889 V119889) due to lens distortion The distance between them is
termed as the radial distortion Therefore the mathematical
8 Mathematical Problems in Engineering
Figure 14 The experimental results in indoor environment
model from 3Dworld coordinates to 2D image coordinates isexpressed by [40]
119906 + 120575119906 = 119891119909
1199031
(119909119908
minus 119909119888) + 1199032
(119910119908
minus 119910119888) + 1199033
(119911119908
minus 119911119888)
1199037
(119909119908
minus 119909119888) + 1199038
(119910119908
minus 119910119888) + 1199039
(119911119908
minus 119911119888)
V + 120575V = 119891119910
1199034
(119909119908
minus 119909119888) + 1199035
(119910119908
minus 119910119888) + 1199036
(119911119908
minus 119911119888)
1199037
(119909119908
minus 119909119888) + 1199038
(119910119908
minus 119910119888) + 1199039
(119911119908
minus 119911119888)
(4)
where 119903119894
(119894 = 1 2 9) denotes the element of therotation matrix from the world coordinate frame to the pixelcoordinate frame 119891
119909and 119891
119910denote the focal lengths of
the camera in the 119909 and 119910 directions 120575119906 and 120575V denotethe photogrammetric distortions 119874
119888(119909119888 119910119888 119911119888) denotes the
coordinate of the camera in the world coordinate frame119875(119909119908
119910119908
119911119908
)denotes the coordinate of the target in theworldcoordinate frame
Traditional calibration algorithms for example DLTalgorithm [9] and Tsai algorithm [10] utilize a series ofmathematical transformations and algorithms to obtainparameters of the camera model They have been widelyused because of their simple mathematical model and theoryHowever these algorithms require a large amount of work torecord and check calibration points during calibration andthus are inefficient in practical applications For this reason afast calibration algorithmbased on the vanishing point theorywas introduced in [11 41] in which the photogrammetricdistortions 120575119906 and 120575V are unconsidered
According to the mathematical model of the fast cameracalibration presented in [11] we develop software to calibratecameras fast and can promptly check the accuracy of cameraparameters by (4) The calibration process and results of apractical camera are shown in Figure 12 As can be seenthe focal lengths of the camera are 39983535 pixels thecalibration error of the line segment 119860119875 is 398mm and
the rotation matrix 119877 and translation vector 119879 (unit mm) areas the following
119877 = [
[
0642 minus0766 0024
minus0222 minus0216 minus0951
0734 0605 minus0309
]
]
119879 = [
[
minus93497
190111
62305937
]
]
(5)
42 Target Positioning Once targets are continuouslytracked the space coordinates of the targets in the cameracoordinate frame can be computed by imaging model withcamera parameters as follows
[
[
119883119888
119884119888
119885119888
]
]
= 119885119888
[[[[[
[
119889119909
119891119909
0 0
0
119889119910
119891119910
0
0 0 1
]]]]]
]
[
[
119906
V1
]
]
+
[[[[[[
[
1199060119885119888119889119909
119891119909
V0119885119888119889119910
119891119910
0
]]]]]]
]
(6)
where 1199060and V0denote the principle point in the pixel frame
and 119889119909 and 119889119910 denote the pixel sizes of the camera in the 119909
and 119910 directionsWhen the targets are across multiple cameras the space
coordinates of the targets in each camera coordinate frameare computed respectively by (6) and then are unified intothe world coordinated system as illustrated in Figure 13 Themathematical model is described as follows
[
[
119883119908
119884119908
119885119908
]
]
= 119877119894([
[
119883119888119894
119884119888119894
119885119888119894
]
]
minus 119879119894) 119894 = 1 2 119873 (7)
where 119894 denotes the 119894th camera and 119873 denotes the number ofcameras
5 Experiment Test
The target positioning system presented in this paper is testedin indoor and outdoor environments in which all the low-cost static cameras are calibrated and the coordinates of
Mathematical Problems in Engineering 9
(a) 3D scene of new main building (b) Camera 1
(c) Camera 2 (d) Camera 3
(e) Camera 4 (f) Camera 5
(g) Camera 6 (h) Camera 7
Figure 15 The experimental results in outdoor environment
cameras are unified into the world coordinate system Targetsdetection and tracking are done by the target detection andtracking algorithm As a result the targets are positionedby imaging model and camera parameters in real time andtheir trajectories are displayed in a three-dimensional scene
The test results are respectively shown in Figures 14 and15 for indoor and outdoor environments As can be seenfrom Figure 14 when a target is continuously moving withinan indoor corridor the positioning system consisting of sixdistributed cameras is able to position this target in real
10 Mathematical Problems in Engineering
time and display its trajectory in a three-dimensional spacemodel Likewise as can be seen from Figure 15 when a targetis continuously moving outdoors the positioning systemconsisting of seven distributed cameras is able to position thistarget and display its trajectory in a three-dimensional spacemodel in real time as well The experimental results confirmthat the systematic framework and inclusive algorithmspresented in this paper are both effective and efficient
In this paper we assume that the ground is flat whichrarely holds in practical applications given a large regionIn order to solve this problem it is necessary to use digitalelevationmodel (DEM) to describe topographic relief in largeregions
6 Conclusion and Future Work
This paper presented the comprehensive design and imple-mentation of a moving target positioning system basedon a distributed camera network The system is composedof low-cost static cameras which provide complementarypositioning information for moving target positioning inindoor and outdoor environments when GNSS signals areunavailable In this system static cameras can cover a largeregion moving targets are detected and then tracked usingcorresponding algorithms target positions are estimated bymaking use of the geometrical relationships among thosecameras after calibrating those cameras and finally foreach target its position estimates obtained from differentcameras are unified into the world coordinate system Theexperimental results of the target detection tracking andpositioning system were reported based on real video data
Targets positioning and trackingwithmultiple static cam-eras were verified in both indoor and outdoor environmentsHowever the reliability and accuracy of target tracking andpositioning suffer from several environment factors Henceit is necessary to fuse information from various sensorssuch as radar infrared camera inertial measure unit (IMU)and wireless location system Regarding future work it ismeaningful to develop and test these algorithms in thepractical applications
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This project is supported by the Key Program of the NationalNatural Science Foundation of China (Grant no 61039003)the National Natural Science Foundation of China (Grantno 41274038) the Aeronautical Science Foundation of China(Grant no 2013ZC51027) the Aerospace Innovation Founda-tion of China (CASC201102) and the Fundamental ResearchFunds for the Central Universities
References
[1] R Mautz and S Tilch ldquoSurvey of optical indoor positioningsystemsrdquo in Proceedings of the International Conference onIndoor Positioning and Indoor Navigation (IPIN rsquo11) GuimaraesPortugal September 2011
[2] G Retscher E Moser D Vredeveld et al ldquoPerformance andaccuracy test of a WiFi indoor positioning systemrdquo Journal ofApplied Geodesy vol 1 no 2 pp 103ndash110 2007
[3] S J Ingram D Harmer and M Quinlan ldquoUltra wide bandindoor positioning systems and their use in emergenciesrdquo inProceedings of the Position Location and Navigation Symposium(PLANS rsquo04 ) pp 706ndash715 Monterey Calif USA April 2004
[4] G Anastasi R Bandelloni M Conti et al ldquoExperimentingan indoor bluetooth-based positioning servicerdquo in Proceedingsof the 23rd International Conference on Distributed ComputingSystems Workshops pp 480ndash483 Providence RI USA 2003
[5] H Liu H Darabi P Banerjee and J Liu ldquoSurvey of wirelessindoor positioning techniques and systemsrdquo IEEE Transactionson Systems Man and Cybernetics C Applications and Reviewsvol 37 no 6 pp 1067ndash1080 2007
[6] H-S Kim and J-S Choi ldquoAdvanced indoor localization usingultrasonic sensor and digital compassrdquo in Proceedings of theInternational Conference on Control Automation and Systems(ICCAS rsquo08) pp 223ndash226 Seoul Korea October 2008
[7] R Karlsson T B Schon D Tornqvist G Conte and FGustafsson ldquoUtilizing model structure for efficient simulta-neous localization and mapping for a UAV applicationrdquo inProceedings of the IEEE Aerospace Conference Digital ObjectIdentifier (AC rsquo08) Rome Italy March 2008
[8] D Nister O Naroditsky and J Bergen ldquoVisual odometryrdquoin Proceedings of the IEEE Computer Society Conference onComputer Vision and Pattern Recognition (CVPR rsquo04) pp 652ndash659 Washington DC USA July 2004
[9] Y I Abdel-Aziz andHMKarara ldquoDirect linear transformationinto object space coordinates in close-range photogrammetryrdquoin Proceedings of the Symposium on Close-Range Photogramme-try pp 1ndash18 Urbana Ill USA 1971
[10] R Y Tsai ldquoA versatile camera calibration technique for high-accuracy 3D Machine vision metrology using off-the-shelf TVcamera and lensesrdquo IEEE Journal of Robotics and Automationvol RA-3 no 4 pp 323ndash344 1987
[11] J Long X Zhang and L Zhao ldquoA fast calibration algorithmbased on vanishing point for scene camerardquo Applied Mechanicsand Materials vol 58ndash60 pp 1148ndash1153 2011
[12] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000
[13] P D Z Varcheie and G-A Bilodeau ldquoPeople tracking using anetwork-based PTZ camerardquoMachine Vision and Applicationsvol 22 no 4 pp 671ndash690 2011
[14] C-H Chen Y Yao D Page B Abidi A Koschan andMAbidildquoHeterogeneous fusion of omnidirectional and PTZ camerasfor multiple object trackingrdquo IEEE Transactions on Circuits andSystems for Video Technology vol 18 no 8 pp 1052ndash1063 2008
[15] X Clady F Collange F Jurie and P Martinet ldquoObject track-ing with a Pan-tilt-zoom camera application to car drivingassistancerdquo in Proceedings of the IEEE International Conferenceon Robotics and Automation (ICRA rsquo01) pp 1653ndash1658 SeoulKorea May 2001
[16] N Bellotto E Sommerlade B Benfold et al ldquoA dstributedcamera system formulti-resolution surveillancerdquo in Proceedings
Mathematical Problems in Engineering 11
of the 3rd ACMIEEE International Conference on DistributedSmart Cameras (ICDSC rsquo09) Como Italy September 2009
[17] C Ding B Song A Morye et al ldquoCollaborative sensing in adistributed ptz camera networkrdquo IEEE Transactions on ImageProcessing vol 21 no 7 pp 3282ndash3295 2012
[18] E B Ermis P Clarot P-M Jodoin and V Saligrama ldquoActivitybasedmatching in distributed camera networksrdquo IEEE Transac-tions on Image Processing vol 19 no 10 pp 2595ndash2613 2010
[19] D-M Tsai and S-C Lai ldquoIndependent component analysis-based background subtraction for indoor surveillancerdquo IEEETransactions on Image Processing vol 18 no 1 pp 158ndash167 2009
[20] J Migdal and W E L Grimson ldquoBackground subtractionusing Markov thresholdsrdquo in Proceedings of the IEEE Workshopon Motion and Video Computing (MOTION rsquo05) pp 58ndash65Breckenridge Colo USA January 2005
[21] C Stauffer andW E L Grimson ldquoLearning patterns of activityusing real-time trackingrdquo IEEETransactions on Pattern Analysisand Machine Intelligence vol 22 no 8 pp 747ndash757 2000
[22] D-S Lee ldquoEffective Gaussian mixture learning for video back-ground subtractionrdquo IEEE Transactions on Pattern Analysis andMachine Intelligence vol 27 no 5 pp 827ndash832 2005
[23] L Zhao and X He ldquoAdaptive Gaussian mixture learningfor moving object detectionrdquo in Proceedings of the 3rd IEEEInternational Conference on Broadband Network and Multime-dia Technology (IC-BNMT rsquo10) pp 1176ndash1180 Beijing ChinaOctober 2010
[24] S Sivaraman and M M Trivedi ldquoLooking at vehicles on theroad a survey of vision-based vehicle detection tracking andbehavior analysisrdquo IEEE Transactions on Intelligent Transporta-tion Systems vol 14 no 4 pp 1773ndash1795 2013
[25] G H Wen Z S Duan G R Chen and W W Yu ldquoConsen-sus tracking of multi-agent systems with lipschitz-type nodedynamics and switching topologiesrdquo IEEE Transactions onCircuits and Systems vol 60 no 9 pp 1ndash13 2013
[26] S Oh S Russell and S Sastry ldquoMarkov chain Monte Carlodata association formulti-target trackingrdquo IEEETransactions onAutomatic Control vol 54 no 3 pp 481ndash497 2009
[27] H Zhang and L Zhao ldquoIntegral channel features for particlefilter based object trackingrdquo in Proceedings of the 5th Interna-tional Conference on Intelligent Human-Machine Systems andCybernetics pp 190ndash193 Hangzhou China 2013
[28] M Taj and A Cavallaro ldquoDistributed and decentralized multi-camera trackingrdquo IEEE Signal Processing Magazine vol 28 no3 pp 46ndash58 2011
[29] N Krahnstoever T Yu S Lim et al ldquoCollaborative real-timecontrol of active cameras in large scale surveillance systemsrdquo inProceedings of the Workshop on Multi-camera and Multi-modalSensor Fusion Algorithms and Applications pp 1ndash12 MarseilleFrance 2008
[30] C Braillon C Pradalier J L Crowley and C Laugier ldquoReal-time moving obstacle detection using optical flow modelsrdquo inProceedings of the IEEE Intelligent Vehicles Symposium (IV rsquo06)pp 466ndash471 Tokyo Japan June 2006
[31] C Kim and J N Hwang ldquoFast and automatic video objectsegmentation and tracking for content-based applicationsrdquoIEEE Transactions on Circuits and Systems for Video Technologyvol 12 no 2 pp 122ndash129 2002
[32] R Jain and H H Nagel ldquoOn the analysis of accumulativedifference pictures from image sequences of real world scenesrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 1 no 2 pp 206ndash214 1978
[33] O Barnich andM Van Droogenbroeck ldquoViBE a powerful ran-dom technique to estimate the background in video sequencesrdquoin Proceedings of the IEEE International Conference onAcousticsSpeech and Signal Processing (ICASSP rsquo09) pp 945ndash948 TaibeiTaiwan April 2009
[34] A Elgammal D Harwood and L Davis ldquoNon-parametricmodel for background subtractionrdquo in Proceedings of the the 6thEuropean Conference on Computer Vision-Part II pp 751ndash767London UK 2000
[35] HWMao C H Yang G P Abousleman and J Si ldquoAutomatedmultiple target detection and tracking in UAV videosrdquo inAirborne Intelligence Surveillance Reconnaissance (ISR) Systemsand Applications VII vol 7668 of Proceedings of SPIE OrlandoFla USA 2010
[36] H Medeiros J Park and A C Kak ldquoDistributed objecttracking using a cluster-based Kalman filter in wireless cameranetworksrdquo IEEE Journal on Selected Topics in Signal Processingvol 2 no 4 pp 448ndash463 2008
[37] A Andriyenko and K Schindler ldquoMulti-target tracking bycontinuous energy minimizationrdquo in Proceedings of the IEEEConference on Computer Vision and Pattern Recognition (CVPRrsquo11) pp 1265ndash1272 Colorado Springs Colo USA June 2011
[38] A Andriyenko and K Schindler ldquoGlobally optimal multi targettracking on a hexagonal latticerdquo in Proceedings of the 11thEuropean Conference on Computer Vision pp 466ndash479 CreteGreece 2010
[39] J Y Bouguet ldquoPyramidal implementation of the affine lucaskanade feature tracker description of the algorithmrdquo IntelCorporation pp 1ndash9 2001
[40] Y Wang G Hu and Z Chen ldquoCalibration of CCD camera inimage matching experimental equipmentrdquo in Proceedings of the2nd International Symposium on Instrumentation Science andTechnology p 3146 Jinan China August 2002
[41] B Caprile and V Torre ldquoUsing vanishing points for cameracalibrationrdquo International Journal of Computer Vision vol 4 no2 pp 127ndash139 1990
Submit your manuscripts athttpwwwhindawicom
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical Problems in Engineering
Hindawi Publishing Corporationhttpwwwhindawicom
Differential EquationsInternational Journal of
Volume 2014
Applied MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical PhysicsAdvances in
Complex AnalysisJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
OptimizationJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Operations ResearchAdvances in
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Function Spaces
Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of Mathematics and Mathematical Sciences
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Algebra
Discrete Dynamics in Nature and Society
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Decision SciencesAdvances in
Discrete MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Stochastic AnalysisInternational Journal of
Mathematical Problems in Engineering 3
1
2
3
4
5
6
7
8
L
Ze
Zw
Ye
Yw
Oe
Ow
120582
Xe
Xw
Figure 1 The systematic framework of the moving target positioning based on a distributed camera network
Frame 0 Frame 60Frame 40Frame 20 Frame 80 Frame 100 Frame 120
R 207G 195B 197
R 207G 195B 197
R 204G 191B 198
R 110G 89B 98
R 217G 200B 208
R 215G 200B 207
R 215G 198B 208
x
BackgroundForeground
y
Imag
e fra
me
Pixe
l inf
orm
atio
nPi
xel d
istrib
utio
n
minus3120590 120583 3120590
Figure 2 The model of pixel distribution
strategy with the result that the algorithm is adaptive andable to deal with multimodal backgrounds in a dynamicenvironment (eg changing time of day clouds swayingtree leafs and etc) However since its sensitivity cannotbe properly tuned its ability to successfully handle high-and low-frequency changes in the background is debatableTo overcome these shortages samples-based techniques [34]
circumvent a part of the parameter estimation step by build-ing their models from observed pixel values and enhancetheir robustness to noises They provide fast responses tohigh-frequency events in the background by directly includ-ing newly observed values in their pixel models Howeversince they update their pixel models in a first-in first-outmanner their ability to successfully handle concomitant
4 Mathematical Problems in Engineering
(a) Original video (b) Gaussian mixture model (c) Random background model
Figure 3 The target detection results of video data set 1
(a) Original video (b) Gaussian mixture model (c) Random background model
Figure 4 The target detection results of video data set 2
(a) Original video (b) Foreground image with ghost (c) The target detection result
Figure 5 The target detection results included the ghost
(a) Original video (b) Gaussian mixture model (c) Our method
Figure 6 The target detection results for frame 80
Mathematical Problems in Engineering 5
(a) Original video (b) Gaussian mixture model (c) Our method
Figure 7 The target detection results for frame 105
(a) Original video (b) Gaussian mixture model (c) Our method
Figure 8 The target detection results for frame 145
events evolving at various speeds is limited similarly to thelimitation in its adaptive ability of dealingwith the concurrentevents with different frequency In order to address thisissue random background modeling that is intuitively animproved samples-based algorithm is found in [33] Thisalgorithm assumes 119901
119905(119909) to be the value of the pixel 119909 at
time 119905 and imposes the constraint that the influence of avalue on the polychromatic space is restrictedwithin the localneighborhood Then a set of sample values is used as a pixelmodel to classify a value 119901
119905(119909) to be either a background or a
foreground pixel valueAn experiment was carried out using video data and
compared with the Gaussian mixture model presented byStauffer andGrimson [21] in which the video data set 1 comesfrom the evaluating data from Performance Evaluation ofTracking and Surveillance (PETS) database with the videoimage resolution of 768 times 576 pixels and the frame rateof 25 frames per second (fs) the video data set 2 comesfrom a practical surveillance system data in the DNC ofBeihang University with the video image resolution of 352times 288 pixels and the frame rate of 25 fs The experimentalresults are shown in Figures 3 and 4 As can be seen fromFigures 3 and 4 since trees were swinging in the windsuch movements were classified as foreground motions bythe Gaussian mixture model while the random backgroundmodel effectively detected the trees as the background How-ever both algorithms do not take into account the shadowof the target and thus are severely damaged in terms of thereliability and robustness of the target detection and trackingas illustrated in Figure 5 in which the used video data set 3comes from a practical surveillance system in the DNC of
Beihang University and is with the video image resolution of352 times 288 pixels and the frame rate of 25 fs
In order to remove the damage from the shadow ontarget detection and tracking we propose an algorithm bycombining the random backgroundmodel and the frame dif-ference algorithm and the mathematical model is describedas follows
Mask (119909 119910)
= 1
1003816100381610038161003816Dilate (119863 (119909 119910)) minus Erode (119863 (119909 119910))1003816100381610038161003816 = 1
0 otherwise
(1)
where 119863(119909 119910) denotes the mask image of the backgrounddifferencing Dilate(lowast) denotes the dilation operation of thetarget region block Erode(lowast) denotes the erosion operationof the target regionMask(119909 119910) denotes themask image of thedifference between the dilated and eroded image operations
Suppose that the number of the pixels with their valuesequal to 1 in Mask(119909 119910) is 119873
1and the number of the pixels
that are detected as foreground from differencing image andthe values of which at (119909 119910) in the template Mask(119909 119910) equalto 1 is 119873
2 If 11987311198732
gt 119879 where 119879 denotes a threshold thenthe target region block is the foreground target otherwise itis the shadow of the target
An experiment is carried out using the video data set 3and compared with the GMM The experimental results areshown in Figures 6 7 and 8 As can be seen the algorithmpresented in this paper is effective to remove the shadow ofthe target
6 Mathematical Problems in Engineering
Figure 9 The multitarget tracking results for video data set 4
Figure 10 The target tracking results across multicamera for video data set 3
u
(u ) (ud d)
Zc Yc
XcOc
zc
xc yc
Zw
Xw
Yw
Ow
P(xw yw zw)+
Figure 11The relationship between the world coordinates and pixelcoordinates
32 Target Tracking Once moving targets are detected thetrack initialization event is triggered such that the movingtargets can be continuously tracked by the tracking algorithmin the living period of a track (which starts from its initial-ization to its termination [35]) The termination of a trackoccurs when a target can no longer be detected because itleaves the field of view it stops and becomes static or it can
Figure 12 The fast calibrating camera tool software
no longer be distinguished from the background Detectedtargets are not confirmed to be true moving targets until theyhave been consistently tracked for a period of time beforetheir target tracks are initialized We create a dynamic listof potential tracks using all detected targets Associationswill be established between targets detected in a new imageframe and potential tracking targets When a potential targetis tracked in several continuous frames it is recognized as atrue moving target and a track will be initialized
Compared to a single target tracking the multitargetproblem poses additional difficulties data association needsto be solved that is it has to be decided which observation
Mathematical Problems in Engineering 7
Xc1
Yc1Oc1
Zc1
Xc2
Yc2
Oc2
Zc2
Xw
Yw
Ow
Zw
r c2c1
r c1w
r c2w
Figure 13 The schematic diagram of the target positioning
corresponds towhich target constraints between targets needto be taken into account Multitarget tracking algorithmscan be roughly divided into two categories the recursivealgorithms and the nonrecursive algorithms The recursivealgorithms base their estimate only on the state of theprevious frame such as Kalman filtering [36] and particlefiltering [27] in which different strategy is used to obtainan optimal solution over multiple frames and can thusbetter cope with ambiguous multimodal distributions Thenonrecursive algorithms seek optimality over an extendedperiod of time [37 38]
In practical applications target tracking takes targetsplitting and merging into account because of the factorssuch as illumination changes and occlusion Since the one-to-one tracking assumption rarely holds multitarget trackingproblem is still challenging In order to realize multitargettracking we propose a solution by combining the pyramidLucas-Kanade feature tracker [39] and Kalman filter Themathematical model of Kalman filtering is described asfollows
119883 (119896 + 1) = 119860 (119896 + 1 119896) 119883 (119896) + 119882 (119896)
119885 (119896) = 119867 (119896) 119883 (119896) + 119881 (119896)
(2)
where 119883(119896) = [119909(119896) 119910(119896) 119908(119896) ℎ(119896) V119909(119896) V119910
(119896) V119908
(119896)
Vℎ(119896)]119879 denotes the state 119860(119896 + 1 119896) denotes the state tran-
sition matrix 119882(119896) denotes the system noise 119867(119896) denotesthe measurement matrix 119885(119896) = [119909(119896) 119910(119896) 119908(119896) ℎ(119896)]
119879
denotes the measurement value 119881(119896) denotes the measure-ment noise 119909(119896) and 119910(119896) denote the horizontal and verticalordinates of the target centroid 119908(119896) and ℎ(119896) denote thewidth and height of the target envelope rectangle V
119909(119896)
V119910
(119896) V119908
(119896) and Vℎ(119896) denote the speeds of the 119909(119896) 119910(119896)
119908(119896) and ℎ(119896) respectivelyTo realize multitarget tracking across multiple cameras
we construct a similarity function of target matching to
realize target association and target tracking across multiplecameras The similarity function is described as follows
119891 (119886 119887) = 120572119872119889
(119886 119887) + 120573119872119904
(119886 119887) + 120574119872ℎ
(119886 119887)
119872119889
(119886 119887) =(119882119886
+ 119882119887)
(119882119886
+ 119882119887) + 119863119909
sdot(119867119886
+ 119867119887)
(119867119886
+ 119867119887) + 119863119910
119872119904
(119886 119887) =2119882119886119867119886119882119887119867119887
(119882119886119867119886)2
+ (119882119887119867119887)2
119872ℎ
(119886 119887) =2ℎ119886ℎ119887
ℎ2119886
+ ℎ2
119887
(3)
where 119886 and 119887 denote targets which will be matched119872119889(119886 119887) isin [0 1] denotes their position similarity (the larger
the 119872119889(119886 119887) is the closer their positions are) 119872
119904(119886 119887) isin
[0 1] denotes the similarity of their sizes (the larger the119872119904(119886 119887) is the closer their sizes are) 119872
ℎ(119886 119887) isin [0 1]
denotes the similarity of their heights (the larger the 119872ℎ(119886 119887)
is the closer their heights are) 120572 isin [0 1] 120573 isin [0 1] and 120574 isin
[0 1] denote the weight coefficients and satisfy 120572 + 120573 + 120574 = 1119883119886and 119884
119886denote the horizontal and vertical ordinates of the
target 119886 in the world coordinate system 119883119887and 119884119887denote the
horizontal and vertical ordinates of the target 119887 in the worldcoordinate system119882
119886and119882
119887denote half of the widths of the
targets 119886 and 119887 respectively 119867119886and 119867
119887denote half of their
heights respectively119863119909
= fabs(119883119886minus119883119887) denotes the absolute
difference between119883119886and119883
119887119863119910
= fabs(119884119886
minus119884119887) denotes the
absolute difference between 119884119886and 119884
119887 ℎ119886and ℎ119887denote their
heights in the world coordinate system 119891(119886 119887) isin [0 1] (thelarger the 119891(119886 119887) is the higher their matching similarity is)and vice versa In practical applications 120572 120573 and 120574 can beadjusted according to the accuracy of the 119872
119889(119886 119887) 119872
119904(119886 119887)
and 119872ℎ(119886 119887)
An experiment is carried out by using the video data sets3 and 4 which come from a practical surveillance systemin the DNC of Beihang University with the video imageresolution of 352 times 288 pixels and the frame rate of 25 fsThe experimental results are shown in Figures 9 and 10 inwhich the rectangle with the dotted blue lines denotes theoverlapping area between two cameras As can be seen thetarget tracking algorithm presented in this paper is effectiveand can track multiple targets across the multiple cameras
4 Fast Camera Calibrationand Target Positioning
41 Fast Camera Calibration Camera calibration is a keytechnology in determining the mapping between 3D worldcoordinates and 2D image coordinates for various computervision applications A schematic diagram describing themapping between 3D world coordinates and 2D image coor-dinates is shown in Figure 11 (119906
119889 V119889) is the image coordinate
of (119909119908
119910119908
119911119908
) if a perfect pinhole camera model is used(119906 V) is the actual image coordinate which deviates from(119906119889 V119889) due to lens distortion The distance between them is
termed as the radial distortion Therefore the mathematical
8 Mathematical Problems in Engineering
Figure 14 The experimental results in indoor environment
model from 3Dworld coordinates to 2D image coordinates isexpressed by [40]
119906 + 120575119906 = 119891119909
1199031
(119909119908
minus 119909119888) + 1199032
(119910119908
minus 119910119888) + 1199033
(119911119908
minus 119911119888)
1199037
(119909119908
minus 119909119888) + 1199038
(119910119908
minus 119910119888) + 1199039
(119911119908
minus 119911119888)
V + 120575V = 119891119910
1199034
(119909119908
minus 119909119888) + 1199035
(119910119908
minus 119910119888) + 1199036
(119911119908
minus 119911119888)
1199037
(119909119908
minus 119909119888) + 1199038
(119910119908
minus 119910119888) + 1199039
(119911119908
minus 119911119888)
(4)
where 119903119894
(119894 = 1 2 9) denotes the element of therotation matrix from the world coordinate frame to the pixelcoordinate frame 119891
119909and 119891
119910denote the focal lengths of
the camera in the 119909 and 119910 directions 120575119906 and 120575V denotethe photogrammetric distortions 119874
119888(119909119888 119910119888 119911119888) denotes the
coordinate of the camera in the world coordinate frame119875(119909119908
119910119908
119911119908
)denotes the coordinate of the target in theworldcoordinate frame
Traditional calibration algorithms for example DLTalgorithm [9] and Tsai algorithm [10] utilize a series ofmathematical transformations and algorithms to obtainparameters of the camera model They have been widelyused because of their simple mathematical model and theoryHowever these algorithms require a large amount of work torecord and check calibration points during calibration andthus are inefficient in practical applications For this reason afast calibration algorithmbased on the vanishing point theorywas introduced in [11 41] in which the photogrammetricdistortions 120575119906 and 120575V are unconsidered
According to the mathematical model of the fast cameracalibration presented in [11] we develop software to calibratecameras fast and can promptly check the accuracy of cameraparameters by (4) The calibration process and results of apractical camera are shown in Figure 12 As can be seenthe focal lengths of the camera are 39983535 pixels thecalibration error of the line segment 119860119875 is 398mm and
the rotation matrix 119877 and translation vector 119879 (unit mm) areas the following
119877 = [
[
0642 minus0766 0024
minus0222 minus0216 minus0951
0734 0605 minus0309
]
]
119879 = [
[
minus93497
190111
62305937
]
]
(5)
42 Target Positioning Once targets are continuouslytracked the space coordinates of the targets in the cameracoordinate frame can be computed by imaging model withcamera parameters as follows
[
[
119883119888
119884119888
119885119888
]
]
= 119885119888
[[[[[
[
119889119909
119891119909
0 0
0
119889119910
119891119910
0
0 0 1
]]]]]
]
[
[
119906
V1
]
]
+
[[[[[[
[
1199060119885119888119889119909
119891119909
V0119885119888119889119910
119891119910
0
]]]]]]
]
(6)
where 1199060and V0denote the principle point in the pixel frame
and 119889119909 and 119889119910 denote the pixel sizes of the camera in the 119909
and 119910 directionsWhen the targets are across multiple cameras the space
coordinates of the targets in each camera coordinate frameare computed respectively by (6) and then are unified intothe world coordinated system as illustrated in Figure 13 Themathematical model is described as follows
[
[
119883119908
119884119908
119885119908
]
]
= 119877119894([
[
119883119888119894
119884119888119894
119885119888119894
]
]
minus 119879119894) 119894 = 1 2 119873 (7)
where 119894 denotes the 119894th camera and 119873 denotes the number ofcameras
5 Experiment Test
The target positioning system presented in this paper is testedin indoor and outdoor environments in which all the low-cost static cameras are calibrated and the coordinates of
Mathematical Problems in Engineering 9
(a) 3D scene of new main building (b) Camera 1
(c) Camera 2 (d) Camera 3
(e) Camera 4 (f) Camera 5
(g) Camera 6 (h) Camera 7
Figure 15 The experimental results in outdoor environment
cameras are unified into the world coordinate system Targetsdetection and tracking are done by the target detection andtracking algorithm As a result the targets are positionedby imaging model and camera parameters in real time andtheir trajectories are displayed in a three-dimensional scene
The test results are respectively shown in Figures 14 and15 for indoor and outdoor environments As can be seenfrom Figure 14 when a target is continuously moving withinan indoor corridor the positioning system consisting of sixdistributed cameras is able to position this target in real
10 Mathematical Problems in Engineering
time and display its trajectory in a three-dimensional spacemodel Likewise as can be seen from Figure 15 when a targetis continuously moving outdoors the positioning systemconsisting of seven distributed cameras is able to position thistarget and display its trajectory in a three-dimensional spacemodel in real time as well The experimental results confirmthat the systematic framework and inclusive algorithmspresented in this paper are both effective and efficient
In this paper we assume that the ground is flat whichrarely holds in practical applications given a large regionIn order to solve this problem it is necessary to use digitalelevationmodel (DEM) to describe topographic relief in largeregions
6 Conclusion and Future Work
This paper presented the comprehensive design and imple-mentation of a moving target positioning system basedon a distributed camera network The system is composedof low-cost static cameras which provide complementarypositioning information for moving target positioning inindoor and outdoor environments when GNSS signals areunavailable In this system static cameras can cover a largeregion moving targets are detected and then tracked usingcorresponding algorithms target positions are estimated bymaking use of the geometrical relationships among thosecameras after calibrating those cameras and finally foreach target its position estimates obtained from differentcameras are unified into the world coordinate system Theexperimental results of the target detection tracking andpositioning system were reported based on real video data
Targets positioning and trackingwithmultiple static cam-eras were verified in both indoor and outdoor environmentsHowever the reliability and accuracy of target tracking andpositioning suffer from several environment factors Henceit is necessary to fuse information from various sensorssuch as radar infrared camera inertial measure unit (IMU)and wireless location system Regarding future work it ismeaningful to develop and test these algorithms in thepractical applications
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This project is supported by the Key Program of the NationalNatural Science Foundation of China (Grant no 61039003)the National Natural Science Foundation of China (Grantno 41274038) the Aeronautical Science Foundation of China(Grant no 2013ZC51027) the Aerospace Innovation Founda-tion of China (CASC201102) and the Fundamental ResearchFunds for the Central Universities
References
[1] R Mautz and S Tilch ldquoSurvey of optical indoor positioningsystemsrdquo in Proceedings of the International Conference onIndoor Positioning and Indoor Navigation (IPIN rsquo11) GuimaraesPortugal September 2011
[2] G Retscher E Moser D Vredeveld et al ldquoPerformance andaccuracy test of a WiFi indoor positioning systemrdquo Journal ofApplied Geodesy vol 1 no 2 pp 103ndash110 2007
[3] S J Ingram D Harmer and M Quinlan ldquoUltra wide bandindoor positioning systems and their use in emergenciesrdquo inProceedings of the Position Location and Navigation Symposium(PLANS rsquo04 ) pp 706ndash715 Monterey Calif USA April 2004
[4] G Anastasi R Bandelloni M Conti et al ldquoExperimentingan indoor bluetooth-based positioning servicerdquo in Proceedingsof the 23rd International Conference on Distributed ComputingSystems Workshops pp 480ndash483 Providence RI USA 2003
[5] H Liu H Darabi P Banerjee and J Liu ldquoSurvey of wirelessindoor positioning techniques and systemsrdquo IEEE Transactionson Systems Man and Cybernetics C Applications and Reviewsvol 37 no 6 pp 1067ndash1080 2007
[6] H-S Kim and J-S Choi ldquoAdvanced indoor localization usingultrasonic sensor and digital compassrdquo in Proceedings of theInternational Conference on Control Automation and Systems(ICCAS rsquo08) pp 223ndash226 Seoul Korea October 2008
[7] R Karlsson T B Schon D Tornqvist G Conte and FGustafsson ldquoUtilizing model structure for efficient simulta-neous localization and mapping for a UAV applicationrdquo inProceedings of the IEEE Aerospace Conference Digital ObjectIdentifier (AC rsquo08) Rome Italy March 2008
[8] D Nister O Naroditsky and J Bergen ldquoVisual odometryrdquoin Proceedings of the IEEE Computer Society Conference onComputer Vision and Pattern Recognition (CVPR rsquo04) pp 652ndash659 Washington DC USA July 2004
[9] Y I Abdel-Aziz andHMKarara ldquoDirect linear transformationinto object space coordinates in close-range photogrammetryrdquoin Proceedings of the Symposium on Close-Range Photogramme-try pp 1ndash18 Urbana Ill USA 1971
[10] R Y Tsai ldquoA versatile camera calibration technique for high-accuracy 3D Machine vision metrology using off-the-shelf TVcamera and lensesrdquo IEEE Journal of Robotics and Automationvol RA-3 no 4 pp 323ndash344 1987
[11] J Long X Zhang and L Zhao ldquoA fast calibration algorithmbased on vanishing point for scene camerardquo Applied Mechanicsand Materials vol 58ndash60 pp 1148ndash1153 2011
[12] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000
[13] P D Z Varcheie and G-A Bilodeau ldquoPeople tracking using anetwork-based PTZ camerardquoMachine Vision and Applicationsvol 22 no 4 pp 671ndash690 2011
[14] C-H Chen Y Yao D Page B Abidi A Koschan andMAbidildquoHeterogeneous fusion of omnidirectional and PTZ camerasfor multiple object trackingrdquo IEEE Transactions on Circuits andSystems for Video Technology vol 18 no 8 pp 1052ndash1063 2008
[15] X Clady F Collange F Jurie and P Martinet ldquoObject track-ing with a Pan-tilt-zoom camera application to car drivingassistancerdquo in Proceedings of the IEEE International Conferenceon Robotics and Automation (ICRA rsquo01) pp 1653ndash1658 SeoulKorea May 2001
[16] N Bellotto E Sommerlade B Benfold et al ldquoA dstributedcamera system formulti-resolution surveillancerdquo in Proceedings
Mathematical Problems in Engineering 11
of the 3rd ACMIEEE International Conference on DistributedSmart Cameras (ICDSC rsquo09) Como Italy September 2009
[17] C Ding B Song A Morye et al ldquoCollaborative sensing in adistributed ptz camera networkrdquo IEEE Transactions on ImageProcessing vol 21 no 7 pp 3282ndash3295 2012
[18] E B Ermis P Clarot P-M Jodoin and V Saligrama ldquoActivitybasedmatching in distributed camera networksrdquo IEEE Transac-tions on Image Processing vol 19 no 10 pp 2595ndash2613 2010
[19] D-M Tsai and S-C Lai ldquoIndependent component analysis-based background subtraction for indoor surveillancerdquo IEEETransactions on Image Processing vol 18 no 1 pp 158ndash167 2009
[20] J Migdal and W E L Grimson ldquoBackground subtractionusing Markov thresholdsrdquo in Proceedings of the IEEE Workshopon Motion and Video Computing (MOTION rsquo05) pp 58ndash65Breckenridge Colo USA January 2005
[21] C Stauffer andW E L Grimson ldquoLearning patterns of activityusing real-time trackingrdquo IEEETransactions on Pattern Analysisand Machine Intelligence vol 22 no 8 pp 747ndash757 2000
[22] D-S Lee ldquoEffective Gaussian mixture learning for video back-ground subtractionrdquo IEEE Transactions on Pattern Analysis andMachine Intelligence vol 27 no 5 pp 827ndash832 2005
[23] L Zhao and X He ldquoAdaptive Gaussian mixture learningfor moving object detectionrdquo in Proceedings of the 3rd IEEEInternational Conference on Broadband Network and Multime-dia Technology (IC-BNMT rsquo10) pp 1176ndash1180 Beijing ChinaOctober 2010
[24] S Sivaraman and M M Trivedi ldquoLooking at vehicles on theroad a survey of vision-based vehicle detection tracking andbehavior analysisrdquo IEEE Transactions on Intelligent Transporta-tion Systems vol 14 no 4 pp 1773ndash1795 2013
[25] G H Wen Z S Duan G R Chen and W W Yu ldquoConsen-sus tracking of multi-agent systems with lipschitz-type nodedynamics and switching topologiesrdquo IEEE Transactions onCircuits and Systems vol 60 no 9 pp 1ndash13 2013
[26] S Oh S Russell and S Sastry ldquoMarkov chain Monte Carlodata association formulti-target trackingrdquo IEEETransactions onAutomatic Control vol 54 no 3 pp 481ndash497 2009
[27] H Zhang and L Zhao ldquoIntegral channel features for particlefilter based object trackingrdquo in Proceedings of the 5th Interna-tional Conference on Intelligent Human-Machine Systems andCybernetics pp 190ndash193 Hangzhou China 2013
[28] M Taj and A Cavallaro ldquoDistributed and decentralized multi-camera trackingrdquo IEEE Signal Processing Magazine vol 28 no3 pp 46ndash58 2011
[29] N Krahnstoever T Yu S Lim et al ldquoCollaborative real-timecontrol of active cameras in large scale surveillance systemsrdquo inProceedings of the Workshop on Multi-camera and Multi-modalSensor Fusion Algorithms and Applications pp 1ndash12 MarseilleFrance 2008
[30] C Braillon C Pradalier J L Crowley and C Laugier ldquoReal-time moving obstacle detection using optical flow modelsrdquo inProceedings of the IEEE Intelligent Vehicles Symposium (IV rsquo06)pp 466ndash471 Tokyo Japan June 2006
[31] C Kim and J N Hwang ldquoFast and automatic video objectsegmentation and tracking for content-based applicationsrdquoIEEE Transactions on Circuits and Systems for Video Technologyvol 12 no 2 pp 122ndash129 2002
[32] R Jain and H H Nagel ldquoOn the analysis of accumulativedifference pictures from image sequences of real world scenesrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 1 no 2 pp 206ndash214 1978
[33] O Barnich andM Van Droogenbroeck ldquoViBE a powerful ran-dom technique to estimate the background in video sequencesrdquoin Proceedings of the IEEE International Conference onAcousticsSpeech and Signal Processing (ICASSP rsquo09) pp 945ndash948 TaibeiTaiwan April 2009
[34] A Elgammal D Harwood and L Davis ldquoNon-parametricmodel for background subtractionrdquo in Proceedings of the the 6thEuropean Conference on Computer Vision-Part II pp 751ndash767London UK 2000
[35] HWMao C H Yang G P Abousleman and J Si ldquoAutomatedmultiple target detection and tracking in UAV videosrdquo inAirborne Intelligence Surveillance Reconnaissance (ISR) Systemsand Applications VII vol 7668 of Proceedings of SPIE OrlandoFla USA 2010
[36] H Medeiros J Park and A C Kak ldquoDistributed objecttracking using a cluster-based Kalman filter in wireless cameranetworksrdquo IEEE Journal on Selected Topics in Signal Processingvol 2 no 4 pp 448ndash463 2008
[37] A Andriyenko and K Schindler ldquoMulti-target tracking bycontinuous energy minimizationrdquo in Proceedings of the IEEEConference on Computer Vision and Pattern Recognition (CVPRrsquo11) pp 1265ndash1272 Colorado Springs Colo USA June 2011
[38] A Andriyenko and K Schindler ldquoGlobally optimal multi targettracking on a hexagonal latticerdquo in Proceedings of the 11thEuropean Conference on Computer Vision pp 466ndash479 CreteGreece 2010
[39] J Y Bouguet ldquoPyramidal implementation of the affine lucaskanade feature tracker description of the algorithmrdquo IntelCorporation pp 1ndash9 2001
[40] Y Wang G Hu and Z Chen ldquoCalibration of CCD camera inimage matching experimental equipmentrdquo in Proceedings of the2nd International Symposium on Instrumentation Science andTechnology p 3146 Jinan China August 2002
[41] B Caprile and V Torre ldquoUsing vanishing points for cameracalibrationrdquo International Journal of Computer Vision vol 4 no2 pp 127ndash139 1990
Submit your manuscripts athttpwwwhindawicom
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical Problems in Engineering
Hindawi Publishing Corporationhttpwwwhindawicom
Differential EquationsInternational Journal of
Volume 2014
Applied MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical PhysicsAdvances in
Complex AnalysisJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
OptimizationJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Operations ResearchAdvances in
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Function Spaces
Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of Mathematics and Mathematical Sciences
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Algebra
Discrete Dynamics in Nature and Society
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Decision SciencesAdvances in
Discrete MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Stochastic AnalysisInternational Journal of
4 Mathematical Problems in Engineering
(a) Original video (b) Gaussian mixture model (c) Random background model
Figure 3 The target detection results of video data set 1
(a) Original video (b) Gaussian mixture model (c) Random background model
Figure 4 The target detection results of video data set 2
(a) Original video (b) Foreground image with ghost (c) The target detection result
Figure 5 The target detection results included the ghost
(a) Original video (b) Gaussian mixture model (c) Our method
Figure 6 The target detection results for frame 80
Mathematical Problems in Engineering 5
(a) Original video (b) Gaussian mixture model (c) Our method
Figure 7 The target detection results for frame 105
(a) Original video (b) Gaussian mixture model (c) Our method
Figure 8 The target detection results for frame 145
events evolving at various speeds is limited similarly to thelimitation in its adaptive ability of dealingwith the concurrentevents with different frequency In order to address thisissue random background modeling that is intuitively animproved samples-based algorithm is found in [33] Thisalgorithm assumes 119901
119905(119909) to be the value of the pixel 119909 at
time 119905 and imposes the constraint that the influence of avalue on the polychromatic space is restrictedwithin the localneighborhood Then a set of sample values is used as a pixelmodel to classify a value 119901
119905(119909) to be either a background or a
foreground pixel valueAn experiment was carried out using video data and
compared with the Gaussian mixture model presented byStauffer andGrimson [21] in which the video data set 1 comesfrom the evaluating data from Performance Evaluation ofTracking and Surveillance (PETS) database with the videoimage resolution of 768 times 576 pixels and the frame rateof 25 frames per second (fs) the video data set 2 comesfrom a practical surveillance system data in the DNC ofBeihang University with the video image resolution of 352times 288 pixels and the frame rate of 25 fs The experimentalresults are shown in Figures 3 and 4 As can be seen fromFigures 3 and 4 since trees were swinging in the windsuch movements were classified as foreground motions bythe Gaussian mixture model while the random backgroundmodel effectively detected the trees as the background How-ever both algorithms do not take into account the shadowof the target and thus are severely damaged in terms of thereliability and robustness of the target detection and trackingas illustrated in Figure 5 in which the used video data set 3comes from a practical surveillance system in the DNC of
Beihang University and is with the video image resolution of352 times 288 pixels and the frame rate of 25 fs
In order to remove the damage from the shadow ontarget detection and tracking we propose an algorithm bycombining the random backgroundmodel and the frame dif-ference algorithm and the mathematical model is describedas follows
Mask (119909 119910)
= 1
1003816100381610038161003816Dilate (119863 (119909 119910)) minus Erode (119863 (119909 119910))1003816100381610038161003816 = 1
0 otherwise
(1)
where 119863(119909 119910) denotes the mask image of the backgrounddifferencing Dilate(lowast) denotes the dilation operation of thetarget region block Erode(lowast) denotes the erosion operationof the target regionMask(119909 119910) denotes themask image of thedifference between the dilated and eroded image operations
Suppose that the number of the pixels with their valuesequal to 1 in Mask(119909 119910) is 119873
1and the number of the pixels
that are detected as foreground from differencing image andthe values of which at (119909 119910) in the template Mask(119909 119910) equalto 1 is 119873
2 If 11987311198732
gt 119879 where 119879 denotes a threshold thenthe target region block is the foreground target otherwise itis the shadow of the target
An experiment is carried out using the video data set 3and compared with the GMM The experimental results areshown in Figures 6 7 and 8 As can be seen the algorithmpresented in this paper is effective to remove the shadow ofthe target
6 Mathematical Problems in Engineering
Figure 9 The multitarget tracking results for video data set 4
Figure 10 The target tracking results across multicamera for video data set 3
u
(u ) (ud d)
Zc Yc
XcOc
zc
xc yc
Zw
Xw
Yw
Ow
P(xw yw zw)+
Figure 11The relationship between the world coordinates and pixelcoordinates
32 Target Tracking Once moving targets are detected thetrack initialization event is triggered such that the movingtargets can be continuously tracked by the tracking algorithmin the living period of a track (which starts from its initial-ization to its termination [35]) The termination of a trackoccurs when a target can no longer be detected because itleaves the field of view it stops and becomes static or it can
Figure 12 The fast calibrating camera tool software
no longer be distinguished from the background Detectedtargets are not confirmed to be true moving targets until theyhave been consistently tracked for a period of time beforetheir target tracks are initialized We create a dynamic listof potential tracks using all detected targets Associationswill be established between targets detected in a new imageframe and potential tracking targets When a potential targetis tracked in several continuous frames it is recognized as atrue moving target and a track will be initialized
Compared to a single target tracking the multitargetproblem poses additional difficulties data association needsto be solved that is it has to be decided which observation
Mathematical Problems in Engineering 7
Xc1
Yc1Oc1
Zc1
Xc2
Yc2
Oc2
Zc2
Xw
Yw
Ow
Zw
r c2c1
r c1w
r c2w
Figure 13 The schematic diagram of the target positioning
corresponds towhich target constraints between targets needto be taken into account Multitarget tracking algorithmscan be roughly divided into two categories the recursivealgorithms and the nonrecursive algorithms The recursivealgorithms base their estimate only on the state of theprevious frame such as Kalman filtering [36] and particlefiltering [27] in which different strategy is used to obtainan optimal solution over multiple frames and can thusbetter cope with ambiguous multimodal distributions Thenonrecursive algorithms seek optimality over an extendedperiod of time [37 38]
In practical applications target tracking takes targetsplitting and merging into account because of the factorssuch as illumination changes and occlusion Since the one-to-one tracking assumption rarely holds multitarget trackingproblem is still challenging In order to realize multitargettracking we propose a solution by combining the pyramidLucas-Kanade feature tracker [39] and Kalman filter Themathematical model of Kalman filtering is described asfollows
119883 (119896 + 1) = 119860 (119896 + 1 119896) 119883 (119896) + 119882 (119896)
119885 (119896) = 119867 (119896) 119883 (119896) + 119881 (119896)
(2)
where 119883(119896) = [119909(119896) 119910(119896) 119908(119896) ℎ(119896) V119909(119896) V119910
(119896) V119908
(119896)
Vℎ(119896)]119879 denotes the state 119860(119896 + 1 119896) denotes the state tran-
sition matrix 119882(119896) denotes the system noise 119867(119896) denotesthe measurement matrix 119885(119896) = [119909(119896) 119910(119896) 119908(119896) ℎ(119896)]
119879
denotes the measurement value 119881(119896) denotes the measure-ment noise 119909(119896) and 119910(119896) denote the horizontal and verticalordinates of the target centroid 119908(119896) and ℎ(119896) denote thewidth and height of the target envelope rectangle V
119909(119896)
V119910
(119896) V119908
(119896) and Vℎ(119896) denote the speeds of the 119909(119896) 119910(119896)
119908(119896) and ℎ(119896) respectivelyTo realize multitarget tracking across multiple cameras
we construct a similarity function of target matching to
realize target association and target tracking across multiplecameras The similarity function is described as follows
119891 (119886 119887) = 120572119872119889
(119886 119887) + 120573119872119904
(119886 119887) + 120574119872ℎ
(119886 119887)
119872119889
(119886 119887) =(119882119886
+ 119882119887)
(119882119886
+ 119882119887) + 119863119909
sdot(119867119886
+ 119867119887)
(119867119886
+ 119867119887) + 119863119910
119872119904
(119886 119887) =2119882119886119867119886119882119887119867119887
(119882119886119867119886)2
+ (119882119887119867119887)2
119872ℎ
(119886 119887) =2ℎ119886ℎ119887
ℎ2119886
+ ℎ2
119887
(3)
where 119886 and 119887 denote targets which will be matched119872119889(119886 119887) isin [0 1] denotes their position similarity (the larger
the 119872119889(119886 119887) is the closer their positions are) 119872
119904(119886 119887) isin
[0 1] denotes the similarity of their sizes (the larger the119872119904(119886 119887) is the closer their sizes are) 119872
ℎ(119886 119887) isin [0 1]
denotes the similarity of their heights (the larger the 119872ℎ(119886 119887)
is the closer their heights are) 120572 isin [0 1] 120573 isin [0 1] and 120574 isin
[0 1] denote the weight coefficients and satisfy 120572 + 120573 + 120574 = 1119883119886and 119884
119886denote the horizontal and vertical ordinates of the
target 119886 in the world coordinate system 119883119887and 119884119887denote the
horizontal and vertical ordinates of the target 119887 in the worldcoordinate system119882
119886and119882
119887denote half of the widths of the
targets 119886 and 119887 respectively 119867119886and 119867
119887denote half of their
heights respectively119863119909
= fabs(119883119886minus119883119887) denotes the absolute
difference between119883119886and119883
119887119863119910
= fabs(119884119886
minus119884119887) denotes the
absolute difference between 119884119886and 119884
119887 ℎ119886and ℎ119887denote their
heights in the world coordinate system 119891(119886 119887) isin [0 1] (thelarger the 119891(119886 119887) is the higher their matching similarity is)and vice versa In practical applications 120572 120573 and 120574 can beadjusted according to the accuracy of the 119872
119889(119886 119887) 119872
119904(119886 119887)
and 119872ℎ(119886 119887)
An experiment is carried out by using the video data sets3 and 4 which come from a practical surveillance systemin the DNC of Beihang University with the video imageresolution of 352 times 288 pixels and the frame rate of 25 fsThe experimental results are shown in Figures 9 and 10 inwhich the rectangle with the dotted blue lines denotes theoverlapping area between two cameras As can be seen thetarget tracking algorithm presented in this paper is effectiveand can track multiple targets across the multiple cameras
4 Fast Camera Calibrationand Target Positioning
41 Fast Camera Calibration Camera calibration is a keytechnology in determining the mapping between 3D worldcoordinates and 2D image coordinates for various computervision applications A schematic diagram describing themapping between 3D world coordinates and 2D image coor-dinates is shown in Figure 11 (119906
119889 V119889) is the image coordinate
of (119909119908
119910119908
119911119908
) if a perfect pinhole camera model is used(119906 V) is the actual image coordinate which deviates from(119906119889 V119889) due to lens distortion The distance between them is
termed as the radial distortion Therefore the mathematical
8 Mathematical Problems in Engineering
Figure 14 The experimental results in indoor environment
model from 3Dworld coordinates to 2D image coordinates isexpressed by [40]
119906 + 120575119906 = 119891119909
1199031
(119909119908
minus 119909119888) + 1199032
(119910119908
minus 119910119888) + 1199033
(119911119908
minus 119911119888)
1199037
(119909119908
minus 119909119888) + 1199038
(119910119908
minus 119910119888) + 1199039
(119911119908
minus 119911119888)
V + 120575V = 119891119910
1199034
(119909119908
minus 119909119888) + 1199035
(119910119908
minus 119910119888) + 1199036
(119911119908
minus 119911119888)
1199037
(119909119908
minus 119909119888) + 1199038
(119910119908
minus 119910119888) + 1199039
(119911119908
minus 119911119888)
(4)
where 119903119894
(119894 = 1 2 9) denotes the element of therotation matrix from the world coordinate frame to the pixelcoordinate frame 119891
119909and 119891
119910denote the focal lengths of
the camera in the 119909 and 119910 directions 120575119906 and 120575V denotethe photogrammetric distortions 119874
119888(119909119888 119910119888 119911119888) denotes the
coordinate of the camera in the world coordinate frame119875(119909119908
119910119908
119911119908
)denotes the coordinate of the target in theworldcoordinate frame
Traditional calibration algorithms for example DLTalgorithm [9] and Tsai algorithm [10] utilize a series ofmathematical transformations and algorithms to obtainparameters of the camera model They have been widelyused because of their simple mathematical model and theoryHowever these algorithms require a large amount of work torecord and check calibration points during calibration andthus are inefficient in practical applications For this reason afast calibration algorithmbased on the vanishing point theorywas introduced in [11 41] in which the photogrammetricdistortions 120575119906 and 120575V are unconsidered
According to the mathematical model of the fast cameracalibration presented in [11] we develop software to calibratecameras fast and can promptly check the accuracy of cameraparameters by (4) The calibration process and results of apractical camera are shown in Figure 12 As can be seenthe focal lengths of the camera are 39983535 pixels thecalibration error of the line segment 119860119875 is 398mm and
the rotation matrix 119877 and translation vector 119879 (unit mm) areas the following
119877 = [
[
0642 minus0766 0024
minus0222 minus0216 minus0951
0734 0605 minus0309
]
]
119879 = [
[
minus93497
190111
62305937
]
]
(5)
42 Target Positioning Once targets are continuouslytracked the space coordinates of the targets in the cameracoordinate frame can be computed by imaging model withcamera parameters as follows
[
[
119883119888
119884119888
119885119888
]
]
= 119885119888
[[[[[
[
119889119909
119891119909
0 0
0
119889119910
119891119910
0
0 0 1
]]]]]
]
[
[
119906
V1
]
]
+
[[[[[[
[
1199060119885119888119889119909
119891119909
V0119885119888119889119910
119891119910
0
]]]]]]
]
(6)
where 1199060and V0denote the principle point in the pixel frame
and 119889119909 and 119889119910 denote the pixel sizes of the camera in the 119909
and 119910 directionsWhen the targets are across multiple cameras the space
coordinates of the targets in each camera coordinate frameare computed respectively by (6) and then are unified intothe world coordinated system as illustrated in Figure 13 Themathematical model is described as follows
[
[
119883119908
119884119908
119885119908
]
]
= 119877119894([
[
119883119888119894
119884119888119894
119885119888119894
]
]
minus 119879119894) 119894 = 1 2 119873 (7)
where 119894 denotes the 119894th camera and 119873 denotes the number ofcameras
5 Experiment Test
The target positioning system presented in this paper is testedin indoor and outdoor environments in which all the low-cost static cameras are calibrated and the coordinates of
Mathematical Problems in Engineering 9
(a) 3D scene of new main building (b) Camera 1
(c) Camera 2 (d) Camera 3
(e) Camera 4 (f) Camera 5
(g) Camera 6 (h) Camera 7
Figure 15 The experimental results in outdoor environment
cameras are unified into the world coordinate system Targetsdetection and tracking are done by the target detection andtracking algorithm As a result the targets are positionedby imaging model and camera parameters in real time andtheir trajectories are displayed in a three-dimensional scene
The test results are respectively shown in Figures 14 and15 for indoor and outdoor environments As can be seenfrom Figure 14 when a target is continuously moving withinan indoor corridor the positioning system consisting of sixdistributed cameras is able to position this target in real
10 Mathematical Problems in Engineering
time and display its trajectory in a three-dimensional spacemodel Likewise as can be seen from Figure 15 when a targetis continuously moving outdoors the positioning systemconsisting of seven distributed cameras is able to position thistarget and display its trajectory in a three-dimensional spacemodel in real time as well The experimental results confirmthat the systematic framework and inclusive algorithmspresented in this paper are both effective and efficient
In this paper we assume that the ground is flat whichrarely holds in practical applications given a large regionIn order to solve this problem it is necessary to use digitalelevationmodel (DEM) to describe topographic relief in largeregions
6 Conclusion and Future Work
This paper presented the comprehensive design and imple-mentation of a moving target positioning system basedon a distributed camera network The system is composedof low-cost static cameras which provide complementarypositioning information for moving target positioning inindoor and outdoor environments when GNSS signals areunavailable In this system static cameras can cover a largeregion moving targets are detected and then tracked usingcorresponding algorithms target positions are estimated bymaking use of the geometrical relationships among thosecameras after calibrating those cameras and finally foreach target its position estimates obtained from differentcameras are unified into the world coordinate system Theexperimental results of the target detection tracking andpositioning system were reported based on real video data
Targets positioning and trackingwithmultiple static cam-eras were verified in both indoor and outdoor environmentsHowever the reliability and accuracy of target tracking andpositioning suffer from several environment factors Henceit is necessary to fuse information from various sensorssuch as radar infrared camera inertial measure unit (IMU)and wireless location system Regarding future work it ismeaningful to develop and test these algorithms in thepractical applications
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This project is supported by the Key Program of the NationalNatural Science Foundation of China (Grant no 61039003)the National Natural Science Foundation of China (Grantno 41274038) the Aeronautical Science Foundation of China(Grant no 2013ZC51027) the Aerospace Innovation Founda-tion of China (CASC201102) and the Fundamental ResearchFunds for the Central Universities
References
[1] R Mautz and S Tilch ldquoSurvey of optical indoor positioningsystemsrdquo in Proceedings of the International Conference onIndoor Positioning and Indoor Navigation (IPIN rsquo11) GuimaraesPortugal September 2011
[2] G Retscher E Moser D Vredeveld et al ldquoPerformance andaccuracy test of a WiFi indoor positioning systemrdquo Journal ofApplied Geodesy vol 1 no 2 pp 103ndash110 2007
[3] S J Ingram D Harmer and M Quinlan ldquoUltra wide bandindoor positioning systems and their use in emergenciesrdquo inProceedings of the Position Location and Navigation Symposium(PLANS rsquo04 ) pp 706ndash715 Monterey Calif USA April 2004
[4] G Anastasi R Bandelloni M Conti et al ldquoExperimentingan indoor bluetooth-based positioning servicerdquo in Proceedingsof the 23rd International Conference on Distributed ComputingSystems Workshops pp 480ndash483 Providence RI USA 2003
[5] H Liu H Darabi P Banerjee and J Liu ldquoSurvey of wirelessindoor positioning techniques and systemsrdquo IEEE Transactionson Systems Man and Cybernetics C Applications and Reviewsvol 37 no 6 pp 1067ndash1080 2007
[6] H-S Kim and J-S Choi ldquoAdvanced indoor localization usingultrasonic sensor and digital compassrdquo in Proceedings of theInternational Conference on Control Automation and Systems(ICCAS rsquo08) pp 223ndash226 Seoul Korea October 2008
[7] R Karlsson T B Schon D Tornqvist G Conte and FGustafsson ldquoUtilizing model structure for efficient simulta-neous localization and mapping for a UAV applicationrdquo inProceedings of the IEEE Aerospace Conference Digital ObjectIdentifier (AC rsquo08) Rome Italy March 2008
[8] D Nister O Naroditsky and J Bergen ldquoVisual odometryrdquoin Proceedings of the IEEE Computer Society Conference onComputer Vision and Pattern Recognition (CVPR rsquo04) pp 652ndash659 Washington DC USA July 2004
[9] Y I Abdel-Aziz andHMKarara ldquoDirect linear transformationinto object space coordinates in close-range photogrammetryrdquoin Proceedings of the Symposium on Close-Range Photogramme-try pp 1ndash18 Urbana Ill USA 1971
[10] R Y Tsai ldquoA versatile camera calibration technique for high-accuracy 3D Machine vision metrology using off-the-shelf TVcamera and lensesrdquo IEEE Journal of Robotics and Automationvol RA-3 no 4 pp 323ndash344 1987
[11] J Long X Zhang and L Zhao ldquoA fast calibration algorithmbased on vanishing point for scene camerardquo Applied Mechanicsand Materials vol 58ndash60 pp 1148ndash1153 2011
[12] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000
[13] P D Z Varcheie and G-A Bilodeau ldquoPeople tracking using anetwork-based PTZ camerardquoMachine Vision and Applicationsvol 22 no 4 pp 671ndash690 2011
[14] C-H Chen Y Yao D Page B Abidi A Koschan andMAbidildquoHeterogeneous fusion of omnidirectional and PTZ camerasfor multiple object trackingrdquo IEEE Transactions on Circuits andSystems for Video Technology vol 18 no 8 pp 1052ndash1063 2008
[15] X Clady F Collange F Jurie and P Martinet ldquoObject track-ing with a Pan-tilt-zoom camera application to car drivingassistancerdquo in Proceedings of the IEEE International Conferenceon Robotics and Automation (ICRA rsquo01) pp 1653ndash1658 SeoulKorea May 2001
[16] N Bellotto E Sommerlade B Benfold et al ldquoA dstributedcamera system formulti-resolution surveillancerdquo in Proceedings
Mathematical Problems in Engineering 11
of the 3rd ACMIEEE International Conference on DistributedSmart Cameras (ICDSC rsquo09) Como Italy September 2009
[17] C Ding B Song A Morye et al ldquoCollaborative sensing in adistributed ptz camera networkrdquo IEEE Transactions on ImageProcessing vol 21 no 7 pp 3282ndash3295 2012
[18] E B Ermis P Clarot P-M Jodoin and V Saligrama ldquoActivitybasedmatching in distributed camera networksrdquo IEEE Transac-tions on Image Processing vol 19 no 10 pp 2595ndash2613 2010
[19] D-M Tsai and S-C Lai ldquoIndependent component analysis-based background subtraction for indoor surveillancerdquo IEEETransactions on Image Processing vol 18 no 1 pp 158ndash167 2009
[20] J Migdal and W E L Grimson ldquoBackground subtractionusing Markov thresholdsrdquo in Proceedings of the IEEE Workshopon Motion and Video Computing (MOTION rsquo05) pp 58ndash65Breckenridge Colo USA January 2005
[21] C Stauffer andW E L Grimson ldquoLearning patterns of activityusing real-time trackingrdquo IEEETransactions on Pattern Analysisand Machine Intelligence vol 22 no 8 pp 747ndash757 2000
[22] D-S Lee ldquoEffective Gaussian mixture learning for video back-ground subtractionrdquo IEEE Transactions on Pattern Analysis andMachine Intelligence vol 27 no 5 pp 827ndash832 2005
[23] L Zhao and X He ldquoAdaptive Gaussian mixture learningfor moving object detectionrdquo in Proceedings of the 3rd IEEEInternational Conference on Broadband Network and Multime-dia Technology (IC-BNMT rsquo10) pp 1176ndash1180 Beijing ChinaOctober 2010
[24] S Sivaraman and M M Trivedi ldquoLooking at vehicles on theroad a survey of vision-based vehicle detection tracking andbehavior analysisrdquo IEEE Transactions on Intelligent Transporta-tion Systems vol 14 no 4 pp 1773ndash1795 2013
[25] G H Wen Z S Duan G R Chen and W W Yu ldquoConsen-sus tracking of multi-agent systems with lipschitz-type nodedynamics and switching topologiesrdquo IEEE Transactions onCircuits and Systems vol 60 no 9 pp 1ndash13 2013
[26] S Oh S Russell and S Sastry ldquoMarkov chain Monte Carlodata association formulti-target trackingrdquo IEEETransactions onAutomatic Control vol 54 no 3 pp 481ndash497 2009
[27] H Zhang and L Zhao ldquoIntegral channel features for particlefilter based object trackingrdquo in Proceedings of the 5th Interna-tional Conference on Intelligent Human-Machine Systems andCybernetics pp 190ndash193 Hangzhou China 2013
[28] M Taj and A Cavallaro ldquoDistributed and decentralized multi-camera trackingrdquo IEEE Signal Processing Magazine vol 28 no3 pp 46ndash58 2011
[29] N Krahnstoever T Yu S Lim et al ldquoCollaborative real-timecontrol of active cameras in large scale surveillance systemsrdquo inProceedings of the Workshop on Multi-camera and Multi-modalSensor Fusion Algorithms and Applications pp 1ndash12 MarseilleFrance 2008
[30] C Braillon C Pradalier J L Crowley and C Laugier ldquoReal-time moving obstacle detection using optical flow modelsrdquo inProceedings of the IEEE Intelligent Vehicles Symposium (IV rsquo06)pp 466ndash471 Tokyo Japan June 2006
[31] C Kim and J N Hwang ldquoFast and automatic video objectsegmentation and tracking for content-based applicationsrdquoIEEE Transactions on Circuits and Systems for Video Technologyvol 12 no 2 pp 122ndash129 2002
[32] R Jain and H H Nagel ldquoOn the analysis of accumulativedifference pictures from image sequences of real world scenesrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 1 no 2 pp 206ndash214 1978
[33] O Barnich andM Van Droogenbroeck ldquoViBE a powerful ran-dom technique to estimate the background in video sequencesrdquoin Proceedings of the IEEE International Conference onAcousticsSpeech and Signal Processing (ICASSP rsquo09) pp 945ndash948 TaibeiTaiwan April 2009
[34] A Elgammal D Harwood and L Davis ldquoNon-parametricmodel for background subtractionrdquo in Proceedings of the the 6thEuropean Conference on Computer Vision-Part II pp 751ndash767London UK 2000
[35] HWMao C H Yang G P Abousleman and J Si ldquoAutomatedmultiple target detection and tracking in UAV videosrdquo inAirborne Intelligence Surveillance Reconnaissance (ISR) Systemsand Applications VII vol 7668 of Proceedings of SPIE OrlandoFla USA 2010
[36] H Medeiros J Park and A C Kak ldquoDistributed objecttracking using a cluster-based Kalman filter in wireless cameranetworksrdquo IEEE Journal on Selected Topics in Signal Processingvol 2 no 4 pp 448ndash463 2008
[37] A Andriyenko and K Schindler ldquoMulti-target tracking bycontinuous energy minimizationrdquo in Proceedings of the IEEEConference on Computer Vision and Pattern Recognition (CVPRrsquo11) pp 1265ndash1272 Colorado Springs Colo USA June 2011
[38] A Andriyenko and K Schindler ldquoGlobally optimal multi targettracking on a hexagonal latticerdquo in Proceedings of the 11thEuropean Conference on Computer Vision pp 466ndash479 CreteGreece 2010
[39] J Y Bouguet ldquoPyramidal implementation of the affine lucaskanade feature tracker description of the algorithmrdquo IntelCorporation pp 1ndash9 2001
[40] Y Wang G Hu and Z Chen ldquoCalibration of CCD camera inimage matching experimental equipmentrdquo in Proceedings of the2nd International Symposium on Instrumentation Science andTechnology p 3146 Jinan China August 2002
[41] B Caprile and V Torre ldquoUsing vanishing points for cameracalibrationrdquo International Journal of Computer Vision vol 4 no2 pp 127ndash139 1990
Submit your manuscripts athttpwwwhindawicom
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical Problems in Engineering
Hindawi Publishing Corporationhttpwwwhindawicom
Differential EquationsInternational Journal of
Volume 2014
Applied MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical PhysicsAdvances in
Complex AnalysisJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
OptimizationJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Operations ResearchAdvances in
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Function Spaces
Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of Mathematics and Mathematical Sciences
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Algebra
Discrete Dynamics in Nature and Society
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Decision SciencesAdvances in
Discrete MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Stochastic AnalysisInternational Journal of
Mathematical Problems in Engineering 5
(a) Original video (b) Gaussian mixture model (c) Our method
Figure 7 The target detection results for frame 105
(a) Original video (b) Gaussian mixture model (c) Our method
Figure 8 The target detection results for frame 145
events evolving at various speeds is limited similarly to thelimitation in its adaptive ability of dealingwith the concurrentevents with different frequency In order to address thisissue random background modeling that is intuitively animproved samples-based algorithm is found in [33] Thisalgorithm assumes 119901
119905(119909) to be the value of the pixel 119909 at
time 119905 and imposes the constraint that the influence of avalue on the polychromatic space is restrictedwithin the localneighborhood Then a set of sample values is used as a pixelmodel to classify a value 119901
119905(119909) to be either a background or a
foreground pixel valueAn experiment was carried out using video data and
compared with the Gaussian mixture model presented byStauffer andGrimson [21] in which the video data set 1 comesfrom the evaluating data from Performance Evaluation ofTracking and Surveillance (PETS) database with the videoimage resolution of 768 times 576 pixels and the frame rateof 25 frames per second (fs) the video data set 2 comesfrom a practical surveillance system data in the DNC ofBeihang University with the video image resolution of 352times 288 pixels and the frame rate of 25 fs The experimentalresults are shown in Figures 3 and 4 As can be seen fromFigures 3 and 4 since trees were swinging in the windsuch movements were classified as foreground motions bythe Gaussian mixture model while the random backgroundmodel effectively detected the trees as the background How-ever both algorithms do not take into account the shadowof the target and thus are severely damaged in terms of thereliability and robustness of the target detection and trackingas illustrated in Figure 5 in which the used video data set 3comes from a practical surveillance system in the DNC of
Beihang University and is with the video image resolution of352 times 288 pixels and the frame rate of 25 fs
In order to remove the damage from the shadow ontarget detection and tracking we propose an algorithm bycombining the random backgroundmodel and the frame dif-ference algorithm and the mathematical model is describedas follows
Mask (119909 119910)
= 1
1003816100381610038161003816Dilate (119863 (119909 119910)) minus Erode (119863 (119909 119910))1003816100381610038161003816 = 1
0 otherwise
(1)
where 119863(119909 119910) denotes the mask image of the backgrounddifferencing Dilate(lowast) denotes the dilation operation of thetarget region block Erode(lowast) denotes the erosion operationof the target regionMask(119909 119910) denotes themask image of thedifference between the dilated and eroded image operations
Suppose that the number of the pixels with their valuesequal to 1 in Mask(119909 119910) is 119873
1and the number of the pixels
that are detected as foreground from differencing image andthe values of which at (119909 119910) in the template Mask(119909 119910) equalto 1 is 119873
2 If 11987311198732
gt 119879 where 119879 denotes a threshold thenthe target region block is the foreground target otherwise itis the shadow of the target
An experiment is carried out using the video data set 3and compared with the GMM The experimental results areshown in Figures 6 7 and 8 As can be seen the algorithmpresented in this paper is effective to remove the shadow ofthe target
6 Mathematical Problems in Engineering
Figure 9 The multitarget tracking results for video data set 4
Figure 10 The target tracking results across multicamera for video data set 3
u
(u ) (ud d)
Zc Yc
XcOc
zc
xc yc
Zw
Xw
Yw
Ow
P(xw yw zw)+
Figure 11The relationship between the world coordinates and pixelcoordinates
32 Target Tracking Once moving targets are detected thetrack initialization event is triggered such that the movingtargets can be continuously tracked by the tracking algorithmin the living period of a track (which starts from its initial-ization to its termination [35]) The termination of a trackoccurs when a target can no longer be detected because itleaves the field of view it stops and becomes static or it can
Figure 12 The fast calibrating camera tool software
no longer be distinguished from the background Detectedtargets are not confirmed to be true moving targets until theyhave been consistently tracked for a period of time beforetheir target tracks are initialized We create a dynamic listof potential tracks using all detected targets Associationswill be established between targets detected in a new imageframe and potential tracking targets When a potential targetis tracked in several continuous frames it is recognized as atrue moving target and a track will be initialized
Compared to a single target tracking the multitargetproblem poses additional difficulties data association needsto be solved that is it has to be decided which observation
Mathematical Problems in Engineering 7
Xc1
Yc1Oc1
Zc1
Xc2
Yc2
Oc2
Zc2
Xw
Yw
Ow
Zw
r c2c1
r c1w
r c2w
Figure 13 The schematic diagram of the target positioning
corresponds towhich target constraints between targets needto be taken into account Multitarget tracking algorithmscan be roughly divided into two categories the recursivealgorithms and the nonrecursive algorithms The recursivealgorithms base their estimate only on the state of theprevious frame such as Kalman filtering [36] and particlefiltering [27] in which different strategy is used to obtainan optimal solution over multiple frames and can thusbetter cope with ambiguous multimodal distributions Thenonrecursive algorithms seek optimality over an extendedperiod of time [37 38]
In practical applications target tracking takes targetsplitting and merging into account because of the factorssuch as illumination changes and occlusion Since the one-to-one tracking assumption rarely holds multitarget trackingproblem is still challenging In order to realize multitargettracking we propose a solution by combining the pyramidLucas-Kanade feature tracker [39] and Kalman filter Themathematical model of Kalman filtering is described asfollows
119883 (119896 + 1) = 119860 (119896 + 1 119896) 119883 (119896) + 119882 (119896)
119885 (119896) = 119867 (119896) 119883 (119896) + 119881 (119896)
(2)
where 119883(119896) = [119909(119896) 119910(119896) 119908(119896) ℎ(119896) V119909(119896) V119910
(119896) V119908
(119896)
Vℎ(119896)]119879 denotes the state 119860(119896 + 1 119896) denotes the state tran-
sition matrix 119882(119896) denotes the system noise 119867(119896) denotesthe measurement matrix 119885(119896) = [119909(119896) 119910(119896) 119908(119896) ℎ(119896)]
119879
denotes the measurement value 119881(119896) denotes the measure-ment noise 119909(119896) and 119910(119896) denote the horizontal and verticalordinates of the target centroid 119908(119896) and ℎ(119896) denote thewidth and height of the target envelope rectangle V
119909(119896)
V119910
(119896) V119908
(119896) and Vℎ(119896) denote the speeds of the 119909(119896) 119910(119896)
119908(119896) and ℎ(119896) respectivelyTo realize multitarget tracking across multiple cameras
we construct a similarity function of target matching to
realize target association and target tracking across multiplecameras The similarity function is described as follows
119891 (119886 119887) = 120572119872119889
(119886 119887) + 120573119872119904
(119886 119887) + 120574119872ℎ
(119886 119887)
119872119889
(119886 119887) =(119882119886
+ 119882119887)
(119882119886
+ 119882119887) + 119863119909
sdot(119867119886
+ 119867119887)
(119867119886
+ 119867119887) + 119863119910
119872119904
(119886 119887) =2119882119886119867119886119882119887119867119887
(119882119886119867119886)2
+ (119882119887119867119887)2
119872ℎ
(119886 119887) =2ℎ119886ℎ119887
ℎ2119886
+ ℎ2
119887
(3)
where 119886 and 119887 denote targets which will be matched119872119889(119886 119887) isin [0 1] denotes their position similarity (the larger
the 119872119889(119886 119887) is the closer their positions are) 119872
119904(119886 119887) isin
[0 1] denotes the similarity of their sizes (the larger the119872119904(119886 119887) is the closer their sizes are) 119872
ℎ(119886 119887) isin [0 1]
denotes the similarity of their heights (the larger the 119872ℎ(119886 119887)
is the closer their heights are) 120572 isin [0 1] 120573 isin [0 1] and 120574 isin
[0 1] denote the weight coefficients and satisfy 120572 + 120573 + 120574 = 1119883119886and 119884
119886denote the horizontal and vertical ordinates of the
target 119886 in the world coordinate system 119883119887and 119884119887denote the
horizontal and vertical ordinates of the target 119887 in the worldcoordinate system119882
119886and119882
119887denote half of the widths of the
targets 119886 and 119887 respectively 119867119886and 119867
119887denote half of their
heights respectively119863119909
= fabs(119883119886minus119883119887) denotes the absolute
difference between119883119886and119883
119887119863119910
= fabs(119884119886
minus119884119887) denotes the
absolute difference between 119884119886and 119884
119887 ℎ119886and ℎ119887denote their
heights in the world coordinate system 119891(119886 119887) isin [0 1] (thelarger the 119891(119886 119887) is the higher their matching similarity is)and vice versa In practical applications 120572 120573 and 120574 can beadjusted according to the accuracy of the 119872
119889(119886 119887) 119872
119904(119886 119887)
and 119872ℎ(119886 119887)
An experiment is carried out by using the video data sets3 and 4 which come from a practical surveillance systemin the DNC of Beihang University with the video imageresolution of 352 times 288 pixels and the frame rate of 25 fsThe experimental results are shown in Figures 9 and 10 inwhich the rectangle with the dotted blue lines denotes theoverlapping area between two cameras As can be seen thetarget tracking algorithm presented in this paper is effectiveand can track multiple targets across the multiple cameras
4 Fast Camera Calibrationand Target Positioning
41 Fast Camera Calibration Camera calibration is a keytechnology in determining the mapping between 3D worldcoordinates and 2D image coordinates for various computervision applications A schematic diagram describing themapping between 3D world coordinates and 2D image coor-dinates is shown in Figure 11 (119906
119889 V119889) is the image coordinate
of (119909119908
119910119908
119911119908
) if a perfect pinhole camera model is used(119906 V) is the actual image coordinate which deviates from(119906119889 V119889) due to lens distortion The distance between them is
termed as the radial distortion Therefore the mathematical
8 Mathematical Problems in Engineering
Figure 14 The experimental results in indoor environment
model from 3Dworld coordinates to 2D image coordinates isexpressed by [40]
119906 + 120575119906 = 119891119909
1199031
(119909119908
minus 119909119888) + 1199032
(119910119908
minus 119910119888) + 1199033
(119911119908
minus 119911119888)
1199037
(119909119908
minus 119909119888) + 1199038
(119910119908
minus 119910119888) + 1199039
(119911119908
minus 119911119888)
V + 120575V = 119891119910
1199034
(119909119908
minus 119909119888) + 1199035
(119910119908
minus 119910119888) + 1199036
(119911119908
minus 119911119888)
1199037
(119909119908
minus 119909119888) + 1199038
(119910119908
minus 119910119888) + 1199039
(119911119908
minus 119911119888)
(4)
where 119903119894
(119894 = 1 2 9) denotes the element of therotation matrix from the world coordinate frame to the pixelcoordinate frame 119891
119909and 119891
119910denote the focal lengths of
the camera in the 119909 and 119910 directions 120575119906 and 120575V denotethe photogrammetric distortions 119874
119888(119909119888 119910119888 119911119888) denotes the
coordinate of the camera in the world coordinate frame119875(119909119908
119910119908
119911119908
)denotes the coordinate of the target in theworldcoordinate frame
Traditional calibration algorithms for example DLTalgorithm [9] and Tsai algorithm [10] utilize a series ofmathematical transformations and algorithms to obtainparameters of the camera model They have been widelyused because of their simple mathematical model and theoryHowever these algorithms require a large amount of work torecord and check calibration points during calibration andthus are inefficient in practical applications For this reason afast calibration algorithmbased on the vanishing point theorywas introduced in [11 41] in which the photogrammetricdistortions 120575119906 and 120575V are unconsidered
According to the mathematical model of the fast cameracalibration presented in [11] we develop software to calibratecameras fast and can promptly check the accuracy of cameraparameters by (4) The calibration process and results of apractical camera are shown in Figure 12 As can be seenthe focal lengths of the camera are 39983535 pixels thecalibration error of the line segment 119860119875 is 398mm and
the rotation matrix 119877 and translation vector 119879 (unit mm) areas the following
119877 = [
[
0642 minus0766 0024
minus0222 minus0216 minus0951
0734 0605 minus0309
]
]
119879 = [
[
minus93497
190111
62305937
]
]
(5)
42 Target Positioning Once targets are continuouslytracked the space coordinates of the targets in the cameracoordinate frame can be computed by imaging model withcamera parameters as follows
[
[
119883119888
119884119888
119885119888
]
]
= 119885119888
[[[[[
[
119889119909
119891119909
0 0
0
119889119910
119891119910
0
0 0 1
]]]]]
]
[
[
119906
V1
]
]
+
[[[[[[
[
1199060119885119888119889119909
119891119909
V0119885119888119889119910
119891119910
0
]]]]]]
]
(6)
where 1199060and V0denote the principle point in the pixel frame
and 119889119909 and 119889119910 denote the pixel sizes of the camera in the 119909
and 119910 directionsWhen the targets are across multiple cameras the space
coordinates of the targets in each camera coordinate frameare computed respectively by (6) and then are unified intothe world coordinated system as illustrated in Figure 13 Themathematical model is described as follows
[
[
119883119908
119884119908
119885119908
]
]
= 119877119894([
[
119883119888119894
119884119888119894
119885119888119894
]
]
minus 119879119894) 119894 = 1 2 119873 (7)
where 119894 denotes the 119894th camera and 119873 denotes the number ofcameras
5 Experiment Test
The target positioning system presented in this paper is testedin indoor and outdoor environments in which all the low-cost static cameras are calibrated and the coordinates of
Mathematical Problems in Engineering 9
(a) 3D scene of new main building (b) Camera 1
(c) Camera 2 (d) Camera 3
(e) Camera 4 (f) Camera 5
(g) Camera 6 (h) Camera 7
Figure 15 The experimental results in outdoor environment
cameras are unified into the world coordinate system Targetsdetection and tracking are done by the target detection andtracking algorithm As a result the targets are positionedby imaging model and camera parameters in real time andtheir trajectories are displayed in a three-dimensional scene
The test results are respectively shown in Figures 14 and15 for indoor and outdoor environments As can be seenfrom Figure 14 when a target is continuously moving withinan indoor corridor the positioning system consisting of sixdistributed cameras is able to position this target in real
10 Mathematical Problems in Engineering
time and display its trajectory in a three-dimensional spacemodel Likewise as can be seen from Figure 15 when a targetis continuously moving outdoors the positioning systemconsisting of seven distributed cameras is able to position thistarget and display its trajectory in a three-dimensional spacemodel in real time as well The experimental results confirmthat the systematic framework and inclusive algorithmspresented in this paper are both effective and efficient
In this paper we assume that the ground is flat whichrarely holds in practical applications given a large regionIn order to solve this problem it is necessary to use digitalelevationmodel (DEM) to describe topographic relief in largeregions
6 Conclusion and Future Work
This paper presented the comprehensive design and imple-mentation of a moving target positioning system basedon a distributed camera network The system is composedof low-cost static cameras which provide complementarypositioning information for moving target positioning inindoor and outdoor environments when GNSS signals areunavailable In this system static cameras can cover a largeregion moving targets are detected and then tracked usingcorresponding algorithms target positions are estimated bymaking use of the geometrical relationships among thosecameras after calibrating those cameras and finally foreach target its position estimates obtained from differentcameras are unified into the world coordinate system Theexperimental results of the target detection tracking andpositioning system were reported based on real video data
Targets positioning and trackingwithmultiple static cam-eras were verified in both indoor and outdoor environmentsHowever the reliability and accuracy of target tracking andpositioning suffer from several environment factors Henceit is necessary to fuse information from various sensorssuch as radar infrared camera inertial measure unit (IMU)and wireless location system Regarding future work it ismeaningful to develop and test these algorithms in thepractical applications
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This project is supported by the Key Program of the NationalNatural Science Foundation of China (Grant no 61039003)the National Natural Science Foundation of China (Grantno 41274038) the Aeronautical Science Foundation of China(Grant no 2013ZC51027) the Aerospace Innovation Founda-tion of China (CASC201102) and the Fundamental ResearchFunds for the Central Universities
References
[1] R Mautz and S Tilch ldquoSurvey of optical indoor positioningsystemsrdquo in Proceedings of the International Conference onIndoor Positioning and Indoor Navigation (IPIN rsquo11) GuimaraesPortugal September 2011
[2] G Retscher E Moser D Vredeveld et al ldquoPerformance andaccuracy test of a WiFi indoor positioning systemrdquo Journal ofApplied Geodesy vol 1 no 2 pp 103ndash110 2007
[3] S J Ingram D Harmer and M Quinlan ldquoUltra wide bandindoor positioning systems and their use in emergenciesrdquo inProceedings of the Position Location and Navigation Symposium(PLANS rsquo04 ) pp 706ndash715 Monterey Calif USA April 2004
[4] G Anastasi R Bandelloni M Conti et al ldquoExperimentingan indoor bluetooth-based positioning servicerdquo in Proceedingsof the 23rd International Conference on Distributed ComputingSystems Workshops pp 480ndash483 Providence RI USA 2003
[5] H Liu H Darabi P Banerjee and J Liu ldquoSurvey of wirelessindoor positioning techniques and systemsrdquo IEEE Transactionson Systems Man and Cybernetics C Applications and Reviewsvol 37 no 6 pp 1067ndash1080 2007
[6] H-S Kim and J-S Choi ldquoAdvanced indoor localization usingultrasonic sensor and digital compassrdquo in Proceedings of theInternational Conference on Control Automation and Systems(ICCAS rsquo08) pp 223ndash226 Seoul Korea October 2008
[7] R Karlsson T B Schon D Tornqvist G Conte and FGustafsson ldquoUtilizing model structure for efficient simulta-neous localization and mapping for a UAV applicationrdquo inProceedings of the IEEE Aerospace Conference Digital ObjectIdentifier (AC rsquo08) Rome Italy March 2008
[8] D Nister O Naroditsky and J Bergen ldquoVisual odometryrdquoin Proceedings of the IEEE Computer Society Conference onComputer Vision and Pattern Recognition (CVPR rsquo04) pp 652ndash659 Washington DC USA July 2004
[9] Y I Abdel-Aziz andHMKarara ldquoDirect linear transformationinto object space coordinates in close-range photogrammetryrdquoin Proceedings of the Symposium on Close-Range Photogramme-try pp 1ndash18 Urbana Ill USA 1971
[10] R Y Tsai ldquoA versatile camera calibration technique for high-accuracy 3D Machine vision metrology using off-the-shelf TVcamera and lensesrdquo IEEE Journal of Robotics and Automationvol RA-3 no 4 pp 323ndash344 1987
[11] J Long X Zhang and L Zhao ldquoA fast calibration algorithmbased on vanishing point for scene camerardquo Applied Mechanicsand Materials vol 58ndash60 pp 1148ndash1153 2011
[12] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000
[13] P D Z Varcheie and G-A Bilodeau ldquoPeople tracking using anetwork-based PTZ camerardquoMachine Vision and Applicationsvol 22 no 4 pp 671ndash690 2011
[14] C-H Chen Y Yao D Page B Abidi A Koschan andMAbidildquoHeterogeneous fusion of omnidirectional and PTZ camerasfor multiple object trackingrdquo IEEE Transactions on Circuits andSystems for Video Technology vol 18 no 8 pp 1052ndash1063 2008
[15] X Clady F Collange F Jurie and P Martinet ldquoObject track-ing with a Pan-tilt-zoom camera application to car drivingassistancerdquo in Proceedings of the IEEE International Conferenceon Robotics and Automation (ICRA rsquo01) pp 1653ndash1658 SeoulKorea May 2001
[16] N Bellotto E Sommerlade B Benfold et al ldquoA dstributedcamera system formulti-resolution surveillancerdquo in Proceedings
Mathematical Problems in Engineering 11
of the 3rd ACMIEEE International Conference on DistributedSmart Cameras (ICDSC rsquo09) Como Italy September 2009
[17] C Ding B Song A Morye et al ldquoCollaborative sensing in adistributed ptz camera networkrdquo IEEE Transactions on ImageProcessing vol 21 no 7 pp 3282ndash3295 2012
[18] E B Ermis P Clarot P-M Jodoin and V Saligrama ldquoActivitybasedmatching in distributed camera networksrdquo IEEE Transac-tions on Image Processing vol 19 no 10 pp 2595ndash2613 2010
[19] D-M Tsai and S-C Lai ldquoIndependent component analysis-based background subtraction for indoor surveillancerdquo IEEETransactions on Image Processing vol 18 no 1 pp 158ndash167 2009
[20] J Migdal and W E L Grimson ldquoBackground subtractionusing Markov thresholdsrdquo in Proceedings of the IEEE Workshopon Motion and Video Computing (MOTION rsquo05) pp 58ndash65Breckenridge Colo USA January 2005
[21] C Stauffer andW E L Grimson ldquoLearning patterns of activityusing real-time trackingrdquo IEEETransactions on Pattern Analysisand Machine Intelligence vol 22 no 8 pp 747ndash757 2000
[22] D-S Lee ldquoEffective Gaussian mixture learning for video back-ground subtractionrdquo IEEE Transactions on Pattern Analysis andMachine Intelligence vol 27 no 5 pp 827ndash832 2005
[23] L Zhao and X He ldquoAdaptive Gaussian mixture learningfor moving object detectionrdquo in Proceedings of the 3rd IEEEInternational Conference on Broadband Network and Multime-dia Technology (IC-BNMT rsquo10) pp 1176ndash1180 Beijing ChinaOctober 2010
[24] S Sivaraman and M M Trivedi ldquoLooking at vehicles on theroad a survey of vision-based vehicle detection tracking andbehavior analysisrdquo IEEE Transactions on Intelligent Transporta-tion Systems vol 14 no 4 pp 1773ndash1795 2013
[25] G H Wen Z S Duan G R Chen and W W Yu ldquoConsen-sus tracking of multi-agent systems with lipschitz-type nodedynamics and switching topologiesrdquo IEEE Transactions onCircuits and Systems vol 60 no 9 pp 1ndash13 2013
[26] S Oh S Russell and S Sastry ldquoMarkov chain Monte Carlodata association formulti-target trackingrdquo IEEETransactions onAutomatic Control vol 54 no 3 pp 481ndash497 2009
[27] H Zhang and L Zhao ldquoIntegral channel features for particlefilter based object trackingrdquo in Proceedings of the 5th Interna-tional Conference on Intelligent Human-Machine Systems andCybernetics pp 190ndash193 Hangzhou China 2013
[28] M Taj and A Cavallaro ldquoDistributed and decentralized multi-camera trackingrdquo IEEE Signal Processing Magazine vol 28 no3 pp 46ndash58 2011
[29] N Krahnstoever T Yu S Lim et al ldquoCollaborative real-timecontrol of active cameras in large scale surveillance systemsrdquo inProceedings of the Workshop on Multi-camera and Multi-modalSensor Fusion Algorithms and Applications pp 1ndash12 MarseilleFrance 2008
[30] C Braillon C Pradalier J L Crowley and C Laugier ldquoReal-time moving obstacle detection using optical flow modelsrdquo inProceedings of the IEEE Intelligent Vehicles Symposium (IV rsquo06)pp 466ndash471 Tokyo Japan June 2006
[31] C Kim and J N Hwang ldquoFast and automatic video objectsegmentation and tracking for content-based applicationsrdquoIEEE Transactions on Circuits and Systems for Video Technologyvol 12 no 2 pp 122ndash129 2002
[32] R Jain and H H Nagel ldquoOn the analysis of accumulativedifference pictures from image sequences of real world scenesrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 1 no 2 pp 206ndash214 1978
[33] O Barnich andM Van Droogenbroeck ldquoViBE a powerful ran-dom technique to estimate the background in video sequencesrdquoin Proceedings of the IEEE International Conference onAcousticsSpeech and Signal Processing (ICASSP rsquo09) pp 945ndash948 TaibeiTaiwan April 2009
[34] A Elgammal D Harwood and L Davis ldquoNon-parametricmodel for background subtractionrdquo in Proceedings of the the 6thEuropean Conference on Computer Vision-Part II pp 751ndash767London UK 2000
[35] HWMao C H Yang G P Abousleman and J Si ldquoAutomatedmultiple target detection and tracking in UAV videosrdquo inAirborne Intelligence Surveillance Reconnaissance (ISR) Systemsand Applications VII vol 7668 of Proceedings of SPIE OrlandoFla USA 2010
[36] H Medeiros J Park and A C Kak ldquoDistributed objecttracking using a cluster-based Kalman filter in wireless cameranetworksrdquo IEEE Journal on Selected Topics in Signal Processingvol 2 no 4 pp 448ndash463 2008
[37] A Andriyenko and K Schindler ldquoMulti-target tracking bycontinuous energy minimizationrdquo in Proceedings of the IEEEConference on Computer Vision and Pattern Recognition (CVPRrsquo11) pp 1265ndash1272 Colorado Springs Colo USA June 2011
[38] A Andriyenko and K Schindler ldquoGlobally optimal multi targettracking on a hexagonal latticerdquo in Proceedings of the 11thEuropean Conference on Computer Vision pp 466ndash479 CreteGreece 2010
[39] J Y Bouguet ldquoPyramidal implementation of the affine lucaskanade feature tracker description of the algorithmrdquo IntelCorporation pp 1ndash9 2001
[40] Y Wang G Hu and Z Chen ldquoCalibration of CCD camera inimage matching experimental equipmentrdquo in Proceedings of the2nd International Symposium on Instrumentation Science andTechnology p 3146 Jinan China August 2002
[41] B Caprile and V Torre ldquoUsing vanishing points for cameracalibrationrdquo International Journal of Computer Vision vol 4 no2 pp 127ndash139 1990
Submit your manuscripts athttpwwwhindawicom
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical Problems in Engineering
Hindawi Publishing Corporationhttpwwwhindawicom
Differential EquationsInternational Journal of
Volume 2014
Applied MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical PhysicsAdvances in
Complex AnalysisJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
OptimizationJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Operations ResearchAdvances in
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Function Spaces
Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of Mathematics and Mathematical Sciences
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Algebra
Discrete Dynamics in Nature and Society
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Decision SciencesAdvances in
Discrete MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Stochastic AnalysisInternational Journal of
6 Mathematical Problems in Engineering
Figure 9 The multitarget tracking results for video data set 4
Figure 10 The target tracking results across multicamera for video data set 3
u
(u ) (ud d)
Zc Yc
XcOc
zc
xc yc
Zw
Xw
Yw
Ow
P(xw yw zw)+
Figure 11The relationship between the world coordinates and pixelcoordinates
32 Target Tracking Once moving targets are detected thetrack initialization event is triggered such that the movingtargets can be continuously tracked by the tracking algorithmin the living period of a track (which starts from its initial-ization to its termination [35]) The termination of a trackoccurs when a target can no longer be detected because itleaves the field of view it stops and becomes static or it can
Figure 12 The fast calibrating camera tool software
no longer be distinguished from the background Detectedtargets are not confirmed to be true moving targets until theyhave been consistently tracked for a period of time beforetheir target tracks are initialized We create a dynamic listof potential tracks using all detected targets Associationswill be established between targets detected in a new imageframe and potential tracking targets When a potential targetis tracked in several continuous frames it is recognized as atrue moving target and a track will be initialized
Compared to a single target tracking the multitargetproblem poses additional difficulties data association needsto be solved that is it has to be decided which observation
Mathematical Problems in Engineering 7
Xc1
Yc1Oc1
Zc1
Xc2
Yc2
Oc2
Zc2
Xw
Yw
Ow
Zw
r c2c1
r c1w
r c2w
Figure 13 The schematic diagram of the target positioning
corresponds towhich target constraints between targets needto be taken into account Multitarget tracking algorithmscan be roughly divided into two categories the recursivealgorithms and the nonrecursive algorithms The recursivealgorithms base their estimate only on the state of theprevious frame such as Kalman filtering [36] and particlefiltering [27] in which different strategy is used to obtainan optimal solution over multiple frames and can thusbetter cope with ambiguous multimodal distributions Thenonrecursive algorithms seek optimality over an extendedperiod of time [37 38]
In practical applications target tracking takes targetsplitting and merging into account because of the factorssuch as illumination changes and occlusion Since the one-to-one tracking assumption rarely holds multitarget trackingproblem is still challenging In order to realize multitargettracking we propose a solution by combining the pyramidLucas-Kanade feature tracker [39] and Kalman filter Themathematical model of Kalman filtering is described asfollows
119883 (119896 + 1) = 119860 (119896 + 1 119896) 119883 (119896) + 119882 (119896)
119885 (119896) = 119867 (119896) 119883 (119896) + 119881 (119896)
(2)
where 119883(119896) = [119909(119896) 119910(119896) 119908(119896) ℎ(119896) V119909(119896) V119910
(119896) V119908
(119896)
Vℎ(119896)]119879 denotes the state 119860(119896 + 1 119896) denotes the state tran-
sition matrix 119882(119896) denotes the system noise 119867(119896) denotesthe measurement matrix 119885(119896) = [119909(119896) 119910(119896) 119908(119896) ℎ(119896)]
119879
denotes the measurement value 119881(119896) denotes the measure-ment noise 119909(119896) and 119910(119896) denote the horizontal and verticalordinates of the target centroid 119908(119896) and ℎ(119896) denote thewidth and height of the target envelope rectangle V
119909(119896)
V119910
(119896) V119908
(119896) and Vℎ(119896) denote the speeds of the 119909(119896) 119910(119896)
119908(119896) and ℎ(119896) respectivelyTo realize multitarget tracking across multiple cameras
we construct a similarity function of target matching to
realize target association and target tracking across multiplecameras The similarity function is described as follows
119891 (119886 119887) = 120572119872119889
(119886 119887) + 120573119872119904
(119886 119887) + 120574119872ℎ
(119886 119887)
119872119889
(119886 119887) =(119882119886
+ 119882119887)
(119882119886
+ 119882119887) + 119863119909
sdot(119867119886
+ 119867119887)
(119867119886
+ 119867119887) + 119863119910
119872119904
(119886 119887) =2119882119886119867119886119882119887119867119887
(119882119886119867119886)2
+ (119882119887119867119887)2
119872ℎ
(119886 119887) =2ℎ119886ℎ119887
ℎ2119886
+ ℎ2
119887
(3)
where 119886 and 119887 denote targets which will be matched119872119889(119886 119887) isin [0 1] denotes their position similarity (the larger
the 119872119889(119886 119887) is the closer their positions are) 119872
119904(119886 119887) isin
[0 1] denotes the similarity of their sizes (the larger the119872119904(119886 119887) is the closer their sizes are) 119872
ℎ(119886 119887) isin [0 1]
denotes the similarity of their heights (the larger the 119872ℎ(119886 119887)
is the closer their heights are) 120572 isin [0 1] 120573 isin [0 1] and 120574 isin
[0 1] denote the weight coefficients and satisfy 120572 + 120573 + 120574 = 1119883119886and 119884
119886denote the horizontal and vertical ordinates of the
target 119886 in the world coordinate system 119883119887and 119884119887denote the
horizontal and vertical ordinates of the target 119887 in the worldcoordinate system119882
119886and119882
119887denote half of the widths of the
targets 119886 and 119887 respectively 119867119886and 119867
119887denote half of their
heights respectively119863119909
= fabs(119883119886minus119883119887) denotes the absolute
difference between119883119886and119883
119887119863119910
= fabs(119884119886
minus119884119887) denotes the
absolute difference between 119884119886and 119884
119887 ℎ119886and ℎ119887denote their
heights in the world coordinate system 119891(119886 119887) isin [0 1] (thelarger the 119891(119886 119887) is the higher their matching similarity is)and vice versa In practical applications 120572 120573 and 120574 can beadjusted according to the accuracy of the 119872
119889(119886 119887) 119872
119904(119886 119887)
and 119872ℎ(119886 119887)
An experiment is carried out by using the video data sets3 and 4 which come from a practical surveillance systemin the DNC of Beihang University with the video imageresolution of 352 times 288 pixels and the frame rate of 25 fsThe experimental results are shown in Figures 9 and 10 inwhich the rectangle with the dotted blue lines denotes theoverlapping area between two cameras As can be seen thetarget tracking algorithm presented in this paper is effectiveand can track multiple targets across the multiple cameras
4 Fast Camera Calibrationand Target Positioning
41 Fast Camera Calibration Camera calibration is a keytechnology in determining the mapping between 3D worldcoordinates and 2D image coordinates for various computervision applications A schematic diagram describing themapping between 3D world coordinates and 2D image coor-dinates is shown in Figure 11 (119906
119889 V119889) is the image coordinate
of (119909119908
119910119908
119911119908
) if a perfect pinhole camera model is used(119906 V) is the actual image coordinate which deviates from(119906119889 V119889) due to lens distortion The distance between them is
termed as the radial distortion Therefore the mathematical
8 Mathematical Problems in Engineering
Figure 14 The experimental results in indoor environment
model from 3Dworld coordinates to 2D image coordinates isexpressed by [40]
119906 + 120575119906 = 119891119909
1199031
(119909119908
minus 119909119888) + 1199032
(119910119908
minus 119910119888) + 1199033
(119911119908
minus 119911119888)
1199037
(119909119908
minus 119909119888) + 1199038
(119910119908
minus 119910119888) + 1199039
(119911119908
minus 119911119888)
V + 120575V = 119891119910
1199034
(119909119908
minus 119909119888) + 1199035
(119910119908
minus 119910119888) + 1199036
(119911119908
minus 119911119888)
1199037
(119909119908
minus 119909119888) + 1199038
(119910119908
minus 119910119888) + 1199039
(119911119908
minus 119911119888)
(4)
where 119903119894
(119894 = 1 2 9) denotes the element of therotation matrix from the world coordinate frame to the pixelcoordinate frame 119891
119909and 119891
119910denote the focal lengths of
the camera in the 119909 and 119910 directions 120575119906 and 120575V denotethe photogrammetric distortions 119874
119888(119909119888 119910119888 119911119888) denotes the
coordinate of the camera in the world coordinate frame119875(119909119908
119910119908
119911119908
)denotes the coordinate of the target in theworldcoordinate frame
Traditional calibration algorithms for example DLTalgorithm [9] and Tsai algorithm [10] utilize a series ofmathematical transformations and algorithms to obtainparameters of the camera model They have been widelyused because of their simple mathematical model and theoryHowever these algorithms require a large amount of work torecord and check calibration points during calibration andthus are inefficient in practical applications For this reason afast calibration algorithmbased on the vanishing point theorywas introduced in [11 41] in which the photogrammetricdistortions 120575119906 and 120575V are unconsidered
According to the mathematical model of the fast cameracalibration presented in [11] we develop software to calibratecameras fast and can promptly check the accuracy of cameraparameters by (4) The calibration process and results of apractical camera are shown in Figure 12 As can be seenthe focal lengths of the camera are 39983535 pixels thecalibration error of the line segment 119860119875 is 398mm and
the rotation matrix 119877 and translation vector 119879 (unit mm) areas the following
119877 = [
[
0642 minus0766 0024
minus0222 minus0216 minus0951
0734 0605 minus0309
]
]
119879 = [
[
minus93497
190111
62305937
]
]
(5)
42 Target Positioning Once targets are continuouslytracked the space coordinates of the targets in the cameracoordinate frame can be computed by imaging model withcamera parameters as follows
[
[
119883119888
119884119888
119885119888
]
]
= 119885119888
[[[[[
[
119889119909
119891119909
0 0
0
119889119910
119891119910
0
0 0 1
]]]]]
]
[
[
119906
V1
]
]
+
[[[[[[
[
1199060119885119888119889119909
119891119909
V0119885119888119889119910
119891119910
0
]]]]]]
]
(6)
where 1199060and V0denote the principle point in the pixel frame
and 119889119909 and 119889119910 denote the pixel sizes of the camera in the 119909
and 119910 directionsWhen the targets are across multiple cameras the space
coordinates of the targets in each camera coordinate frameare computed respectively by (6) and then are unified intothe world coordinated system as illustrated in Figure 13 Themathematical model is described as follows
[
[
119883119908
119884119908
119885119908
]
]
= 119877119894([
[
119883119888119894
119884119888119894
119885119888119894
]
]
minus 119879119894) 119894 = 1 2 119873 (7)
where 119894 denotes the 119894th camera and 119873 denotes the number ofcameras
5 Experiment Test
The target positioning system presented in this paper is testedin indoor and outdoor environments in which all the low-cost static cameras are calibrated and the coordinates of
Mathematical Problems in Engineering 9
(a) 3D scene of new main building (b) Camera 1
(c) Camera 2 (d) Camera 3
(e) Camera 4 (f) Camera 5
(g) Camera 6 (h) Camera 7
Figure 15 The experimental results in outdoor environment
cameras are unified into the world coordinate system Targetsdetection and tracking are done by the target detection andtracking algorithm As a result the targets are positionedby imaging model and camera parameters in real time andtheir trajectories are displayed in a three-dimensional scene
The test results are respectively shown in Figures 14 and15 for indoor and outdoor environments As can be seenfrom Figure 14 when a target is continuously moving withinan indoor corridor the positioning system consisting of sixdistributed cameras is able to position this target in real
10 Mathematical Problems in Engineering
time and display its trajectory in a three-dimensional spacemodel Likewise as can be seen from Figure 15 when a targetis continuously moving outdoors the positioning systemconsisting of seven distributed cameras is able to position thistarget and display its trajectory in a three-dimensional spacemodel in real time as well The experimental results confirmthat the systematic framework and inclusive algorithmspresented in this paper are both effective and efficient
In this paper we assume that the ground is flat whichrarely holds in practical applications given a large regionIn order to solve this problem it is necessary to use digitalelevationmodel (DEM) to describe topographic relief in largeregions
6 Conclusion and Future Work
This paper presented the comprehensive design and imple-mentation of a moving target positioning system basedon a distributed camera network The system is composedof low-cost static cameras which provide complementarypositioning information for moving target positioning inindoor and outdoor environments when GNSS signals areunavailable In this system static cameras can cover a largeregion moving targets are detected and then tracked usingcorresponding algorithms target positions are estimated bymaking use of the geometrical relationships among thosecameras after calibrating those cameras and finally foreach target its position estimates obtained from differentcameras are unified into the world coordinate system Theexperimental results of the target detection tracking andpositioning system were reported based on real video data
Targets positioning and trackingwithmultiple static cam-eras were verified in both indoor and outdoor environmentsHowever the reliability and accuracy of target tracking andpositioning suffer from several environment factors Henceit is necessary to fuse information from various sensorssuch as radar infrared camera inertial measure unit (IMU)and wireless location system Regarding future work it ismeaningful to develop and test these algorithms in thepractical applications
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This project is supported by the Key Program of the NationalNatural Science Foundation of China (Grant no 61039003)the National Natural Science Foundation of China (Grantno 41274038) the Aeronautical Science Foundation of China(Grant no 2013ZC51027) the Aerospace Innovation Founda-tion of China (CASC201102) and the Fundamental ResearchFunds for the Central Universities
References
[1] R Mautz and S Tilch ldquoSurvey of optical indoor positioningsystemsrdquo in Proceedings of the International Conference onIndoor Positioning and Indoor Navigation (IPIN rsquo11) GuimaraesPortugal September 2011
[2] G Retscher E Moser D Vredeveld et al ldquoPerformance andaccuracy test of a WiFi indoor positioning systemrdquo Journal ofApplied Geodesy vol 1 no 2 pp 103ndash110 2007
[3] S J Ingram D Harmer and M Quinlan ldquoUltra wide bandindoor positioning systems and their use in emergenciesrdquo inProceedings of the Position Location and Navigation Symposium(PLANS rsquo04 ) pp 706ndash715 Monterey Calif USA April 2004
[4] G Anastasi R Bandelloni M Conti et al ldquoExperimentingan indoor bluetooth-based positioning servicerdquo in Proceedingsof the 23rd International Conference on Distributed ComputingSystems Workshops pp 480ndash483 Providence RI USA 2003
[5] H Liu H Darabi P Banerjee and J Liu ldquoSurvey of wirelessindoor positioning techniques and systemsrdquo IEEE Transactionson Systems Man and Cybernetics C Applications and Reviewsvol 37 no 6 pp 1067ndash1080 2007
[6] H-S Kim and J-S Choi ldquoAdvanced indoor localization usingultrasonic sensor and digital compassrdquo in Proceedings of theInternational Conference on Control Automation and Systems(ICCAS rsquo08) pp 223ndash226 Seoul Korea October 2008
[7] R Karlsson T B Schon D Tornqvist G Conte and FGustafsson ldquoUtilizing model structure for efficient simulta-neous localization and mapping for a UAV applicationrdquo inProceedings of the IEEE Aerospace Conference Digital ObjectIdentifier (AC rsquo08) Rome Italy March 2008
[8] D Nister O Naroditsky and J Bergen ldquoVisual odometryrdquoin Proceedings of the IEEE Computer Society Conference onComputer Vision and Pattern Recognition (CVPR rsquo04) pp 652ndash659 Washington DC USA July 2004
[9] Y I Abdel-Aziz andHMKarara ldquoDirect linear transformationinto object space coordinates in close-range photogrammetryrdquoin Proceedings of the Symposium on Close-Range Photogramme-try pp 1ndash18 Urbana Ill USA 1971
[10] R Y Tsai ldquoA versatile camera calibration technique for high-accuracy 3D Machine vision metrology using off-the-shelf TVcamera and lensesrdquo IEEE Journal of Robotics and Automationvol RA-3 no 4 pp 323ndash344 1987
[11] J Long X Zhang and L Zhao ldquoA fast calibration algorithmbased on vanishing point for scene camerardquo Applied Mechanicsand Materials vol 58ndash60 pp 1148ndash1153 2011
[12] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000
[13] P D Z Varcheie and G-A Bilodeau ldquoPeople tracking using anetwork-based PTZ camerardquoMachine Vision and Applicationsvol 22 no 4 pp 671ndash690 2011
[14] C-H Chen Y Yao D Page B Abidi A Koschan andMAbidildquoHeterogeneous fusion of omnidirectional and PTZ camerasfor multiple object trackingrdquo IEEE Transactions on Circuits andSystems for Video Technology vol 18 no 8 pp 1052ndash1063 2008
[15] X Clady F Collange F Jurie and P Martinet ldquoObject track-ing with a Pan-tilt-zoom camera application to car drivingassistancerdquo in Proceedings of the IEEE International Conferenceon Robotics and Automation (ICRA rsquo01) pp 1653ndash1658 SeoulKorea May 2001
[16] N Bellotto E Sommerlade B Benfold et al ldquoA dstributedcamera system formulti-resolution surveillancerdquo in Proceedings
Mathematical Problems in Engineering 11
of the 3rd ACMIEEE International Conference on DistributedSmart Cameras (ICDSC rsquo09) Como Italy September 2009
[17] C Ding B Song A Morye et al ldquoCollaborative sensing in adistributed ptz camera networkrdquo IEEE Transactions on ImageProcessing vol 21 no 7 pp 3282ndash3295 2012
[18] E B Ermis P Clarot P-M Jodoin and V Saligrama ldquoActivitybasedmatching in distributed camera networksrdquo IEEE Transac-tions on Image Processing vol 19 no 10 pp 2595ndash2613 2010
[19] D-M Tsai and S-C Lai ldquoIndependent component analysis-based background subtraction for indoor surveillancerdquo IEEETransactions on Image Processing vol 18 no 1 pp 158ndash167 2009
[20] J Migdal and W E L Grimson ldquoBackground subtractionusing Markov thresholdsrdquo in Proceedings of the IEEE Workshopon Motion and Video Computing (MOTION rsquo05) pp 58ndash65Breckenridge Colo USA January 2005
[21] C Stauffer andW E L Grimson ldquoLearning patterns of activityusing real-time trackingrdquo IEEETransactions on Pattern Analysisand Machine Intelligence vol 22 no 8 pp 747ndash757 2000
[22] D-S Lee ldquoEffective Gaussian mixture learning for video back-ground subtractionrdquo IEEE Transactions on Pattern Analysis andMachine Intelligence vol 27 no 5 pp 827ndash832 2005
[23] L Zhao and X He ldquoAdaptive Gaussian mixture learningfor moving object detectionrdquo in Proceedings of the 3rd IEEEInternational Conference on Broadband Network and Multime-dia Technology (IC-BNMT rsquo10) pp 1176ndash1180 Beijing ChinaOctober 2010
[24] S Sivaraman and M M Trivedi ldquoLooking at vehicles on theroad a survey of vision-based vehicle detection tracking andbehavior analysisrdquo IEEE Transactions on Intelligent Transporta-tion Systems vol 14 no 4 pp 1773ndash1795 2013
[25] G H Wen Z S Duan G R Chen and W W Yu ldquoConsen-sus tracking of multi-agent systems with lipschitz-type nodedynamics and switching topologiesrdquo IEEE Transactions onCircuits and Systems vol 60 no 9 pp 1ndash13 2013
[26] S Oh S Russell and S Sastry ldquoMarkov chain Monte Carlodata association formulti-target trackingrdquo IEEETransactions onAutomatic Control vol 54 no 3 pp 481ndash497 2009
[27] H Zhang and L Zhao ldquoIntegral channel features for particlefilter based object trackingrdquo in Proceedings of the 5th Interna-tional Conference on Intelligent Human-Machine Systems andCybernetics pp 190ndash193 Hangzhou China 2013
[28] M Taj and A Cavallaro ldquoDistributed and decentralized multi-camera trackingrdquo IEEE Signal Processing Magazine vol 28 no3 pp 46ndash58 2011
[29] N Krahnstoever T Yu S Lim et al ldquoCollaborative real-timecontrol of active cameras in large scale surveillance systemsrdquo inProceedings of the Workshop on Multi-camera and Multi-modalSensor Fusion Algorithms and Applications pp 1ndash12 MarseilleFrance 2008
[30] C Braillon C Pradalier J L Crowley and C Laugier ldquoReal-time moving obstacle detection using optical flow modelsrdquo inProceedings of the IEEE Intelligent Vehicles Symposium (IV rsquo06)pp 466ndash471 Tokyo Japan June 2006
[31] C Kim and J N Hwang ldquoFast and automatic video objectsegmentation and tracking for content-based applicationsrdquoIEEE Transactions on Circuits and Systems for Video Technologyvol 12 no 2 pp 122ndash129 2002
[32] R Jain and H H Nagel ldquoOn the analysis of accumulativedifference pictures from image sequences of real world scenesrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 1 no 2 pp 206ndash214 1978
[33] O Barnich andM Van Droogenbroeck ldquoViBE a powerful ran-dom technique to estimate the background in video sequencesrdquoin Proceedings of the IEEE International Conference onAcousticsSpeech and Signal Processing (ICASSP rsquo09) pp 945ndash948 TaibeiTaiwan April 2009
[34] A Elgammal D Harwood and L Davis ldquoNon-parametricmodel for background subtractionrdquo in Proceedings of the the 6thEuropean Conference on Computer Vision-Part II pp 751ndash767London UK 2000
[35] HWMao C H Yang G P Abousleman and J Si ldquoAutomatedmultiple target detection and tracking in UAV videosrdquo inAirborne Intelligence Surveillance Reconnaissance (ISR) Systemsand Applications VII vol 7668 of Proceedings of SPIE OrlandoFla USA 2010
[36] H Medeiros J Park and A C Kak ldquoDistributed objecttracking using a cluster-based Kalman filter in wireless cameranetworksrdquo IEEE Journal on Selected Topics in Signal Processingvol 2 no 4 pp 448ndash463 2008
[37] A Andriyenko and K Schindler ldquoMulti-target tracking bycontinuous energy minimizationrdquo in Proceedings of the IEEEConference on Computer Vision and Pattern Recognition (CVPRrsquo11) pp 1265ndash1272 Colorado Springs Colo USA June 2011
[38] A Andriyenko and K Schindler ldquoGlobally optimal multi targettracking on a hexagonal latticerdquo in Proceedings of the 11thEuropean Conference on Computer Vision pp 466ndash479 CreteGreece 2010
[39] J Y Bouguet ldquoPyramidal implementation of the affine lucaskanade feature tracker description of the algorithmrdquo IntelCorporation pp 1ndash9 2001
[40] Y Wang G Hu and Z Chen ldquoCalibration of CCD camera inimage matching experimental equipmentrdquo in Proceedings of the2nd International Symposium on Instrumentation Science andTechnology p 3146 Jinan China August 2002
[41] B Caprile and V Torre ldquoUsing vanishing points for cameracalibrationrdquo International Journal of Computer Vision vol 4 no2 pp 127ndash139 1990
Submit your manuscripts athttpwwwhindawicom
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical Problems in Engineering
Hindawi Publishing Corporationhttpwwwhindawicom
Differential EquationsInternational Journal of
Volume 2014
Applied MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical PhysicsAdvances in
Complex AnalysisJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
OptimizationJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Operations ResearchAdvances in
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Function Spaces
Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of Mathematics and Mathematical Sciences
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Algebra
Discrete Dynamics in Nature and Society
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Decision SciencesAdvances in
Discrete MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Stochastic AnalysisInternational Journal of
Mathematical Problems in Engineering 7
Xc1
Yc1Oc1
Zc1
Xc2
Yc2
Oc2
Zc2
Xw
Yw
Ow
Zw
r c2c1
r c1w
r c2w
Figure 13 The schematic diagram of the target positioning
corresponds towhich target constraints between targets needto be taken into account Multitarget tracking algorithmscan be roughly divided into two categories the recursivealgorithms and the nonrecursive algorithms The recursivealgorithms base their estimate only on the state of theprevious frame such as Kalman filtering [36] and particlefiltering [27] in which different strategy is used to obtainan optimal solution over multiple frames and can thusbetter cope with ambiguous multimodal distributions Thenonrecursive algorithms seek optimality over an extendedperiod of time [37 38]
In practical applications target tracking takes targetsplitting and merging into account because of the factorssuch as illumination changes and occlusion Since the one-to-one tracking assumption rarely holds multitarget trackingproblem is still challenging In order to realize multitargettracking we propose a solution by combining the pyramidLucas-Kanade feature tracker [39] and Kalman filter Themathematical model of Kalman filtering is described asfollows
119883 (119896 + 1) = 119860 (119896 + 1 119896) 119883 (119896) + 119882 (119896)
119885 (119896) = 119867 (119896) 119883 (119896) + 119881 (119896)
(2)
where 119883(119896) = [119909(119896) 119910(119896) 119908(119896) ℎ(119896) V119909(119896) V119910
(119896) V119908
(119896)
Vℎ(119896)]119879 denotes the state 119860(119896 + 1 119896) denotes the state tran-
sition matrix 119882(119896) denotes the system noise 119867(119896) denotesthe measurement matrix 119885(119896) = [119909(119896) 119910(119896) 119908(119896) ℎ(119896)]
119879
denotes the measurement value 119881(119896) denotes the measure-ment noise 119909(119896) and 119910(119896) denote the horizontal and verticalordinates of the target centroid 119908(119896) and ℎ(119896) denote thewidth and height of the target envelope rectangle V
119909(119896)
V119910
(119896) V119908
(119896) and Vℎ(119896) denote the speeds of the 119909(119896) 119910(119896)
119908(119896) and ℎ(119896) respectivelyTo realize multitarget tracking across multiple cameras
we construct a similarity function of target matching to
realize target association and target tracking across multiplecameras The similarity function is described as follows
119891 (119886 119887) = 120572119872119889
(119886 119887) + 120573119872119904
(119886 119887) + 120574119872ℎ
(119886 119887)
119872119889
(119886 119887) =(119882119886
+ 119882119887)
(119882119886
+ 119882119887) + 119863119909
sdot(119867119886
+ 119867119887)
(119867119886
+ 119867119887) + 119863119910
119872119904
(119886 119887) =2119882119886119867119886119882119887119867119887
(119882119886119867119886)2
+ (119882119887119867119887)2
119872ℎ
(119886 119887) =2ℎ119886ℎ119887
ℎ2119886
+ ℎ2
119887
(3)
where 119886 and 119887 denote targets which will be matched119872119889(119886 119887) isin [0 1] denotes their position similarity (the larger
the 119872119889(119886 119887) is the closer their positions are) 119872
119904(119886 119887) isin
[0 1] denotes the similarity of their sizes (the larger the119872119904(119886 119887) is the closer their sizes are) 119872
ℎ(119886 119887) isin [0 1]
denotes the similarity of their heights (the larger the 119872ℎ(119886 119887)
is the closer their heights are) 120572 isin [0 1] 120573 isin [0 1] and 120574 isin
[0 1] denote the weight coefficients and satisfy 120572 + 120573 + 120574 = 1119883119886and 119884
119886denote the horizontal and vertical ordinates of the
target 119886 in the world coordinate system 119883119887and 119884119887denote the
horizontal and vertical ordinates of the target 119887 in the worldcoordinate system119882
119886and119882
119887denote half of the widths of the
targets 119886 and 119887 respectively 119867119886and 119867
119887denote half of their
heights respectively119863119909
= fabs(119883119886minus119883119887) denotes the absolute
difference between119883119886and119883
119887119863119910
= fabs(119884119886
minus119884119887) denotes the
absolute difference between 119884119886and 119884
119887 ℎ119886and ℎ119887denote their
heights in the world coordinate system 119891(119886 119887) isin [0 1] (thelarger the 119891(119886 119887) is the higher their matching similarity is)and vice versa In practical applications 120572 120573 and 120574 can beadjusted according to the accuracy of the 119872
119889(119886 119887) 119872
119904(119886 119887)
and 119872ℎ(119886 119887)
An experiment is carried out by using the video data sets3 and 4 which come from a practical surveillance systemin the DNC of Beihang University with the video imageresolution of 352 times 288 pixels and the frame rate of 25 fsThe experimental results are shown in Figures 9 and 10 inwhich the rectangle with the dotted blue lines denotes theoverlapping area between two cameras As can be seen thetarget tracking algorithm presented in this paper is effectiveand can track multiple targets across the multiple cameras
4 Fast Camera Calibrationand Target Positioning
41 Fast Camera Calibration Camera calibration is a keytechnology in determining the mapping between 3D worldcoordinates and 2D image coordinates for various computervision applications A schematic diagram describing themapping between 3D world coordinates and 2D image coor-dinates is shown in Figure 11 (119906
119889 V119889) is the image coordinate
of (119909119908
119910119908
119911119908
) if a perfect pinhole camera model is used(119906 V) is the actual image coordinate which deviates from(119906119889 V119889) due to lens distortion The distance between them is
termed as the radial distortion Therefore the mathematical
8 Mathematical Problems in Engineering
Figure 14 The experimental results in indoor environment
model from 3Dworld coordinates to 2D image coordinates isexpressed by [40]
119906 + 120575119906 = 119891119909
1199031
(119909119908
minus 119909119888) + 1199032
(119910119908
minus 119910119888) + 1199033
(119911119908
minus 119911119888)
1199037
(119909119908
minus 119909119888) + 1199038
(119910119908
minus 119910119888) + 1199039
(119911119908
minus 119911119888)
V + 120575V = 119891119910
1199034
(119909119908
minus 119909119888) + 1199035
(119910119908
minus 119910119888) + 1199036
(119911119908
minus 119911119888)
1199037
(119909119908
minus 119909119888) + 1199038
(119910119908
minus 119910119888) + 1199039
(119911119908
minus 119911119888)
(4)
where 119903119894
(119894 = 1 2 9) denotes the element of therotation matrix from the world coordinate frame to the pixelcoordinate frame 119891
119909and 119891
119910denote the focal lengths of
the camera in the 119909 and 119910 directions 120575119906 and 120575V denotethe photogrammetric distortions 119874
119888(119909119888 119910119888 119911119888) denotes the
coordinate of the camera in the world coordinate frame119875(119909119908
119910119908
119911119908
)denotes the coordinate of the target in theworldcoordinate frame
Traditional calibration algorithms for example DLTalgorithm [9] and Tsai algorithm [10] utilize a series ofmathematical transformations and algorithms to obtainparameters of the camera model They have been widelyused because of their simple mathematical model and theoryHowever these algorithms require a large amount of work torecord and check calibration points during calibration andthus are inefficient in practical applications For this reason afast calibration algorithmbased on the vanishing point theorywas introduced in [11 41] in which the photogrammetricdistortions 120575119906 and 120575V are unconsidered
According to the mathematical model of the fast cameracalibration presented in [11] we develop software to calibratecameras fast and can promptly check the accuracy of cameraparameters by (4) The calibration process and results of apractical camera are shown in Figure 12 As can be seenthe focal lengths of the camera are 39983535 pixels thecalibration error of the line segment 119860119875 is 398mm and
the rotation matrix 119877 and translation vector 119879 (unit mm) areas the following
119877 = [
[
0642 minus0766 0024
minus0222 minus0216 minus0951
0734 0605 minus0309
]
]
119879 = [
[
minus93497
190111
62305937
]
]
(5)
42 Target Positioning Once targets are continuouslytracked the space coordinates of the targets in the cameracoordinate frame can be computed by imaging model withcamera parameters as follows
[
[
119883119888
119884119888
119885119888
]
]
= 119885119888
[[[[[
[
119889119909
119891119909
0 0
0
119889119910
119891119910
0
0 0 1
]]]]]
]
[
[
119906
V1
]
]
+
[[[[[[
[
1199060119885119888119889119909
119891119909
V0119885119888119889119910
119891119910
0
]]]]]]
]
(6)
where 1199060and V0denote the principle point in the pixel frame
and 119889119909 and 119889119910 denote the pixel sizes of the camera in the 119909
and 119910 directionsWhen the targets are across multiple cameras the space
coordinates of the targets in each camera coordinate frameare computed respectively by (6) and then are unified intothe world coordinated system as illustrated in Figure 13 Themathematical model is described as follows
[
[
119883119908
119884119908
119885119908
]
]
= 119877119894([
[
119883119888119894
119884119888119894
119885119888119894
]
]
minus 119879119894) 119894 = 1 2 119873 (7)
where 119894 denotes the 119894th camera and 119873 denotes the number ofcameras
5 Experiment Test
The target positioning system presented in this paper is testedin indoor and outdoor environments in which all the low-cost static cameras are calibrated and the coordinates of
Mathematical Problems in Engineering 9
(a) 3D scene of new main building (b) Camera 1
(c) Camera 2 (d) Camera 3
(e) Camera 4 (f) Camera 5
(g) Camera 6 (h) Camera 7
Figure 15 The experimental results in outdoor environment
cameras are unified into the world coordinate system Targetsdetection and tracking are done by the target detection andtracking algorithm As a result the targets are positionedby imaging model and camera parameters in real time andtheir trajectories are displayed in a three-dimensional scene
The test results are respectively shown in Figures 14 and15 for indoor and outdoor environments As can be seenfrom Figure 14 when a target is continuously moving withinan indoor corridor the positioning system consisting of sixdistributed cameras is able to position this target in real
10 Mathematical Problems in Engineering
time and display its trajectory in a three-dimensional spacemodel Likewise as can be seen from Figure 15 when a targetis continuously moving outdoors the positioning systemconsisting of seven distributed cameras is able to position thistarget and display its trajectory in a three-dimensional spacemodel in real time as well The experimental results confirmthat the systematic framework and inclusive algorithmspresented in this paper are both effective and efficient
In this paper we assume that the ground is flat whichrarely holds in practical applications given a large regionIn order to solve this problem it is necessary to use digitalelevationmodel (DEM) to describe topographic relief in largeregions
6 Conclusion and Future Work
This paper presented the comprehensive design and imple-mentation of a moving target positioning system basedon a distributed camera network The system is composedof low-cost static cameras which provide complementarypositioning information for moving target positioning inindoor and outdoor environments when GNSS signals areunavailable In this system static cameras can cover a largeregion moving targets are detected and then tracked usingcorresponding algorithms target positions are estimated bymaking use of the geometrical relationships among thosecameras after calibrating those cameras and finally foreach target its position estimates obtained from differentcameras are unified into the world coordinate system Theexperimental results of the target detection tracking andpositioning system were reported based on real video data
Targets positioning and trackingwithmultiple static cam-eras were verified in both indoor and outdoor environmentsHowever the reliability and accuracy of target tracking andpositioning suffer from several environment factors Henceit is necessary to fuse information from various sensorssuch as radar infrared camera inertial measure unit (IMU)and wireless location system Regarding future work it ismeaningful to develop and test these algorithms in thepractical applications
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This project is supported by the Key Program of the NationalNatural Science Foundation of China (Grant no 61039003)the National Natural Science Foundation of China (Grantno 41274038) the Aeronautical Science Foundation of China(Grant no 2013ZC51027) the Aerospace Innovation Founda-tion of China (CASC201102) and the Fundamental ResearchFunds for the Central Universities
References
[1] R Mautz and S Tilch ldquoSurvey of optical indoor positioningsystemsrdquo in Proceedings of the International Conference onIndoor Positioning and Indoor Navigation (IPIN rsquo11) GuimaraesPortugal September 2011
[2] G Retscher E Moser D Vredeveld et al ldquoPerformance andaccuracy test of a WiFi indoor positioning systemrdquo Journal ofApplied Geodesy vol 1 no 2 pp 103ndash110 2007
[3] S J Ingram D Harmer and M Quinlan ldquoUltra wide bandindoor positioning systems and their use in emergenciesrdquo inProceedings of the Position Location and Navigation Symposium(PLANS rsquo04 ) pp 706ndash715 Monterey Calif USA April 2004
[4] G Anastasi R Bandelloni M Conti et al ldquoExperimentingan indoor bluetooth-based positioning servicerdquo in Proceedingsof the 23rd International Conference on Distributed ComputingSystems Workshops pp 480ndash483 Providence RI USA 2003
[5] H Liu H Darabi P Banerjee and J Liu ldquoSurvey of wirelessindoor positioning techniques and systemsrdquo IEEE Transactionson Systems Man and Cybernetics C Applications and Reviewsvol 37 no 6 pp 1067ndash1080 2007
[6] H-S Kim and J-S Choi ldquoAdvanced indoor localization usingultrasonic sensor and digital compassrdquo in Proceedings of theInternational Conference on Control Automation and Systems(ICCAS rsquo08) pp 223ndash226 Seoul Korea October 2008
[7] R Karlsson T B Schon D Tornqvist G Conte and FGustafsson ldquoUtilizing model structure for efficient simulta-neous localization and mapping for a UAV applicationrdquo inProceedings of the IEEE Aerospace Conference Digital ObjectIdentifier (AC rsquo08) Rome Italy March 2008
[8] D Nister O Naroditsky and J Bergen ldquoVisual odometryrdquoin Proceedings of the IEEE Computer Society Conference onComputer Vision and Pattern Recognition (CVPR rsquo04) pp 652ndash659 Washington DC USA July 2004
[9] Y I Abdel-Aziz andHMKarara ldquoDirect linear transformationinto object space coordinates in close-range photogrammetryrdquoin Proceedings of the Symposium on Close-Range Photogramme-try pp 1ndash18 Urbana Ill USA 1971
[10] R Y Tsai ldquoA versatile camera calibration technique for high-accuracy 3D Machine vision metrology using off-the-shelf TVcamera and lensesrdquo IEEE Journal of Robotics and Automationvol RA-3 no 4 pp 323ndash344 1987
[11] J Long X Zhang and L Zhao ldquoA fast calibration algorithmbased on vanishing point for scene camerardquo Applied Mechanicsand Materials vol 58ndash60 pp 1148ndash1153 2011
[12] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000
[13] P D Z Varcheie and G-A Bilodeau ldquoPeople tracking using anetwork-based PTZ camerardquoMachine Vision and Applicationsvol 22 no 4 pp 671ndash690 2011
[14] C-H Chen Y Yao D Page B Abidi A Koschan andMAbidildquoHeterogeneous fusion of omnidirectional and PTZ camerasfor multiple object trackingrdquo IEEE Transactions on Circuits andSystems for Video Technology vol 18 no 8 pp 1052ndash1063 2008
[15] X Clady F Collange F Jurie and P Martinet ldquoObject track-ing with a Pan-tilt-zoom camera application to car drivingassistancerdquo in Proceedings of the IEEE International Conferenceon Robotics and Automation (ICRA rsquo01) pp 1653ndash1658 SeoulKorea May 2001
[16] N Bellotto E Sommerlade B Benfold et al ldquoA dstributedcamera system formulti-resolution surveillancerdquo in Proceedings
Mathematical Problems in Engineering 11
of the 3rd ACMIEEE International Conference on DistributedSmart Cameras (ICDSC rsquo09) Como Italy September 2009
[17] C Ding B Song A Morye et al ldquoCollaborative sensing in adistributed ptz camera networkrdquo IEEE Transactions on ImageProcessing vol 21 no 7 pp 3282ndash3295 2012
[18] E B Ermis P Clarot P-M Jodoin and V Saligrama ldquoActivitybasedmatching in distributed camera networksrdquo IEEE Transac-tions on Image Processing vol 19 no 10 pp 2595ndash2613 2010
[19] D-M Tsai and S-C Lai ldquoIndependent component analysis-based background subtraction for indoor surveillancerdquo IEEETransactions on Image Processing vol 18 no 1 pp 158ndash167 2009
[20] J Migdal and W E L Grimson ldquoBackground subtractionusing Markov thresholdsrdquo in Proceedings of the IEEE Workshopon Motion and Video Computing (MOTION rsquo05) pp 58ndash65Breckenridge Colo USA January 2005
[21] C Stauffer andW E L Grimson ldquoLearning patterns of activityusing real-time trackingrdquo IEEETransactions on Pattern Analysisand Machine Intelligence vol 22 no 8 pp 747ndash757 2000
[22] D-S Lee ldquoEffective Gaussian mixture learning for video back-ground subtractionrdquo IEEE Transactions on Pattern Analysis andMachine Intelligence vol 27 no 5 pp 827ndash832 2005
[23] L Zhao and X He ldquoAdaptive Gaussian mixture learningfor moving object detectionrdquo in Proceedings of the 3rd IEEEInternational Conference on Broadband Network and Multime-dia Technology (IC-BNMT rsquo10) pp 1176ndash1180 Beijing ChinaOctober 2010
[24] S Sivaraman and M M Trivedi ldquoLooking at vehicles on theroad a survey of vision-based vehicle detection tracking andbehavior analysisrdquo IEEE Transactions on Intelligent Transporta-tion Systems vol 14 no 4 pp 1773ndash1795 2013
[25] G H Wen Z S Duan G R Chen and W W Yu ldquoConsen-sus tracking of multi-agent systems with lipschitz-type nodedynamics and switching topologiesrdquo IEEE Transactions onCircuits and Systems vol 60 no 9 pp 1ndash13 2013
[26] S Oh S Russell and S Sastry ldquoMarkov chain Monte Carlodata association formulti-target trackingrdquo IEEETransactions onAutomatic Control vol 54 no 3 pp 481ndash497 2009
[27] H Zhang and L Zhao ldquoIntegral channel features for particlefilter based object trackingrdquo in Proceedings of the 5th Interna-tional Conference on Intelligent Human-Machine Systems andCybernetics pp 190ndash193 Hangzhou China 2013
[28] M Taj and A Cavallaro ldquoDistributed and decentralized multi-camera trackingrdquo IEEE Signal Processing Magazine vol 28 no3 pp 46ndash58 2011
[29] N Krahnstoever T Yu S Lim et al ldquoCollaborative real-timecontrol of active cameras in large scale surveillance systemsrdquo inProceedings of the Workshop on Multi-camera and Multi-modalSensor Fusion Algorithms and Applications pp 1ndash12 MarseilleFrance 2008
[30] C Braillon C Pradalier J L Crowley and C Laugier ldquoReal-time moving obstacle detection using optical flow modelsrdquo inProceedings of the IEEE Intelligent Vehicles Symposium (IV rsquo06)pp 466ndash471 Tokyo Japan June 2006
[31] C Kim and J N Hwang ldquoFast and automatic video objectsegmentation and tracking for content-based applicationsrdquoIEEE Transactions on Circuits and Systems for Video Technologyvol 12 no 2 pp 122ndash129 2002
[32] R Jain and H H Nagel ldquoOn the analysis of accumulativedifference pictures from image sequences of real world scenesrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 1 no 2 pp 206ndash214 1978
[33] O Barnich andM Van Droogenbroeck ldquoViBE a powerful ran-dom technique to estimate the background in video sequencesrdquoin Proceedings of the IEEE International Conference onAcousticsSpeech and Signal Processing (ICASSP rsquo09) pp 945ndash948 TaibeiTaiwan April 2009
[34] A Elgammal D Harwood and L Davis ldquoNon-parametricmodel for background subtractionrdquo in Proceedings of the the 6thEuropean Conference on Computer Vision-Part II pp 751ndash767London UK 2000
[35] HWMao C H Yang G P Abousleman and J Si ldquoAutomatedmultiple target detection and tracking in UAV videosrdquo inAirborne Intelligence Surveillance Reconnaissance (ISR) Systemsand Applications VII vol 7668 of Proceedings of SPIE OrlandoFla USA 2010
[36] H Medeiros J Park and A C Kak ldquoDistributed objecttracking using a cluster-based Kalman filter in wireless cameranetworksrdquo IEEE Journal on Selected Topics in Signal Processingvol 2 no 4 pp 448ndash463 2008
[37] A Andriyenko and K Schindler ldquoMulti-target tracking bycontinuous energy minimizationrdquo in Proceedings of the IEEEConference on Computer Vision and Pattern Recognition (CVPRrsquo11) pp 1265ndash1272 Colorado Springs Colo USA June 2011
[38] A Andriyenko and K Schindler ldquoGlobally optimal multi targettracking on a hexagonal latticerdquo in Proceedings of the 11thEuropean Conference on Computer Vision pp 466ndash479 CreteGreece 2010
[39] J Y Bouguet ldquoPyramidal implementation of the affine lucaskanade feature tracker description of the algorithmrdquo IntelCorporation pp 1ndash9 2001
[40] Y Wang G Hu and Z Chen ldquoCalibration of CCD camera inimage matching experimental equipmentrdquo in Proceedings of the2nd International Symposium on Instrumentation Science andTechnology p 3146 Jinan China August 2002
[41] B Caprile and V Torre ldquoUsing vanishing points for cameracalibrationrdquo International Journal of Computer Vision vol 4 no2 pp 127ndash139 1990
Submit your manuscripts athttpwwwhindawicom
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical Problems in Engineering
Hindawi Publishing Corporationhttpwwwhindawicom
Differential EquationsInternational Journal of
Volume 2014
Applied MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical PhysicsAdvances in
Complex AnalysisJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
OptimizationJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Operations ResearchAdvances in
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Function Spaces
Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of Mathematics and Mathematical Sciences
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Algebra
Discrete Dynamics in Nature and Society
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Decision SciencesAdvances in
Discrete MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Stochastic AnalysisInternational Journal of
8 Mathematical Problems in Engineering
Figure 14 The experimental results in indoor environment
model from 3Dworld coordinates to 2D image coordinates isexpressed by [40]
119906 + 120575119906 = 119891119909
1199031
(119909119908
minus 119909119888) + 1199032
(119910119908
minus 119910119888) + 1199033
(119911119908
minus 119911119888)
1199037
(119909119908
minus 119909119888) + 1199038
(119910119908
minus 119910119888) + 1199039
(119911119908
minus 119911119888)
V + 120575V = 119891119910
1199034
(119909119908
minus 119909119888) + 1199035
(119910119908
minus 119910119888) + 1199036
(119911119908
minus 119911119888)
1199037
(119909119908
minus 119909119888) + 1199038
(119910119908
minus 119910119888) + 1199039
(119911119908
minus 119911119888)
(4)
where 119903119894
(119894 = 1 2 9) denotes the element of therotation matrix from the world coordinate frame to the pixelcoordinate frame 119891
119909and 119891
119910denote the focal lengths of
the camera in the 119909 and 119910 directions 120575119906 and 120575V denotethe photogrammetric distortions 119874
119888(119909119888 119910119888 119911119888) denotes the
coordinate of the camera in the world coordinate frame119875(119909119908
119910119908
119911119908
)denotes the coordinate of the target in theworldcoordinate frame
Traditional calibration algorithms for example DLTalgorithm [9] and Tsai algorithm [10] utilize a series ofmathematical transformations and algorithms to obtainparameters of the camera model They have been widelyused because of their simple mathematical model and theoryHowever these algorithms require a large amount of work torecord and check calibration points during calibration andthus are inefficient in practical applications For this reason afast calibration algorithmbased on the vanishing point theorywas introduced in [11 41] in which the photogrammetricdistortions 120575119906 and 120575V are unconsidered
According to the mathematical model of the fast cameracalibration presented in [11] we develop software to calibratecameras fast and can promptly check the accuracy of cameraparameters by (4) The calibration process and results of apractical camera are shown in Figure 12 As can be seenthe focal lengths of the camera are 39983535 pixels thecalibration error of the line segment 119860119875 is 398mm and
the rotation matrix 119877 and translation vector 119879 (unit mm) areas the following
119877 = [
[
0642 minus0766 0024
minus0222 minus0216 minus0951
0734 0605 minus0309
]
]
119879 = [
[
minus93497
190111
62305937
]
]
(5)
42 Target Positioning Once targets are continuouslytracked the space coordinates of the targets in the cameracoordinate frame can be computed by imaging model withcamera parameters as follows
[
[
119883119888
119884119888
119885119888
]
]
= 119885119888
[[[[[
[
119889119909
119891119909
0 0
0
119889119910
119891119910
0
0 0 1
]]]]]
]
[
[
119906
V1
]
]
+
[[[[[[
[
1199060119885119888119889119909
119891119909
V0119885119888119889119910
119891119910
0
]]]]]]
]
(6)
where 1199060and V0denote the principle point in the pixel frame
and 119889119909 and 119889119910 denote the pixel sizes of the camera in the 119909
and 119910 directionsWhen the targets are across multiple cameras the space
coordinates of the targets in each camera coordinate frameare computed respectively by (6) and then are unified intothe world coordinated system as illustrated in Figure 13 Themathematical model is described as follows
[
[
119883119908
119884119908
119885119908
]
]
= 119877119894([
[
119883119888119894
119884119888119894
119885119888119894
]
]
minus 119879119894) 119894 = 1 2 119873 (7)
where 119894 denotes the 119894th camera and 119873 denotes the number ofcameras
5 Experiment Test
The target positioning system presented in this paper is testedin indoor and outdoor environments in which all the low-cost static cameras are calibrated and the coordinates of
Mathematical Problems in Engineering 9
(a) 3D scene of new main building (b) Camera 1
(c) Camera 2 (d) Camera 3
(e) Camera 4 (f) Camera 5
(g) Camera 6 (h) Camera 7
Figure 15 The experimental results in outdoor environment
cameras are unified into the world coordinate system Targetsdetection and tracking are done by the target detection andtracking algorithm As a result the targets are positionedby imaging model and camera parameters in real time andtheir trajectories are displayed in a three-dimensional scene
The test results are respectively shown in Figures 14 and15 for indoor and outdoor environments As can be seenfrom Figure 14 when a target is continuously moving withinan indoor corridor the positioning system consisting of sixdistributed cameras is able to position this target in real
10 Mathematical Problems in Engineering
time and display its trajectory in a three-dimensional spacemodel Likewise as can be seen from Figure 15 when a targetis continuously moving outdoors the positioning systemconsisting of seven distributed cameras is able to position thistarget and display its trajectory in a three-dimensional spacemodel in real time as well The experimental results confirmthat the systematic framework and inclusive algorithmspresented in this paper are both effective and efficient
In this paper we assume that the ground is flat whichrarely holds in practical applications given a large regionIn order to solve this problem it is necessary to use digitalelevationmodel (DEM) to describe topographic relief in largeregions
6 Conclusion and Future Work
This paper presented the comprehensive design and imple-mentation of a moving target positioning system basedon a distributed camera network The system is composedof low-cost static cameras which provide complementarypositioning information for moving target positioning inindoor and outdoor environments when GNSS signals areunavailable In this system static cameras can cover a largeregion moving targets are detected and then tracked usingcorresponding algorithms target positions are estimated bymaking use of the geometrical relationships among thosecameras after calibrating those cameras and finally foreach target its position estimates obtained from differentcameras are unified into the world coordinate system Theexperimental results of the target detection tracking andpositioning system were reported based on real video data
Targets positioning and trackingwithmultiple static cam-eras were verified in both indoor and outdoor environmentsHowever the reliability and accuracy of target tracking andpositioning suffer from several environment factors Henceit is necessary to fuse information from various sensorssuch as radar infrared camera inertial measure unit (IMU)and wireless location system Regarding future work it ismeaningful to develop and test these algorithms in thepractical applications
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This project is supported by the Key Program of the NationalNatural Science Foundation of China (Grant no 61039003)the National Natural Science Foundation of China (Grantno 41274038) the Aeronautical Science Foundation of China(Grant no 2013ZC51027) the Aerospace Innovation Founda-tion of China (CASC201102) and the Fundamental ResearchFunds for the Central Universities
References
[1] R Mautz and S Tilch ldquoSurvey of optical indoor positioningsystemsrdquo in Proceedings of the International Conference onIndoor Positioning and Indoor Navigation (IPIN rsquo11) GuimaraesPortugal September 2011
[2] G Retscher E Moser D Vredeveld et al ldquoPerformance andaccuracy test of a WiFi indoor positioning systemrdquo Journal ofApplied Geodesy vol 1 no 2 pp 103ndash110 2007
[3] S J Ingram D Harmer and M Quinlan ldquoUltra wide bandindoor positioning systems and their use in emergenciesrdquo inProceedings of the Position Location and Navigation Symposium(PLANS rsquo04 ) pp 706ndash715 Monterey Calif USA April 2004
[4] G Anastasi R Bandelloni M Conti et al ldquoExperimentingan indoor bluetooth-based positioning servicerdquo in Proceedingsof the 23rd International Conference on Distributed ComputingSystems Workshops pp 480ndash483 Providence RI USA 2003
[5] H Liu H Darabi P Banerjee and J Liu ldquoSurvey of wirelessindoor positioning techniques and systemsrdquo IEEE Transactionson Systems Man and Cybernetics C Applications and Reviewsvol 37 no 6 pp 1067ndash1080 2007
[6] H-S Kim and J-S Choi ldquoAdvanced indoor localization usingultrasonic sensor and digital compassrdquo in Proceedings of theInternational Conference on Control Automation and Systems(ICCAS rsquo08) pp 223ndash226 Seoul Korea October 2008
[7] R Karlsson T B Schon D Tornqvist G Conte and FGustafsson ldquoUtilizing model structure for efficient simulta-neous localization and mapping for a UAV applicationrdquo inProceedings of the IEEE Aerospace Conference Digital ObjectIdentifier (AC rsquo08) Rome Italy March 2008
[8] D Nister O Naroditsky and J Bergen ldquoVisual odometryrdquoin Proceedings of the IEEE Computer Society Conference onComputer Vision and Pattern Recognition (CVPR rsquo04) pp 652ndash659 Washington DC USA July 2004
[9] Y I Abdel-Aziz andHMKarara ldquoDirect linear transformationinto object space coordinates in close-range photogrammetryrdquoin Proceedings of the Symposium on Close-Range Photogramme-try pp 1ndash18 Urbana Ill USA 1971
[10] R Y Tsai ldquoA versatile camera calibration technique for high-accuracy 3D Machine vision metrology using off-the-shelf TVcamera and lensesrdquo IEEE Journal of Robotics and Automationvol RA-3 no 4 pp 323ndash344 1987
[11] J Long X Zhang and L Zhao ldquoA fast calibration algorithmbased on vanishing point for scene camerardquo Applied Mechanicsand Materials vol 58ndash60 pp 1148ndash1153 2011
[12] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000
[13] P D Z Varcheie and G-A Bilodeau ldquoPeople tracking using anetwork-based PTZ camerardquoMachine Vision and Applicationsvol 22 no 4 pp 671ndash690 2011
[14] C-H Chen Y Yao D Page B Abidi A Koschan andMAbidildquoHeterogeneous fusion of omnidirectional and PTZ camerasfor multiple object trackingrdquo IEEE Transactions on Circuits andSystems for Video Technology vol 18 no 8 pp 1052ndash1063 2008
[15] X Clady F Collange F Jurie and P Martinet ldquoObject track-ing with a Pan-tilt-zoom camera application to car drivingassistancerdquo in Proceedings of the IEEE International Conferenceon Robotics and Automation (ICRA rsquo01) pp 1653ndash1658 SeoulKorea May 2001
[16] N Bellotto E Sommerlade B Benfold et al ldquoA dstributedcamera system formulti-resolution surveillancerdquo in Proceedings
Mathematical Problems in Engineering 11
of the 3rd ACMIEEE International Conference on DistributedSmart Cameras (ICDSC rsquo09) Como Italy September 2009
[17] C Ding B Song A Morye et al ldquoCollaborative sensing in adistributed ptz camera networkrdquo IEEE Transactions on ImageProcessing vol 21 no 7 pp 3282ndash3295 2012
[18] E B Ermis P Clarot P-M Jodoin and V Saligrama ldquoActivitybasedmatching in distributed camera networksrdquo IEEE Transac-tions on Image Processing vol 19 no 10 pp 2595ndash2613 2010
[19] D-M Tsai and S-C Lai ldquoIndependent component analysis-based background subtraction for indoor surveillancerdquo IEEETransactions on Image Processing vol 18 no 1 pp 158ndash167 2009
[20] J Migdal and W E L Grimson ldquoBackground subtractionusing Markov thresholdsrdquo in Proceedings of the IEEE Workshopon Motion and Video Computing (MOTION rsquo05) pp 58ndash65Breckenridge Colo USA January 2005
[21] C Stauffer andW E L Grimson ldquoLearning patterns of activityusing real-time trackingrdquo IEEETransactions on Pattern Analysisand Machine Intelligence vol 22 no 8 pp 747ndash757 2000
[22] D-S Lee ldquoEffective Gaussian mixture learning for video back-ground subtractionrdquo IEEE Transactions on Pattern Analysis andMachine Intelligence vol 27 no 5 pp 827ndash832 2005
[23] L Zhao and X He ldquoAdaptive Gaussian mixture learningfor moving object detectionrdquo in Proceedings of the 3rd IEEEInternational Conference on Broadband Network and Multime-dia Technology (IC-BNMT rsquo10) pp 1176ndash1180 Beijing ChinaOctober 2010
[24] S Sivaraman and M M Trivedi ldquoLooking at vehicles on theroad a survey of vision-based vehicle detection tracking andbehavior analysisrdquo IEEE Transactions on Intelligent Transporta-tion Systems vol 14 no 4 pp 1773ndash1795 2013
[25] G H Wen Z S Duan G R Chen and W W Yu ldquoConsen-sus tracking of multi-agent systems with lipschitz-type nodedynamics and switching topologiesrdquo IEEE Transactions onCircuits and Systems vol 60 no 9 pp 1ndash13 2013
[26] S Oh S Russell and S Sastry ldquoMarkov chain Monte Carlodata association formulti-target trackingrdquo IEEETransactions onAutomatic Control vol 54 no 3 pp 481ndash497 2009
[27] H Zhang and L Zhao ldquoIntegral channel features for particlefilter based object trackingrdquo in Proceedings of the 5th Interna-tional Conference on Intelligent Human-Machine Systems andCybernetics pp 190ndash193 Hangzhou China 2013
[28] M Taj and A Cavallaro ldquoDistributed and decentralized multi-camera trackingrdquo IEEE Signal Processing Magazine vol 28 no3 pp 46ndash58 2011
[29] N Krahnstoever T Yu S Lim et al ldquoCollaborative real-timecontrol of active cameras in large scale surveillance systemsrdquo inProceedings of the Workshop on Multi-camera and Multi-modalSensor Fusion Algorithms and Applications pp 1ndash12 MarseilleFrance 2008
[30] C Braillon C Pradalier J L Crowley and C Laugier ldquoReal-time moving obstacle detection using optical flow modelsrdquo inProceedings of the IEEE Intelligent Vehicles Symposium (IV rsquo06)pp 466ndash471 Tokyo Japan June 2006
[31] C Kim and J N Hwang ldquoFast and automatic video objectsegmentation and tracking for content-based applicationsrdquoIEEE Transactions on Circuits and Systems for Video Technologyvol 12 no 2 pp 122ndash129 2002
[32] R Jain and H H Nagel ldquoOn the analysis of accumulativedifference pictures from image sequences of real world scenesrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 1 no 2 pp 206ndash214 1978
[33] O Barnich andM Van Droogenbroeck ldquoViBE a powerful ran-dom technique to estimate the background in video sequencesrdquoin Proceedings of the IEEE International Conference onAcousticsSpeech and Signal Processing (ICASSP rsquo09) pp 945ndash948 TaibeiTaiwan April 2009
[34] A Elgammal D Harwood and L Davis ldquoNon-parametricmodel for background subtractionrdquo in Proceedings of the the 6thEuropean Conference on Computer Vision-Part II pp 751ndash767London UK 2000
[35] HWMao C H Yang G P Abousleman and J Si ldquoAutomatedmultiple target detection and tracking in UAV videosrdquo inAirborne Intelligence Surveillance Reconnaissance (ISR) Systemsand Applications VII vol 7668 of Proceedings of SPIE OrlandoFla USA 2010
[36] H Medeiros J Park and A C Kak ldquoDistributed objecttracking using a cluster-based Kalman filter in wireless cameranetworksrdquo IEEE Journal on Selected Topics in Signal Processingvol 2 no 4 pp 448ndash463 2008
[37] A Andriyenko and K Schindler ldquoMulti-target tracking bycontinuous energy minimizationrdquo in Proceedings of the IEEEConference on Computer Vision and Pattern Recognition (CVPRrsquo11) pp 1265ndash1272 Colorado Springs Colo USA June 2011
[38] A Andriyenko and K Schindler ldquoGlobally optimal multi targettracking on a hexagonal latticerdquo in Proceedings of the 11thEuropean Conference on Computer Vision pp 466ndash479 CreteGreece 2010
[39] J Y Bouguet ldquoPyramidal implementation of the affine lucaskanade feature tracker description of the algorithmrdquo IntelCorporation pp 1ndash9 2001
[40] Y Wang G Hu and Z Chen ldquoCalibration of CCD camera inimage matching experimental equipmentrdquo in Proceedings of the2nd International Symposium on Instrumentation Science andTechnology p 3146 Jinan China August 2002
[41] B Caprile and V Torre ldquoUsing vanishing points for cameracalibrationrdquo International Journal of Computer Vision vol 4 no2 pp 127ndash139 1990
Submit your manuscripts athttpwwwhindawicom
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical Problems in Engineering
Hindawi Publishing Corporationhttpwwwhindawicom
Differential EquationsInternational Journal of
Volume 2014
Applied MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical PhysicsAdvances in
Complex AnalysisJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
OptimizationJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Operations ResearchAdvances in
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Function Spaces
Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of Mathematics and Mathematical Sciences
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Algebra
Discrete Dynamics in Nature and Society
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Decision SciencesAdvances in
Discrete MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Stochastic AnalysisInternational Journal of
Mathematical Problems in Engineering 9
(a) 3D scene of new main building (b) Camera 1
(c) Camera 2 (d) Camera 3
(e) Camera 4 (f) Camera 5
(g) Camera 6 (h) Camera 7
Figure 15 The experimental results in outdoor environment
cameras are unified into the world coordinate system Targetsdetection and tracking are done by the target detection andtracking algorithm As a result the targets are positionedby imaging model and camera parameters in real time andtheir trajectories are displayed in a three-dimensional scene
The test results are respectively shown in Figures 14 and15 for indoor and outdoor environments As can be seenfrom Figure 14 when a target is continuously moving withinan indoor corridor the positioning system consisting of sixdistributed cameras is able to position this target in real
10 Mathematical Problems in Engineering
time and display its trajectory in a three-dimensional spacemodel Likewise as can be seen from Figure 15 when a targetis continuously moving outdoors the positioning systemconsisting of seven distributed cameras is able to position thistarget and display its trajectory in a three-dimensional spacemodel in real time as well The experimental results confirmthat the systematic framework and inclusive algorithmspresented in this paper are both effective and efficient
In this paper we assume that the ground is flat whichrarely holds in practical applications given a large regionIn order to solve this problem it is necessary to use digitalelevationmodel (DEM) to describe topographic relief in largeregions
6 Conclusion and Future Work
This paper presented the comprehensive design and imple-mentation of a moving target positioning system basedon a distributed camera network The system is composedof low-cost static cameras which provide complementarypositioning information for moving target positioning inindoor and outdoor environments when GNSS signals areunavailable In this system static cameras can cover a largeregion moving targets are detected and then tracked usingcorresponding algorithms target positions are estimated bymaking use of the geometrical relationships among thosecameras after calibrating those cameras and finally foreach target its position estimates obtained from differentcameras are unified into the world coordinate system Theexperimental results of the target detection tracking andpositioning system were reported based on real video data
Targets positioning and trackingwithmultiple static cam-eras were verified in both indoor and outdoor environmentsHowever the reliability and accuracy of target tracking andpositioning suffer from several environment factors Henceit is necessary to fuse information from various sensorssuch as radar infrared camera inertial measure unit (IMU)and wireless location system Regarding future work it ismeaningful to develop and test these algorithms in thepractical applications
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This project is supported by the Key Program of the NationalNatural Science Foundation of China (Grant no 61039003)the National Natural Science Foundation of China (Grantno 41274038) the Aeronautical Science Foundation of China(Grant no 2013ZC51027) the Aerospace Innovation Founda-tion of China (CASC201102) and the Fundamental ResearchFunds for the Central Universities
References
[1] R Mautz and S Tilch ldquoSurvey of optical indoor positioningsystemsrdquo in Proceedings of the International Conference onIndoor Positioning and Indoor Navigation (IPIN rsquo11) GuimaraesPortugal September 2011
[2] G Retscher E Moser D Vredeveld et al ldquoPerformance andaccuracy test of a WiFi indoor positioning systemrdquo Journal ofApplied Geodesy vol 1 no 2 pp 103ndash110 2007
[3] S J Ingram D Harmer and M Quinlan ldquoUltra wide bandindoor positioning systems and their use in emergenciesrdquo inProceedings of the Position Location and Navigation Symposium(PLANS rsquo04 ) pp 706ndash715 Monterey Calif USA April 2004
[4] G Anastasi R Bandelloni M Conti et al ldquoExperimentingan indoor bluetooth-based positioning servicerdquo in Proceedingsof the 23rd International Conference on Distributed ComputingSystems Workshops pp 480ndash483 Providence RI USA 2003
[5] H Liu H Darabi P Banerjee and J Liu ldquoSurvey of wirelessindoor positioning techniques and systemsrdquo IEEE Transactionson Systems Man and Cybernetics C Applications and Reviewsvol 37 no 6 pp 1067ndash1080 2007
[6] H-S Kim and J-S Choi ldquoAdvanced indoor localization usingultrasonic sensor and digital compassrdquo in Proceedings of theInternational Conference on Control Automation and Systems(ICCAS rsquo08) pp 223ndash226 Seoul Korea October 2008
[7] R Karlsson T B Schon D Tornqvist G Conte and FGustafsson ldquoUtilizing model structure for efficient simulta-neous localization and mapping for a UAV applicationrdquo inProceedings of the IEEE Aerospace Conference Digital ObjectIdentifier (AC rsquo08) Rome Italy March 2008
[8] D Nister O Naroditsky and J Bergen ldquoVisual odometryrdquoin Proceedings of the IEEE Computer Society Conference onComputer Vision and Pattern Recognition (CVPR rsquo04) pp 652ndash659 Washington DC USA July 2004
[9] Y I Abdel-Aziz andHMKarara ldquoDirect linear transformationinto object space coordinates in close-range photogrammetryrdquoin Proceedings of the Symposium on Close-Range Photogramme-try pp 1ndash18 Urbana Ill USA 1971
[10] R Y Tsai ldquoA versatile camera calibration technique for high-accuracy 3D Machine vision metrology using off-the-shelf TVcamera and lensesrdquo IEEE Journal of Robotics and Automationvol RA-3 no 4 pp 323ndash344 1987
[11] J Long X Zhang and L Zhao ldquoA fast calibration algorithmbased on vanishing point for scene camerardquo Applied Mechanicsand Materials vol 58ndash60 pp 1148ndash1153 2011
[12] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000
[13] P D Z Varcheie and G-A Bilodeau ldquoPeople tracking using anetwork-based PTZ camerardquoMachine Vision and Applicationsvol 22 no 4 pp 671ndash690 2011
[14] C-H Chen Y Yao D Page B Abidi A Koschan andMAbidildquoHeterogeneous fusion of omnidirectional and PTZ camerasfor multiple object trackingrdquo IEEE Transactions on Circuits andSystems for Video Technology vol 18 no 8 pp 1052ndash1063 2008
[15] X Clady F Collange F Jurie and P Martinet ldquoObject track-ing with a Pan-tilt-zoom camera application to car drivingassistancerdquo in Proceedings of the IEEE International Conferenceon Robotics and Automation (ICRA rsquo01) pp 1653ndash1658 SeoulKorea May 2001
[16] N Bellotto E Sommerlade B Benfold et al ldquoA dstributedcamera system formulti-resolution surveillancerdquo in Proceedings
Mathematical Problems in Engineering 11
of the 3rd ACMIEEE International Conference on DistributedSmart Cameras (ICDSC rsquo09) Como Italy September 2009
[17] C Ding B Song A Morye et al ldquoCollaborative sensing in adistributed ptz camera networkrdquo IEEE Transactions on ImageProcessing vol 21 no 7 pp 3282ndash3295 2012
[18] E B Ermis P Clarot P-M Jodoin and V Saligrama ldquoActivitybasedmatching in distributed camera networksrdquo IEEE Transac-tions on Image Processing vol 19 no 10 pp 2595ndash2613 2010
[19] D-M Tsai and S-C Lai ldquoIndependent component analysis-based background subtraction for indoor surveillancerdquo IEEETransactions on Image Processing vol 18 no 1 pp 158ndash167 2009
[20] J Migdal and W E L Grimson ldquoBackground subtractionusing Markov thresholdsrdquo in Proceedings of the IEEE Workshopon Motion and Video Computing (MOTION rsquo05) pp 58ndash65Breckenridge Colo USA January 2005
[21] C Stauffer andW E L Grimson ldquoLearning patterns of activityusing real-time trackingrdquo IEEETransactions on Pattern Analysisand Machine Intelligence vol 22 no 8 pp 747ndash757 2000
[22] D-S Lee ldquoEffective Gaussian mixture learning for video back-ground subtractionrdquo IEEE Transactions on Pattern Analysis andMachine Intelligence vol 27 no 5 pp 827ndash832 2005
[23] L Zhao and X He ldquoAdaptive Gaussian mixture learningfor moving object detectionrdquo in Proceedings of the 3rd IEEEInternational Conference on Broadband Network and Multime-dia Technology (IC-BNMT rsquo10) pp 1176ndash1180 Beijing ChinaOctober 2010
[24] S Sivaraman and M M Trivedi ldquoLooking at vehicles on theroad a survey of vision-based vehicle detection tracking andbehavior analysisrdquo IEEE Transactions on Intelligent Transporta-tion Systems vol 14 no 4 pp 1773ndash1795 2013
[25] G H Wen Z S Duan G R Chen and W W Yu ldquoConsen-sus tracking of multi-agent systems with lipschitz-type nodedynamics and switching topologiesrdquo IEEE Transactions onCircuits and Systems vol 60 no 9 pp 1ndash13 2013
[26] S Oh S Russell and S Sastry ldquoMarkov chain Monte Carlodata association formulti-target trackingrdquo IEEETransactions onAutomatic Control vol 54 no 3 pp 481ndash497 2009
[27] H Zhang and L Zhao ldquoIntegral channel features for particlefilter based object trackingrdquo in Proceedings of the 5th Interna-tional Conference on Intelligent Human-Machine Systems andCybernetics pp 190ndash193 Hangzhou China 2013
[28] M Taj and A Cavallaro ldquoDistributed and decentralized multi-camera trackingrdquo IEEE Signal Processing Magazine vol 28 no3 pp 46ndash58 2011
[29] N Krahnstoever T Yu S Lim et al ldquoCollaborative real-timecontrol of active cameras in large scale surveillance systemsrdquo inProceedings of the Workshop on Multi-camera and Multi-modalSensor Fusion Algorithms and Applications pp 1ndash12 MarseilleFrance 2008
[30] C Braillon C Pradalier J L Crowley and C Laugier ldquoReal-time moving obstacle detection using optical flow modelsrdquo inProceedings of the IEEE Intelligent Vehicles Symposium (IV rsquo06)pp 466ndash471 Tokyo Japan June 2006
[31] C Kim and J N Hwang ldquoFast and automatic video objectsegmentation and tracking for content-based applicationsrdquoIEEE Transactions on Circuits and Systems for Video Technologyvol 12 no 2 pp 122ndash129 2002
[32] R Jain and H H Nagel ldquoOn the analysis of accumulativedifference pictures from image sequences of real world scenesrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 1 no 2 pp 206ndash214 1978
[33] O Barnich andM Van Droogenbroeck ldquoViBE a powerful ran-dom technique to estimate the background in video sequencesrdquoin Proceedings of the IEEE International Conference onAcousticsSpeech and Signal Processing (ICASSP rsquo09) pp 945ndash948 TaibeiTaiwan April 2009
[34] A Elgammal D Harwood and L Davis ldquoNon-parametricmodel for background subtractionrdquo in Proceedings of the the 6thEuropean Conference on Computer Vision-Part II pp 751ndash767London UK 2000
[35] HWMao C H Yang G P Abousleman and J Si ldquoAutomatedmultiple target detection and tracking in UAV videosrdquo inAirborne Intelligence Surveillance Reconnaissance (ISR) Systemsand Applications VII vol 7668 of Proceedings of SPIE OrlandoFla USA 2010
[36] H Medeiros J Park and A C Kak ldquoDistributed objecttracking using a cluster-based Kalman filter in wireless cameranetworksrdquo IEEE Journal on Selected Topics in Signal Processingvol 2 no 4 pp 448ndash463 2008
[37] A Andriyenko and K Schindler ldquoMulti-target tracking bycontinuous energy minimizationrdquo in Proceedings of the IEEEConference on Computer Vision and Pattern Recognition (CVPRrsquo11) pp 1265ndash1272 Colorado Springs Colo USA June 2011
[38] A Andriyenko and K Schindler ldquoGlobally optimal multi targettracking on a hexagonal latticerdquo in Proceedings of the 11thEuropean Conference on Computer Vision pp 466ndash479 CreteGreece 2010
[39] J Y Bouguet ldquoPyramidal implementation of the affine lucaskanade feature tracker description of the algorithmrdquo IntelCorporation pp 1ndash9 2001
[40] Y Wang G Hu and Z Chen ldquoCalibration of CCD camera inimage matching experimental equipmentrdquo in Proceedings of the2nd International Symposium on Instrumentation Science andTechnology p 3146 Jinan China August 2002
[41] B Caprile and V Torre ldquoUsing vanishing points for cameracalibrationrdquo International Journal of Computer Vision vol 4 no2 pp 127ndash139 1990
Submit your manuscripts athttpwwwhindawicom
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical Problems in Engineering
Hindawi Publishing Corporationhttpwwwhindawicom
Differential EquationsInternational Journal of
Volume 2014
Applied MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical PhysicsAdvances in
Complex AnalysisJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
OptimizationJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Operations ResearchAdvances in
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Function Spaces
Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of Mathematics and Mathematical Sciences
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Algebra
Discrete Dynamics in Nature and Society
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Decision SciencesAdvances in
Discrete MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Stochastic AnalysisInternational Journal of
10 Mathematical Problems in Engineering
time and display its trajectory in a three-dimensional spacemodel Likewise as can be seen from Figure 15 when a targetis continuously moving outdoors the positioning systemconsisting of seven distributed cameras is able to position thistarget and display its trajectory in a three-dimensional spacemodel in real time as well The experimental results confirmthat the systematic framework and inclusive algorithmspresented in this paper are both effective and efficient
In this paper we assume that the ground is flat whichrarely holds in practical applications given a large regionIn order to solve this problem it is necessary to use digitalelevationmodel (DEM) to describe topographic relief in largeregions
6 Conclusion and Future Work
This paper presented the comprehensive design and imple-mentation of a moving target positioning system basedon a distributed camera network The system is composedof low-cost static cameras which provide complementarypositioning information for moving target positioning inindoor and outdoor environments when GNSS signals areunavailable In this system static cameras can cover a largeregion moving targets are detected and then tracked usingcorresponding algorithms target positions are estimated bymaking use of the geometrical relationships among thosecameras after calibrating those cameras and finally foreach target its position estimates obtained from differentcameras are unified into the world coordinate system Theexperimental results of the target detection tracking andpositioning system were reported based on real video data
Targets positioning and trackingwithmultiple static cam-eras were verified in both indoor and outdoor environmentsHowever the reliability and accuracy of target tracking andpositioning suffer from several environment factors Henceit is necessary to fuse information from various sensorssuch as radar infrared camera inertial measure unit (IMU)and wireless location system Regarding future work it ismeaningful to develop and test these algorithms in thepractical applications
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This project is supported by the Key Program of the NationalNatural Science Foundation of China (Grant no 61039003)the National Natural Science Foundation of China (Grantno 41274038) the Aeronautical Science Foundation of China(Grant no 2013ZC51027) the Aerospace Innovation Founda-tion of China (CASC201102) and the Fundamental ResearchFunds for the Central Universities
References
[1] R Mautz and S Tilch ldquoSurvey of optical indoor positioningsystemsrdquo in Proceedings of the International Conference onIndoor Positioning and Indoor Navigation (IPIN rsquo11) GuimaraesPortugal September 2011
[2] G Retscher E Moser D Vredeveld et al ldquoPerformance andaccuracy test of a WiFi indoor positioning systemrdquo Journal ofApplied Geodesy vol 1 no 2 pp 103ndash110 2007
[3] S J Ingram D Harmer and M Quinlan ldquoUltra wide bandindoor positioning systems and their use in emergenciesrdquo inProceedings of the Position Location and Navigation Symposium(PLANS rsquo04 ) pp 706ndash715 Monterey Calif USA April 2004
[4] G Anastasi R Bandelloni M Conti et al ldquoExperimentingan indoor bluetooth-based positioning servicerdquo in Proceedingsof the 23rd International Conference on Distributed ComputingSystems Workshops pp 480ndash483 Providence RI USA 2003
[5] H Liu H Darabi P Banerjee and J Liu ldquoSurvey of wirelessindoor positioning techniques and systemsrdquo IEEE Transactionson Systems Man and Cybernetics C Applications and Reviewsvol 37 no 6 pp 1067ndash1080 2007
[6] H-S Kim and J-S Choi ldquoAdvanced indoor localization usingultrasonic sensor and digital compassrdquo in Proceedings of theInternational Conference on Control Automation and Systems(ICCAS rsquo08) pp 223ndash226 Seoul Korea October 2008
[7] R Karlsson T B Schon D Tornqvist G Conte and FGustafsson ldquoUtilizing model structure for efficient simulta-neous localization and mapping for a UAV applicationrdquo inProceedings of the IEEE Aerospace Conference Digital ObjectIdentifier (AC rsquo08) Rome Italy March 2008
[8] D Nister O Naroditsky and J Bergen ldquoVisual odometryrdquoin Proceedings of the IEEE Computer Society Conference onComputer Vision and Pattern Recognition (CVPR rsquo04) pp 652ndash659 Washington DC USA July 2004
[9] Y I Abdel-Aziz andHMKarara ldquoDirect linear transformationinto object space coordinates in close-range photogrammetryrdquoin Proceedings of the Symposium on Close-Range Photogramme-try pp 1ndash18 Urbana Ill USA 1971
[10] R Y Tsai ldquoA versatile camera calibration technique for high-accuracy 3D Machine vision metrology using off-the-shelf TVcamera and lensesrdquo IEEE Journal of Robotics and Automationvol RA-3 no 4 pp 323ndash344 1987
[11] J Long X Zhang and L Zhao ldquoA fast calibration algorithmbased on vanishing point for scene camerardquo Applied Mechanicsand Materials vol 58ndash60 pp 1148ndash1153 2011
[12] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000
[13] P D Z Varcheie and G-A Bilodeau ldquoPeople tracking using anetwork-based PTZ camerardquoMachine Vision and Applicationsvol 22 no 4 pp 671ndash690 2011
[14] C-H Chen Y Yao D Page B Abidi A Koschan andMAbidildquoHeterogeneous fusion of omnidirectional and PTZ camerasfor multiple object trackingrdquo IEEE Transactions on Circuits andSystems for Video Technology vol 18 no 8 pp 1052ndash1063 2008
[15] X Clady F Collange F Jurie and P Martinet ldquoObject track-ing with a Pan-tilt-zoom camera application to car drivingassistancerdquo in Proceedings of the IEEE International Conferenceon Robotics and Automation (ICRA rsquo01) pp 1653ndash1658 SeoulKorea May 2001
[16] N Bellotto E Sommerlade B Benfold et al ldquoA dstributedcamera system formulti-resolution surveillancerdquo in Proceedings
Mathematical Problems in Engineering 11
of the 3rd ACMIEEE International Conference on DistributedSmart Cameras (ICDSC rsquo09) Como Italy September 2009
[17] C Ding B Song A Morye et al ldquoCollaborative sensing in adistributed ptz camera networkrdquo IEEE Transactions on ImageProcessing vol 21 no 7 pp 3282ndash3295 2012
[18] E B Ermis P Clarot P-M Jodoin and V Saligrama ldquoActivitybasedmatching in distributed camera networksrdquo IEEE Transac-tions on Image Processing vol 19 no 10 pp 2595ndash2613 2010
[19] D-M Tsai and S-C Lai ldquoIndependent component analysis-based background subtraction for indoor surveillancerdquo IEEETransactions on Image Processing vol 18 no 1 pp 158ndash167 2009
[20] J Migdal and W E L Grimson ldquoBackground subtractionusing Markov thresholdsrdquo in Proceedings of the IEEE Workshopon Motion and Video Computing (MOTION rsquo05) pp 58ndash65Breckenridge Colo USA January 2005
[21] C Stauffer andW E L Grimson ldquoLearning patterns of activityusing real-time trackingrdquo IEEETransactions on Pattern Analysisand Machine Intelligence vol 22 no 8 pp 747ndash757 2000
[22] D-S Lee ldquoEffective Gaussian mixture learning for video back-ground subtractionrdquo IEEE Transactions on Pattern Analysis andMachine Intelligence vol 27 no 5 pp 827ndash832 2005
[23] L Zhao and X He ldquoAdaptive Gaussian mixture learningfor moving object detectionrdquo in Proceedings of the 3rd IEEEInternational Conference on Broadband Network and Multime-dia Technology (IC-BNMT rsquo10) pp 1176ndash1180 Beijing ChinaOctober 2010
[24] S Sivaraman and M M Trivedi ldquoLooking at vehicles on theroad a survey of vision-based vehicle detection tracking andbehavior analysisrdquo IEEE Transactions on Intelligent Transporta-tion Systems vol 14 no 4 pp 1773ndash1795 2013
[25] G H Wen Z S Duan G R Chen and W W Yu ldquoConsen-sus tracking of multi-agent systems with lipschitz-type nodedynamics and switching topologiesrdquo IEEE Transactions onCircuits and Systems vol 60 no 9 pp 1ndash13 2013
[26] S Oh S Russell and S Sastry ldquoMarkov chain Monte Carlodata association formulti-target trackingrdquo IEEETransactions onAutomatic Control vol 54 no 3 pp 481ndash497 2009
[27] H Zhang and L Zhao ldquoIntegral channel features for particlefilter based object trackingrdquo in Proceedings of the 5th Interna-tional Conference on Intelligent Human-Machine Systems andCybernetics pp 190ndash193 Hangzhou China 2013
[28] M Taj and A Cavallaro ldquoDistributed and decentralized multi-camera trackingrdquo IEEE Signal Processing Magazine vol 28 no3 pp 46ndash58 2011
[29] N Krahnstoever T Yu S Lim et al ldquoCollaborative real-timecontrol of active cameras in large scale surveillance systemsrdquo inProceedings of the Workshop on Multi-camera and Multi-modalSensor Fusion Algorithms and Applications pp 1ndash12 MarseilleFrance 2008
[30] C Braillon C Pradalier J L Crowley and C Laugier ldquoReal-time moving obstacle detection using optical flow modelsrdquo inProceedings of the IEEE Intelligent Vehicles Symposium (IV rsquo06)pp 466ndash471 Tokyo Japan June 2006
[31] C Kim and J N Hwang ldquoFast and automatic video objectsegmentation and tracking for content-based applicationsrdquoIEEE Transactions on Circuits and Systems for Video Technologyvol 12 no 2 pp 122ndash129 2002
[32] R Jain and H H Nagel ldquoOn the analysis of accumulativedifference pictures from image sequences of real world scenesrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 1 no 2 pp 206ndash214 1978
[33] O Barnich andM Van Droogenbroeck ldquoViBE a powerful ran-dom technique to estimate the background in video sequencesrdquoin Proceedings of the IEEE International Conference onAcousticsSpeech and Signal Processing (ICASSP rsquo09) pp 945ndash948 TaibeiTaiwan April 2009
[34] A Elgammal D Harwood and L Davis ldquoNon-parametricmodel for background subtractionrdquo in Proceedings of the the 6thEuropean Conference on Computer Vision-Part II pp 751ndash767London UK 2000
[35] HWMao C H Yang G P Abousleman and J Si ldquoAutomatedmultiple target detection and tracking in UAV videosrdquo inAirborne Intelligence Surveillance Reconnaissance (ISR) Systemsand Applications VII vol 7668 of Proceedings of SPIE OrlandoFla USA 2010
[36] H Medeiros J Park and A C Kak ldquoDistributed objecttracking using a cluster-based Kalman filter in wireless cameranetworksrdquo IEEE Journal on Selected Topics in Signal Processingvol 2 no 4 pp 448ndash463 2008
[37] A Andriyenko and K Schindler ldquoMulti-target tracking bycontinuous energy minimizationrdquo in Proceedings of the IEEEConference on Computer Vision and Pattern Recognition (CVPRrsquo11) pp 1265ndash1272 Colorado Springs Colo USA June 2011
[38] A Andriyenko and K Schindler ldquoGlobally optimal multi targettracking on a hexagonal latticerdquo in Proceedings of the 11thEuropean Conference on Computer Vision pp 466ndash479 CreteGreece 2010
[39] J Y Bouguet ldquoPyramidal implementation of the affine lucaskanade feature tracker description of the algorithmrdquo IntelCorporation pp 1ndash9 2001
[40] Y Wang G Hu and Z Chen ldquoCalibration of CCD camera inimage matching experimental equipmentrdquo in Proceedings of the2nd International Symposium on Instrumentation Science andTechnology p 3146 Jinan China August 2002
[41] B Caprile and V Torre ldquoUsing vanishing points for cameracalibrationrdquo International Journal of Computer Vision vol 4 no2 pp 127ndash139 1990
Submit your manuscripts athttpwwwhindawicom
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical Problems in Engineering
Hindawi Publishing Corporationhttpwwwhindawicom
Differential EquationsInternational Journal of
Volume 2014
Applied MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical PhysicsAdvances in
Complex AnalysisJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
OptimizationJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Operations ResearchAdvances in
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Function Spaces
Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of Mathematics and Mathematical Sciences
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Algebra
Discrete Dynamics in Nature and Society
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Decision SciencesAdvances in
Discrete MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Stochastic AnalysisInternational Journal of
Mathematical Problems in Engineering 11
of the 3rd ACMIEEE International Conference on DistributedSmart Cameras (ICDSC rsquo09) Como Italy September 2009
[17] C Ding B Song A Morye et al ldquoCollaborative sensing in adistributed ptz camera networkrdquo IEEE Transactions on ImageProcessing vol 21 no 7 pp 3282ndash3295 2012
[18] E B Ermis P Clarot P-M Jodoin and V Saligrama ldquoActivitybasedmatching in distributed camera networksrdquo IEEE Transac-tions on Image Processing vol 19 no 10 pp 2595ndash2613 2010
[19] D-M Tsai and S-C Lai ldquoIndependent component analysis-based background subtraction for indoor surveillancerdquo IEEETransactions on Image Processing vol 18 no 1 pp 158ndash167 2009
[20] J Migdal and W E L Grimson ldquoBackground subtractionusing Markov thresholdsrdquo in Proceedings of the IEEE Workshopon Motion and Video Computing (MOTION rsquo05) pp 58ndash65Breckenridge Colo USA January 2005
[21] C Stauffer andW E L Grimson ldquoLearning patterns of activityusing real-time trackingrdquo IEEETransactions on Pattern Analysisand Machine Intelligence vol 22 no 8 pp 747ndash757 2000
[22] D-S Lee ldquoEffective Gaussian mixture learning for video back-ground subtractionrdquo IEEE Transactions on Pattern Analysis andMachine Intelligence vol 27 no 5 pp 827ndash832 2005
[23] L Zhao and X He ldquoAdaptive Gaussian mixture learningfor moving object detectionrdquo in Proceedings of the 3rd IEEEInternational Conference on Broadband Network and Multime-dia Technology (IC-BNMT rsquo10) pp 1176ndash1180 Beijing ChinaOctober 2010
[24] S Sivaraman and M M Trivedi ldquoLooking at vehicles on theroad a survey of vision-based vehicle detection tracking andbehavior analysisrdquo IEEE Transactions on Intelligent Transporta-tion Systems vol 14 no 4 pp 1773ndash1795 2013
[25] G H Wen Z S Duan G R Chen and W W Yu ldquoConsen-sus tracking of multi-agent systems with lipschitz-type nodedynamics and switching topologiesrdquo IEEE Transactions onCircuits and Systems vol 60 no 9 pp 1ndash13 2013
[26] S Oh S Russell and S Sastry ldquoMarkov chain Monte Carlodata association formulti-target trackingrdquo IEEETransactions onAutomatic Control vol 54 no 3 pp 481ndash497 2009
[27] H Zhang and L Zhao ldquoIntegral channel features for particlefilter based object trackingrdquo in Proceedings of the 5th Interna-tional Conference on Intelligent Human-Machine Systems andCybernetics pp 190ndash193 Hangzhou China 2013
[28] M Taj and A Cavallaro ldquoDistributed and decentralized multi-camera trackingrdquo IEEE Signal Processing Magazine vol 28 no3 pp 46ndash58 2011
[29] N Krahnstoever T Yu S Lim et al ldquoCollaborative real-timecontrol of active cameras in large scale surveillance systemsrdquo inProceedings of the Workshop on Multi-camera and Multi-modalSensor Fusion Algorithms and Applications pp 1ndash12 MarseilleFrance 2008
[30] C Braillon C Pradalier J L Crowley and C Laugier ldquoReal-time moving obstacle detection using optical flow modelsrdquo inProceedings of the IEEE Intelligent Vehicles Symposium (IV rsquo06)pp 466ndash471 Tokyo Japan June 2006
[31] C Kim and J N Hwang ldquoFast and automatic video objectsegmentation and tracking for content-based applicationsrdquoIEEE Transactions on Circuits and Systems for Video Technologyvol 12 no 2 pp 122ndash129 2002
[32] R Jain and H H Nagel ldquoOn the analysis of accumulativedifference pictures from image sequences of real world scenesrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 1 no 2 pp 206ndash214 1978
[33] O Barnich andM Van Droogenbroeck ldquoViBE a powerful ran-dom technique to estimate the background in video sequencesrdquoin Proceedings of the IEEE International Conference onAcousticsSpeech and Signal Processing (ICASSP rsquo09) pp 945ndash948 TaibeiTaiwan April 2009
[34] A Elgammal D Harwood and L Davis ldquoNon-parametricmodel for background subtractionrdquo in Proceedings of the the 6thEuropean Conference on Computer Vision-Part II pp 751ndash767London UK 2000
[35] HWMao C H Yang G P Abousleman and J Si ldquoAutomatedmultiple target detection and tracking in UAV videosrdquo inAirborne Intelligence Surveillance Reconnaissance (ISR) Systemsand Applications VII vol 7668 of Proceedings of SPIE OrlandoFla USA 2010
[36] H Medeiros J Park and A C Kak ldquoDistributed objecttracking using a cluster-based Kalman filter in wireless cameranetworksrdquo IEEE Journal on Selected Topics in Signal Processingvol 2 no 4 pp 448ndash463 2008
[37] A Andriyenko and K Schindler ldquoMulti-target tracking bycontinuous energy minimizationrdquo in Proceedings of the IEEEConference on Computer Vision and Pattern Recognition (CVPRrsquo11) pp 1265ndash1272 Colorado Springs Colo USA June 2011
[38] A Andriyenko and K Schindler ldquoGlobally optimal multi targettracking on a hexagonal latticerdquo in Proceedings of the 11thEuropean Conference on Computer Vision pp 466ndash479 CreteGreece 2010
[39] J Y Bouguet ldquoPyramidal implementation of the affine lucaskanade feature tracker description of the algorithmrdquo IntelCorporation pp 1ndash9 2001
[40] Y Wang G Hu and Z Chen ldquoCalibration of CCD camera inimage matching experimental equipmentrdquo in Proceedings of the2nd International Symposium on Instrumentation Science andTechnology p 3146 Jinan China August 2002
[41] B Caprile and V Torre ldquoUsing vanishing points for cameracalibrationrdquo International Journal of Computer Vision vol 4 no2 pp 127ndash139 1990
Submit your manuscripts athttpwwwhindawicom
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical Problems in Engineering
Hindawi Publishing Corporationhttpwwwhindawicom
Differential EquationsInternational Journal of
Volume 2014
Applied MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical PhysicsAdvances in
Complex AnalysisJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
OptimizationJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Operations ResearchAdvances in
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Function Spaces
Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of Mathematics and Mathematical Sciences
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Algebra
Discrete Dynamics in Nature and Society
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Decision SciencesAdvances in
Discrete MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Stochastic AnalysisInternational Journal of
Submit your manuscripts athttpwwwhindawicom
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical Problems in Engineering
Hindawi Publishing Corporationhttpwwwhindawicom
Differential EquationsInternational Journal of
Volume 2014
Applied MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Mathematical PhysicsAdvances in
Complex AnalysisJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
OptimizationJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Operations ResearchAdvances in
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Function Spaces
Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of Mathematics and Mathematical Sciences
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Algebra
Discrete Dynamics in Nature and Society
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Decision SciencesAdvances in
Discrete MathematicsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Stochastic AnalysisInternational Journal of