PROJECT REPORT Vidoe Tracking

34
HARDWARE IMPLEMENTATION OF AUTOMATIC OBJECT TRACKING CHAPTER 1 INTRODUCTION AND PROBLEM STATEMENT MMCOE E&TC Page 1

description

wideo tracking

Transcript of PROJECT REPORT Vidoe Tracking

Page 1: PROJECT REPORT Vidoe Tracking

HARDWARE IMPLEMENTATION OF AUTOMATIC OBJECT TRACKING

CHAPTER 1

INTRODUCTION AND PROBLEM

STATEMENT

MMCOE E&TC Page 1

Page 2: PROJECT REPORT Vidoe Tracking

HARDWARE IMPLEMENTATION OF AUTOMATIC OBJECT TRACKING

CHAPTER 1

INTRODUCTION

1.1 PROBLEM STATEMENT:

“Hardware implementation of automatic object tracking system using video tracking”.

This addresses several problems and challenges such as operation and communication of camera with microprocessor unit, efficiency and speed constraints of pan-and-tilt unit, beagle board implementation, tracking of object path, variables affecting efficient object tracking etc. The sophistication of hardware used is limited to college project level which should satisfy the cost constraints approximately. Problem does not include specific application area which may be included in future stages of development.

1.2 OBJECTIVE:

The objective is “to move the camera unit with the help of pan-and-tilt unit so as to follow the object to be tracked along the path travelled by that object”.

Objective includes: Taking the live feed of video from camera unit. Communicating the information within microprocessor and peripheral

hardware. Reading and processing the input video to identify the object. To implement algorithm for tracking the object in the video which should

produce output as information related to the identified object mainly considering the parameters angle and direction.

Moving the pan-and-tilt unit depending upon output of tracking algorithm. Track the object by repeating the above procedure till the scope of

hardware permits.

1.3 GENERAL INTRODUCTION (REQUIREMENTS AND CONCEPT):

When it comes to truth, it is one’s natural reaction that “we believe what we see”. It is now accepted that vision will play an exceptional role in both supervised and unsupervised applications.

Systems currently installed in majority, consist of array of fixed position still cameras. By implementing video tracking we can solve many questions related to proper identification and movement of any object. Traditionally the surveillance

MMCOE E&TC Page 2

Page 3: PROJECT REPORT Vidoe Tracking

HARDWARE IMPLEMENTATION OF AUTOMATIC OBJECT TRACKING

systems are implemented using complex networks which include different types of sensors such as infrared sensors, pressure sensors, electronic contact sensors and many more. Nowadays the interests are shifting to smarter systems which include visual support which uses video tracking employing one or more cameras. Camera will provide us with an option of replacing different type of sensors in different applications changing the direction from which we look at the problem. We can decide the sophistication level as single or multiple cameras, color or gray scale, tracking single or multiple objects depending on the application. Let us then discuss first the basic concept behind video tracking. Before we go to the concept, let us have a quick look at what is video, which is:

A video is a sequence of frames captured over time. For tracking we consider individual frames. Thus our image data becomes a function of space

(x, y) and time (t).

Knowing that now we will move to the main problem. As we have mentioned earlier, video tracking is the technique that is used for detecting (identifying) and tracking the object in video frame and in turn in the video. It is a complex problem in which we try to follow the identified object or area from the video frame. The tracking systems must then address two basic problems:

Motion problem : To identify the search region in the video frame where the object is most likely to be found (neighborhood of object).

Matching problem : Identify the same object in the next frame within the designated search region.

Search area in the next frame is generally the fixed or variable size region surrounding the object in the previous frame in which it is already identified. The objects may be tracked depending upon various features of object such as shape, color etc. the hardware also includes a smart device such as microprocessor units with local scope and/or a complete computer automation.

MMCOE E&TC Page 3

Page 4: PROJECT REPORT Vidoe Tracking

HARDWARE IMPLEMENTATION OF AUTOMATIC OBJECT TRACKING

CHAPTER 2

LITERATURE SURVEY

MMCOE E&TC Page 4

Page 5: PROJECT REPORT Vidoe Tracking

HARDWARE IMPLEMENTATION OF AUTOMATIC OBJECT TRACKING

CHAPTER 2

LITERATURE SURVEYOver the last decade, video motion tracking has been attracting strenuous research

endeavors associated with computer vision applications. Some of them are: on-site surveillance, robot navigation, human-computer interaction, public security and security systems, missile guidance, and many more as we count.

Till date, many have tried to tackle this problem in various ways using many different techniques. As we can safely say that video tracking does not remain a completely new concept, we also have to admit that we are yet noticeably away from achieving the optimal solution for the same.

Target Representation and Localization is mostly a bottom-up process. These methods give a variety of tools for identifying the moving object. Locating and tracking the target object successfully is dependent on the algorithm. For example using blob tracking is useful for identifying human movement because a persons profile changes dynamically. Typically the computational complexity for these algorithms is low. These are Target Representation and Localization algorithms. In this section we will revise and study the algorithms already implemented to achieve video tracking, which are as follows:

Blob tracking: Segmentation for object interior (for example blob detection, block

based correlation or optical flow). In the area of computer vision, blob detection refers to visual modules that are aimed at detecting points and/or regions in the image that are either brighter or darker than the surrounding. There are two main classes of blob detectors (1) differential methods based on derivative expressions and (2) methods based on local extremes in the intensity landscape. With the more recent terminology used in the field, these operators can also be referred to as interest point operators, or alternatively interest region operators.

In early work in the area, blob detection was used to obtain regions of interest for further processing. These regions could signal the presence of objects or parts of objects in the image domain with application to object recognition and/or object tracking. In other domains, such as histogram analysis, blob descriptors can also be used for peak detection with application to segmentation. Another common use of blob descriptors is as main primitives for texture analysis and texture recognition. In more recent work,

MMCOE E&TC Page 5

Page 6: PROJECT REPORT Vidoe Tracking

HARDWARE IMPLEMENTATION OF AUTOMATIC OBJECT TRACKING

blob descriptors have found increasingly popular use as interest points for wide baseline. Stereo matching and to signal the presence of informative image features for appearance-based object recognition based on local image statistics.

Contour tracking: Detection of object boundary (e.g. active contours or Condensation

algorithms). In this we consider the problem of tracking the boundary contour of a moving and deforming object from a sequence of images. If the motion of the “object” or region of interest is constrained (e.g. rigid or approximately rigid), the contour motion can be efficiently represented by a small number of parameters.

Deforming contours occur either due to changing region of partial occlusions or when the object of interest is actually deforming its shape over a time or space sequence of images. Examples of the second kind are a beating heart, moving animals or humans, or the cross-sections of different parts of a 3D object like the brain, in consecutive MRI slices. Most biological images contain deforming objects/regions. Contour tracking has many applications in medical image analysis, e.g. sequential segmentation of volume images; tracking heart regions or image guided surgery. The observation likelihood is often multimodal due to background objects (clutter) which are partially occluded by the “object of interest” or due to an object which partially occludes the “object of interest” or due to low contrast imagery.

Kernel-based tracking   (Mean-shift tracking): An iterative localization procedure based on the maximization of a

similarity measure (Bhattacharyya coefficient). Mean shift is a procedure for locating the maxima of a density function given discrete data sampled from that function. It is useful for detecting the modes of this density. It derives the target candidate that is the most similar to a given target model.

The dissimilarity between the target model and the target candidates is expressed by a metric based on the Bhattacharyya coefficient.

Filtering methods allow the tracking of complex objects along with more complex object interaction like tracking objects moving behind obstructions. Additionally the complexity is increased if the video tracker (also named TV Tracker or Target Tracker) is not mounted on rigid foundation (on-shore) but on a moving ship (off-shore), where typically an inertial measurement system is used to pre-stabilize the video tracker to reduce the required dynamics and bandwidth of the camera system. The computational complexity for these algorithms is usually much higher. The following are some common Filtering algorithms:

MMCOE E&TC Page 6

Page 7: PROJECT REPORT Vidoe Tracking

HARDWARE IMPLEMENTATION OF AUTOMATIC OBJECT TRACKING

Kalman filter: An optimal recursive Bayesian filter for linear functions subjected to

Gaussian noise. Its purpose is to use measurements observed over time, containing noise (random variations) and other inaccuracies, and produce values that tend to be closer to the true values of the measurements and their associated calculated values.

The Kalman filter produces estimates of the true values of measurements and their associated calculated values by predicting a value, estimating the uncertainty of the predicted value, and computing a weighted average of the predicted value and the measured value. The most weight is given to the value with the least uncertainty. The estimates produced by the method tend to be closer to the true values than the original measurements because the weighted average has a better estimated uncertainty than either of the values that went into the weighted average.

Kalman filter can assist computers to track objects in videos with low latency (not to be confused with a low number of latent variables). The tracking of objects is a dynamic problem, using data from sensor and camera images that always suffer from noise. This can sometimes be reduced by using higher quality cameras and sensors but can never be eliminated, so it is often desirable to use a noise reduction method. The iterative predictor-corrector nature of the Kalman filter can be helpful, because at each time instance only one constraint on the state variable need be considered. This process is repeated, considering a different constraint at every time instance. All the measured data are accumulated over time and help in predicting the state. Video can also be pre-processed, perhaps using a segmentation technique, to reduce the computation and hence latency.

Particle filter: Useful for sampling the underlying state-space distribution of non-

linear & non-Gaussian processes. The particle filter aims to estimate the sequence of hidden parameters, xk for k = 0,1,2,3,…, based only on the observed data yk for k = 0,1,2,3,…. All Bayesian estimates of xk follow from the posterior distribution p(xk | y0,y1,…,yk).

There are many types of particle filter implementations present such as Gaussian particle filter, unscented particle filter, Monte Carlo particle filter, Cost Reference particle filter, Rao-Blackwellized particle filter, Auxiliary particle filter etc.

MMCOE E&TC Page 7

Page 8: PROJECT REPORT Vidoe Tracking

HARDWARE IMPLEMENTATION OF AUTOMATIC OBJECT TRACKING

CHAPTER 3

BLOCK DIAGRAM AND

DESCRIPTION

MMCOE E&TC Page 8

Page 9: PROJECT REPORT Vidoe Tracking

HARDWARE IMPLEMENTATION OF AUTOMATIC OBJECT TRACKING

CHAPTER 3

BLOCK DIAGRAM AND DESCRIPTION

3.1 BLOCK DIAGRAM:

Fig.1. Block diagram of Video tracking system

MMCOE E&TC Page 9

Camera Unit

Interfacing board Display unit

Pan & Tilt unit

Power supply

Page 10: PROJECT REPORT Vidoe Tracking

HARDWARE IMPLEMENTATION OF AUTOMATIC OBJECT TRACKING

3.2 BLOCK DIAGRAM DESCRIPTION:

1. CAMERA UNIT: The camera unit which is mounted on the pan/tilt unit further connected to the processor for taking the video feed. We will be using Logitech V-ubq42 web camera for our camera unit.

2. PAN AND TILT UNIT:This unit provides with the mounting platform for camera unit and controls the orientation of camera with the help of servo motors.This unit consists of a pan/tilt bracket and two servo motors.

3. INTERFACING BOARD: It is the brain of the system, thus is responsible for video processing and generating the control signals for peripheral control.We are using the Arduino board as our interfacing board which consists of ATMEGA328Microcontroller. It is a high performance, low power AVR 8-bit microcontroller.

4. POWER SUPPLY: The unit is concerned with providing the power requirements of all the hardware including peripherals.

5. DISPLAY UNIT:The display unit which includes monitor for the purpose of display may be a remote computer connected to the main board present locally.

MMCOE E&TC Page 10

Page 11: PROJECT REPORT Vidoe Tracking

HARDWARE IMPLEMENTATION OF AUTOMATIC OBJECT TRACKING

CHAPTER 4

HARDWARE DESIGN

MMCOE E&TC Page 11

Page 12: PROJECT REPORT Vidoe Tracking

HARDWARE IMPLEMENTATION OF AUTOMATIC OBJECT TRACKING

CHAPTER 4

HARDWARE DESIGN

4.1 ARDUINO BOARD

DESCRIPTION:

The Arduino Uno is a microcontroller board based on the ATmega328. It has 14 digital input/output pins (of which 6 can be used as PWM outputs), 6 analog inputs, a 16 MHz crystal oscillator, a USB connection, a power jack, an ICSP header, and a reset button. It contains everything needed to support the microcontroller; simply connect it to a computer with a USB cable or power it with a AC-to-DC adapter or battery to get started.

The Uno differs from all preceding boards in that it does not use the FTDI USB-to-serial driver chip. Instead, it features the Atmega8U2 programmed as a USB-to-serial converter. Revision 2 of the Uno board has a resistor pulling the 8U2 HWB line to ground, making it easier to put into DFU mode.

"Uno" means one in Italian and is named to mark the upcoming release of Arduino 1.0. The Uno and version 1.0 will be the reference versions of Arduino, moving forward. The Uno is the latest in a series of USB Arduino boards.

Fig 2: Arduino board

MMCOE E&TC Page 12

Page 13: PROJECT REPORT Vidoe Tracking

HARDWARE IMPLEMENTATION OF AUTOMATIC OBJECT TRACKING

FEATURES OF ARDUINO: Microcontroller-ATmega328 Operating Voltage -5V Input Voltage (recommended) - 7-12V Input Voltage (limits) - 6-20V Digital I/O Pins - 14 (of which 6 provide PWM output) Analog Input Pins - 6 DC Current per I/O Pin - 40 mA DC Current for 3.3V Pin - 50 mA Flash Memory - 32 KB (ATmega328) of which 0.5 KB used by bootloader SRAM - 2 KB (ATmega328) EEPROM - 1 KB (ATmega328) Clock Speed - 16 MHz

4.2 CAMERA :

The camera unit consists of a basic web camera with a good resolution and a directX or VFW (video for windows) driver.

The camera is to be mounted on the pan/tilt bracket for its orientation in all directions. The weight of the camera should be minimum for its easy movement on pan & tilt unit. For our project we have selected the Logitech V-ubq42 webcam as it provides to all our

requirements.

Fig 3: Logitech V-UBQ42 camera.

MMCOE E&TC Page 13

Page 14: PROJECT REPORT Vidoe Tracking

HARDWARE IMPLEMENTATION OF AUTOMATIC OBJECT TRACKING

4.3 PAN AND TILT UNIT:

1) PAN/TILT BRACKET:

This pan/tilt bracket consists of two brackets and all the hardware you need to attach them to make a pan/tilt mechanism using two servo motors.

The pan/tilt bracket is used for the movement of camera which is mounted on it.

2) SERVO MOTORS:

Two servo motors are used for the pan and tilt movement of the bracket on which the camera is mounted.

The servos are used to give the desired orientation and speed for the camera movement. The servos should be capable of handling the weight of camera and bracket so as to move

freely in all directions. We will be using Futuba S3003 servo motors as it provides all our requirements

Features: Control System: +Pulse Width Control 1520usec Neutral Current Drain (4.8V): 7.2mA/idle Wave Current Drain (6.0V): 8mA/idle Operating Voltage: 4.8-6.0 Volts Direction: Counter Clockwise/Pulse Traveling 1520- Operating Temperature Range: -20 to +60 Degree C Motor Type: 3 Pole Ferrite Operating Speed (4.8V): 0.23sec/60 degrees at no load Potentiometer Operating Speed (6.0V): 0.19sec/60 degrees at no load Bearing Type: Plastic Bearing Stall Torque (4.8V): 44 oz/in. (3.2kg.cm) Stall Torque (6.0V): 56.8 oz/in. (4.1kg.cm) Connector Wire Length: 12" Operating Angle: 45 Deg. one side pulse traveling 400usec Dimensions: 1.6" x 0.8"x 1.4" (41 x 20 x 36mm) 360 Modifiable: Yes Weight:

MMCOE E&TC Page 14

Page 15: PROJECT REPORT Vidoe Tracking

HARDWARE IMPLEMENTATION OF AUTOMATIC OBJECT TRACKING

3) ASSEMBLY OF PAN AND TILT UNIT:

The assembly of pan and tilt unit comprises of constructing the pan/tilt bracket with hardware to attach them together and joining the two servo motors to the bracket.

The camera needs to be mounted on the pan/tilt bracket so that its free movement in all directions is possible.

Fig 4: Assembly of pan & tilt unit

MMCOE E&TC Page 15

Page 16: PROJECT REPORT Vidoe Tracking

HARDWARE IMPLEMENTATION OF AUTOMATIC OBJECT TRACKING

CHAPTER 5

SOFTWARE DESIGN

MMCOE E&TC Page 16

Page 17: PROJECT REPORT Vidoe Tracking

HARDWARE IMPLEMENTATION OF AUTOMATIC OBJECT TRACKING

CHAPTER 5

SOFTWARE DESIGN

5.1 SOFTWARES REQUIRED:

We use following software:

1) Microsoft visual studio C++ 2008 express edition: C++ compiler and development environment.

2) OpenCV: OpenCV (Free Open Source Computer Vision library) is a library of programming functions mainly aimed at real time computer vision. It has a BSD license (free for commercial or research use). OpenCV was originally developed by Microsoft, written in C but now has a full C++ interface and all new development is in C++ and JAVA.

3) Processing: Processing is an open source programming language and environment for people who want to create images, animations, and interactions. Initially developed to serve as a software sketchbook and to teach fundamentals of computer programming within a visual context, Processing also has evolved into a tool for generating finished professional work.

MMCOE E&TC Page 17

Page 18: PROJECT REPORT Vidoe Tracking

HARDWARE IMPLEMENTATION OF AUTOMATIC OBJECT TRACKING

4) Arduino: The open-source Arduino environment makes it easy to write code and upload it to the i/o board. It runs on Windows, Mac OS X, and Linux. The environment is written in Java and based on Processing, avr-gcc, and other open source software.

5) MATLAB: includes video and image processing libraries (as per the requirement).

MMCOE E&TC Page 18

Page 19: PROJECT REPORT Vidoe Tracking

HARDWARE IMPLEMENTATION OF AUTOMATIC OBJECT TRACKING

5.2 FLOW CHART AND ALGORITHM:

5.2.1 ALGORITHM:

1. Start2. Read input video in the form of live stream.3. Identify and mark the object to be tracked.4. Extract the frames at time instances (t) and (t+∂t) to be analyzed by the tracker.5. Perform preprocessing and background noise reduction.6. Perform final marking and extraction of moving regions.7. Identify the moving point vectors.8. Generate the tracking information and variables.9. Detect pan and tilt angles and directions.10. Send panning and tilting angles and directions to the motor control unit.11. If object still in the frame scope and time t < T(time constraint), then goto step 2.12. Stop.

MMCOE E&TC Page 19

Page 20: PROJECT REPORT Vidoe Tracking

HARDWARE IMPLEMENTATION OF AUTOMATIC OBJECT TRACKING

5.2.2 FLOWCHART :

Fig 5: Flowchart

MMCOE E&TC Page 20

Page 21: PROJECT REPORT Vidoe Tracking

HARDWARE IMPLEMENTATION OF AUTOMATIC OBJECT TRACKING

CHAPTER 6

FUTURE SCOPE

CHAPTER 6

MMCOE E&TC Page 21

Page 22: PROJECT REPORT Vidoe Tracking

HARDWARE IMPLEMENTATION OF AUTOMATIC OBJECT TRACKING

FUTURE SCOPE

The future scope of this project expands in both hardware as well as software parts.

The cameras used may be converted to the sophisticated hardware such as PTZ (Pan-

Tilt-Zoom) or PTZ-Dome surveillance cameras for better video feed and zooming to provide better tracking environment.

Instead of using single camera, interconnected and centrally controlled array of

cameras may be used depending upon the area of application and area under observation. By this type of implementation tracking over very large areas may be achieved.

Instead of using a stationary hardware, mobile hardware may be used so as to actually

“follow” the object by combining the robotic systems with current hardware.

By implementing more advanced and modified algorithms flexibility in the object

tracking may be achieved.

Tracking of multiple objects, switching between tracked objects are some of the near

advancements.

As the need of application if needed the database of tracked objects and their paths in

the form of coordinators may be maintained on centralized computer for security reasons and future references.

Manual identification of object may be provided for easy switching between tracked

object.

MMCOE E&TC Page 22

Page 23: PROJECT REPORT Vidoe Tracking

HARDWARE IMPLEMENTATION OF AUTOMATIC OBJECT TRACKING

CHAPTER 7

APPLICATION AND SUMMARY

CHAPTER 7

APPLICATIONS AND SUMMARY

MMCOE E&TC Page 23

Page 24: PROJECT REPORT Vidoe Tracking

HARDWARE IMPLEMENTATION OF AUTOMATIC OBJECT TRACKING

7.1 APPLICATIONS:

1) On-site surveillance.2) Robot navigation.3) Human-computer interaction.4) Public security and security systems.5) Missile guidance.6) Smart consumer appliances and devices.

7.2 SUMMARY:

We have successfully studied how to implement servo motor movement through program for panning and tilting actions.

We have also found out low cost hardware for the implementation of required procedure which would be further enhanced if necessary. We have successfully run the program for face detection under object detection.

The finalization of camera and motors have been done which would further enable successful completion of our project.

MMCOE E&TC Page 24

Page 25: PROJECT REPORT Vidoe Tracking

HARDWARE IMPLEMENTATION OF AUTOMATIC OBJECT TRACKING

REFERENCES

REFERENCES

IEEE papers :

MMCOE E&TC Page 25

Page 26: PROJECT REPORT Vidoe Tracking

HARDWARE IMPLEMENTATION OF AUTOMATIC OBJECT TRACKING

[1] Video motion predictive tracking quality: Kalman filters Vs Low pass filters. By Ken Chen Dong Li and Chul Gyu Jhun.

[2] A novel way of tracking moving objects in video scenes. By J.Shajeena and Dr.K.Ramar.

[3] Video Tracking: A concise survey. By Emanuele Trucco and Constantinous Plakas.

[4] Motion estimation and segmentation framework for digital video processing by P.De Smet I. Bruyland.

[5] A hardware platform for an automatic video tracking system using multiple PTZ cameras By Hongsheng Zhang.

[6] feature based tracking using PTZ cameras by Abhishek Rawat and Tulip Kumar Toppo.

Web references :

[1] http://www.microchip.com

[2] http://www.ti.com

[3] http://www.rhydolabz.com

[4] http://www.servocity.com

[5] http://www.sparkfun.com

[6] http://www.idasystems.net

[7] http://www.societyofrobots.com

Book referances :

[1] VIDEO TRACKING Theory and Practices by Wiley Publications.

[2]Learning OpenCV by Garry Bradsky and Adrian Kaehler (O’REILLY PUB.)

[3]OpenCV application programming cookbook by Robert Laganiere.

MMCOE E&TC Page 26

Page 27: PROJECT REPORT Vidoe Tracking

HARDWARE IMPLEMENTATION OF AUTOMATIC OBJECT TRACKING

DATASHEETS

MMCOE E&TC Page 27