Blue eyes report
-
Upload
munitech4u -
Category
Documents
-
view
292 -
download
4
Transcript of Blue eyes report
BLUE EYES TECHNOLOGY
B. Tech. Seminar Report
BY
MUNISH BANSAL (06429)
DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERINGNATIONAL INSTITUTE OF TECHNOLOGY
HAMIRPUR-177005, HP (INDIA)
November, 2009
BLUE EYES TECHNOLOGY
A Seminar Report
Submitted in partial fulfillment of theRequirement for the award of the degree
Of
Bachelor of Technology
In
ELECTRONICS & COMMUNICATION ENGINEERING
BY
MUNISH BANSAL (06429)
DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING
NATIONAL INSTITUTE OF TECHNOLOGY
HAMIRPUR-177005, HP (INDIA)
November, 2009
Abstract - Human error is still one of the
most frequent causes of catastrophes and
ecological disasters. The main reason is
that the monitoring systems concern only
the state of the processes whereas human
contribution to the overall performance
of the system is left unsupervised. Since
the control instruments are automated to
a large extent, a human – operator
becomes a passive observer of the
supervised system, which results in
weariness and vigilance drop. This, he
may not notice important changes of
indications causing financial or ecological
consequences and a threat to human life.
It therefore is crucial to assure that the
operator’s conscious brain is involved in
an active system supervising over the
whole work time period.
It is possible to measure indirectly
the level of the operator’s conscious brain
involvement using eye motility analysis.
Although there are capable sensors
available on the market, a complex
solution enabling transformation, analysis
and reasoning based on measured signals
still does not exist. In large control rooms,
wiring the operator to the central system
is a serious limitation of his mobility
and disables his operation. Utilization
of wireless technology becomes essential.
I. INTRODUCTION
The Blue eyes system provides
technical means for monitoring and
recording the operator’s basic
physiological parameters. The most
important parameter is saccadic activity,
which enables the system to monitor
the status of the operator’s visual
attention along with head acceleration,
which accompanies large displacement
of the visual axis (saccades larger
than15 degrees). Complex industrial
environment can create a danger of
exposing the operator to toxic substances,
which can affect his cardiac, circulatory
and pulmonary systems. Thus, on the
grounds of plethysmographic signal
taken from the forehead skin surface,
the system computes heart beat rate and
blood oxygenation.
The The Blue eyes system checks
above parameters against abnormal (e.g.
a low level of blood oxygenation or a
high pulse rate) or undesirable (e.g. a
longer period of lowered visual
attention) values and triggers user-
defined alarms when necessary.
Quite often in an emergency
situation operators speak to themselves
expressing their surprise or stating
verbally the problem. Therefore, the
operator’s voice, physiological
parameters and an overall view of the
operating room are recorded. This helps
to reconstruct the course of operators’
work and provides data for long-term
analysis.
BlueEyes consists of a mobile
measuring device and a central analytical
system. The mobile device is integrated
with Bluetooth module providing wireless
interface between sensors worn by the
operator and the central unit. ID cards
assigned to each of the operators and
adequate user profiles on the central unit
side provide necessary data
personalization so different people can
use a single mobile device (called
hereafter DAU – Data Acquisition
Unit). The overall system diagram is
shown in Figure 1. The tasks of the
mobile Data Acquisition Unit are to
maintain Bluetooth connections, to get
information from the sensor and sending
it over the wireless connection, to deliver
the alarm messages sent from the Central
System Unit to the operator and handle
personalized ID cards. Central System
Unit maintains the other side of the
Bluetooth connection, buffers incoming
sensor data, performs on-line data
analysis, records the conclusions for
further exploration and provides
visualization interface.
Figure1. Overall system diagram
The task of the mobile Data
Acquisition Unit are to maintain Bluetooth
connection, to get information from the
sensor and sending it over the wireless
connection ,to deliver the alarm messages
sent from the Central System Unit to the
operator and handle personalized ID cards.
Central System Unit maintains the other side
of the Bluetooth connection, buffers
incoming sensor data, performs on-line data
analysis, records the conclusion for further
exploration and provides visualization
interface.
II. PERFORMANCE REQUIREMENTS
The portable nature of the
mobile unit results in a number of
performance requirements. As the device is
intended to run on batteries, low power
consumption is the most important
constraint. Moreover, it is necessary to
assure proper timing while receiving and
transmitting sensor signals. To make the
operation comfortable the device should be
lightweight and electrically safe. Finally the
use of standard and inexpensive IC’s will
keep the price of the device at relatively low
level.
The priority of the central unit
is to provide real-time buffering and
incoming sensor signals and semi-real-time
processing of the data, which requires
speed-optimizes filtering and reasoning
algorithms. Moreover, the design should
assure the possibility of distributing the
processing among two or more central unit
nodes (e.g. to offload the database system
related tasks to a dedicated server).
III. SYSTEM OVERVIEWS
Blue eyes system monitors the status
of the operator’s visual attention through
measurement of saccadic activity. The
system checks parameters like heart beat
rate and blood oxygenation against
abnormal and triggers user defined alarms.
BlueEyes system consists of a
mobile measuring device and a central
analytical system. The mobile device is
integrated with Bluetooth module providing
wireless interface between sensors worn by
the operator and the central unit. ID cards
assigned to each of the operators and
adequate user profiles on the central unit
side provide necessary data personalization
so The system consists of
> Mobile measuring device (DAU)
>Central System Unit (CSU)
Figure2. System overview
IV. THE HARDWARE
4.1. DATA ACQUISITION UNIT:-
Data Acquisition Unit is a mobile
part of the Blue eyes system. Its main task
is to fetch the physiological data from the
sensor and to send it to the central system to
be processed. To accomplish the task the
device must manage wireless Bluetooth
connections (connection establishment,
authentication and termination). Personal
ID cards and PIN codes provide
operator's authorization.Communication
with the operator is carried on using a
simple 5-key keyboard, a small LCD
display and a beeper. When an exceptional
situation is detected the device uses them
to notify the operator. Voice data is
transferred using a small headset,
interfaced to the DAU with standard
minijack plugs. The Data Acquisition Unit
comprises several hardware modules:
Atmel 89C52 microcontroller -
system core
Bluetooth module (based on
ROK101008)
HD44780 - small LCD display
24C16 - I2C EEPROM (on a
removable ID card)
MC145483 – 13bit PCM codec
Jazz Multisensor interface
Beeper and LED indicators ,6 AA
batteries and voltage level monitor
Figure3. DAU components
4.2. CENTRAL SYSTEM UNIT:-
Central System Unit hardware is
the second peer of the wireless
connection. The box contains a Bluetooth
module (based on ROK101008) and a PCM
codec for voice data transmission.The
module is interfaced to a PC using a
parallel, serial and USB cable. The audio
data is accessible through standard mini-jack
sockets To program operator's personal ID
cards we developed a simple
programming device. The programmer is
interfaced to a PC using serial and PS/2
(power source) ports. Inside, there is
Atmel 89C2051 microcontroller, which
handles UART transmission and I2C
EEPROM (ID card) programming.
Figure4. CSU components
V. THE SOFTWARE
Blue Eyes software's main task is
to look after working operators'
physiological condition. To assure instant
reaction on the operators' condition
change the software performs real time
buffering of the incoming data, real-time
physiological data analysis and alarm
triggering. The Blue Eyes software
comprises several functional modules
System core facilitates the transfers flow
between other system modules (e.g.
transfers raw data from the
ConnectionManager to data analyzers,
processed data from the data analyzers to
GUI controls, other data analyzers,
data logger etc.). The System Core
fundamental are single-producer-multi-
consumer thread safe queues. Any number
of consumers can register to receive the
data supplied by a producer. Every single
consumer can register at any number of
producers, receiving therefore different
types of data. Naturally, every consumer
may be a producer for other consumers.
Connection Manager is
responsible for managing the wireless
communicationbetween the mobile Data
Acquisition Units and the central system.
Data Analysismodule performs the
analysis of the raw sensor data in order
toobtain information about the operator’s
physiological condition.
Visualization module provides a
user interface for the supervisors. It
enables them to watch each of the working
operator’s physiological condition along
with a preview of selected video source
and related sound stream. All the
incoming alarm messages are instantly
signaled to the supervisor. The Visualization
module can be set in an offline mode,
where all the data is fetched from the
database. Watching all the recorded
physiological parameters, alarms, video
and audio data the supervisor is able
toreconstruct the course of the selected
operator’s duty.
Figure5. Software analysis diagram
VI. EMOTION COMPUTING
Rosalind Picard (1997) describes
why emotions are important to the
computing community. There are two
aspects of affective computing: giving the
computer the ability to detect emotions and
giving the computer the ability to express
emotions. Not only are emotions crucial for
rational decision making as Picard describes,
but emotion detection is an important step to
an adaptive computer system. An adaptive,
smart computer system has been driving
our efforts to detect a person’s emotional
state. An important element of
incorporating emotion into computing is for
productivity for a computer user. A study
(Dryer & Horowitz, 1997) has shown that
people with personalities that are similar
or complement each other collaborate
well. Dryer (1999) has also shown that
people view their computer as having a
personality. For these reasons, it is
important to develop computers which can
work well with its user.
VII. TYPES OF MOTIONAL
SENSORS:
For Hand: Emotion Mouse Sentic Mouse
For Eyes: Expression Glasses Magic Pointing Eye Tracking
For Voice: Artificial Inteligence
Speech Recognition
VIII. EMOTION MOUSE
One proposed, non—invasive
method for gaining user information through
touch is via a computer input device, the
mouse. This then allows the user to relate
the cardiac rhythm, the body temperature,
electrical conductivity of the skin and other
physiological attributes with the mood. This
has led to the creation of the “Emotion
Mouse”. The device can measure heart
rate,temperature, galvanic skin response
and minute bodily movements and matches
them with six emotional states: happiness,
surprise, anger, fear, sadness and disgust.
The mouse includes a set of sensors,
including infrared detectors and
temperature-sensitive chips. These
components, User researchers’ stress, will
also be crafted into other commonly used
items such as the office chair, the steering
wheel, the keyboard and the phone
handle. Integrating the system into the
steering wheel, for instance, could allow an
alert to be sounded when a driver becomes
drowsy.
Figure6. Emotion mouse
IX. SENTIC MOUSE
It is a modified computer mouse that
includes a directional pressure sensor for
aiding in recognition of emotional valence
(liking/attraction vs. disliking/avoidance)
Figure7. Senetic mouse
X. EXPRESSION GLASSES
A wearable device which allows any
viewer to visualize the confusion and
interest levels of the wearer. Other recent
developments in related technology is the
attempt to learn the needs of the user just by
following the interaction between the user
and the computer in order to know what
he/she is interested in at any given
moment. For example, by remembering
the type of websites that the user links to
according to the mood and time of the day,
the computer could search on related sites
and suggest the results the user.
Figure8. Expression Glass
XI. MAGIC POINTING
This work explores a new direction
in utilizing eye gaze for computer input.
Gaze tracking has long been considered
as an alternative or potentially superior
pointing method for computer input.
We believe that many fundamental
limitations exist with traditional gaze
pointing. In particular, it is unnatural to
overload a perceptual channel such as vision
with a motor control task. We
therefore propose an alternative
approach, dubbed MAGIC (Manual And
Gaze Input Cascaded) pointing. With such
an approach, pointing appears to the user to
be a manual task, used for fine
manipulation and selection. However, a
large portion of the cursor movement is
eliminated by warping the cursor to the
eye gaze area, which encompasses the
target.Two specific MAGIC pointing
techniques, one conservative and one
liberal, were designed, analyzed, and
implemented with an eye tracker we
developed. They were then tested in a pilot
study. This early stage exploration
showed that the MAGIC pointing
techniques might offer many advantages,
including reduced physical effort and
fatigue as compared to traditional manual
pointing, greater accuracy and naturalness
than traditional gaze pointing, and
possibly faster speed than manual
pointing. In our view, there are two
fundamental shortcomings to the
existing gaze pointing techniques,
regardless of the maturity of eye tracking
technology.
XII. EYE TRACKING
Since the goal of this work is to
explore MAGIC pointing as a user interface
technique, we started out by purchasing a
commercial eye tracker (ASL Model 5000)
after a market survey. In comparison to the
system reported in early studies this system
is much more compact and reliable.
However, we felt that it was still not robust
enough for a variety of people with
different eye characteristics, such as
pupil brightness and correction glasses. We
hence chose to develop and use our own
eye tracking system. Available
commercial systems, such as those
made by ISCAN Incorporated, LC T
echnologies, and Applied Science
Laboratories (ASL), rely on a single
light source that is positioned either
off the camera axis in the case of the
ISCANETL-400 systems, or on-axis in the
case of the LCT and the ASL E504 systems.
Figure10. Geometric Facial data extraction
XII. ARTIFICIAL INTELLIGENT
Artificial intelligence (AI)
involves two basic ideas. First, it
involves studying the thought processes of
human beings. Second, it deals with
representing those processes via machines
(like computers, robots, etc). AI is
behavior of a machine, which, if
performed by a human being, would be
called intelligent. It makes machines smarter
and more useful, and is less expensive than
natural intelligence. Natural language
processing (NLP) refers to artificial
intelligence methods of communicating with
a computer in a natural language like
English. The main objective of a NLP
program is to understand input and initiate
action. The input words are scanned and
matched against internally stored known
words. Identification of a key word causes
some action to be taken. In this way, one can
communicate with the computer in one’s
language. No special commands or
computer language are required. There is
no need to enter programs in a special
language for creating software.
XIII. SPEECH RECOGNITION
The user speaks to the computer
through a microphone, which, in used; a
simple system may contain a minimum of
three filters. The more the number of filters
used, the higher the probability of accurate
recognition. Presently, switched capacitor
digital filters are used because these can be
custom-built in integrated circuit form.
These are smaller and cheaper than active
filters using operational amplifiers. The
filter output is then fed to the ADC to
translate the analogue signal into digital
word. The ADC samples the filter outputs
many times a second. Each sample
represents different amplitude of the
signal .Evenly spaced vertical lines
represent the amplitude of the audio filter
output at the instant of sampling. Each
value is then converted to a binary number
proportional to the amplitude of the sample.
A central processor unit (CPU) controls the
input circuits that are fed by the ADCS. A
large RAM (random access memory) stores
all the digital values in a buffer area. This
digital information, representing the spoken
word, is now accessed by the CPU to
process it further. The normal speech has a
frequency range of 200 Hz to 7 kHz.
Recognizing a telephone call is more
difficult as it has bandwidth limitation of
300 Hz to3.3 kHz.
XIV. APPLICATION OF BLUE-
EYE TECHNOLOGY
Engineers at IBM's ffice:smarttags"
Research Center in San Jose, CA, report that
a number of large retailers have
implemented surveillance systems that
record and interpret customer movements,
using software from Almaden's BlueEyes
research project. BlueEyes is developing
ways for computers to anticipate users'
wants by gathering video data on eye
movement and facial expression. Your gaze
might rest on a Web site heading, for
example, and that would prompt your
computer to find similar links and to call
them up in a new window. But the first
practical use for the research turns out to
be snooping on shoppers.
Another application would be in the
automobile industry. By simply touching
acomputer input device such as a mouse, the
computer system is designed tobe able to
determine a person's emotional state. for
cars, it could be useful tohelp with critical
decisions like: "I know you want to get into
the fast lane, but I'm afraid I can't do
that.Yourtoo upset right now" and therefore
assist in driving safely.
Current interfaces between
computers and humans can present
information vividly, but have no sense of
whether that information is ever viewed or
understood. In contrast, new real-time
computer vision techniques for perceiving
people allows us to create "Face-
responsive Displays" and "Perceptive
Environments", which can sense and
respond to users that are viewing them.
Using stereo-vision techniques, we are able
to detect, track, and identify users robustly
and in real time. This information can
make spoken language interface more
robust, by selecting the acoustic information
from a visually-localized source.
Environments can become aware of how
many people are present, what activity is
occuring, and therefore what display or
messaging modalities are most appropriate
to use in the current situation. The results
of our research will allow the interface
between computers and human users to
become more natural and intuitive.
The familiar and useful come from
things we recognize. Many of our favorite
things' appearance communicate their use;
they show the change in their value though
patina. As technologists we are now poised
to imagine a world where computing objects
communicate with us in-situ; where we are.
We use our looks, feelings, and actions to
give the computer the experience it needs to
work with us. Keyboards and mice will not
continue to dominate computer user
interfaces. Keyboard input will be replaced
in large measure by systems that know what
we want and require less explicit
communication. Sensors are gaining fidelity
and ubiquity to record presence and
actions; sensors will notice when we enter
a space, sit down, lie down, pump iron, etc.
Pervasive infrastructure is recording it. This
talk will cover projects from the Context
Aware Computing Group at MIT Media
Lab.
XV. CONCLUSIONS
The nineties witnessed quantum
leaps interface designing for improved man
machine interactions. The BLUE EYES
technology ensures a convenient way of
simplifying the life by providing more
delicate and user friendly facilities in
computing devices. Instead of using
cumbersome modules to gather information
about the user, it will be better to use
smaller and less intrusive units. Ordinary
household devices -- such as televisions,
refrigerators, and ovens -- may be able to do
their jobs when we look at them and speak
to them. It is only a technological forecast.
ACKNOWLEDGEMENT
The author is thankful to the
management of National Institute of
Technology, Hamirpur, for their guidance
and support. The author gratefully
acknowledges the support and constant
encouragement of the HOD and the Faculty
of E&CED, NIT Hamirpur. Finally he
would like to thank his parents for their love
and blessings, which have been instrumental
in the compilation of this report.
REFERENCES
[1] Joseph j.carr & john
m.brown,”introduction to blue
eyestechnology”,published in ieee spectrum
magazine.
[2] A.jajszczyk,”automatically switched blue
eyes networks:Benefits and
Requirement,”IEEE blue toooth.feb
2005,vol 3,no1,pp.
[3] A .Banerjee, ”Generalized multi protocol
label switching: an over view of
computer enhancements and recovery
techniques,”IEEE” commun. Magvol39.
[4] J.jones, L.ong, and m.lazer,”creating
and intelligent technology
network/worldwide interoperability
demonstration.”IEEEcommun .mag.,vol 42.
[5] BlueEyes Technology,Computer
Edge,Oct.2002,pages 23-27.