EOG technique to guide wheelchair

6
1 Electronics Department. University of Alcala. Alcala de Henares. Madrid. Spain. e-mail: [email protected] EOG technique to guide a wheelchair R. Barea 1 , L. Boquete, M. Mazo, E. López. A. García-Lledó. Abstract This paper involves a method to control and guide a wheelchair for disabled people using electrooculography (EOG), so that control is made by means of the ocular position (eye displacement into its orbit). An eye model based on electrooculographic signal is proposed and its validity is studied. After that, differents tecniques of guidance are shown and advantages and disadvanges of each one are commented. The system consists of a standard electric wheelchair with an on board computer, sensors and graphics user interface running on the computer [1]. This control technique can be useful in multiple applications, such as help to the movility and the comunication between handicapped persons. Key words: Electrooculography (EOG), wheelchair, control, guidance, interface. 1 Introduction In the last years, the applications for developing help systems to people with several disabilities have been increased, and a great advance has been produced in the comunication systems between humans and machines. Mainly, this advances have been produced in comunications from machines to humans, by means of graphical users interfaces or multimedia applications (sounds). However, this advances in the comunication from humands to machines have been modest, using keyboards, mouses, joysticks or tactile screen. All this systems are handled by hand. Nevertheless, at the moment, many comunication systems are being developed based on voice recognition or visual information [2] and they will be launched onto the market in the next years. For example, people do diary a lot of eye movements that allow them to do different tasks, such as reading, writing, learning new things, adquiring information about the environment , handling objects and comunicating with other people, etc. This ability of people to control their gaze direction can be used to comunicate with the machines. Electrooculography is a method to sense eye movement and is based on recording the steady corneal-retinal potential that is due to hiperpolarizations and depolarizations existing between the cornea and the retina and is commonly known as an electrooculogram (EOG) [3]. This potential can be considered as a steady electrical dipole with a negative pole at the fundus and a positive pole at the cornea. The EOG ranges from 0.05 to 3.5 mV in humans and is linearly proportional to eye displacement for angles less than ±30º. The derivations of the EOG are achieved placing two electrodes on the outerside of the eyes to detect horizontal movement and another pair above and below the eye to detect vertical movement. A reference electrode is placed on the forehead. Figure 1 shows the electrodes placement. The EOG signal changes approximately 20 microvolts for each degree of eye movement. In our system, the signals are sampled 10 times per second. The record of EOG signal has several problems [2]: Firstly, this signal is seldom deterministic, even for the same person in different experiments.The EOG signal is a result of a number of factors, including eyeball rotation and movement, eyelid movement, different sources of artifact such as EEG, electrodes placement, head movements,

description

useful for projects based on EOG

Transcript of EOG technique to guide wheelchair

1 Electronics Department. University of Alcala. Alcala de Henares. Madrid. Spain. e-mail: [email protected]

EOG technique to guide a wheelchair

R. Barea 1, L. Boquete, M. Mazo, E. López. A. García-Lledó.

Abstract This paper involves a method to control and guide a wheelchair for disabled peopleusing electrooculography (EOG), so that control is made by means of the ocular position(eye displacement into its orbit). An eye model based on electrooculographic signal isproposed and its validity is studied. After that, differents tecniques of guidance are shownand advantages and disadvanges of each one are commented. The system consists of astandard electric wheelchair with an on board computer, sensors and graphics user interfacerunning on the computer [1]. This control technique can be useful in multiple applications,such as help to the movility and the comunication between handicapped persons.

Key words: Electrooculography (EOG), wheelchair, control, guidance, interface.

1 Introduction

In the last years, the applications for developing help systems to people with several disabilities have beenincreased, and a great advance has been produced in the comunication systems between humans and machines.Mainly, this advances have been produced in comunications from machines to humans, by means of graphicalusers interfaces or multimedia applications (sounds). However, this advances in the comunication fromhumands to machines have been modest, using keyboards, mouses, joysticks or tactile screen. All this systemsare handled by hand. Nevertheless, at the moment, many comunication systems are being developed based onvoice recognition or visual information [2] and they will be launched onto the market in the next years. Forexample, people do diary a lot of eye movements that allow them to do different tasks, such as reading, writing,learning new things, adquiring information about the environment , handling objects and comunicating withother people, etc. This ability of people to control their gaze direction can be used to comunicate with themachines.

Electrooculography is a method to sense eye movement and is based on recording the steady corneal-retinalpotential that is due to hiperpolarizations and depolarizations existing between the cornea and the retina andis commonly known as an electrooculogram (EOG) [3]. This potential can be considered as a steady electricaldipole with a negative pole at the fundus and a positive pole at the cornea. The EOG ranges from 0.05 to 3.5mV in humans and is linearly proportional to eye displacement for angles less than ±30º. The derivations ofthe EOG are achieved placing two electrodes on the outerside of the eyes to detect horizontal movement andanother pair above and below the eye to detect vertical movement. A reference electrode is placed on theforehead. Figure 1 shows the electrodes placement. The EOG signal changes approximately 20 microvolts foreach degree of eye movement. In our system, the signals are sampled 10 times per second. The record of EOGsignal has several problems [2]: Firstly, this signal is seldom deterministic, even for the same person indifferent experiments.The EOG signal is a result of a number of factors, including eyeball rotation andmovement, eyelid movement, different sources of artifact such as EEG, electrodes placement, head movements,

influence of the luminance, etc. For this reasons, it is necessary to eliminate the shifting resting potential(mean value) because this value changes. To avoid this problem it's necessary an ac differential amplifier witha high pass filter with cutoff frecuency at 0.05 Hz and relatively long time constant. Besides, Ag-AgCl floatingmetal body-surface electrodes are used.

Figure 1. Electrodes placement. A) Reference electrode D-E) Horizontal derivation electrodes. B-C) Vertical derivation electrodes.

This paper has been divided into the following sections: in section 2 an eye model is proposed to detectsaccadic eye movements. In section 3 a semiautomatic sweep technique for guiding is studied and finally,section 4 puts forward the main conclusions and lays down the main lines of work to be followed in the future.

2 Eye model based in EOG (BiDiM-EOG)Our aim is to design a system capable of obtaining the gaze direction detecting the eye movements. For this,a model of the ocular motor system based on electrooculography is proposed (figure 2) (Bidimensional dipolarmodel EOG, BiDiM-EOG). This model allows us to separe saccadic and smooth eye movements and calculatethe eye position into its orbit with good accuracy (less than 2º). The filter eliminates the effects due to otherbiopotentials, just as the blinks over to the EOG signal. The security block detects when the eyes are closedand in this case, the ouput is disabled. After that, the EOG signal is clasified into saccadic or smooth eyemovements by means of two detectors. If a saccadic movement is detected, a position control is used, whereasif a smooth movement is detected, a speed control is used to calculate the eye position. The final position(angle) is calculated as the sum of the saccadic and smooth movements. Besides, the model has to adapt itselfto the possible variations of adquisition conditions (electrodes placement, electrode-skin contact, etc). To dothis, the model parameters are adjusted in accordance with the angle detected.

A person, in a voluntary way, only can made saccadic movements unless he tries to follow an object inmovement. Therefore, to control some interface it is convenient to focus the study in the detection of saccadicmovements (rapid movements). This process can be done processing the derivate of the EOG signal. To avoidproblems with the variability of the signal (the isoelectric line varies with time, even though the user keeps thegaze at the same position), a high pass filter with a very small cutoff frecuency (0.05 Hz) is used. The processfollowed can be observed in figure 3. Figure 4 shows the results of a process in which the user made asecuence of saccadic movements of ±10º..±40º. It is possible to see that the derivate of the electrooculographicsignal allows us to determinate when a sudden movement is made in the eye gaze. This variation can be easilytranslated to angles (figure 4.d).

0 100 200 300 400 4 5 6

a) Eog

0 100 200 300 400 -10

0 10

b) Derivate

0 100 200 300 400 -10

0 10

c)Maximum derivate

0 100 200 300 400 -50

0 50

d) Angle

Muestras

In the following sections, the results reached in the Electronics Department of University of Alcala at themoment are shown. It is possible to use this technique to help disabled people, because we have got an accuracyless than ±2º. Although in this paper we are going to comment the results obtained in the guidance of awheelchair (help to the movility), other applications have been developed to increase the facilities in peoplecomunication (help to the comunication) [4][5].

Figure 2. Bidemensional bipolar model (BiDiM-EOG).

Figure 3. Process to detect saccadic movements. Figure 4. Process results

$

$

&$

&$

X

X

X

X

h

v

h

v

Vcmd

cmdΩ

Figure 5. Control scheme

Figure 6. User interface

3 Guidance of a wheelchair using electrooculography

The goal of this control system is to guide an autonomous mobile robot using the positioning of the eye intoits orbit by means of EOG signal [6][7]. In our case, the autonomous robot is a wheelchair for disabled people;figure 5 shows a diagram of the control system. The EOG signal is recorded by means of an adquisition systemand the data are sent to a PC, in which they are processed to calculate the eye gaze direction. Then, inaccordance with the guidance control strategy, the control commands of the wheelchair are sent. It is possibleto see that exists a visual feedback in the system by means of a tactile screen that the user has in front of him.Figure 6 shows the user interface where the commads that the user can generate are: Forward, Backwards, Left,Right and Stop.

To control the robot movements there are multiple options: direct access guidance, semiautomatic andautomatic sweep or interpretation of different commands generated by means of eye movements. We are goingto use the semiautomatic sweep option because it allows us to generate simple code for controlling thewheelchair using the eye movements (a “tick”) to select among the different commands. The actions arevalidated by time, this is, when a command is selected, if other “tick” is not generated during a time interval,the command is validated and the guidance action is executed [8].

Figure 7 shows the state machine of the control system: it starts in an inactive state and when the activationaction (this action consist of look at right-left-right) is generated the system goes to an active state. Then, toselect the commands it is only necessary to generate a simple action (look at right). Figure 8 shows an exampleof selection of the Left command.

Several security elements are necessary, such as alarm and stop commands, to avoid dangerous situations.These codes can be generated by means of the blink and alpha waves in EEG to detect when the eyelids areclosed. Besides, the automatic wheelchair system must be able to navigate indoor and outdoor environmentsand should switch automatically between navigations modes for these environments. Therefore, this systemcan be applied to different navigation modes in function of the disability degree of the user, using always themore efficient technique for each person. It is necessary to use different support systems to avoid collisions(“bumpers”, ultrasonic and infrared sensors, etc) and the robotic system can switch automatically forcontrolling the system in an autonomous form. For example, if the user losts the control and the systembecomes unstable, the wheelchair should switch and get the control system. This work is included in a generalpurpose navigational assistant in environments with accesible features to allow a wheelchair to pass; thisproject is the SIAMO project [1].

Figure 8. Selection of Left command usingsemiautomatic sweep.

Screen Ð

Figure 7. State machine

Screen ÑScreen ÏScreen Î

Two examples of guidance are shown in figure 9, where the red line 9 represents the “3-spline curve-line” tofollow. This trajectory is obtained using a trajectory spline generator developed in SIAMO project. On theother hand, the blue line " represents the trajectory obtained when the wheelchair is guided using EOG. It ispossible to see that the desired trajectory is followed with a small lateral error. Nowadays, we have not testedthis control with persons with disabilities but we consider that it is not difficult to learn the control commands.Learning to use this system must be done in an acquired skill. Some studies show that disabled persons usuallyrequires about 15 minutes to learn to use this kind of systems [7].

Figure 9. Examples of wheelchair guidance

4 Conclusions

In this paper, the main characteristics of electrooculography have been shown: adquisition and processing ofEOG signal and its applications in help systems for the disabled . An eye model based on EOG is proposed andits ability to determine the eye position into its orbit is studied. Besides, it is possible to codify ocular actionsas commands and to apply them for applications about help to the movility and comunication amonghandicapped persons. In this work, we present a system that can be used as a mean of controlling that allowsthe handicapped, especially those with only eye-motor coordination, to live more independiently. Some of theprevious wheelchair robotics researchs are restricted to a particular location and in many areas of robotics,environmental assumptions can be made that simplify the navigation problem. However, a person using awheelchair and EOG technique should not be limited by the device intended to assist them if the environmenthas accessible features.

There are a lot of applications that can be developed using EOG because this technique provides the users witha free degree in the enviroment. Therefore, an improvement in the confortability of this technique can be ofgreat utility in the future because it provides a lot of possibilities. If the eye gaze is known, it is possible todevelope different user interfaces to control different tasks: spell and speak software programs allow users towrite a letter or a message and after that, a control system can interpret the message and generate differentcommands. It is possible to generate a similar code to deaft people, etc.

5 Acknowledgments

The autors would like to express their gratitude to the “Comision Interministerial de Ciencia y Tecnología”(CICYT) for their support through the project TER96-1957-C03-01.

References

[1] SIAMO Project. "Sistema de Ayuda a la Movilidad" (CICYT). Departamento de Electrónica. Universidadde Alcalá. Madrid. Spain.

[2] Robert J.K. Jacob. "Eye Movement-Based Human-Computer Interaction Techniques: Toward Non-Command Interfaces". Human-Computer Interaction Lab. Naval Research Laboratory. Washington, D.C. 1995.

[3] Manual de técnicas de Electrofisiología clínica. M.C. Nicolau, J. Burcet, R.V. Rial. Universidad de las IslasBaleares. España. 1995.

[4] S. Palazuelos, S. Aguilera, J.L. Rodrigo, J. Godino, J. Martín. “PREDICE: Editor predictivo para personascon discapacidad física". Jornadas sobre Comunicación Aumentativa y Alternativa, ISAAC. Vitoria, Spain.September 1999.

[5] R. Barea, L. Boquete, M. Mazo, E, López, L. Bergasa. “Aplicación de electrooculografía para ayuda aminusválidos”. Revista Española de Electrónica. Octubre 1999. Spain.

[6] R. Barea, L. Boquete, M. Mazo, E. López. "Guidance of a wheelchair using electrooculography".Proceedings of the 3rd IMACS International Multiconference on Circuits, Systems, Communications andComputers (CSCC'99). July 1999. Greece.

[7] EagleEyes Project. James Gips, Philip DiMattia, Francis X. Curran and Peter Olivieri. Computer ScienceDepartment, Boston College. Chestnut Hill, Mass. USA.

[8] Holly A. Yanco and James Gips. “Drivers performance using single swicth scanning with a poweredwheelchair: robotic assisted control versus traditional control”. MIT Artificial IntelligenceLaboratory.Cambridge, MA 02139, Computer Science Department-Boston College Chestnut Hill, MA 02167.RESNA '98, Pittsburgh, Pennsylvania,