[IEEE 2012 3rd International Conference on Intelligent Systems, Modelling and Simulation (ISMS) -...

5
Modulation of Spatial Attention by Emotional Expression Maziyar Molavi Faculty of Health Science and Biomedical Engineering, University Technology Malaysia Johor, Malaysia E-mail: [email protected] Jasmy bin Yunus Faculty of Health Science and Biomedical Engineering, University Technology Malaysia Johor, Malaysia E-mail: [email protected] Abstract—After the gaze is shifted to some events or objects peripherally, the attention mechanism follows it. It is aimed to investigate the emotion and spatial attention dynamic by neural processing of subsequent expression stimuli. The Posner as a gaze cueing paradigm has been used to study the spatial attention mechanism. It is investigated the effect of expression facial cues and meaningful word targets on event-related potential (ERPs) extracted from recorded Electroencephalography (EEG) as indices of gaze-directed orienting. According to analyzes the ERPs component (P1 and N1), Results show that special attention and motivational significantly approach by happy emotion. Understanding the mental chronometry of joint mutual special attention and emotion is implicated by results. Keywords- Spatial attention; emotion; event-related potential I. INTRODUCTION One of the most important parts of communication in a complex social environment is the ability to detect and determine the direction from people’s attention. The ability to estimate the personal actions and responses to environmental stimuli is related to internal mental states of conspecifics [1].People sense to where other are looking and follow the eye gaze to same position and look away from what they dislike and focus on what they like [2]. The person tries to draw inferences about mind state and emotion and attention mechanism by observing the expressing faces [3]. In this way inter action between gaze direction and emotional recognition, which is induced by facial expression, is the research area that is investigated by neural architecture [4]. We conducted the experiment using the attentional cueing paradigm, in which sad, happy, or neutral facial expressions were presented looking left or right. Additionally semantic cue is used with the normal face looking left or right. Positive and negative words were used as targets, to assess whether the effect of all facial expression and gaze direction on attentional shifts depends on the presence of an evaluative goal. II. BACKGROUND Most of the articles that focused on attention mechanism, which is influenced by emotional stimuli were based on the measuring reaction time [5, 6]. However, EEG analyses by a different method like event related potential (ERPs) were used by some other researchers. For example, it is shown that regular pattern of ERPs components would be modulated by spatial attention. The first positive deflection (P1) and first negative peak (N1) were the featured of EEG that were affected by spatial attention [7, 8 , 9]. Although some researches reject the significant relation between emotional and spatial attention [10, 11], some others provide evidences that showed strong effect of facial expression on gaze cueing compared to neutral faces [12, 13]. Furthermore, recent studies revealed that the modulation of emotion and attention may also be sensitive to other individual differences such as self-esteem [14]. Also eyes of fearful faces produced stronger gaze cueing effects than a neutrally expressive face in a group of participants with higher than average levels of anxiety [15]. Overall investigation in the area of attention mechanism with regard to gaze–emotion interactions on attentional orienting is inconclusive. Moreover variances in experimental method occur due to differences in emotion and spatial attention effects. This results in differences in the test design (discrimination, position, or detection of goal) and timing (fixation–cue-target duration). III. NEW IDEA There is not much report about the interaction of emotional effect on spatial attention using EEG analysis based on the emotional effect. This work aims to investigate this further. In the current study, we modified the Posner paradigm by incorporating gaze shifts to evaluate the inter action between emotional expression and spatial attention when the target defined as an emotional word. We use emotional word target, which would yield more involuntary appreciation of the emotional content of the facial expression stimuli. We have extended previous studies in this area which focus on the modulated attention by emotion due to the reaction time as an external feature[12,13] by studying the the neural mechanism and analyzing the ERPs components as an internal feature to find the accurate evidence of the modulation. We found that positive emotional face provoked stronger effect on gaze than neutral face. 2012 Third International Conference on Intelligent Systems Modelling and Simulation 978-0-7695-4668-1/12 $26.00 © 2012 IEEE DOI 10.1109/ISMS.2012.124 220

Transcript of [IEEE 2012 3rd International Conference on Intelligent Systems, Modelling and Simulation (ISMS) -...

Modulation of Spatial Attention by Emotional Expression

Maziyar Molavi Faculty of Health Science and Biomedical Engineering,

University Technology Malaysia Johor, Malaysia

E-mail: [email protected]

Jasmy bin Yunus Faculty of Health Science and Biomedical Engineering,

University Technology Malaysia Johor, Malaysia

E-mail: [email protected]

Abstract—After the gaze is shifted to some events or objects peripherally, the attention mechanism follows it. It is aimed to investigate the emotion and spatial attention dynamic by neural processing of subsequent expression stimuli. The Posner as a gaze cueing paradigm has been used to study the spatial attention mechanism. It is investigated the effect of expression facial cues and meaningful word targets on event-related potential (ERPs) extracted from recorded Electroencephalography (EEG) as indices of gaze-directed orienting. According to analyzes the ERPs component (P1 and N1), Results show that special attention and motivational significantly approach by happy emotion. Understanding the mental chronometry of joint mutual special attention and emotion is implicated by results.

Keywords- Spatial attention; emotion; event-related potential

I. INTRODUCTION

One of the most important parts of communication in a complex social environment is the ability to detect and determine the direction from people’s attention. The ability to estimate the personal actions and responses to environmental stimuli is related to internal mental states of conspecifics [1].People sense to where other are looking and follow the eye gaze to same position and look away from what they dislike and focus on what they like [2]. The person tries to draw inferences about mind state and emotion and attention mechanism by observing the expressing faces [3].

In this way inter action between gaze direction and emotional recognition, which is induced by facial expression, is the research area that is investigated by neural architecture [4]. We conducted the experiment using the attentional cueing paradigm, in which sad, happy, or neutral facial expressions were presented looking left or right. Additionally semantic cue is used with the normal face looking left or right. Positive and negative words were used as targets, to assess whether the effect of all facial expression and gaze direction on attentional shifts depends on the presence of an evaluative goal.

II. BACKGROUND

Most of the articles that focused on attention mechanism, which is influenced by emotional stimuli were based on the

measuring reaction time [5, 6]. However, EEG analyses by a different method like event related potential (ERPs) were used by some other researchers. For example, it is shown that regular pattern of ERPs components would be modulated by spatial attention. The first positive deflection (P1) and first negative peak (N1) were the featured of EEG that were affected by spatial attention [7, 8 , 9].

Although some researches reject the significant relation between emotional and spatial attention [10, 11], some others provide evidences that showed strong effect of facial expression on gaze cueing compared to neutral faces [12, 13]. Furthermore, recent studies revealed that the modulation of emotion and attention may also be sensitive to other individual differences such as self-esteem [14]. Also eyes of fearful faces produced stronger gaze cueing effects than a neutrally expressive face in a group of participants with higher than average levels of anxiety [15].

Overall investigation in the area of attention mechanism with regard to gaze–emotion interactions on attentional orienting is inconclusive. Moreover variances in experimental method occur due to differences in emotion and spatial attention effects. This results in differences in the test design (discrimination, position, or detection of goal) and timing (fixation–cue-target duration).

III. NEW IDEA

There is not much report about the interaction of emotional effect on spatial attention using EEG analysis based on the emotional effect. This work aims to investigate this further. In the current study, we modified the Posner paradigm by incorporating gaze shifts to evaluate the inter action between emotional expression and spatial attention when the target defined as an emotional word. We use emotional word target, which would yield more involuntary appreciation of the emotional content of the facial expression stimuli. We have extended previous studies in this area which focus on the modulated attention by emotion due to the reaction time as an external feature[12,13] by studying the the neural mechanism and analyzing the ERPs components as an internal feature to find the accurate evidence of the modulation. We found that positive emotional face provoked stronger effect on gaze than neutral face.

2012 Third International Conference on Intelligent Systems Modelling and Simulation

978-0-7695-4668-1/12 $26.00 © 2012 IEEE

DOI 10.1109/ISMS.2012.124

220

IV. METHODOLOGY

A. Participants We studied 16 postgraduate students from Universiti

Teknologi Malaysia. Two subjects were excluded, because of uncontrolled artifacts due to eye blinking. The ages of the subjects were between 25 and 32 years old with a mean age of 28 years (SD 3.1).

We measured the response of participants to spatial attention stimuli when affected by emotional face to evaluate the effect of spatial-emotional stimuli on attention mechanism. Participating subjects all normal (or corrected) vision, no serious head injury, no epilepsy or seizures, and no psychiatric illness.

B. Paradigm Posner in 1980 conducted a spatial attention paradigm

that measured the reaction time of detecting the position of target after indicating the cue [16]. Reaction time is measured for three conditions, namely:

(i) Valid condition - when the cue correctly points to the position of target (ii) Invalid condition - when the cue points to the opposite side of target

A cue is used to orient spatial attention and then the reaction time is recorded by measuring the time that the participant reacts to a target presented either as a valid condition or an invalid condition. Results show that participants gave faster responses to valid condition compared to invalid condition.

To evaluate the influence of emotional stimulus on spatial attention we conducted the experiment using attentional cueing paradigm, in which sad, happy, or neutral facial expressions were presented looking left or right. Additionally semantic cue is used involving the normal face that sees to left or right side. Positive and negative pictures or words were used as targets, to assess whether the effect of all facial expression and gaze direction on attention shifts depends on the presence of an evaluative goal.

C. Stimulus materials and design On each trial, a white fixation cross was presented on a

black screen. Next, the face, with left, right or direct gaze and emotional expression (sad, happy, and neutral) appeared in the center of the screen. The target appeared, in one of the placeholders.

Figure 1. View screen on emotional task presented to test subjects

The entire screen is then cleared, and a further time lapse is given before the next trial begins. Figure 1 shows a simple diagram of experiment.

Participants were instructed to maintain fixation at the center of the screen throughout each trial, and to ignore the direction of gaze, the emotional expression of the face, and the content of the target picture.� It was impressed upon the participants whom the only relevant feature of the trial was the location of the target, and it was to this that they were to respond as quickly and as accurately as possible by pressing the specific key on a standard keyboard with their left index finger and the other specific key with their right index finger for left and right targets, respectively. It is important to note that the target remained on the screen until start of next trial irrespective of whether the participant responded, ensuring that each participant was exposed to each picture for the same amount of time.

All pictures were saved in the database and loaded randomly by the program. To synchronize the EEG recorder machine and pc which showed the picture, a program was written by “LABVIEW.” A signal was sent with one millisecond resolution to 2 dc channels of EEG recorder devise with 0.3 volt amplitude. Additionally, this program sent an indicator signal to the EEG recorder device about press button by participants.

The scalp electroencephalogram (EEG) was recorded according to the international 10-20 system using Ag/AgCl electrode caps at 16 electrode sites (F3, Fz, F4, T3, C3, Cz, C4, T4, T5, P3, Pz, P4, T6, O1, Oz, O2) referred to linked earlobes. The horizontal electrooculogram (EOG) was recorded at the outer canthi of both eyes, and the vertical EOG was recorded from electrodes placed below and above the left eye. The band pass filter was set at 0.15-30 Hz and the sampling rate was 500 Hz. EEGs and EOGs were analyzed using Matlab.

Nine photographs selected from the Pictures of facial affect to display in cueing level. Depicted faces were showing a happy, sad or neutral facial expression. Pictures were adjusted for looking left and right.

A total of 70 target words were selected from the affective norms for English Words on the basis of valence and arousal: half were positive (valence: M = 7.36, SD = 0.7; range = 5.68–8.72; arousal: M = 5.25, SD = 1.06; range = 2.39–8.1), and the other half were negative (valence: M = 2.55, SD = 0.66; range = 1.25–3.91; arousal: M = 5.68, SD = 0.93; range = 3.83–7.86)[5]. Each word was presented twice, once in valid gaze-cued trial and once in invalid gaze-cued trial.

Stimuli was presented on a 17-in monitor (1024 × 768 pixels, 60 Hz) connected to a 1-GHz Pentium computer. The faces subtended 7° of vertical visual angle when presented on screen. Words were printed in a font size of 18 Arial. After we obtained informed consent, participants sat in front of a computer screen at a distance of approximately 60 cm in a dimly lit room. After 10 practice trials, participants completed 60 trials.

Each block immediately after the face, a target word was presented either to the left or right, at the same height as the eyes, and remained on screen until response or 3000 msec

221

elapsed. The inter trial interval was 200 msec. Trials appeared in a new random order for every participant, each of whom was instructed to press one of two keys using the index and middle finger of their dominant hand depending on whether the word denoted something positive or negative. The keys “E” and “D” were chosen as they are perpendicular to the positive–negative target position and key assignment were counterbalanced. Keys were labeled “P” and “N” (positive vs. negative). Both speed and accuracy were emphasized. Participants were explicitly told that the eye gazed did not predict where the target word would appear. The experimental design had a 3 (facial expression: sad, happiness, neutral) × 2 (looking side cue: valid cued, invalid cued) × 2 (target valence: positive, negative) classes within-participant categories.

V. RESULTS AND DISCUUION

This work is focused on measuring the reaction time (RT) and ERPs as a main external and internal feature to be investigated concerning the modulation of emotion and spatial attention. RT is the time interval between the stimuli being shown the participants pushing the keyboard button. ERP is any measured brain response that is direct result of a thought or perception which was related to the peak of electrical recorded wave form in the time domain.

A. Reaction Time(RT) Data reduction trials on which an error was made (1%)

and with RTs faster than 100 msec or slower than 1500 msec (1.5%) was excluded from analyses. Mean RTs were computed for each consisted of equally probable factorial combinations of facial expression (happy, neutral, sad), target position (left, right), target valence (positive, negative), and looking direction (left, right). The direction of eyesight was equally likely to look toward (i.e., valid gaze cued) or away from (i.e., invalid gaze cued) the target word.

Each trial began with the presentation in the center of the screen of a fixation point (1000 msec) followed by a face (250 msec). The face could look left or right (the eyes located at the same height as the fixation point) and show one of four facial expressions (happy, neutral, and sad). The result of recorded the RT is as shown in TABLE I.

TABLE I. REACTION TIME

Neutral��face� Happy�face� Sad��face��������������������������������������cue�

Target�

Mean� S.T.D� Mean� S.T.D� Mean� S.T.D�

Valid� ��927.7� 20� ��883.3� 21� ��960.1� 15�positive�

Invalid� 1004.8� �

13� 970.71� �

17� 1045.6� 12�

Valid� 996.3� 19� 908.1� �

22� 1063.1� 21�negative�

Invalid� 1033.7� �

17� 1024.5� �

18� 1065.4� 16�

Analysis of variance (ANOVA) results showed a significant main effect of facial expression, (F (1, 14) = 36.19, p < 0.001) with longer RTs when the face showed a negative expression (sad) than when it showed a happy or neutral expression. Overall, there were longer RTs to targets presented at invalid gaze-cued locations than to targets at valid gaze-cued locations (F (1, 14) = 69.3, p < 0.007) and also longer RTs to negative target words than to positive target (F (1, 14) = 79.81, p < 0.0021). The results of this experiment confirm our assumption that gaze cueing can be modulated by facial expression as a task of emotion context. Remarkably the context congruity influence was driven exclusively by observers in the positive emotional stimulation. It shows that the gaze cueing is modulated by emotional context when the targets include the positive stimuli. Explicitly, when all the targets in the Posner paradigm are pleasant, happy faces create stronger gaze cueing effects than a neutral and sad face.

B. Event Related Potential (ERP) In next step, the EEG recorded was analyzed by ERPs

method. Before ERPs analysis preprocessing was done for the recorded wave.� The artifact rejection criterion was any positive or negative change above 75 mV. The length of each record (epoch) was of -1000 msec to 1000 msec .The baseline was corrected from -500 msec to the 0 msec .The electrode impedances were kept constant and below 5 kW for all electrodes and below 10 in the electro-oculogram. Severe contamination of EEG activity by eye movements, blinks, muscle, heart and line noise are a serious problem for EEG interpretation and analysis. We used the independent component analysis (ICA) to remove these artifacts. Although using of ICA was effective to remove the artifacts, after some tests, we found that in our experiments most (86%) of eye movements happened after the interval that we extracted the ERP component (0-300 msec). Scalp map from recorded EEG data is shown by Figure 2. Most of the positive activity in P1 and N1 during their intervals happened around the occipital area, and negative activity was shown in the parietal area.

The components of ERPs show significant validity effect where the side of recording was ipsilateral (F(1,14) = 19.42, P < 0.0021) for P1, (F(1,14) = 82.1, P < 0.0001) for N1 and contralateral (F(1,14) = 13.21, P < 0.006) for N1 to the visual field of presentation. The main effects of positive words which influence the spatial attention by decreasing the P1 (F (1, 14) = 46.48, p < 0.0001) for Cz electrode, (F (1, 14) = 39.79, p < 0.0031) for P4 electrode and (F (1, 14) = 53.79, p < 0.0001) for O1 electrode; were significant which is shown by Figure 3. In the occipital area, the increasing of N1,( F (1, 14) = 57.95, p <0.0001 )for O1 electrode was significant. Rather, we suggest here that the systems which give rise to joint attention are imbued with a great degree of flexibility and allow the regulation of looking-triggered orienting responses as a function of emotional context. The central effect of facial expression on both enhancing the P1 and N1 amplitude component of EEG recorded signals and RT suggests that the dynamic change in emotional context, happy face induced a state of vigilance that increased activity

222

processing of all targets. The P1 is assumed to characterize the processing of a visual stimulus in the extra striate visual cortex [17]. It has been shown that the amplitude level increases in response to additional resources at the positive emotional context. This result suggests that emotional expression can be a relevant signal of the eye gazing for creating attentional sets during social exchanges.�

VI. CONCLUSION

The aim of this work is to investigate whether the attendance of a motivational goal to assessing neural information process is necessary for both emotional expression and gaze to affect attention. By this goal, we presented happy, sad and neutral faces looking either left or right side. Words as targets performed either at the spatial position gazed by the face or in the opposite side. In our study, we used ERPs to differentiate the properties of two altered social cues signaled by the facial expression and gaze looking direction on orienting of attention. Facial signals play important roles in attention mechanism and social referencing. A novel different orienting of attention paradigm was developed to investigate the dynamic facial cue displays that appeared to change in looking direction and emotional appearance in advance of a target detection. Reaction time showed the slower response when influenced by invalid cue .The effects of the central cue Posner paradigm on ERPs had been extensively described, typically, P1 and N1 showed a larger deflection for stimuli presented on the relevant positions, compared to stimuli presented on irrelevant display locations. The effect of emotion on spatial attention was tested. It revealed that happy face for detecting the positive valence word show faster reaction time and significant effect for ERPs features in comparison with the sad face as a cue to detect the negative valence word. Furthermore, this effect by ERP feature was reported when P1 is significantly greater for positive meaning words and N1 in the occipital area was different when reported to positive word than other conditions. It is suggested as a next step, the effect of another facial expression (fearful, disgusting, etc.) is included to emotional list.

ACKNOWLEDGMENT

The authors gratefully acknowledge the experimental assistance from the Faculty of Health Science and Biomedical Engineering, Universiti Teknologi Malaysia.

This work has been supported by international doctoral Fellowship (IDF), School of Postgraduate Studies, Universiti Teknologi Malaysia (UTM), Malaysia.

REFERENCES

[1] N.J. Emery, “The eyes have it: The neuroethnology, function and evolution of social gaze,” Neuroscience and Biobehavioural Reviews. 2000,vol.24, pp.581–604.

[2] S. Baron-Cohen, R. Campbell ,A. Karmiloff-Smith,J. Grant and J. Walker, “Are children with autism blind to the mentalistic significance of the eyes?,” British Journal of Developmental Psychology, 1995 , vol.13, pp.379-398.

[3] A. Pecchinenda and M. Heil, “Role of working memory load on selective attention to affectively valent information,” The European Journal of Cognitive Psychology, 2007, vol.19, pp.898-909..

[4] R. Graham,C.K. Friesen,H. Fichtenholtz and K. LaBar, “Modulation of reflexive orienting to gaze direction by facial expressions,” Visual Cognition, 2010, vol.18(3), pp.331–368.

[5] A. Pecchinenda, M. Pes, F. Ferlazzo and P. Zoccolotti, “The combined effect of gaze direction and facial expression on cueing spatial attention,” Emotion , 2008, vol.8 , pp. 628-634.

[6] E. Fox, A. Mathews, A.J. Calder and J. Yiend , “Anxiety and sensitivity to gaze direction in emotionally expressive faces.,” Emotion, 2007, vol.7, pp.478–486.

[7] S. Luck, G. F. Woodman and E. K. Vogel, “Event-related potential studies of attention,” Trends in Cognitive Sciences, 2000, vol.4, pp.432-440.

[8] A. Schuller, and B. Rossion, “Perception of static eye gaze direction facilitates subsequent early visual processing,” Clinical Neurophysiology, 2004, vol.115, pp. 1161-1168.

[9] A Schuller and B. Rossion, “Spatial attention triggered by eye gaze enhances and speeds up visual processing in upper and lower fields beyond early striate visual processing,” Clinical Neurophysiology, 2005, vol.116, pp.2565_2576.

[10] J.K. Hietanen and J.M. Leppänen, “Does facial expression affect attention orienting by gaze direction cues?” Journal of Experimental Psychology: Human Perception and Performance, 2003, vol.29, pp.1228–1243.

[11] R. Graham, C.K. Friesen, H. Fichtenholtz and K. LaBar, “Modulation of reflexive orienting to gaze direction by facial expressions,” Visual Cognition, 2010, vol.18, pp.331–368.

[12] A. Holmes, A. Richards and S. Green, “Anxiety and sensitivity to eye gaze in emotional faces,” Brain and Cognition, 2006, vol.60, pp.282–294.

[13] A. Mathews,E. Fox, J. Yiend and A. Calder, “The face of fear: Effects of eye gaze and emotion on visual attention,” Visual Cognition, 2003, vol.10, pp.823–835.

[14] B. M. Wilkowski, M. D. Robinson and C. K. Friesen, “Gaze-Triggered Orienting as a Tool of the Belongingness Self-Regulation System,” Psychol Science , 2009 , vol.20(4), pp.495-501.

[15] A. Mathews, E. Fox, J. Yiend, A. Calder , “The face of fear: Effects of eye gaze and emotion on visual attention,” Visual Cognition , 2003, vol.10 , pp.823–835

[16] M.I. Posner, “Orienting of attention,” Quarterly Journal of Experimental Psychology, 1980, vol.32, pp.3-25.

[17] H.J. Heinze, G.R. Mangun, W. Burchert, H. Hinrichs, M. Sholz, TF Munte and H.Hundeshagen, “Combined spatial and temporal imaging of brain activity during selective attention in humans,” Nature, 1994, vol.372(6506), pp.543-546.

223

Scalp map by Positive word Scalp map by negative word

Figure 2. scalp activity during influence of the positive word(left) and negative word(right)

���Figure 3. Cz(left) and P4(right ) activity; negative word effect(blue)& positive word effect(red)

224