Brain and Cognition 91 (2014) 79–86

Contents lists available at ScienceDirect

Brain and Cognition journal homepage: www.elsevier.com/locate/b&c

Neural correlates of emotional intelligence in a visual emotional oddball task: An ERP study Sivan Raz a,b,⇑, Orrie Dan a, Leehu Zysberg b,c a

Department of Psychology, The Center for Psychobiological Research, The Max Stern Yezreel Valley College, 19300, Israel Department of Psychology, Tel Hai College, 12208, Israel c Graduate School, Gordon College of Education, Haifa, Israel b

a r t i c l e

i n f o

Article history: Accepted 5 September 2014

Keywords: Event Related Potentials (ERPs) Emotional intelligence Visual oddball

a b s t r a c t The present study was aimed at identifying potential behavioral and neural correlates of Emotional Intelligence (EI) by using scalp-recorded Event-Related Potentials (ERPs). EI levels were defined according to both self-report questionnaire and a performance-based ability test. We identified ERP correlates of emotional processing by using a visual-emotional oddball paradigm, in which subjects were confronted with one frequent standard stimulus (a neutral face) and two deviant stimuli (a happy and an angry face). The effects of these faces were then compared across groups with low and high EI levels. The ERP results indicate that participants with high EI exhibited significantly greater mean amplitudes of the P1, P2, N2, and P3 ERP components in response to emotional and neutral faces, at frontal, posterior-parietal and occipital scalp locations. P1, P2 and N2 are considered indexes of attention-related processes and have been associated with early attention to emotional stimuli. The later P3 component has been thought to reflect more elaborative, top–down, emotional information processing including emotional evaluation and memory encoding and formation. These results may suggest greater recruitment of resources to process all emotional and non-emotional faces at early and late processing stages among individuals with higher EI. The present study underscores the usefulness of ERP methodology as a sensitive measure for the study of emotional stimuli processing in the research field of EI. Ó 2014 Elsevier Inc. All rights reserved.

1. Introduction In recent years, the literature sees increasing interest in the neurobiological basis of emotion perception and regulation in humans. Neuroscientists and cognitive psychologists acknowledge the important role of emotions in cognitive processes such as judgment, decision-making, problem solving and interpersonal perception (Damasio, 1994; Grewal, Bracket, & Salovey, 2006). The concept of Emotional Intelligence (EI) is a relatively new concept proposing a framework integrating aspects of emotional information processing, emotion regulation and effective behavioral responses to emotional stimuli. This framework, originally proposed by Mayer and Salovey (1997), defines EI as a cluster of abilities related to an individual’s capabilities of (1) identifying emotions in self and others, (2) integrating emotions into thought processes, (3) effectively processing complex emotions and (4) ⇑ Corresponding author at: Department of Psychology, The Center for Psychobiological Research, The Max Stern Yezreel Valley College, 19300, Israel. Fax: +972 46423618. E-mail address: [email protected] (S. Raz). http://dx.doi.org/10.1016/j.bandc.2014.09.003 0278-2626/Ó 2014 Elsevier Inc. All rights reserved.

regulating one’s own emotions and those of others. While some authors described EI as a personality trait (Petrides & Furnham, 2000), others attempted to describe it as part of human abilities, using the equivalence with scholastic intelligence as guidelines (Mayer, Caruso, & Salovey, 1999). Skills linked to emotional intelligence are directly associated with positive social interaction and well-being, while emotion dysregulation is considered a key mechanism underlying various psychopathologies (Davidson, 1998, 2002; Phillips, Ladouceur, & Drevets, 2008). While research on the consequences of EI has been quite prolific, less is known about the physiological and neurological correlates of EI. Empirical evidence is lacking when it comes to associating the physiological and neurological processes we generally link to emotions and emotional regulation within the context of EI. Neurological correlates of measured EI may help deepen our understanding of the mechanisms behind the concept, thus contributing to the study of EI’s nature. Existing evidence from neuropsychological studies of patients with brain damage and from functional neuroimaging studies of healthy individuals suggests that several brain regions may be of particular importance for EI (Tarasuik, Ciorciari, & Stough, 2009).

80

S. Raz et al. / Brain and Cognition 91 (2014) 79–86

Bar-On, Tranel, Denburg, and Bechara (2003) reported significantly lower levels of EI among patients with lesions of the ventromedial prefrontal cortex, right amygdala and right insular cortex. Other studies emphasized the role other brain areas play in EI, among them the orbital frontal cortex and the anterior cingulate cortex (Hornak et al., 2003). Researchers have also found that individuals with higher EI exert less brain activity to solve emotional problems, as indicated by brain wave activity (Jausovec & Jausove, 2005; Jausovec, Jausovec, & Gerli, 2001). Killgore and YurgelunTodd (2007) reported that higher levels of EI in adolescents were associated with greater activity in the cerebellum and visual association cortex, as well as with decreased activity in a variety of emotion-related limbic and paralimbic regions during the perception of fearful faces. They suggested that EI in adolescents may involve greater neural efficiency of these key emotional-processing structures. All of the above mentioned studies used self-report measures to evaluate levels of EI. The current study attempts to add to the literature by exploring the potential neural correlates of EI using scalp-recorded EventRelated Potentials (ERPs) among individuals with higher and lower levels of EI, as measured both by a self-report questionnaire and a performance-based ability test. The last two decades have witnessed a notable rise in the application of ERPs to examine neurophysiological mechanisms associated with emotional information processing. Collectively, these data suggest that emotional stimuli elicit a sustained increase in attention and receive increased processing resources. The data also indicate that the facilitated processing of emotional stimuli can be indexed by modulation of positivities and/or negativities in the stimulus-locked ERP. Early ERP components, peaking around 100–300 ms following stimulus presentation (e.g. P1, N1, P2, N2), are considered indexes of attention-related processes and have been associated with early attention to emotional stimuli. Late ERP components (>300 ms) are long-lasting slow-waves (e.g. P3). These components are considered indexes of more elaborative, top–down, emotional information processing including emotional evaluation and memory encoding and formation (Hajcak & MacNamara, 2010; Luck, 2012; Olofsson, Nordin, Sequeira, & Polich, 2008). Several of the ERP components described above can be used to characterize the role of EI in the processing of emotional stimuli. Yet the literature on EI and ERP activity is extremely limited. To the best of our knowledge, only one study has examined ERP responses to emotional stimuli (emotion-word Stroop task) in relation to EI (Fisher et al., 2010). The present study examined differences in behavioral and ERP patterns among young adults with high and low EI in response to emotional and neutral facial expressions. In addition, all participants completed a standard non-emotional continuous performance test (the OCPT) so as to examine and control for possible between-group differences in global attention to task demands. Based on the literature on EI we hypothesized that EI levels would interact with behavioral responses and ERP patterns in response to emotional stimuli. a. At the electrophysiological level, we expected higher EI to be reflected in greater amplitudes of emotion processingrelated early and late ERP components (e.g. P1, P2, N2 and P3) in response to both angry and happy target faces. Due to the novelty of this investigation, our analyses remain exploratory in nature. b. At the behavioral level, based on previous findings (Campanella et al., 2004; Fox et al., 2000), we expected lower error rates and faster reaction times in response to angry faces compared with happy faces among both EI groups. We also expected individuals with high EI to have lower

rates of omission and commission errors and faster reaction times in response to emotional faces compared with individuals with low EI. c. Finally, we expected both EI groups to have similar results on the OCPT. 2. Material and methods 2.1. Participants The participants included 41 undergraduate students (30 females, 11 males; mean age 24.51 ± 1.99 years) selected from a pre-screening sample of 195 undergraduate students on the basis of their EI scores on the Audiovisual Test of Emotional Intelligence (AVEI) (Zysberg, Levy, & Zisberg, 2011) and the Schutte self-report Emotional Intelligence Scale (EIS) (Schutte et al., 1998). The participants were assigned to two EI groups. Those with EI scores in the bottom quartile of the distribution (among the lowest 25%) on both the AVEI and the EIS were allocated to the low EI group (n = 20; 15 females), and those with EI scores in the top quartile of the distribution (among the highest 25%) on both the AVEI and the EIS were allocated to the high EI group (n = 21; 15 females). Those with EI scores in the bottom quartile or in the top quartile of the distribution on one EI measure only (AVEI or EIS) were not included in the study. All the participants were healthy, with normal or corrected-to-normal vision, and none of them had a prior history of neurological or psychiatric disorders. All of them gave their informed consent. The experiment was approved by the academic committee of the college’s IRB. 2.2. Measures 2.2.1. EIS The EIS is a 33-item questionnaire developed by Schutte et al. (1998) that includes statements regarding an individual’s perceptions of his or her own social and emotional abilities. Responses are rated on a 5-point Likert-style scale ranging from 1 (full agreement with the item) to 5 (full disagreement with the item). Therefore, lower scores indicate higher self-reported EI. 2.2.2. AVEI The AVEI is a 27-item computer-based test of EI predicated on the ability-EI approach (Zysberg et al., 2011). The test contains still images and short video clips depicting various persons in diverse social and emotional situations. Test-takers are requested to identify the emotion experienced by a target person in the picture/clip from a list of ten options. Scores range from 0 to 27. Higher scores represent higher levels of emotional recognition and integration— two of the four aspects of ability-EI. The test takes about 12– 15 min to complete. 2.2.3. OCPT The OCPT (Raz, Bar-Haim, Sadeh, & Dan, 2012) is a standard (nonemotional) CPT designed and programmed for delivery over the Internet. Total net test time of the OCPT is 19 min. Four primary measures are extracted for analyses: errors of omission (defined as the number of targets to which a participant did not respond), errors of commission (defined as the number of times a participant incorrectly responded to a non-target), response times, and response time consistency (standard deviations of response times). A detailed description of this test can be found elsewhere (Raz et al., 2012). 2.2.4. Visual emotional oddball task Photographs of faces of 8 different individuals, 4 male and 4 female, were used as stimuli. All faces were taken from a standard

S. Raz et al. / Brain and Cognition 91 (2014) 79–86

81

set of pictures of facial affect, the NimStim face stimulus set (Tottenham, Borscheid, Ellertsen, Marcus, & Nelson, 2002). Facial expressions were angry, happy or neutral. Each actor presented both emotional (angry, happy) and neutral expressions. Each angry and happy face was presented 5 times and each neutral face was presented 30 times. We used a three-stimulus visual emotional oddball paradigm in which participants were confronted with one regularly repeated standard stimulus; a neutral face, presented with probability of 0.75, and two ‘‘target’’ deviant stimuli; angry and happy faces, each presented with probability of 0.125. A total of 320 stimuli (240 neutral faces, 40 angry faces and 40 happy faces) were divided into two blocks of 160 trials each (120 neutral faces, 20 angry faces and 20 happy faces). One block composed of male faces and the other one composed of female faces. The order of the two blocks varied across participants and the presentation of face stimuli was randomized between trials. All trials began with a 1000 ms fixation display (white cross on a black background) followed by 500 ms of the target faces (angry, happy or neutral) display. Following target display the screen went blank for an inter-trial interval (ITI) of 1000 ms. The experiment began with a short practice block which contained neutral, happy and angry faces corresponding to those presented in the experimental blocks.

81.43% of the 320 trials was retained in the analyses. The mean percent of artifact free trials per condition was 83.52% for angry faces, 82.99% for happy faces and 77.78% for neutral faces. Averaged ERP data were baseline corrected and re-referenced into an average reference frame. All stimulus presentations and behavioral response collections were controlled by a PC computer running E-prime 2.0 software (Psychology SoftwareTools Inc., PA).

Following inspection of the grand average ERPs, we decided to quantify the mean amplitudes of five consecutive ERP components within specified latency windows (centered on the component’s peak): P1; 100–150 ms post-stimulus, N170; 155–200 ms poststimulus, P2; 205–290 ms post-stimulus, N2; 290–340 ms poststimulus and P3; 350–480 ms post-stimulus. Mean amplitudes of these ERP components were quantified for ten channels at the frontal scalp location (average of channels 2, 3, 5, 6, 8, 9, 10, 11, 12, 60), six channels at the posterior-parietal scalp location (average of channels 31, 33, 34, 36, 38, 40), and three channels at the occipital location (average of channels 35, 37, 39). For the electrode array, see Fig. 1.

2.3. Procedure

2.6. Data analysis

One to two weeks prior to the ERP experiment, 195 undergraduate students completed the AVEI and the EIS. Those who met the criteria for either the high or the low EI groups were invited to the ERP session. Participants were seated in a comfortable chair in a dimly lit room at a distance of 80 cm from a 1900 computer screen. They were instructed to focus their gaze on the face stimuli to be presented at the center of the screen, and to point out as quickly as possible (without compromising accuracy) the occurrence of a ‘‘target’’ (deviant) emotional face by pressing the spacebar on the computer’s keyboard with their right index finger while withholding response to the ‘‘non-target’’ (standard) neutral faces. Response time and error rate were recorded. There were two categories of errors: omissions and commissions. Responses could be made during face presentation as well as during inter-trial intervals. Participants were asked to refrain from making eye movements throughout the session. Participants were allowed a rest period after the practice block and between the two experimental blocks. In addition to the ERP session, each participant completed the OCPT. Half of the participants completed the OCPT before and half of them after the ERP session.

2.6.1. ERP To assess the relationship between EI and brain activity, we used a 3  2 multivariate analysis of variance (mixed-design ANOVA) to analyze mean amplitudes for the pre-selected P1, N170, P2, N2 and P3 components at frontal, posterior-parietal and occipital channels. Face (angry/happy/neutral) was the within-subject factor, and EI group (high EI/low WI) was the between-subject factor. Follow-up paired sample t-tests were used to break down within-subject effects.

2.4. EEG/ERP recording; data acquisition

3. Results

EEG was recorded continuously using a 64-channel HydroCel Geodesic Sensor Net, Net Amps 300 amplifier, and Net Station, Version 4.2, software (Electrical Geodesics Inc., Eugene, OR) at 250 Hz with 0.1 Hz high-pass and 100 Hz low-pass filtering. Electrode impedances were maintained below 60 k. All channels were referenced to Cz during acquisition. After acquisition, during ‘‘offline’’ processing, the continuous EEG was referenced to an average reference, filtered with a 1–30 Hz band-pass filter and segmented by condition into 900 ms stimulus-locked epochs, ranging from 100 ms pre-stimulus to 800 ms post-stimulus. Epochs contaminated with vertical eye movements (eye blinks; ±140 lV) and horizontal eye movement (±55 lV) artifacts, as identified by computerized algorithm and verified by visual inspection, were eliminated. In addition, a recording segment was marked bad if it contained more than ten bad channels. Individual bad channels were replaced on a segment-by-segment basis with spherical spline interpolation. After artifact correction, an average of

In both behavioral and ERP results of the oddball task there were no significant differences between the responses to male and female faces. The two blocks were therefore combined in further analyses. One female participant from the high EI group and one from the low EI group were excluded from analyses due to excessive artifacts in the EEG data.

2.5. Target-evoked ERP components

2.6.2. Behavioral results To examine group differences in reaction times and error rates we conducted a mixed-model 3  2 ANOVA (Analysis of Variance), with face (angry/happy/neutral) as the within-subject factor and EI group (high EI/low EI) as the between-subject factor. Follow-up paired sample t-tests were used to break down within-subject effects. Numeric electrophysiological and behavioral results are presented as Mean ± SEM (the standard error of means) both in the text and in the figures.

3.1. Electrophysiological results Fig. 2 depicts the grand averaged ERPs to angry, happy and neutral faces by EI group at frontal, posterior-parietal and occipital channels. 3.1.1. P1 (100–150 ms) Analysis of the P1 component at frontal and at posteriorparietal channels revealed a main effect of EI group, such that the P1 mean amplitude was greater (more negative at frontal

82

S. Raz et al. / Brain and Cognition 91 (2014) 79–86

Fz

Cz

Pz

Oz

Fig. 1. Layout of the electrode array and the electrodes chosen for analysis.

location and more positive at posterior-parietal location) in the high EI group than in the low EI group, regardless of the facial expression condition F(1,37) = 7.03, p = 0.012, g2p = 0.16; F(1,37) = 6.77, p = 0.013, g2p = 0.16, respectively (Fig. 3). The pattern of results was similar at occipital channels but the difference between EI groups did not reach statistical significance (p = 0.10). No within-subject or interaction effects were found for P1.

3.1.2. N170 (155–200 ms) Analysis of the N170 component at frontal, posterior-parietal and occipital channels revealed a significant within-subject main effect of facial expression F(1.65,61.11) = 5.89, p = 0.007; g2p = 0.14; F(1.49,55.05) = 4.00, p = 0.03; g2p = 0.10; F(1.63,6 0.13) = 10.17, p = 0.0001; g2p = 0.22, respectively. N170 mean amplitude was greater for angry and happy faces than for neutral faces. No between-group or interaction effects were seen in N170 mean amplitude.

3.1.3. P2 (205–290 ms) Analysis of the P2 component at frontal, posterior-parietal and occipital channels revealed a main effect of EI group, such that the P2 mean amplitude was significantly greater (more negative at frontal channels and more positive at posterior-parietal and occipital channels) in the high EI group than in the low EI group, regardless of the facial expression condition-frontal: F(1,37) = 8.76, p = 0.005, g2p = 0.19, posterior-parietal: F(1,37) = 11.57, p = 0.002, g2p = 0.24, occipital: F(1,37) = 7.12, p = 0.011, g2p = 0.16. In addition to the between-group effect, analysis also revealed a significant within-subject main effect of facial expression at all three scalp locations-frontal: F(1.72,63.72) = 11.68, p = 0.0001; g2p = 0.24, posterior-parietal: F(1.40,51.86) = 7.58, p = 0.004; g2p = 0.17, occipital:

F(1.41,52.03) = 13.21, p = 0.0001; g2p = 0.26. P2 mean amplitude was greater for neutral faces than for angry and happy faces (Fig. 4). 3.1.4. N2 (290–340 ms) Analysis revealed a main effect of EI group, such that the N2 mean amplitude was more positive at frontal channels and more negative at posterior-parietal and occipital channels in the low EI group than in the high EI group, regardless of the facial expression condition-frontal: F(1,37) = 5.58, p = 0.024, g2p = 0.13; posteriorparietal: F(1,37) = 12.34, p = 0.001, g2p = 0.25, occipital: F(1,37) = 5.88, p = 0.02, g2p = 0.14. There was also a significant within-subject main effect of facial expression at all three scalp locations-frontal: F(2,74) = 10.60, p = 0.0001; g2p = 0.22, posterior-parietal: F(1.72,63.74) = 4.52, p = 0.019; g2p = 0.11, occipital: F(2,74) = 18.75, p = 0.0001; g2p = 0.34. N2 mean amplitude was greater for the emotional faces than for the neutral faces (Fig. 5). 3.1.5. P3 (350–480 ms post stimulus onset) Analysis of the P3 component at frontal, posterior-parietal and occipital channels revealed a main effect of EI group, such that the P3 mean amplitude was significantly greater (more negative at frontal channels and more positive at posterior-parietal and occipital channels) in the high EI group than in the low EI group, regardless of the facial expression condition-frontal: F(1,37) = 4.74, p = 0.036, g2p = 0.11; posterior-parietal: F(1,37) = 6.16, p = 0.018, g2p = 0.14, occipital: F(1,37) = 4.42, p = 0.042, g2p = 0.11. There was also a significant within-subject main effect of facial expression at all three scalp locations-frontal: F(2,74) = 19.10, p = 0.0001; g2p = 0.34, posterior-parietal: F(1.6 8,62.13) = 34.23, p = 0.0001; g2p = 0.48, occipital: F(1.49,55.0 2) = 5.20, p = 0.015; g2p = 0.12. P3 mean amplitude was greater for the emotional faces than for the neutral faces (Fig. 6).

83

S. Raz et al. / Brain and Cognition 91 (2014) 79–86

(a)

6

(a)

Frontal channels

1.5

5

High EI

P1 at frontal channels

Low EI 1

4 3

0.5

-100

µv

µv

2 1 0

0

100

200

300

400

500

0

600

-1 -2

-0.5

-3

Posterior-parietal channels

5

Low EI angry

P3

3

µv

Neutral

High EI angry

P2

4

-100

Happy

-1

6

High EI happy

(b)

2

High EI neutral Low EI neutral

1 0 -1

P1 at posterior-parietal channels

2.5

Low EI happy

P1

2

1.5

0

100

200

-2

300

400

500

600

µv

(b)

Angry

N2

1

N170

-3

0.5

-4 -5

0

(c)

Angry

Occipital channels

6 4

µv

2 1 0 -1

Neutral

Fig. 3. Mean amplitudes of the P1 (100–150 ms) component in response to angry, happy and neutral faces within the high and low EI groups at frontal (a) and at posterior-parietal, and (b) channels.

3

-100

Happy

-0.5

5

0

100

200

300

400

500

600

-2 -3 -4 -5

Fig. 2. Grand averaged ERPs to angry, happy and neutral faces at frontal ((a) average of channels 2, 3, 5, 6, 8, 9, 10, 11, 12, 60), at posterior-parietal, ((b) average of channels 31, 33, 34, 36, 38, 40) and at occipital, and ((c) average of channels 35, 37, 39) scalp locations for the high and low EI groups.

3.2. Behavioral results In testing our hypotheses regarding error rates and reaction times at the behavioral level, analysis revealed a significant within-subject effects of Face: omission error rates were lower for angry faces (2.82% ± 0.70) than for happy faces (4.17% ± 0.92) F(1,37) = 6.18, p = 0.018, g2p = 0.15. RTs were faster in response to angry faces (470.92 ms ± 7.10) than in response to happy faces (490.19 ± 8.11) F(1,37) = 10.68, p = 0.002, g2p = 0.22. No betweengroup or interaction effects were found in rates of omission errors, commission errors or reaction times. 3.3. OCPT results Consistent with our hypothesis, analysis of both the low-target and the high-target frequency sections of the OCPT results revealed no differences in average rates of omission errors, commission errors, reaction times and response time consistencies between the low and the high EI groups (see Table 1). 4. Discussion The present study set out to examine the neurological correlates of emotional intelligence by recording ERPs during an emotional

oddball task in two groups of participants: those with high EI and those with low EI (based on a self-report questionnaire and a performance-based test). Our first hypothesis suggested that individuals with high EI would differ from those with low EI in their mean amplitudes of early and late ERP components in response to both angry and happy target faces. The ERP results largely supported this hypothesis, providing indication of modulation of stimuli processing by EI level. More specifically, participants with high EI exhibited significantly greater amplitudes of the P1, P2, N2 and P3 ERP components in response to all faces, which was evident at frontal, posteriorparietal and occipital scalp locations. This difference between EI groups was not found for the face-sensitive N170 component. P1, P2 and N2 have been associated with early attention to emotional stimuli, stimulus discrimination and response selection processes. Augmentation of these components is thought to reflect the recruitment of additional perceptual processing for emotioninducing stimuli (Codispoti, Ferrari, & Bradley, 2007; Hillyard & Anllo-Vento, 1998; Hillyard, Vogel, & Luck, 1998; Junghöfer, Bradley, Elbert, & Lang, 2001; Olofsson et al., 2008; Thomas, Johnstone, & Gonsalvez, 2007; Vogel & Luck, 2000). The P3 component has been thought to reflect the attentional capacity allocated to the categorization of significant events (Kok, 2001), and may be viewed as an index of the brain activities that underlie revision of the mental representation induced by incoming stimuli (Linden, 2005; Polich, 2007). Hence, the current findings indicating that P1, P2, N2 and P3 are enhanced among participants with high EI compared to those with low EI may be cautiously interpreted as suggesting that visual emotional stimuli elicit greater mobilization of attention resources and subsequently more elaborative emotional information processing in individuals with high EI compared with those with low EI. Importantly, however, greater P1, P2, N2 and P3 amplitudes within the high EI group were evident not only in response to

84

S. Raz et al. / Brain and Cognition 91 (2014) 79–86

(a)

2.5 2

High EI

(a)

P2 at frontal channels

Low EI

3.5 3

High EI

N2 at frontal channels

Low EI

1.5 2.5 2

0.5

µv

µv

1

0

1.5

-0.5 -1

1

Angry

Happy

Neutral 0.5

-1.5 -2

0

Angry

(b)

4

(b)

Neutral

3

2 1.5

2

1

1.5

0.5

1

0

0.5

-0.5

0

-1

Angry

Happy

Neutral

N2 at posterior-parietal channels

2.5

2.5

µv

µv

3.5

-0.5

Happy

P2 at posterior-parietal channels

Angry

Happy

Neutral

-1.5 -2

(c)

4

1

3

0.5

2.5

0

2

-0.5

1.5 1

-1 -1.5

Angry

Happy

Neutral

-2

0.5

-2.5

0 -0.5

N2 at occipital channels

1.5

3.5

µv

µv

(c)

P2 at occipital channels

4.5

Angry

Happy

Neutral

-1

Fig. 4. Mean amplitudes of the P2 (205–290 ms) component in response to angry, happy and neutral faces within the high and low EI groups at frontal (a) at posterior-parietal, (b) and at occipital, and (c) channels.

emotional faces but also in response to neutral faces. This is in line with Fisher et al. (2010). Never the less, this finding implicates modulation of P1, P2, N2 and P3 as an indicator of attentional recourses commitment to the processing of facial expressions of emotion. There are a number of possible explanations for this puzzling finding. First, it may be that higher EI level may lead to vigilance toward all upcoming stimuli, especially in the context of an emotional task (in which emotional and neutral stimuli are presented rapidly in random order). This may suggest greater recruitment of resources to process all emotional and non-emotional stimuli at early and late processing stages. Second, it may be that reduced AVEI performance and reduced P1, P2, N2 and P3 amplitudes within the low EI group may simply reflect globally decreased attention to task demands. However, this explanation seems less plausible in the context of the current study, since the high and low EI groups did not differ in their OCPT performance. Future studies may attempt to further explore and differentiate among the various explanations of the present pattern of findings. At the behavioral level, consistent with our hypothesis, participants exhibited lower error rates and faster reaction times in

-3 -3.5 -4

Fig. 5. Mean amplitudes of the N2 (290–340 ms) component in response to angry, happy and neutral faces within the high and low EI groups at frontal (a) at posterior-parietal, (b) and at occipital, and (c) channels.

response to angry faces compared with happy faces. However, our hypothesis regarding differences in error rates and reaction times between the high and the low EI groups was not supported by the current results. The fact that the expected between-group differences were detected only in the ERP data suggests that the ERP methodology may provide a more sensitive measure of the emotional processing correlates of EI. It is also possible that the straightforward oddball task which was used in this study was too simple and easy to detect behavioral differences among EI groups. Future studies may use more complicated emotional tasks to that end. The concept of EI is associated with greater sensitivity to emotional cues and improved acuity in emotion identification and response (Zysberg et al., 2011; Goleman, 1995). The current results largely support our hypothesized association between EI and neurological processes during an emotional task. The results posit that emotionally intelligent individuals do differ from others in their electrophysiological reactions to emotional stimuli. Such result patterns may support the theoretical equivalence between

S. Raz et al. / Brain and Cognition 91 (2014) 79–86

(a)

3 2.5

to the high and low EI groups based on two measures of EI that addressed two different approaches: self-report questionnaire (trait-EI) and the Audio-Visual Test of Emotional Intelligence (ability-EI). On the one hand, this procedure may enhance the validity of the division into groups, but on the other hand it is hard to attribute our results specifically either to self-report or to ability EI testing. In other words, we cannot conclude that the neural differences between the low and high EI groups found in this study are directly related to traitor to ability EI. Future studies may attempt to examine whether ERP differences are more strongly related to, or better explained by, trait EI, ability EI or the combination of the two. Second, in the current task, emotional faces were always the target stimuli and neutral faces were non-target stimuli. This might make the comparison between emotional and non-emotional faces less straightforward since, in the case of angry and happy faces, the effects due to their emotional content may be conflated with the effects due to their role as targets. Continued studies may control for this potential confound of emotion with target status. In conclusion, to the best of our knowledge, the current study is among the first ERP studies to examine emotional visual stimuli processing with respect to EI. It represents an important preliminary investigation of brain mechanisms and dynamics underlying different levels of EI. The present study also underscores the usefulness of ERP methodology as a sensitive measure for the study of emotional stimuli processing in the research field of EI.

P3 at frontal channels

High EI Low EI

2

µv

1.5 1 0.5 0 -0.5

Angry

Happy

Neutral

-1

(b)

P3 at posterior-parietal channels

3.5 3 2.5 2

µv

1.5 1 0.5 0 -0.5

Angry

Happy

85

Neutral

-1 -1.5

Acknowledgment

(c)

P3 at occipital channels

1

This study was supported by a Migal-Tel Hai research authority grant.

0.5 0

References

µv

-0.5 -1 -1.5 -2

Angry

Happy

Neutral

-2.5

Fig. 6. Mean amplitudes of the P3 (350–480 ms) component in response to angry, happy and neutral faces within the high and low EI groups at frontal (a) at posterior-parietal (b) and at occipital (c) channels.

Table 1 OCPT performance among high and low EI groups: means and standard deviations for omission errors, commission errors, reaction times (ms) and response time consistencies (RTSD), and the corresponding between-groups t and p values. High EI

Low EI

Mean

sd

Mean

Low target Omissions Commissions RT (ms) RTSD (ms)

0.47 0.89 440.28 77.35

0.70 1.29 76.53 27.09

0.55 0.60 424.60 68.47

High target Omissions Commissions RT (ms) RTSD (ms)

1.79 4.79 401.70 88.21

2.15 4.46 57.78 24.82

1.25 4.85 387.32 83.88

t

p

1.31 1.14 68.32 29.03

0.22 0.76 0.68 0.99

0.82 0.45 0.50 0.33

2.31 4.97 104.21 74.65

0.75 0.04 0.53 0.24

0.46 0.97 0.60 0.81

sd

ability-EI and theories of scholastic intelligence linking cognitive ability with neurological and cognitive efficiency (Nevo, 1997). In the interpretation of our results a few more considerations need to be taken into account. First, participants were assigned

Bar-On, R., Tranel, D., Denburg, N. L., & Bechara, A. (2003). Exploring the neurological substrate of emotional and social intelligence. Brain, 126, 1790–1800. Campanella, S., Rossignol, M., Mejias, S., Joassin, F., Maurage, P., Debatisse, D., et al. (2004). Human gender differences in an emotional visual oddball task: An ERP study. Neuroscience Letters, 367, 14–18. Codispoti, M., Ferrari, V., & Bradley, M. M. (2007). Repetition and event-related potentials: distinguishing early and late processes in affective picture perception. Journal of Cognitive Neuroscience, 19(4), 577–586. Damasio, A. R. (1994). Descartes’ error: Emotion, reason, and the human brain. New York: Putnam. Davidson, R. J. (1998). Affective style and affective disorders: Perspectives from affective neuroscience. Cognition and Emotion, 12(3), 307–330. Davidson, R. J. (2002). Anxiety and affective style: Role of prefrontal cortex and amygdala. Biological Psychiatry, 51, 68–80. Fisher, J. E., Sass, S. M., Heller, W., Levin Silton, R., Edgar, J. C., Stewart, J. L., et al. (2010). Time course of processing emotional stimuli as a function of perceived emotional intelligence, anxiety, and depression. Emotion, 10(4), 486–497. Fox, E., Lester, V., Russo, R., Bowles, R. J., Pichler, A., & Dutton, K. (2000). Facial expressions of emotion: Are angry faces detected more efficiently? Cognition and Emotion, 14(1), 61–92. Goleman, D. (1995). Emotional intelligence. New York, NY: Bantam. Grewal, D., Bracket, M., & Salovey, P. (2006). Emotional intelligence and the self regulation of affect. In D. K. Snyder, J. Simpson, & J. N. Hughes (Eds.), Emotion regulation in couples and families: Pathways to dysfunction and health (pp. 37–55). Washington, DC, US: American Psychological Association. Hajcak, G., & MacNamara, A. (2010). Event-related potentials, emotion, and emotion regulation: An integrative review. Developmental Neuropsychology, 35(2), 129–155. Hillyard, S. A., & Anllo-Vento, L. (1998). Event-related brain potentials in the study of visual selective attention. Proceedings of the National Academy of Sciences of the United States of America, 95, 781–787. Hillyard, S. A., Vogel, E. K., & Luck, S. J. (1998). Sensory gain control (amplification) as a mechanism of selective attention: Electrophysiological and neuroimaging evidence. Philosophical transactions of the royal society of London. Series B, Biological Sciences, 353, 1257–1270. Hornak, J., Bramham, J., Rolls, E. T., Morris, R. G., O’Doherty, J., Bullock, P. R., et al. (2003). Changes in emotion after circumscribed surgical lesions of the orbitofrontal and cingulate cortices. Brain, 126, 1691–1712. Jausovec, N., & Jausove, K. (2005). Differences in induced gamma and alpha oscillation in the human brain related to verbal/performance and emotional intelligence. International Journal of Psychophysiology, 56, 223–235.

86

S. Raz et al. / Brain and Cognition 91 (2014) 79–86

Jausovec, K., Jausovec, N., & Gerli, I. (2001). Differences in event-related and induced EEG patterns in the theta and alpha frequency bands related to human emotional intelligence. Neuroscience Letters, 311, 93–96. Junghöfer, M., Bradley, M. M., Elbert, T. R., & Lang, P. J. (2001). Fleeting images: A new look at early emotion discrimination. Psychophysiology, 38, 175–178. Killgore, W. D. S., & Yurgelun-Todd, D. A. (2007). Neural correlates of emotional intelligence in adolescent children. Cognitive, Affective and Behavioral Neuroscience, 7(2), 140–151. Kok, A. (2001). On the utility of P3 amplitude as a measure of processing capacity. Psychophysiology, 38, 557–577. Linden, D. (2005). The P300: Where in the brain is it produced and what does it tell us? The Neuroscientist, 11(6), 563–576. Luck, S. J. (2012). Event-related potentials. In D. L. Long (Ed.), APA handbook of research methods in psychology (pp. 1–18). Washington, DC: American Psychological Association. Mayer, J. D., Caruso, D. R., & Salovey, P. (1999). Emotional intelligence meets traditional standards for an intelligence. Intelligence, 27(4), 267–298. Mayer, J. D., & Salovey, P. (1997). What is emotional intelligence? In P. Salovey & D. Sluyter (Eds.), Emotional development and emotional intelligence: Educational implications (pp. 3–31). New York: Basic Books. Nevo, B. (1997). Human intelligence [Hebrew edition]. Tel Aviv: Open University press. Olofsson, J. K., Nordin, S., Sequeira, H., & Polich, J. (2008). Affective picture processing: An integrative review of ERP findings. Biological Psychology, 77(3), 247–265. Petrides, K. V., & Furnham, A. (2000). On the dimensional structure of emotional intelligence. Personality and Individual Differences, 29, 313–320.

Phillips, M. L., Ladouceur, C. D., & Drevets, W. C. (2008). A neural model of voluntary and automatic emotion regulation: Implications for understanding the pathophysiology and neurodevelopment of bipolar disorder. Molecular Psychiatry, 13(9), 833–857. Polich, J. (2007). Updating P300: An integrative theory of P3a and P3b. Clinical Neurophysiology, 118(10), 2128–2148. Raz, S., Bar-Haim, Y., Sadeh, A., & Dan, O. (2012). Reliability and validity of the online continuous performance test among young adults. Assessment. http:// dx.doi.org/10.1177/1073191112443409. Schutte, N. S., Malouff, J. M., Hall, L. E., Haggerty, D. J., Cooper, J. T., Gloden, C. J., et al. (1998). Development and validation of a measure of emotional intelligence. Personality and Individual Differences, 25, 167–177. Tarasuik, J., Ciorciari, J., & Stough, C. (2009). Understanding the neurobiology of emotional intelligence: A review. Assessing emotional intelligence: Theory, research, and applications. Springer Science+Business Media: New York, NY US, pp. 307– 320. Thomas, S. J., Johnstone, S. J., & Gonsalvez, C. J. (2007). Event-related potentials during an emotional Stroop task. International Journal of Psychophysiology, 63(3), 221–231. Tottenham, N., Borscheid, A., Ellertsen, K., Marcus, D. J., & Nelson, C. A. (2002). Categorization of facial expressions in children and adults: Establishing a larger stimulus set [Abstract]. Journal of Cognitive Neuroscience, 14, S74. Vogel, E. K., & Luck, S. J. (2000). The visual N1 component as an index of a discrimination process. Psychophysiology, 37, 123–190. Zysberg, L., Levy, A., & Zisberg, A. (2011). Emotional intelligence in applicant selection for care-oriented programs. Journal of Psychoeducational Assessment, 29, 27–38.

Neural correlates of emotional intelligence in a visual emotional oddball task: an ERP study.

The present study was aimed at identifying potential behavioral and neural correlates of Emotional Intelligence (EI) by using scalp-recorded Event-Rel...
907KB Sizes 0 Downloads 5 Views