International Journal of Psychophysiology 93 (2014) 398–410

Contents lists available at ScienceDirect

International Journal of Psychophysiology journal homepage: www.elsevier.com/locate/ijpsycho

The role of encoding and attention in facial emotion memory: An EEG investigation Colleen A. Brenner a,⁎, Samuel P. Rumak a, Amy M.N. Burns a, Paul D. Kieffaber b a b

University of British Columbia, Department of Psychology 2136 West Mall, Vancouver, British Columbia V6T 1Z4, Canada College of William & Mary, Department of Psychology, P.O. Box 8795, Williamsburg, VA 23187-8795, USA

a r t i c l e

i n f o

Article history: Received 28 January 2014 Received in revised form 29 April 2014 Accepted 10 June 2014 Available online 17 June 2014 Keywords: Event-related potential N170 P100 N250 Theta Emotion Attention Memory

a b s t r a c t Facial expressions are encoded via sensory mechanisms, but meaning extraction and salience of these expressions involve cognitive functions. We investigated the time course of sensory encoding and subsequent maintenance in memory via EEG. Twenty-nine healthy participants completed a facial emotion delayed match-to-sample task. P100, N170 and N250 ERPs were measured in response to the first stimulus, and evoked theta power (4–7 Hz) was measured during the delay interval. Negative facial expressions produced larger N170 amplitudes and greater theta power early in the delay. N170 amplitude correlated with theta power, however larger N170 amplitude coupled with greater theta power only predicted behavioural performance for one emotion condition (very happy) out of six tested (see Supplemental Data). These findings indicate that the N170 ERP may be sensitive to emotional facial expressions when task demands require encoding and retention of this information. Furthermore, sustained theta activity may represent continued attentional processing that supports short-term memory, especially of negative facial stimuli. Further study is needed to investigate the potential influence of these measures, and their interaction, on behavioural performance. Crown Copyright © 2014 Published by Elsevier B.V. All rights reserved.

1. Introduction Emotional facial expressions are an efficient way of communicating one's emotional state. This information is extremely important in situations where socially appropriate responses require an accurate reading of the emotional states of others. Emotional expressions are quickly translated from sensory signals to higher order cognitive networks for further processing and integration with broader executive processes, including memory (Adolphs, 2003). Thus far, there have been very few systematic studies specifically investigating the role of emotion on early sensory processing of facial emotions, their maintenance in shortterm memory via sustained EEG activity, and whether the interplay between sensory- and maintenance-related activity affects performance. Electrophysiological methods have the advantage of capturing the brain's response to facial expressions on a millisecond timescale, matching the timing of facial expression recognition in the course of a natural interaction. Scalp-recorded event-related potentials (ERPs) reflect the synchronized firing of large populations of neurons that are time-locked to a stimulus. Previous research has identified that the P100, the N170 and the N250 ERPs can be elicited by visual stimuli, with the N170 and N250 particularly sensitive to facial stimuli (Bentin et al., 1996; Herrmann et al., 2005a, 2005b; Streit et al., 2000). While ⁎ Corresponding author. Tel.: +1 604 822 4650. E-mail addresses: [email protected] (C.A. Brenner), [email protected] (S.P. Rumak), [email protected] (A.M.N. Burns), [email protected] (P.D. Kieffaber).

http://dx.doi.org/10.1016/j.ijpsycho.2014.06.006 0167-8760/Crown Copyright © 2014 Published by Elsevier B.V. All rights reserved.

several other face-sensitive ERPs have been reported in the literature (EPN, N400, and LPC), we focus on the P100, N170 and N250 as representations of processes that are morphologically well-characterized in the current study, are consistent with our recording parameters and choice of reference, and occur early enough to ostensibly reflect sensory rather than cognitive processing. 1.1. Relevant ERPs: P100, N170 and N250 The P100 ERP is a positive deflection that peaks between 80 and 120 ms after a visual stimulus, and is thought to reflect attentionbased early visual processing (Mangun and Hillyard, 1991; Mangun, 1995). The P100 is larger in response to expected rather than unexpected stimuli, and varies depending on stimulus properties and location (Nakamura et al., 2001; Regan, 1989). Localization studies place the generator of the P100 in bilateral occipital areas and fusiform gyrus (Herrmann et al., 2005a; Taylor et al., 2011; T. K. W. Wong et al., 2009). It is therefore considered an early index of attentionmodulated sensory processing. The data regarding face processing and the P100 ERP are somewhat inconsistent, with some studies finding P100 amplitude sensitive to faces compared to non-face stimuli, while others fail to find such modulation (Herrmann et al., 2005b; Jacques and Rossion, 2006; Liu et al., 2002; Utama et al., 2009a, 2009b; A. C.-N. Wong et al., 2009). By presenting intact and scrambled faces versus objects, Rossion and Caharel (2011) demonstrated that the P100 reflects processing of low-level visual cues that are not related to the experience

C.A. Brenner et al. / International Journal of Psychophysiology 93 (2014) 398–410

of viewing a face. A few studies have reported facial emotion modulation as early as 90 ms post-stimulus onset (Batty and Taylor, 2003; Brosch et al., 2008; Pourtois et al., 2004), however the tasks employed in these studies were either covert spatial attention or implicit emotion tasks. Furthermore, one study reported P100 modulation in response to a bar that replaced emotional face stimuli, but not in response to the face itself (Pourtois et al., 2004), while another found a main effect of emotion but post-hoc analyses showed that P100 amplitude did not significantly differ between any two stimuli (Batty and Taylor, 2003). Given these findings, the P100 may reflect attention-modulated sensory processing of the visual cues found in faces, but it likely does not reflect the structural encoding of a face, the phenomenological experience of seeing a face, or the explicit emotional categorization of facial stimuli in most circumstances. The N170 ERP is a negative deflection that peaks approximately 170 ms after stimulus presentation. The N170 is larger in response to intact faces compared to objects or scrambled faces (Bentin et al., 1996; Chang and Huang, 2012; Herrmann et al., 2005a, 2005b; Rossion and Caharel, 2011). A study by Jonas et al. (2012) found that only stimulation of the right inferior occipital gyrus resulted in transient prosopagnosia in an epileptic patient who did not show prosopagnosia before or after stimulation. They also recorded N170 in response to facial stimuli in this area, indicating that the right inferior occipital gyrus is necessary for the generation of this seemingly face-sensitive ERP. Source localization studies lend support to the notion of face selectivity in the superior temporal sulcus and the fusiform gyrus (Deffke et al., 2007; Utama et al., 2009a, 2009b). The data concerning the emotional modulation of the N170 are mixed with some studies reporting amplitude modulation of the N170 ERP, especially to negative facial expressions and that it is sensitive to expression intensity, while others do not report modulation by emotional expression (Batty and Taylor, 2003; Eimer and Holmes, 2002; Rellecke et al., 2012; Righi et al., 2012; Utama et al., 2009a, 2009b; A. C.-N. Wong et al., 2009). One explanation for this inconsistency may be that the N170 is influenced by the temporally overlapping early posterior negativity (EPN) ERP. Rellecke et al. (2013) reported that the EPN and the N170 can be distinguished by distinct scalp distributions during the same time window at the same electrode sites. They suggest that the scalp distribution related to the EPN is sensitive to emotional facial expressions while the parallel distribution associated with the N170 is only sensitive to the structural processing of facial stimuli, and that topographies are largely affected by choice of reference. The EPN is more negative in response to emotional compared to neutral facial expressions and is characterized by a more posterior spatial distribution compared to the N170 ERP (Rellecke et al., 2013). Since studies that measured the N170 used different paradigms and different references, a summary of emotion modulation of the N170 ERP must be tentative. In general these findings indicate that the N170 reflects a more detailed processing of facial structure that may include emotional expression depending on specific recording parameters and task demands. Early models of facial emotion processing propose that the structural encoding of faces and the detection of different facial expressions are separate but parallel processes (Bruce and Young, 1986; Martens et al., 2010). More recent work has found an interaction between identity and expression processing of faces, consistent with parallel but intersecting networks (Streit et al., 2003; Vuilleumier and Pourtois, 2007). Given this context, the N250 ERP that is typically recorded at frontal–central sites, is thought to use the structural information gleaned from the processing step reflected by the N170 ERP and superimposes additional complex information associated with identity, such as facial affect or gender (Wynn et al., 2013). Streit et al. (2000) reported larger N250 amplitudes in response to a facial emotion recognition task compared to blurred faces in an object recognition task. Since both tasks evoked an N170 ERP but only the explicit facial emotion task elicited an N250, they interpreted these findings to indicate face emotional specificity of the N250 waveform. Furthermore, Zheng et al. (2012) found

399

that the N250 was sensitive to the differentiation of one face from another, while earlier components such as the P100 and N170 were not. They suggested that the N170 may be associated with the perception of face characteristics (i.e. facial layout) whereas the N250 may reflect individual face recognition within a memory paradigm. Activity associated with the N250 ERP in response to happy faces may arise from bilateral fusiform gyri although EEG localization studies in response to other emotions are scarce (Williams et al., 2006). Much less data have been gathered on the N250 ERP in general, and its modulation by emotion is mixed (Lee et al., 2010; Liu et al., 2012; Sato et al., 2001; T. K. W. Wong et al., 2009). Given these findings, the N250 may reflect the coordination of neural activity associated with more complex, detailed processing of facial stimuli such as facial expressions.

1.2. ERPs in emotion processing, memory and attention There is a large amount of research focused on the impact of emotion on attention and memory. The data indicates that emotional stimuli are encoded and retrieved more accurately than neutral stimuli and that emotional stimuli capture more attention than neutral stimuli (Adolphs et al., 2000; Calvo et al., 2007; Dolan, 2002; Gomes et al., 2013; Kensinger, 2007; Koenig and Mecklinger, 2008; LaBar and Cabeza, 2006; Phelps, 2004; Sato et al., 2001; Schupp et al., 2003; Todd et al., 2012a). It is hypothesized that angry and fearful facial expressions represent threat stimuli that may be more salient than neutral or positive expressions because they may require action on the part of the observer (Krombholz et al., 2007; Palermo and Rhodes, 2007; Rellecke et al., 2012; Schupp et al., 2004). The findings within the field of electrophysiology are consistent. Negatively valenced stimuli produce larger old/new effects compared to neutral or positive stimuli (Johansson et al., 2004; Schaefer et al., 2011; van Strien et al., 2009; Weymar et al., 2010). The N2pc ERP reflects attention allocation, and is larger and earlier in response to angry compared to happy faces (Feldmann-Wüstefeld et al., 2011; Weymar et al., 2011). In addition to ERPs, electrophysiological studies of short-term memory indicate that oscillations in the theta frequency range may support memory maintenance. Sustained theta activity and increased phaselocking recorded via several methods (scalp, local field potentials and single-unit recordings) during the maintenance phase of memory tasks in both humans and macaques predict behavioural performance (Reinhart et al., 2012). Both sustained theta activity and increased theta rhythmicity have been recorded during the delay of short-term memory tasks (Lee et al., 2005; Tsoneva et al., 2011). Klimesch et al. (2001) showed that items that were later remembered evoked a longer sustained theta response compared to those that were rated as only familiar. Furthermore, the confidence with which subjects reported seeing a stimulus was associated with dorsolateral prefrontal and superior parietal theta activity (Klimesch et al., 2006). However, in the studies reported above, memory and attentional focus are confounded, and there is evidence that sustained theta activity is also associated with attention in the absence of working memory load. For example, Kirschner et al. (2012) found increased theta power and phase locking values when subjects were on-task compared to non-attentional mind wandering. Chang and Huang (2012) reported higher theta synchronization in a high- compared to low-attention task. Therefore, it is possible that increased sustained theta power recorded during short-term memory tasks reflects the maintenance of a memory trace itself, or the increased concentration and focused attention required to maintain a memory trace. Deiber et al. (2007) reported increased transient frontal theta power within 500 ms of stimulus onset that was associated with obligatory stimulus processing and focused attention, whereas increased induced frontal theta between 965 and 2390 ms after stimulus onset varied with working memory load and was interpreted to reflect working memory maintenance (Deiber et al., 2007). Therefore, differentiation between sustained theta activity associated with focused

400

C.A. Brenner et al. / International Journal of Psychophysiology 93 (2014) 398–410

attention and that associated with the maintenance of a memory trace may require detailed investigation of the timing of the sustained activity. Finally, a host of data suggests that theta oscillations may also be associated with emotion discrimination. Presentation of emotionally laden stimuli, such as International Affective Picture System (IAPS, (Lang et al., 2008) slides or faces exhibiting emotional expressions, induces greater time-locked theta activity in response to emotional compared to neutral stimuli (Aftanas et al., 2001; González-Roldan et al., 2011; Knyazev et al., 2009; Zhang et al., 2012). Increased theta in response to emotional stimuli is associated with subjective intensity of the stimuli and individual sensitivity to emotional content (González-Roldan et al., 2011; Knyazev et al., 2009). They interpreted these findings to represent stronger engagement of attentional resources in response to biologically meaningful stimuli. In addition, many studies have found these early increases in theta activity in response to emotional facial stimuli to be somewhat right-lateralized (Balconi and Lucchiari, 2006; Balconi and Pozzoli, 2009; Güntekin and Başar, 2009). While these studies generally demonstrate that eventrelated theta activity within the first 500 ms of stimulus presentation can discriminate between “emotional” and “neutral” expressions, it is unclear whether the magnitude of the theta response differentiates between the different emotions that make up these categories. Furthermore, Knyazev et al. (2010) postulated that this theta response likely represents the complex (including memory, motivation and attention) operations associated with processing emotional information. While these studies demonstrate that time-locked theta activity is broadly sensitive to emotional content when measured shortly after the presentation of an emotional stimulus (overlapping in time with the P1, N170 and N250 ERPs described above), it is unclear whether sustained theta activity (i.e. N 500 ms after stimulus presentation) maintains this sensitivity.

participants to remember the face, rather than just the emotional expression, of the facial stimuli (Johansson et al., 2004; Langeslag et al., 2009). The current study addresses this issue by administering a unique task that requires participants to encode and maintain the emotional expression rather than the specific face of an individual. We utilized this novel delayed match-to-sample paradigm, and put forth the following hypotheses: (1) the N170 ERP and the N250 ERP amplitudes would be modulated by emotion, particularly negative facial expressions, while the P100 ERP would not, and (2) we expected increased sustained theta activity in response to emotional compared to neutral stimuli. The second aim of this study is to evaluate whether these parameters contribute to short-term memory performance on this task. For this aim we hypothesized (3) negative correlations between the ERP amplitudes modulated by emotion (N170 and N250) and sustained theta power, and no statistical relationship between theta power and P100 amplitude, and (4) that the interaction between sensory ERPs and sustained theta activity would predict behavioural performance for N170 and N250 but not for P100.

1.3. The current study

2.2. Procedure

Despite the large literature on the impact of emotion on ERPs, memory and attention, and the extensive literature on the role of theta activity in memory, emotion processing and attention, no studies have investigated the impact of emotion on sustained theta activity during an emotion memory task. A study by Todd et al. (2012b) linked the concepts of attention and memory in a single emotion paradigm. They found larger P200 amplitudes in response to emotional (compared to neutral) stimuli mirrored behavioural reports of perceived vividness. These more vivid emotional stimuli were also recalled more accurately, indicating that emotional saliency modulates attention in order to enhance visual encoding and subsequent memory of the stimulus. Given the importance of emotional facial expressions, one would anticipate a similar interaction between attention and memory in response to facial stimuli. Sauseng et al. (2010) argued that theta activity during a working memory task is essential for the integration of bottom-up sensory and top-down template-matching information, and Zhang et al. (2012) suggested that theta activity may be associated with orienting attention toward important emotional cues. While these studies provide important insights about the neural involvement during encoding and maintenance of stimuli, the literature is still lacking a study in which sensory processing of- and sustained response toemotional stimuli are compared. It is possible that emotional, and particularly negative facial expressions engage more attentional resources at encoding, which may subsequently benefit short-term memory maintenance. The first aim of this study is to investigate the role of emotion on sensory ERPs and maintenance-related EEG activity. To this end, sensory-related ERPs during the encoding of facial emotion stimuli and the sustained theta activity following facial emotion presentation were evaluated. Previous ERP studies that have investigated memory for emotional faces have used implicit emotion tasks that required

Participants were seated in a quiet, darkened room and stimuli were presented on a CRT monitor 110 cm in front of the participants. Facial affect stimuli for the delayed match-to-sample task were taken from the NimStim Face Stimulus Set and consisted of an equal number of Caucasian female and male faces (Tottenham et al., 2009). Six emotions were depicted by 22 different models (very happy, somewhat happy, neutral, sad, fearful, and angry) and pictures from every emotional category were selected for each model. The facial stimuli subtended 12.97° of visual angle, and color and contrast of the pictures were not modified. The first face stimulus was presented for 200 ms, followed by a delay interval in which a black fixation cross remained on for the entire 2000 ms delay. The second stimulus was then presented for 200 ms and the participants responded, via button press with their dominant hand (right for all but one participant), whether the affective expression of the second stimulus was the same or different from the first stimulus (Fig. 1). The first and second stimuli were always different models of the same sex, and the identity of the models was counterbalanced across matched and mismatched trials. The task consisted of 120 trials; 60 matched expressions and 60 mismatched expressions, balanced across emotional expression (20 trials per emotion).

2. Material methods 2.1. Participants Participants were 29 undergraduate students (16 women, aged 18 to 22 years, mean age: 20 years) at the University of British Columbia, who participated for course credit. This study was approved by the ethics committee at the University of British Columbia, and all subjects provided written informed consent. All subjects had normal or corrected-to-normal vision, no self-reported psychological or neurological disorders, and all but one was right-handed via self-report.

2.3. EEG data acquisition Brainwave activity was recorded from 31 electrode sites using Brain Vision QuickAmps. Data were captured at 1000 Hz using an averaged reference and all impedances were kept below 10 kΩ. Eye blink activity was recorded using bipolar electrodes placed above and below the left eye and on each temple for offline eye-blink correction (Gratton et al., 1983). Only trials in which participants provided a correct response were analyzed further.

C.A. Brenner et al. / International Journal of Psychophysiology 93 (2014) 398–410

401

Fig. 1. The facial emotion memory task.

2.4. ERP amplitudes Data for ERP analysis were filtered using a 1–24 Hz (24 dB/octave) bandpass filter (Brain Vision LLC) and segmented into 500 ms epochs (100 ms pre-stimulus baseline + 400 ms post-stimulus) from the onset of the first facial stimulus. Artefacts greater than ±150 μV were excluded and data were then averaged and baseline correction was applied prior to a semi-automatic baseline-to-peak detection procedure. The ERPs of interest were analyzed at sites with clear morphology and consistent with current literature (Batty and Taylor, 2003; Liu et al., 2012; Rossion and Caharel, 2011). ERP amplitudes were then averaged to create clusters that were used in subsequent analyses: P100 and N170 left hemisphere (P7, TP7, and O1), P100 and N170 right hemisphere (P8, TP8, and O2), N250 left hemisphere (FP1, F3, and FC3) and N250 right hemisphere (FP2, F4, and FC4). Only correct trials were included in the clusters. Participants were quite accurate; there was an average of 17.49/20 correct trials per emotion (range from 16.3 to 18.2). Outliers were identified as those with amplitude values three times the interquartile range of the group for each cluster, and were excluded from further analyses.

2.5. Delay interval theta power Separate segmentation and data processing were performed for theta power analyses. Raw data were segmented into 6100 ms epochs (100 ms baseline) for each emotion condition. Baseline correction, ocular correction and artifact rejection procedures were the same as those for ERP analyses. A Morlet wavelet was run over the entire 6100 ms averaged waveform from 1 to 60 Hz in 60 steps (c = 6) and normalized with respect to sampling rate and baseline activity. Power between 4 and 7 Hz in three 500 ms bins during the delay interval was averaged and exported using the same electrode sites that ERPs were measured (bin 1 = 500–1000 ms post-stimulus onset, bin 2 = 1001–1500 ms post-stimulus onset, bin 3 = 1501–2000 ms post-stimulus onset).

Bins started at 500 ms in an effort to capture memory-related oscillations rather than the slow fronto-central activity, or late positive potential, reported in some studies as being modulated by emotional expression (Batty and Taylor, 2003; Hajcak et al., 2009). Clusters of theta power were created by averaging individual electrode sites: left occipital-parietal (TP7, P7, and O1), right occipital-parietal (TP8, P8, and O2), left frontal (FP1, F3, and FC3) and right frontal (FP2, F4, and FC4). Outliers were identified as those with amplitude values three times the interquartile range of the group for each cluster, and were excluded from further analyses. 2.6. Statistics Repeated measures ANOVA with within subjects factors of emotion positive (very happy and somewhat happy), neutral, negative (sad, fearful, and angry) and hemisphere (left versus right) were used to evaluate ERP amplitudes. For theta power analyses a repeated measures ANOVA with within subject factors of emotion (positive, neutral, and negative), hemisphere (left, right), and time (500–1000, 1000–1500, 1500–2000) were used. If sphericity was violated, Geenhouse– Geisser corrections were used to interpret the data. ERP and theta power analyses for each emotion separately are included in the Supplemental Data. Pearson product–moment correlation coefficients were used to examine the relationship between (1) ERP clusters and evoked theta power clusters for each time bin for all emotion categories, (2) ERP clusters and behavioural performance for all emotion categories and (3) evoked theta power clusters for each time bin and behavioural performance for all emotion categories. Finally, regression analyses for each emotion category, time bin, and hemisphere were used to test whether the relationship between theta power and behavioural performance is moderated by ERP amplitude. For each regression analysis, behavioural performance for a given emotional category was regressed on the corresponding N170 amplitude, theta power, and the N170 × theta power interaction term.

402

C.A. Brenner et al. / International Journal of Psychophysiology 93 (2014) 398–410

Interaction terms were computed by multiplying the centered N170 amplitude and theta power variables (Aiken and West, 1991). In these analyses, a significant interaction term indicates moderation of the relationship between theta power and behavioural performance by N170 amplitude. All regression analyses were conducted using the Interaction!© software package.

4

FP1

3. Results 3.1. Behavioral responses Participants were asked to respond, via button press, whether the facial expression from the second stimulus was the same or different

FP2 4

-6 4

200

200

400

FC3

Left Frontal

400

FC4

-6 4 -6 4

200

400

F3

200

200

400

8.5 TP7

8.5

200

400

200

200

400

400

8.5

-6.5

Left Occipital-parietal

P8

-6.5

8.5

400

TP8

P7

-6.5

200

Right Frontal

400

8.5

-6.5

400

F4

-6 -6

200

200

200

400

200

400

Right Occipital-parietal

400

O2

O1

-6.5 200

400

200

400

200

400

Fig. 2. Grand average waveforms for all facial emotion expressions. Note the different scales between frontal and parietal–occipital sites. Red = very happy, orange = somewhat happy, yellow = neutral, teal = sad, blue = fear, green = anger.

C.A. Brenner et al. / International Journal of Psychophysiology 93 (2014) 398–410

from the first. Participant's mean performance across all emotions was 88.0% (SD = 0.043) and the average reaction time for correct responses across all emotions was 834 ms. A repeated measures ANOVA using a within subjects factor of emotion revealed a main effect of emotion [F(2,50) = 9.68, p b .001]. Planned contrasts showed that participants responded significantly more accurately to neutral expressions compared to emotional stimuli (i.e., positive and negative), t(25) =4.80, p b .001, and that accuracy was not significantly different between positive and negative emotions, t(25) = −0.144, p = .887.

403

Table 1 Mean (SD) ERP cluster amplitudes (μV) for facial emotion categories. Emotion

P100

- Left - Right - Left - Right - Left - Right

N170 N250

Positive

Neutral

Negative

4.79 (2.54) 5.44 (2.43) −5.16 (3.00) −4.80 (3.76) −4.18 (1.70) −4.47 (1.89)

4.33 (2.17) 5.50 (2.55) −4.94 (3.59) −4.54 (3.81) −4.52 (2.49) −4.85 (2.36)

4.62 (1.87) 5.77 (2.77) −5.81 (3.63) −5.58 (3.60) −4.49 (2.11) −4.89 (2.29)

3.2. Event-related potentials

3.2.2. N170 ERP A repeated measures ANOVA revealed a main effect of emotion, F(2,56) = 5.39, p = .007 (Fig. 2). Planned contrasts revealed that N170 amplitude was larger (i.e., more negative) in response to emotional images (M = − 5.41, SE = 0.57) than to neutral (M = − 4.74, SE = 0.61) images, t(28) = 2.49, p = .019. This analysis also revealed a significant increase in N170 amplitude in response to negative (M = − 5.69, SE = 0.61) compared with positive (M = − 4.98, SE = 0.55) images, t(28) = 2.41, p = .022 (Table 1, Fig. 3). 3.2.3. N250 ERP (Fig. 2) A repeated measures ANOVA did not reveal a main effect of emotion or hemisphere (Table 1, Fig. 3).

neutral stimuli there was a main effect of time, F(1.41,29.59) = 9.63, p = .002 and a hemisphere × time interaction, F(1.33,28.04) = 3.86, p = .049; indicating that in the left hemisphere power in time bin 1 was larger than in bin 2 (p = .001) and bin 3 (p = .001) which did

A

5

1 The pattern of ERP and theta power results did not change when the one left-handed participant was excluded.

4 3 2 1 0 Posive P100 Le Hemisphere

B Absolute Value of uV

3.3. Theta power

Neutral

Negave

P100 Right Hemisphere

7 6 5 4 3 2 1 0 Posive N170 Le Hemisphere

C Absolute Value of uV

A repeated measures ANOVA with emotion (positive, neutral, and negative), hemisphere (left and right) and time (501–1000, 1001– 1500, and 1501–2000 ms) of theta power measured over occipital– parietal sites revealed a main effect of emotion, F(2,48) = 3.31, p = .045, and main effect of time, F(1.18,28.35) = 8.81, p b .001 (Table 2). Pair-wise comparisons indicated that theta power in the first time bin was significantly larger than the second (p = .005) and third (p = .005) time bins, which did not significantly differ from each other (p = .815). Planned contrasts to investigate the main effect of emotion revealed that theta power was significantly larger in response to negative (M = 75.11, SE = 7.41) than to positive (M = 66.16, SE = 5.90) images, t(24) = 2.95, p = .007, but that theta power in response to the emotional (positive/negative) stimuli (M = 70.64, SE = 6.5) was not significantly different from the magnitude of the response to neutral stimuli (M = 68.65, SE = 6.32; Fig. 4).1 A repeated measures ANOVA using emotion (positive, neutral, and negative), hemisphere (left and right) and time (501–1000, 1001– 1500, and 1501–2000 ms) of theta power measured over frontal sites revealed a main effect of time, F(1.49,29.73) = 20.29, p b .001. Pairwise comparisons indicated that theta power in first bin was significantly larger than that in the second (p b .001) and third (p b .001) bins, and that theta power in the second bin was significantly larger than that in the third bin (p = .025). There was also a significant emotion × hemisphere × time interaction, F(4,80) = 3.27, p = .015. Follow-up ANOVAs revealed that for positive stimuli, a significant main effect of time F(2,40) = 8.81, p = .001 indicated that power in the first time bin was larger than that in the second (p = .009) and third (p = .003) bins which did not significantly differ from one another (p = .106). For

7 6

uV

3.2.1. P100 ERP (Fig. 2) A repeated measures ANOVA yielded a main effect of hemisphere [F(1,28) = 5.90, p = .022], indicating that P100 amplitudes were larger in the right compared to the left hemisphere (Table 1, Fig. 3). Neither the main effect of emotion or the emotion × hemisphere interaction reached significance.

Neutral

Negave

N170 Right Hemisphere

7 6 5 4 3 2 1 0 Posive N250 Le Hemisphere

Neutral

Negave

N250 Right Hemisphere

Fig. 3. Bar Graphs of ERP amplitudes for all facial emotion expressions. A. reflects P100 cluster amplitudes (SE) in each hemisphere across emotion categories. B. reflects the absolute value of N170 cluster amplitudes (SE) in each hemisphere across emotion categories. C. reflects the absolute value of N250 cluster amplitudes (SE) in each hemisphere across emotion categories. SE = standard error.

404

C.A. Brenner et al. / International Journal of Psychophysiology 93 (2014) 398–410

Table 2 Mean (SD) theta power in μV2 for facial emotion categories in all time bins.

Positive Left occipital–parietal Right occipital–parietal Left frontal Right frontal Neutral Left occipital–parietal Right occipital parietal Left frontal Right frontal Negative Left occipital–parietal Right occipital parietal Left frontal Right frontal

0–500 ms

Bin 1

Bin 2

Bin 3

353.59 (168.25) 461.84 (269.65) 144.01 (64.99) 177.13 (102.98)

79.7305 (45.64) 84.84 (56.45) 72.13 (57.87) 101.43 (113.91)

72.93 (46.22) 76.12 (45.03) 53.07 (35.05) 63.53 (46.47)

71.74 (46.91) 77.66 (53.74) 48.31 (33.95) 55.67 (42.43)

284.17 (137.53) 392.61 (265.88) 154.86 (90.34) 152.99 (91.57)

87.73 (63.79) 87.77 (58.07) 77.85 (67.89) 67.56 (68.74)

71.13 (46.32) 74.64 (43.59) 40.39 (25.24) 45.01 (29.08)

73.06 (46.29) 75.35 (45.39) 37.07 (19.25) 38.77 (23.13)

82.29 (62.56) 82.57 (47.46) 98.29 (197.36) 68.97 (55.03)

70.80 (45.44) 78.36 (41.52) 58.75 (45.04) 59.65 (48.49)

366.11 (185.57) 479.66 (314.01) 162.36 (93.07) 176.67 (86.83)

98.05 (60.67) 101.05 (63.85) 167.24 (375.71) 136.51 (213.28)

Notes: Bins refer to delay interval times; Bin 1 = 500–1000 ms, Bin 2 = 1000–1500 ms, and Bin 3 = 1500–2000 ms.

not significantly differ from one another (p = .198) but in the right hemisphere power in time bin 1 was larger than bin 3 (p = .018) but that power in bin 2 did not significantly differ from that in bin 1 (p = .077) or bin 3 (p = .090). Finally, in response to the negative stimuli, there was a marginally significant main effect of hemisphere, F(1,21) = 4.19, p = .053, indicating larger power in the right compared to left hemisphere, and a main effect of time, F(1.48, 31.02) = 13.14, p b .001. Pairwise comparisons indicated that power in bin 1 was significantly larger than in bins 2 (p =

.001) and 3 (p = .001), which did not significantly differ from one another (p = .108). A repeated measures ANOVA using emotion (positive, neutral, and negative) and hemisphere (left and right) of theta power measured over occipital–parietal sites for the 0–500 ms time bin was calculated separately from those of the later delay interval time bins. This analysis was run separately because this bin contained theta activity associated with the ERPs in response to the presentation of the first stimulus and therefore likely contains sensory-related activity. The ANOVA revealed

Occipital-Parietal Theta Power 0-500 ms

500-1000 ms

500

110

450

100 90

uV2

uV2

400 350

80

300

70

250 200

60

Posive

Neutral

Negave

Posive

1000 - 1500 ms

Neutral

Negave

1500 - 2000 ms 110

100

100

90

90

uV2

uV2

110

80

80

70

70

60

60

Posive

Neutral

Negave

Posive

Neutral

Negave

Fig. 4. Occipital–parietal theta power (SE) for facial emotion categories in each time bin. Note the different scales between the 0–500 ms graph and all others. SE = standard error.

C.A. Brenner et al. / International Journal of Psychophysiology 93 (2014) 398–410

a main effect of emotion, F(1.52,37.96) = 4.19, p = .032, and main effect of hemisphere, F(1.00,25.00) = 7.96, p = .009. Pair-wise comparisons indicated that theta power was greater in the right than the in left hemisphere. Planned contrasts to investigate the main effect of emotion revealed that theta power was significantly larger in response to emotional (positive/negative) (M = 416.82, SE = 37.28) than to neutral (M = 338.39, SE = 35.67) images, t(25) = 3.85, p = .001, that theta power in response to positive stimuli (M = 407.71, SE = 37.96) was not significantly different from the magnitude of the response to negative stimuli (M = 422.87, SE = 43.33) t(25) = −0.42, p = .678, and that while the difference in theta power in response to positive and neutral stimuli was marginally significant, t(25) = 2.02, p = .054, the difference in theta power in response to negative and neutral stimuli was highly significant, t(25) = 4.16, p b .001 (Fig. 4). A repeated measures ANOVA on frontal sites using the 0–500 ms time bin did not reveal significant main effects or interactions. 3.4. ERP and power correlations Correlations were calculated between P100 amplitude, N170 amplitude, and N250 amplitude and theta power (4–7 Hz) within the same electrode clusters for each emotion category (Table 3). Significance values were Bonferonni corrected (to p b .016) for each correlation. As Table 3 indicates, N170 amplitudes were significantly correlated with evoked theta power across the delay interval for all emotion categories. All of the significant correlations with N170 amplitude were negative, indicating that smaller ERP amplitude values (representing larger, more negative N170 ERPs) were associated with greater evoked theta power. There were no significant correlations between P100 or N250 amplitude and theta power. 3.5. Predicting behaviour Correlations were calculated between P100 amplitude, N170 amplitude and N250 amplitude and behavioural performance for each emotion category separately, and between theta power (4–7 Hz) and Table 3 p-Values for all significant Pearson correlations between ERP amplitude and theta power for emotion categories and time bins. Positive

P100 left avg P100 right avg N170 left avg N170 right avg N250 left avg N250 right avg

Left theta Right theta Left theta Right theta Left theta Right theta

Bin 1

Bin 2

Bin 3

ns ns ns p = .002 ns ns

ns ns p = .005 p = .001 ns ns

ns ns ns p = .002 ns ns

Bin 1

Bin 2

Bin 3

ns ns ns p = .001 ns ns

ns ns ns p = .001 ns ns

ns ns ns ns ns ns

Bin 1

Bin 2

Bin 3

ns ns ns p b .001 ns ns

ns ns ns p b .001 ns ns

ns ns p = .003 p b .001 ns ns

Neutral

P100 left avg P100 right avg N170 left avg N170 right avg N250 left avg N250 right avg

Left theta Right theta Left theta Right theta Left theta Right theta

Negative

P100 left avg P100 right avg N170 left avg N170 right avg N250 left avg N250 right avg

Left theta Right theta Left theta Right theta Left theta Right theta

Notes: Bin 1 = 500–1000 ms, Bin 2 = 1000–1500 ms, and Bin 3 = 1500–2000 ms. Significance values were Bonferoni corrected to p b .016.

405

behavioural performance for each time bin and each emotion category separately. There were no statistically significant correlations (all p N .008 based on Bonferroni correction). Hierarchical linear regression analyses were performed to determine whether N170 ERP amplitude, theta power, and their interaction predicted behavioural performance for each emotion category (i.e. positive, neutral, negative). None of the models were significant using emotion categories, although the reader is referred to the Supplemental Data to see the outcome of emotion-specific analyses. 4. Discussion Several of our initial hypotheses were supported. First, we found that ERPs reflecting different stages of stimulus processing were differentially affected by emotional expressions. Only N170 amplitudes were modulated by emotion, with negative emotions producing larger amplitudes compared to positive or neutral expressions. In contrast, neither P100 nor N250 amplitude was modulated by emotion. Our second hypothesis was partially supported; theta power decreased over time across emotions, and was generally larger in response to emotional compared to neutral expressions, especial negative expressions . Our third hypothesis was supported in that N170 amplitudes were negatively correlated with theta power during the delay interval, while no such relationship existed for P100 and N250 (Table 3). Finally, N170 amplitude did not moderate the relationship between theta power and behavioural performance for the emotion categories (although see Supplemental Data). 4.1. Behavioural findings Behavioural performance indicated that our participants were most accurate in response to neutral expressions, and did not differ in their accuracy between emotional (positive or negative) expressions. This is relatively consistent with several findings in the literature regarding facial expression identification. Calvo and Lundqvist (2008) presented emotionally expressive faces to sixty-three student participants in a free-viewing condition and found most accurate identification of happy and neutral expressions and least accurate to fearful expressions. Ebner et al. (2010) found highest accuracy to happy faces, with next highest accuracy to neutral expressions. Similarly, Palermo and Coltheart (2004) also reported that participants identified happy faces most accurately, followed by neutral expressions and fearful faces were identified with the least accuracy. Finally, the validation study for the NimStim facial stimuli used in the current study reported highest accuracy for happy faces, followed by neutral expressions, while those in response to fear and sad expressions were the least accurately identified (Tottenham et al., 2009). Our results follow a similar, although not exact, pattern of behavioural performance that indicates typical facial emotion identification. 4.2. ERP findings As predicted, the P100 amplitude was not modulated by emotion. While the P100 seems to show larger amplitudes in response to faces compared to non-face stimuli (Herrmann et al., 2005a), and some studies have reported a negative/threat bias in response to fearful facial expressions (Luo et al., 2010; Vuilleumier and Pourtois, 2007; Williams et al., 2006), modulation of P100 amplitude across a variety of facial emotions is not typically reported (Batty and Taylor, 2003). Rossion and Caharel (2011) presented compelling evidence that the facesensitivity of P100 amplitude can be accounted for by low-level visual cues. Their findings were consistent with the data indicating that the face-sensitive P100 disappears when black and white (“Mooney”) faces are presented instead of photographs (George et al., 2005). Together, these findings indicate that the P100 ERP largely reflects sensory related (bottom-up) processing, that is not associated with the conscious perception of a face. Given the explicit focus on emotional

406

C.A. Brenner et al. / International Journal of Psychophysiology 93 (2014) 398–410

expression in this task, the lack of P100 modulation may suggest that low-level visual cues did not vary systematically across stimuli depicting different emotional expressions in this task. In contrast, the N170 amplitude demonstrated modulation by emotion. While many studies investigating the N170 have not found it to be modulated by facial emotion, studies in which facial emotional expression was the explicit focus of the task have reported changes in N170 amplitude. Krombholz et al. (2007) used a semantic priming task to differentiate neural activity in response to line drawings of happy and angry faces. They reported larger N170 amplitudes in response to angry faces in both congruent and incongruent conditions. Similarly, Rellecke et al. (2012) examined the impact of intentionality on emotional face processing and also found that emotion modulated ERP amplitudes (N170 and the superimposed EPN) on tasks that required explicit processing of facial emotions. Together these findings suggest that early structural processing of facial stimuli can be influenced by emotional expression when the task requires processing of emotional content. Reports of larger amplitudes in response to negative emotions may be ecologically adaptive (Krombholz et al., 2007; Schupp et al., 2004). Facial expressions of anger and fear in particular may require action on the part of the observer, which may increase their salience over more positive or neutral expressions (Palermo and Rhodes, 2007; Schupp et al., 2004). Therefore, results from this study indicate that N170 amplitude modulation in response to emotional faces has some support in the literature; explicit focus on facial emotional expression can lead to early modulation of ERPs, and this modulation is subject to modification by the potential importance or salience of the specific emotion being presented. However, the spatial distribution of the N170 reported in this

study (Fig. 5) implies that N170 amplitude may have been affected by the temporally and spatially overlapping EPN (Rellecke et al., 2013). As the emotion-sensitive topography of the EPN is dependent on choice of reference, future studies that can compare emotion modulation of N170 and EPN amplitudes using averaged versus linked-mastoid references (not recorded during this paradigm) are needed to elucidate the possible impact of the EPN on N170 amplitude in this paradigm. Surprisingly, we did not find emotion modulation of the N250 ERP. Given that previous studies have indicated that the N250 reflects processing of unique facial characteristics associated with identity, including emotional expression and gender, we expected that emotion would modulate N250 amplitude. One possible explanation is that the unique demands of our task, and the characteristics which the N250 is known to represent, were confounded. Studies of facial learning indicate that the N250 ERP may represent facial learning in response to repeated facial stimuli (Kaufmann et al., 2009; Tanaka et al., 2006). In this context the N250r is thought to reflect the process by which previously seen structural information about faces is compared against currently viewed structural information (Haxby et al., 2000). While initial findings indicated that this process occurs independent of facial details (such as expression), a study by Schweinberger et al. (2002) showed that subtle changes in the image of the same individual resulted in larger N250r amplitudes compared to the same picture of the individual. In the current paradigm, the same models were used to represent all six emotions, meaning that slightly different pictures of the same individuals were presented multiple times throughout the paradigm. Given that facial identity and emotional expression are processed by parallel but intersecting networks, it is possible that the emotion modulation

Fig. 5. Spatial distribution of activity at peak time points for P100, N170 and N250.

C.A. Brenner et al. / International Journal of Psychophysiology 93 (2014) 398–410

of the N250 ERP was not discernible in this paradigm because of its sensitivity to the repetition of individual models. 4.3. Theta power findings As previously described, increased theta power has been linked to short term memory in both humans and animals (Fujisawa and Buzsáki, 2011; Jensen and Tesche, 2002; Kahana, 2006; Klimesch et al., 2001; Raghavachari et al., 2001). However, the dissociation between short-term memory and attention in these (and other) studies remains unclear (Cowan, 2011). Studies that have attempted to dissociate the effects of attention on short term memory processes report a substantial involvement of attention on evoked theta activity. For example, Missonnier et al. (2006) found that evoked theta was increased for detection and short-term memory tasks compared to a passive viewing condition. They also reported that theta was greater early in the delay interval during the detection task rather than increasing with working memory load, indicating that early theta may reflect attention allocation related to the presentation of target stimuli, rather than the maintenance of information in working memory. Others have also reported evoked theta activity after stimulus onset that was not modulated by working memory demands, and was thought to reflect obligatory processing and continued stimulus evaluation (Deiber et al., 2007; Yordanova et al., 2002). These findings are similar to what we report here with the main effect of time at both occipital–parietal and frontal clusters, indicating that theta power was largest early in the delay interval across emotions. Given the findings that negative facial expressions are processed faster, induce a larger neurophysiological response and that attention is preferentially allocated to stimuli of negative valence compared to positive or neutral valence, it is possible that the larger theta power in response to negative emotions reported here is an effect of preferential attention allocation associated with memory processes (Feldmann-Wüstefeld et al., 2011; Schaefer et al., 2011; Weymar et al., 2011). Balconi and Pozzoli (2009) also found increased theta power within 150–350 ms in response to emotional compared to neutral faces that predicted N200 amplitude. They interpreted these findings to indicate that theta power may reflect the orienting of attention toward the emotional significance of the stimulus. Combined with the literature reporting early increased evoked theta power in response to emotional compared to neutral stimuli, our findings indicate that negative facial expressions elicit enhanced processing early on, and that this augmentation continues for at least 500 ms post stimulus onset. It is possible that early evoked theta activity reflects sustained attention towards—and continued obligatory processing of—stimuli that are particularly salient (González-Roldan et al., 2011; Knyazev et al., 2010). Previous speculation about the role of theta activity in memory tasks has suggested that theta plays a regulatory, executive role in the coordination of cognitive processes across distributed networks and that it is inherently involved in top-down and bottom-up integration (Sauseng et al., 2010). Given our expected findings that facial expressions are particularly salient, as reflected by emotion modulation of early ERPs, and that evoked theta activity may reflect attention allocation associated with continued processing and memory for these stimuli, we hypothesized that the junction between ERPs modulated by emotion and early evoked theta activity may reflect an important integration between top-down cognitive processes and bottom-up stimulus features. We found highly significant correlations between N170 amplitude and theta power throughout the delay interval across all emotions (Table 3). These relationships were not found between the P100 ERP and theta power, or between N250 and theta power. These findings are consistent with those of Todd et al. (2012b) who found enhanced P200 amplitude in response to emotional stimuli that was localized to the lateral occipital complex. The timing of their results (approximately 200 ms post-stimulus onset) and ours (170 post-stimulus onset) is consistent with the extraction of content from the image that is not entirely sensory-driven (bottom-up) and not so late as to be entirely

407

cognitively driven (top-down). Given the latency of these effects within the encoding stream, and their correlation with later evoked theta activity, it is possible that these findings represent integration between bottom-up and top-down processes. When decomposed into frequency components, the N170 ERP itself encompasses the theta range and spans between 0 and 10 Hz (Tang et al., 2008). However, these authors also demonstrated that the dominant frequency shifts to a lower frequency, 0–4 Hz, at approximately 350 ms post stimulus. Therefore, whether the increased evoked theta power early in the delay interval reported here (500–1000 ms) represents a continuation of the oscillatory activity associated with the N170 ERP, or is a distinct phenomenon potentially reflecting separate attention mechanisms required for memory, remains unclear. Our findings are also consistent with reports of neural network oscillations during face perception. Furl et al. (2013) reported increased theta power coupling between the occipital face area and the superior temporal sulcus, particularly in response to fearful faces. These structures comprise the dorsal stream of face processing thought to subserve facial emotion processing. This theta coupling, along with theta crossfrequency suppression of beta activity, was not found in the ventral stream (i.e. occipital face area and fusiform face area), which is thought to subserve facial identity processing. These findings were interpreted to represent independent, hierarchical feedforward networks associated with facial processing. As our task required individuals to focus on the emotional expression rather than the identity of the faces, the increased sustained theta power reported here may represent activation of this dorsal face processing network. The more consistent effects of emotion on occipital–parietal compared to frontal theta power partially support this notion. Furthermore, phase-amplitude cross-frequency coupling has been associated with memory maintenance (Friese et al., 2012; Kaplan et al., 2014; Lisman and Idiart, 1995). Mizuhara and Yamaguchi (2011) found that frontal–central theta oscillations were associated with the rehearsal of sensory stimuli represented by phasecoupled gamma activity. While amplitude- and phase-coupling analyses within and across frequencies are beyond the scope of this study, it is likely that dynamic neural coupling comprises the functional networks necessary for facial emotion processing and attentionmodulated maintenance of facial emotion stimuli in this task (Başar et al., 2006; Luo et al., 2014; Vuilleumier and Pourtois, 2007). Future studies investigating the relationship between sustained theta activity and higher frequency activity in this or similar tasks are warranted. 4.4. Study limitations There are several limitations in this study. First, it is possible that this paradigm, which uses different models within a trial, inadvertently encourages the translation of the visual expression into a verbal category. Since participants know that no individuals will be the same within a trial, they may encode the verbal representation for “happy” instead of a visual representation of a smile, during the encoding period. Similarly, there was not a passive viewing control task upon which to compare delay interval activity, so the effects of the motor responses required on every trial cannot be dissociated from the theta response. However, the main purpose of this study was to investigate the impact of facial emotion on initial ERPs and subsequent delay interval activity. With this in mind, the visual to verbal translation and motor preparatory effects possibly embedded in the delay interval activity are held constant across all emotion conditions. For example, we have no reason to believe that a visual to verbal translation would occur for positive expressions but not for negative expressions. Furthermore, Missonnier et al. (2006) found no difference in evoked theta power between trials in which participants made a motor response and those in which they did not. Therefore, we do not think that these issues affected the emotion-specific findings reported above. A second limitation is that we did not modify the stimuli to control for low-level visual cues such as color, luminance and contrast. However, the P100 ERP which is

408

C.A. Brenner et al. / International Journal of Psychophysiology 93 (2014) 398–410

known to be sensitive to these low-level visual cues was not modulated by emotion. Therefore, we do not think that low-level visual cues can explain the emotion-related results reported here. A third limitation is that we could not investigate the Dm (difference in memory) effects because of high accuracy rates. Subjects performed very well on this task, making ERPs to the very few incorrect trials unreliable and unsuitable for further analyses. 4.5. Conclusions In summary, with respect to the first two hypotheses of the study we found that evoked activity within the timeframe of the N170 ERP was modulated by facial emotion, and that sustained theta activity was similarly modulated by facial emotional expression. Evoked theta activity decreased throughout the memory delay interval and may represent continued attentional focus on particularly salient (i.e. negative) emotional stimuli required for short-term memory. With respect to the third hypothesis, we found large correlations between sensory-related N170 ERP amplitudes and sustained theta activity across the delay interval, despite these two measures being temporally distinct. Finally, we did not find that the interaction between face-sensitive ERPs and sustained evoked theta power affects behavioural performance (although see Supplemental Data). To our knowledge, this is the first study that has used a delayed match-to-sample paradigm of explicit facial emotion recognition, which allowed for the investigation of delay interval activity and its relationship with early sensory processing. Given the functional importance of facial emotion encoding and maintenance during social interactions, our finding that early, enhanced neural responsiveness to emotional expressions is correlated with sustained attention-related activity is an important and novel contribution. Further investigation of these relationships with regards to induced activity and phase coupling with other frequency bands (gamma and beta bands in particular) is warranted. Acknowledgments The authors would like to dedicate this manuscript to Sara Grant-Weaver, who brought a smile to everyone she met. This project was partially supported by the National Science and Engineering Research Council (C.A.B.). Appendix A. Supplementary data Supplementary data to this article can be found online at http://dx. doi.org/10.1016/j.ijpsycho.2014.06.006. References Adolphs, R., 2003. Cognitive neuroscience of human social behaviour. Nat. Rev. Neurosci. 4, 165–178. http://dx.doi.org/10.1038/nrn1056. Adolphs, R., Tranel, D., Denburg, N., 2000. Impaired emotional declarative memory following unilateral amygdala damage. Learn. Mem. 7, 180–186. Aftanas, L.I., Varlamov, a a, Pavlov, S.V., Makhnev, V.P., Reva, N.V., 2001. Affective picture processing: event-related synchronization within individually defined human theta band is modulated by valence dimension. Neurosci. Lett. 303, 115–118. Aiken, L.S., West, S.G., 1991. Multiple regression: testing and interpreting interactions. Sage, Newbury Park, CA. Balconi, M., Lucchiari, C., 2006. EEG correlates (event-related desynchronization) of emotional face elaboration: a temporal analysis. Neurosci. Lett. 392, 118–123. http://dx.doi.org/10.1016/j.neulet.2005.09.004. Balconi, M., Pozzoli, U., 2009. Arousal effect on emotional face comprehension: frequency band changes in different time intervals. Physiol. Behav. 97, 455–462. http://dx.doi. org/10.1016/j.physbeh.2009.03.023. Başar, E., Güntekin, B., Oniz, A., 2006. Principles of oscillatory brain dynamics and a treatise of recognition of faces and facial expressions. Prog. Brain Res. 159, 43–62. http://dx.doi.org/10.1016/S0079-6123(06)59004-1. Batty, M., Taylor, M.J., 2003. Early processing of the six basic facial emotional expressions. Brain Res. Cogn. Brain Res. 17, 613–620. Bentin, S., Truett, A., Puce, A., Perez, E., Mccarthy, G., 1996. Electrophysiological studies of face perception in humans. J. Cogn. Neurosci. 8, 551–565. http://dx.doi.org/10.1162/ jocn.1996.8.6.551.Electrophysiological.

Brosch, T., Sander, D., Pourtois, G., Scherer, K.R., 2008. Beyond fear: rapid spatial orienting toward positive emotional stimuli. Psychol. Sci. 19, 362–370. http://dx.doi.org/10. 1111/j.1467-9280.2008.02094.x. Bruce, V., Young, A., 1986. Understanding face recognition. Br. J. Psychol. 77 (Pt 3), 305–327. Calvo, M.G., Lundqvist, D., 2008. Facial expressions of emotion (KDEF): identification under different display-duration conditions. Behav. Res. Methods 40, 109–115. Calvo, M.G., Nummenmaa, L., Hyönä, J., 2007. Emotional and neutral scenes in competition: orienting, efficiency, and identification. Q. J. Exp. Psychol. (Hove) 60, 1585–1593. http://dx.doi.org/10.1080/17470210701515868. Chang, Y.-C., Huang, S.-L., 2012. The influence of attention levels on psychophysiological responses. Int. J. Psychophysiol. 86, 39–47. http://dx.doi.org/10.1016/j.ijpsycho. 2012.09.001. Cowan, N., 2011. The focus of attention as observed in visual working memory tasks: making sense of competing claims. Neuropsychologia 49, 1401–1406. http://dx.doi. org/10.1016/j.neuropsychologia.2011.01.035. Deffke, I., Sander, T., Heidenreich, J., Sommer, W., Curio, G., Trahms, L., Lueschow, A., 2007. MEG/EEG sources of the 170-ms response to faces are co-localized in the fusiform gyrus. Neuroimage 35, 1495–1501. http://dx.doi.org/10.1016/j.neuroimage.2007.01.034. Deiber, M.-P., Missonnier, P., Bertrand, O., Gold, G., Fazio-Costa, L., Ibañez, V., Giannakopoulos, P., 2007. Distinction between perceptual and attentional processing in working memory tasks: a study of phase-locked and induced oscillatory brain dynamics. J. Cogn. Neurosci. 19, 158–172. http://dx.doi.org/10.1162/jocn.2007. 19.1.158. Dolan, R.J., 2002. Emotion, cognition, and behavior. Science 298, 1191–1194. http://dx.doi. org/10.1126/science.1076358. Ebner, N.C., Riediger, M., Lindenberger, U., 2010. FACES—a database of facial expressions in young, middle-aged, and older women and men: development and validation. Behav. Res. Methods 42, 351–362. http://dx.doi.org/10.3758/BRM.42.1.351. Eimer, M., Holmes, A., 2002. An ERP study on the time course of emotional face processing. Neuroreport 13, 427–431. Feldmann-Wüstefeld, T., Schmidt-Daffy, M., Schubö, A., 2011. Neural evidence for the threat detection advantage: differential attention allocation to angry and happy faces. Psychophysiology 48, 697–707. http://dx.doi.org/10.1111/j.1469-8986.2010. 01130.x. Friese, U., Köster, M., Hassler, U., Martens, U., Trujillo-Barreto, N., Gruber, T., 2012. Successful memory encoding is associated with increased cross-frequency coupling between frontal theta and posterior gamma oscillations in human scalp-recorded EEG. Neuroimage 66C, 642–647. http://dx.doi.org/10.1016/j.neuroimage.2012.11. 002. Fujisawa, S., Buzsáki, G., 2011. A 4 Hz oscillation adaptively synchronizes prefrontal, VTA, and hippocampal activities. Neuron 72, 153–165. http://dx.doi.org/10.1016/j.neuron. 2011.08.018. Furl, N., Coppola, R., Averbeck, B.B., Weinberger, D.R., 2013. Cross-frequency power coupling between hierarchically organized face-selective areas. Cereb. Cortex 1–12. http://dx.doi.org/10.1093/cercor/bht097. George, N., Jemel, B., Fiori, N., Chaby, L., Renault, B., 2005. Electrophysiological correlates of facial decision: insights from upright and upside-down Mooney-face perception. Brain Res. Cogn. Brain Res. 24, 663–673. http://dx.doi.org/10.1016/j.cogbrainres. 2005.03.017. Gomes, C.F.A., Brainerd, C.J., Stein, L.M., 2013. Effects of emotional valence and arousal on recollective and nonrecollective recall. J. Exp. Psychol. Learn. Mem. Cogn. 39, 663–677. http://dx.doi.org/10.1037/a0028578. González-Roldan, A.M., Martínez-Jauand, M., Muñoz-García, M. a, Sitges, C., Cifre, I., Montoya, P., 2011. Temporal dissociation in the brain processing of pain and anger faces with different intensities of emotional expression. Pain 152, 853–859. http:// dx.doi.org/10.1016/j.pain.2010.12.037. Gratton, G., Coles, M.G., Donchin, E., 1983. A new method for off-line removal of ocular artifact. Electroencephalogr. Clin. Neurophysiol. 55, 468–484. Güntekin, B., Başar, E., 2009. Facial affect manifested by multiple oscillations. Int. J. Psychophysiol. 71, 31–36. http://dx.doi.org/10.1016/j.ijpsycho.2008.07.019. Hajcak, G., Dunning, J.P., Foti, D., 2009. Motivated and controlled attention to emotion: time-course of the late positive potential. Clin. Neurophysiol. 120, 505–510. http:// dx.doi.org/10.1016/j.clinph.2008.11.028. Haxby, J., Hoffman, E., Gobbini, M., 2000. The distributed human neural system for face perception. Trends Cogn. Sci. 4, 223–233. Herrmann, M.J., Ehlis, a-C., Ellgring, H., Fallgatter, a J., 2005a. Early stages (P100) of face perception in humans as measured with event-related potentials (ERPs). J. Neural Transm. 112, 1073–1081. http://dx.doi.org/10.1007/s00702-004-0250-8. Herrmann, M.J., Ehlis, A.-C., Muehlberger, A., Fallgatter, A.J., 2005b. Source localization of early stages of face processing. Brain Topogr. 18, 77–85. http://dx.doi.org/10.1007/ s10548-005-0277-7. Jacques, C., Rossion, B., 2006. The speed of individual face categorization. Psychol. Sci. 17, 485–492. http://dx.doi.org/10.1111/j.1467-9280.2006.01733.x. Jensen, O., Tesche, C.D., 2002. Frontal theta activity in humans increases with memory load in a working memory task. Eur. J. Neurosci. 15, 1395–1399. Johansson, M., Mecklinger, A., Treese, A.-C., 2004. Recognition memory for emotional and neutral faces: an event-related potential study. J. Cogn. Neurosci. 16, 1840–1853. http://dx.doi.org/10.1162/0898929042947883. Jonas, J., Descoins, M., Koessler, L., Colnat-Coulbois, S., Sauvée, M., Guye, M., Vignal, J.-P., Vespignani, H., Rossion, B., Maillard, L., 2012. Focal electrical intracerebral stimulation of a face-sensitive area causes transient prosopagnosia. Neuroscience 222, 281–288. http://dx.doi.org/10.1016/j.neuroscience.2012.07.021. Kahana, M.J., 2006. The cognitive correlates of human brain oscillations. J. Neurosci. 26, 1669–1672. http://dx.doi.org/10.1523/JNEUROSCI.3737-05c.2006.

C.A. Brenner et al. / International Journal of Psychophysiology 93 (2014) 398–410 Kaplan, R., Bush, D., Bonnefond, M., Bandettini, P. a, Barnes, G.R., Doeller, C.F., Burgess, N., 2014. Medial prefrontal theta phase coupling during spatial memory retrieval. Hippocampus 00. http://dx.doi.org/10.1002/hipo.22255. Kaufmann, J.M., Schweinberger, S.R., Burton, a M., 2009. N250 ERP correlates of the acquisition of face representations across different images. J. Cogn. Neurosci. 21, 625–641. http://dx.doi.org/10.1162/jocn.2009.21080. Kensinger, E. a, 2007. Negative emotion enhances memory accuracy: behavioral and neuroimaging evidence. Curr. Dir. Psychol. Sci. 16, 213–218. http://dx.doi.org/10. 1111/j.1467-8721.2007.00506.x. Kirschner, A., Kam, J.W.Y., Handy, T.C., Ward, L.M., 2012. Differential synchronization in default and task-specific networks of the human brain. Front. Hum. Neurosci. 6, 139. http://dx.doi.org/10.3389/fnhum.2012.00139. Klimesch, W., Doppelmayr, M., Yonelinas, A., Kroll, N.E., Lazzara, M., Röhm, D., Gruber, W., 2001. Theta synchronization during episodic retrieval: neural correlates of conscious awareness. Brain Res. Cogn. Brain Res. 12, 33–38. Klimesch, W., Hanslmayr, S., Sauseng, P., Gruber, W., Brozinsky, C.J., Kroll, N.E.A., Yonelinas, A.P., Doppelmayr, M., 2006. Oscillatory EEG correlates of episodic trace decay. Cereb. Cortex 16, 280–290. http://dx.doi.org/10.1093/cercor/bhi107. Knyazev, G.G., Slobodskoj-Plusnin, J.Y., Bocharov, a V., 2009. Event-related delta and theta synchronization during explicit and implicit emotion processing. Neuroscience 164, 1588–1600. http://dx.doi.org/10.1016/j.neuroscience.2009.09.057. Knyazev, G.G., Slobodskoj-Plusnin, J.Y., Bocharov, A.V., 2010. Gender differences in implicit and explicit processing of emotional facial expressions as revealed by event-related theta synchronization. Emotion 10, 678–687. http://dx.doi.org/10.1037/a0019175. Koenig, S., Mecklinger, A., 2008. Electrophysiological correlates of encoding and retrieving emotional events. Emotion 8, 162–173. http://dx.doi.org/10.1037/1528-3542.8.2.162. Krombholz, A., Schaefer, F., Boucsein, W., 2007. Modification of N170 by different emotional expression of schematic faces. Biol. Psychol. 76, 156–162. http://dx.doi. org/10.1016/j.biopsycho.2007.07.004. LaBar, K.S., Cabeza, R., 2006. Cognitive neuroscience of emotional memory. Nat. Rev. Neurosci. 7, 54–64. http://dx.doi.org/10.1038/nrn1825. Lang, P.J., Bradley, M.M., Cuthbert, B.N., 2008. International Affective Picture System (IAPS): affective ratings of pictures and instruction manual. Gainesville. Langeslag, S.J.E., Morgan, H.M., Jackson, M.C., Linden, D.E.J., Van Strien, J.W., 2009. Electrophysiological correlates of improved short-term memory for emotional faces. Neuropsychologia 47, 887–896. http://dx.doi.org/10.1016/j.neuropsychologia. 2008.12.024. Lee, H., Simpson, G.V., Logothetis, N.K., Rainer, G., 2005. Phase locking of single neuron activity to theta oscillations during working memory in monkey extrastriate visual cortex. Neuron 45, 147–156. http://dx.doi.org/10.1016/j.neuron.2004.12.025. Lee, S.-H., Kim, E.-Y., Kim, S., Bae, S.-M., 2010. Event-related potential patterns and gender effects underlying facial affect processing in schizophrenia patients. Neurosci. Res. 67, 172–180. http://dx.doi.org/10.1016/j.neures.2010.03.001. Lisman, J.E., Idiart, M. a, 1995. Storage of 7 +/− 2 short-term memories in oscillatory subcycles. Science 267, 1512–1515. Liu, J., Harris, A., Kanwisher, N., 2002. Stages of processing in face perception: an MEG study. Nat. Neurosci. 5, 910–916. http://dx.doi.org/10.1038/nn909. Liu, T., Pinheiro, A., Zhao, Z., Nestor, P.G., McCarley, R.W., Niznikiewicz, M. a, 2012. Emotional cues during simultaneous face and voice processing: electrophysiological insights. PLoS One 7, e31001. http://dx.doi.org/10.1371/journal.pone.0031001. Luo, W., Feng, W., He, W., Wang, N.-Y., Luo, Y.-J., 2010. Three stages of facial expression processing: ERP study with rapid serial visual presentation. Neuroimage 49, 1857–1867. http://dx.doi.org/10.1016/j.neuroimage.2009.09.018. Luo, Q., Cheng, X., Holroyd, T., Xu, D., Carver, F., Blair, R.J., 2014. Theta band activity in response to emotional expressions and its relationship with gamma band activity as revealed by MEG and advanced beamformer source imaging. Front. Hum. Neurosci. 7, 940. http://dx.doi.org/10.3389/fnhum.2013.00940. Mangun, G.R., 1995. Neural mechanisms of visual selective attention. Psychophysiology 32, 4–18. Mangun, G.R., Hillyard, S.A., 1991. Modulations of sensory-evoked brain potentials indicate changes in perceptual processing during visual-spatial priming. J. Exp. Psychol. Hum. Percept. Perform. 17, 1057–1074. Martens, U., Leuthold, H., Schweinberger, S.R., 2010. On the temporal organization of facial identity and expression analysis: inferences from event-related brain potentials. Cogn. Affect. Behav. Neurosci. 10, 505–522. http://dx.doi.org/10.3758/CABN.10.4.505. Missonnier, P., Deiber, M.-P., Gold, G., Millet, P., Gex-Fabry Pun, M., Fazio-Costa, L., Giannakopoulos, P., Ibáñez, V., 2006. Frontal theta event-related synchronization: comparison of directed attention and working memory load effects. J. Neural Transm. 113, 1477–1486. http://dx.doi.org/10.1007/s00702-005-0443-9. Mizuhara, H., Yamaguchi, Y., 2011. Neuronal ensemble for visual working memory via interplay of slow and fast oscillations. Eur. J. Neurosci. 33, 1925–1934. http://dx.doi. org/10.1111/j.1460-9568.2011.07681.x. Nakamura, a, Yamada, T., Abe, Y., Nakamura, K., Sato, N., Horibe, K., Kato, T., Kachi, T., Ito, K., 2001. Age-related changes in brain neuromagnetic responses to face perception in humans. Neurosci. Lett. 312, 13–16. Palermo, R., Coltheart, M., 2004. Photographs of facial expression: accuracy, response times, and ratings of intensity. Behav. Res. Methods Instrum. Comput. 36, 634–638. Palermo, R., Rhodes, G., 2007. Are you always on my mind? A review of how face perception and attention interact. Neuropsychologia 45, 75–92. http://dx.doi.org/10.1016/j. neuropsychologia.2006.04.025. Phelps, E.A., 2004. Human emotion and memory: interactions of the amygdala and hippocampal complex. Curr. Opin. Neurobiol. 14, 198–202. http://dx.doi.org/10.1016/j. conb.2004.03.015. Pourtois, G., Grandjean, D., Sander, D., Vuilleumier, P., 2004. Electrophysiological correlates of rapid spatial orienting towards fearful faces. Cereb. Cortex 14, 619–633. http://dx.doi.org/10.1093/cercor/bhh023.

409

Raghavachari, S., Kahana, M.J., Rizzuto, D.S., Caplan, J.B., Kirschen, M.P., Bourgeois, B., Madsen, J.R., Lisman, J.E., 2001. Gating of human theta oscillations by a working memory task. J. Neurosci. 21, 3175–3183. Regan, D., 1989. Human Brain Electrophysiology. Elsevier, New York. Reinhart, R.M.G., Heitz, R.P., Purcell, B. a, Weigand, P.K., Schall, J.D., Woodman, G.F., 2012. Homologous mechanisms of visuospatial working memory maintenance in macaque and human: properties and sources. J. Neurosci. 32, 7711–7722. http://dx.doi.org/10. 1523/JNEUROSCI.0215-12.2012. Rellecke, J., Sommer, W., Schacht, A., 2012. Does processing of emotional facial expressions depend on intention? Time-resolved evidence from event-related brain potentials. Biol. Psychol. 90, 23–32. http://dx.doi.org/10.1016/j.biopsycho. 2012.02.002. Rellecke, J., Sommer, W., Schacht, A., 2013. Emotion effects on the n170: a question of reference? Brain Topogr. 26, 62–71. http://dx.doi.org/10.1007/s10548-0120261-y. Righi, S., Marzi, T., Toscani, M., Baldassi, S., Ottonello, S., Viggiano, M.P., 2012. Fearful expressions enhance recognition memory: electrophysiological evidence. Acta Psychol. (Amst) 139, 7–18. http://dx.doi.org/10.1016/j.actpsy.2011.09.015. Rossion, B., Caharel, S., 2011. ERP evidence for the speed of face categorization in the human brain: disentangling the contribution of low-level visual cues from face perception. Vision Res. 51, 1297–1311. http://dx.doi.org/10.1016/j.visres.2011. 04.003. Sato, W., Kochiyama, T., Yoshikawa, S., Matsumura, M., 2001. Emotional expression boosts early visual processing of the face: ERP recording and its decomposition by independent component analysis. Neuroreport 12, 709–714. Sauseng, P., Griesmayr, B., Freunberger, R., Klimesch, W., 2010. Control mechanisms in working memory: a possible function of EEG theta oscillations. Neurosci. Biobehav. Rev. 34, 1015–1022. http://dx.doi.org/10.1016/j.neubiorev.2009.12.006. Schaefer, A., Pottage, C.L., Rickart, A.J., 2011. Electrophysiological correlates of remembering emotional pictures. Neuroimage 54, 714–724. http://dx.doi.org/10.1016/j. neuroimage.2010.07.030. Schupp, H.T., Junghöfer, M., Weike, A.I., Hamm, A.O., 2003. Emotional facilitation of sensory processing in the visual cortex. Psychol. Sci. 14, 7–13. Schupp, H.T., Ohman, A., Junghöfer, M., Weike, A.I., Stockburger, J., Hamm, A.O., 2004. The facilitated processing of threatening faces: an ERP analysis. Emotion 4, 189–200. http://dx.doi.org/10.1037/1528-3542.4.2.189. Schweinberger, S.R., Pickering, E.C., Jentzsch, I., Burton, A.M., Kaufmann, J.M., 2002. Event-related brain potential evidence for a response of inferior temporal cortex to familiar face repetitions. Brain Res. Cogn. Brain Res. 14, 398–409. Streit, M., Wölwer, W., Brinkmeyer, J., Ihl, R., Gaebel, W., 2000. Electrophysiological correlates of emotional and structural face processing in humans. Neurosci. Lett. 278, 13–16. Streit, M., Dammers, J., Simsek-Kraues, S., Brinkmeyer, J., Wölwer, W., Ioannides, A., 2003. Time course of regional brain activations during facial emotion recognition in humans. Neurosci. Lett. 342, 101–104. Tanaka, J.W., Curran, T., Porterfield, A.L., Collins, D., 2006. Activation of preexisting and acquired face representations: the N250 event-related potential as an index of face familiarity. J. Cogn. Neurosci. 18, 1488–1497. http://dx.doi.org/10.1162/jocn.2006. 18.9.1488. Tang, Y., Liu, D., Li, Y., Qiu, Y., Zhu, Y., 2008. The time–frequency representation of the ERPs of face processing. Conf. Proc. IEEE Eng. Med. Biol. Soc. 2008, 4114–4117. http://dx. doi.org/10.1109/IEMBS.2008.4650114. Taylor, M.J., Bayless, S.J., Mills, T., Pang, E.W., 2011. Recognising upright and inverted faces: MEG source localisation. Brain Res. 1381, 167–174. http://dx.doi.org/10.1016/ j.brainres.2010.12.083. Todd, R.M., Cunningham, W.A., Anderson, A.K., Thompson, E., 2012a. Affect-biased attention as emotion regulation. Trends Cogn. Sci. 16, 365–372. http://dx.doi.org/10.1016/ j.tics.2012.06.003. Todd, R.M., Talmi, D., Schmitz, T.W., Susskind, J., Anderson, A.K., 2012b. Psychophysical and neural evidence for emotion-enhanced perceptual vividness. J. Neurosci. 32, 11201–11212. http://dx.doi.org/10.1523/JNEUROSCI.0155-12.2012. Tottenham, N., Tanaka, J.W., Leon, A.C., McCarry, T., Nurse, M., Hare, T.A., Marcus, D.J., Westerlund, A., Casey, B.J., Nelson, C., 2009. The NimStim set of facial expressions: judgments from untrained research participants. Psychiatry Res. 168, 242–249. http://dx.doi.org/10.1016/j.psychres.2008.05.006. Tsoneva, T., Baldo, D., Lema, V., Garcia-Molina, G., 2011. EEG-rhythm dynamics during a 2-back working memory task and performance. Conf. Proc. IEEE Eng. Med. Biol. Soc. 2011, 3828–3831. http://dx.doi.org/10.1109/IEMBS.2011.6090952. Utama, N.P., Takemoto, A., Koike, Y., Nakamura, K., 2009a. Phased processing of facial emotion: an ERP study. Neurosci. Res. 64, 30–40. http://dx.doi.org/10.1016/j.neures. 2009.01.009. Utama, N.P., Takemoto, A., Nakamura, K., Koike, Y., 2009b. Single-trial EEG data to classify type and intensity of facial emotion from P100 and N170. Proceedings of the 2009 International Joint Conference on Neural Networks. Piscataway, NJ. Van Strien, N.M., Cappaert, N.L.M., Witter, M.P., 2009. The anatomy of memory: an interactive overview of the parahippocampal–hippocampal network. Nat. Rev. Neurosci. 10, 272–282. http://dx.doi.org/10.1038/nrn2614. Vuilleumier, P., Pourtois, G., 2007. Distributed and interactive brain mechanisms during emotion face perception: evidence from functional neuroimaging. Neuropsychologia 45, 174–194. http://dx.doi.org/10.1016/j.neuropsychologia.2006.06.003. Weymar, M., Löw, A., Schwabe, L., Hamm, A.O., 2010. Brain dynamics associated with recollective experiences of emotional events. Neuroreport 21, 827–831. http://dx. doi.org/10.1097/WNR.0b013e32833d180a. Weymar, M., Löw, A., Ohman, A., Hamm, A.O., 2011. The face is more than its parts—brain dynamics of enhanced spatial attention to schematic threat. Neuroimage 58, 946–954. http://dx.doi.org/10.1016/j.neuroimage.2011.06.061.

410

C.A. Brenner et al. / International Journal of Psychophysiology 93 (2014) 398–410

Williams, L.M., Palmer, D., Liddell, B.J., Song, L., Gordon, E., 2006. The “when” and “where” of perceiving signals of threat versus non-threat. Neuroimage 31, 458–467. http://dx. doi.org/10.1016/j.neuroimage.2005.12.009. Wong, A.C.-N., Palmeri, T.J., Rogers, B.P., Gore, J.C., Gauthier, I., 2009a. Beyond shape: how you learn about objects affects how they are represented in visual cortex. PLoS One 4, e8405. http://dx.doi.org/10.1371/journal.pone.0008405. Wong, T.K.W., Fung, P.C.W., McAlonan, G.M., Chua, S.E., 2009b. Spatiotemporal dipole source localization of face processing ERPs in adolescents: a preliminary study. Behav. Brain Funct. 5, 16. http://dx.doi.org/10.1186/1744-9081-5-16. Wynn, J.K., Jahshan, C., Altshuler, L.L., Glahn, D.C., Green, M.F., 2013. Event-related potential examination of facial affect processing in bipolar disorder and schizophrenia. Psychol. Med. 43, 109–117. http://dx.doi.org/10.1017/S0033291712001006.

Yordanova, J., Kolev, V., Rosso, O.A., Schürmann, M., Sakowitz, O.W., Ozgören, M., Basar, E., 2002. Wavelet entropy analysis of event-related potentials indicates modalityindependent theta dominance. J. Neurosci. Methods 117, 99–109. Zhang, D., Wang, L., Luo, Y., Luo, Y., 2012. Individual differences in detecting rapidly presented fearful faces. PLoS One 7, e49517. http://dx.doi.org/10.1371/journal.pone. 0049517. Zheng, X., Mondloch, C.J., Segalowitz, S.J., 2012. The timing of individual face recognition in the brain. Neuropsychologia 50, 1451–1461. http://dx.doi.org/10.1016/j. neuropsychologia.2012.02.030.

The role of encoding and attention in facial emotion memory: an EEG investigation.

Facial expressions are encoded via sensory mechanisms, but meaning extraction and salience of these expressions involve cognitive functions. We invest...
934KB Sizes 2 Downloads 3 Views