SAFETYLIT WEEKLY UPDATE

We compile citations and summaries of about 400 new articles every week.
RSS Feed

HELP: Tutorials | FAQ
CONTACT US: Contact info

Search Results

Journal Article

Citation

Hirai M, Watanabe S, Honda Y, Miki K, Kakigi R. Brain Res. Bull. 2008; 77(5): 264-273.

Affiliation

Department of Integrative Physiology, National Institute for Physiological Sciences, Myodaiji, Okazaki 444-8585, Japan; Japan Society for the Promotion of Science, Japan.

Copyright

(Copyright © 2008, Elsevier Publishing)

DOI

10.1016/j.brainresbull.2008.08.011

PMID

18793705

Abstract

To understand the processing of facial expressions in terms of social communication, it is important to clarify how they are influenced by environmental stimuli such as natural scenes or objects. We investigated how and when neural responses to facial expressions were modulated by both a natural scene and an object containing emotional information. A facial expression stimulus (fearful/neutral) was presented after a scene or object stimulus (fearful/neutral), and then event-related potentials were recorded from both the onset of scene and facial expression presentation. As in previous studies, for the presentation of the scenes and objects positive-going waves at around 200-500ms were observed for unpleasant visual stimuli at the Pz and Cz electrodes when the stimuli were intact; however, such a response was not observed when the stimuli were scrambled. During the subsequent facial expression presentation period, although we could not identify a significant interaction between the contextual information and facial expression in the N170 component, we observed a significant interaction in the P2 component: the P2 amplitude of the fearful cued was significantly larger than that of the neutral cued condition when the face was fearful, and the P2 amplitude of the neutral face was significantly larger than that of the fearful face condition when the preceding stimulus was neutral. These findings show that an adjacent, non-face stimulus containing emotional information influences the subsequent processing of facial expressions up to 260ms, and even in cases when the two stimulus categories are different.



Language: en

NEW SEARCH


All SafetyLit records are available for automatic download to Zotero & Mendeley
Print