PMID- 25429141 OWN - NLM STAT- MEDLINE DCOM- 20150126 LR - 20220309 IS - 1529-2401 (Electronic) IS - 0270-6474 (Print) IS - 0270-6474 (Linking) VI - 34 IP - 48 DP - 2014 Nov 26 TI - A common neural code for perceived and inferred emotion. PG - 15997-6008 LID - 10.1523/JNEUROSCI.1676-14.2014 [doi] AB - Although the emotions of other people can often be perceived from overt reactions (e.g., facial or vocal expressions), they can also be inferred from situational information in the absence of observable expressions. How does the human brain make use of these diverse forms of evidence to generate a common representation of a target's emotional state? In the present research, we identify neural patterns that correspond to emotions inferred from contextual information and find that these patterns generalize across different cues from which an emotion can be attributed. Specifically, we use functional neuroimaging to measure neural responses to dynamic facial expressions with positive and negative valence and to short animations in which the valence of a character's emotion could be identified only from the situation. Using multivoxel pattern analysis, we test for regions that contain information about the target's emotional state, identifying representations specific to a single stimulus type and representations that generalize across stimulus types. In regions of medial prefrontal cortex (MPFC), a classifier trained to discriminate emotional valence for one stimulus (e.g., animated situations) could successfully discriminate valence for the remaining stimulus (e.g., facial expressions), indicating a representation of valence that abstracts away from perceptual features and generalizes across different forms of evidence. Moreover, in a subregion of MPFC, this neural representation generalized to trials involving subjectively experienced emotional events, suggesting partial overlap in neural responses to attributed and experienced emotions. These data provide a step toward understanding how the brain transforms stimulus-bound inputs into abstract representations of emotion. CI - Copyright (c) 2014 the authors 0270-6474/14/3315997-12$15.00/0. FAU - Skerry, Amy E AU - Skerry AE AD - Department of Psychology, Harvard University, Cambridge, Massachusetts 02138, and amy.skerry@gmail.com. FAU - Saxe, Rebecca AU - Saxe R AD - Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139. LA - eng GR - R01 MH096914/MH/NIMH NIH HHS/United States GR - 1R01 MH096914-01A1/MH/NIMH NIH HHS/United States PT - Journal Article PT - Randomized Controlled Trial PT - Research Support, N.I.H., Extramural PT - Research Support, U.S. Gov't, Non-P.H.S. PL - United States TA - J Neurosci JT - The Journal of neuroscience : the official journal of the Society for Neuroscience JID - 8102140 SB - IM MH - Adult MH - Brain/*physiology MH - Emotions/*physiology MH - Female MH - Humans MH - Magnetic Resonance Imaging/methods MH - Male MH - Nerve Net/*physiology MH - Photic Stimulation/*methods MH - Pilot Projects MH - Psychomotor Performance/*physiology MH - Young Adult PMC - PMC4244468 OTO - NOTNLM OT - abstraction OT - concepts OT - emotion attribution OT - multimodal OT - social cognition OT - theory of mind EDAT- 2014/11/28 06:00 MHDA- 2015/01/27 06:00 PMCR- 2015/05/26 CRDT- 2014/11/28 06:00 PHST- 2014/11/28 06:00 [entrez] PHST- 2014/11/28 06:00 [pubmed] PHST- 2015/01/27 06:00 [medline] PHST- 2015/05/26 00:00 [pmc-release] AID - 34/48/15997 [pii] AID - 1676-14 [pii] AID - 10.1523/JNEUROSCI.1676-14.2014 [doi] PST - ppublish SO - J Neurosci. 2014 Nov 26;34(48):15997-6008. doi: 10.1523/JNEUROSCI.1676-14.2014.