Feeling strong emotions synchronizes people’s brains

May 30, 2012

 

Valence (pleasant-unpleasant) modulates intersubject correlations in the default-mode network (yellow) and arousal in the visual network (violet) and dorsal attention network (green). Brain regions where intersubject correlations were statistically significantly correlated with self-reported valence (blue to turquoise) and arousal (red to yellow) scores during viewing of film clips. (Credit: L. Nummenmaaa/PNAS)

Experiencing strong emotions synchronizes brain activity across individuals, research at Aalto University and Turku PET Centre has revealed.

The results revealed that especially feeling strong unpleasant emotions synchronized brain’s emotion processing networks in the frontal and midline regions. On the contrary, experiencing highly arousing events synchronized activity in the networks supporting vision, attention and sense of touch.

Experimental design for fMRI (Upper) and subjective emotional ratings (Lower). Participants watched short movie clips depicting pleasant, unpleasant, and neutral events. The movies were preceded by a 5-s presentation of a fixation cross and were followed by a 15-s presentation of text that described the general context of the upcoming movie without revealing details of its actual events. After fMRI the participants watched the movies again and rated their moment-to-moment experiences of valence (pleasantness–unpleasantness) and arousal. (Credit: L. Nummenmaaa/PNAS)

During movie viewing, participants’ brain activity was synchronized in lower- and higher-order sensory areas and in corticolimbic emotion circuits. Negative valence was associated with increased intersubject correlations (ISC) in the emotion-processing network (thalamus, ventral striatum, insula) and in the default-mode network (precuneus, temporoparietal junction, medial prefrontal cortex, posterior superior temporal sulcus).

High arousal was associated with increased ISC in the somatosensory cortices and visual and dorsal attention networks comprising the visual cortex, bilateral intraparietal sulci, and frontal eye fields.

Sharing others’ emotional states provides the observers a somatosensory and neural framework that facilitates understanding others’ intentions and actions and allows to “tune in” or “sync” with them.

Such automatic tuning facilitates social interaction and group processes, says Adjunct Professor Lauri Nummenmaa from the Aalto University.

The results have major implications for current neural models of human emotions and group behavior, but also deepen our understanding of mental disorders involving abnormal socio-emotional processing, Nummenmaa says.

The project was supported by the Academy of Finland and Aalto University (aivoAALTO-project).

Ref.: Lauri Nummenmaaa et al., Emotions promote social interaction by synchronizing brain activity across individuals, PNAS, 2012, DOI: 10.1073/pnas.1206095109 (open access)