Integrating multiple sensory modalities during dyadic interactions drives self-other differentiation at the behavioral and electrocortical level

This article has 0 evaluations Published on
Read the full article Related papers
This article on Sciety

Abstract

Interpersonal motor interactions represent a key setting for processing signals from multiple sensory channels simultaneously, possibly modulating cross-modal multisensory integration, that is a crucial perceptual mechanism where different sensory sources of information are combined into one single percept. Here we explored whether integrating sensorimotor signals while interacting with a partner can lead to shared sensorimotor representations, and possibly to a recalibration of individual’s multisensory perception. In detail, we investigated whether engaging individuals in dyadic activities that utilized either single (e.g., visual or tactile/proprioceptive) or combined (e.g., visuo-tactile/proprioceptive) sensory modalities would impact the behavioral and electrocortical markers associated with interpersonal cross-modal integration. We show that interactions requiring the integration of multiple sensory modalities lead to higher interpersonal differentiation resulting in reduced interpersonal cross-modal integration in a subsequent spatial detection task and alter its distributed neural representations. Specifically, the neural patterns elicited by interpersonal visuo-tactile stimuli vary based on the sensory nature of the previous interpersonal interaction, with the one involving multiple sensory modalities resulting in improved performance of a neural classifier. These findings suggest new avenues for sensorimotor approaches in social neuroscience, emphasizing the malleability of self-other representations based on the nature of interpersonal interactions.

SIGNIFICANCE STATEMENT

Successful social interaction relies on the dynamic integration of sensory and motor signals between individuals. Here, we show that the sensory channels engaged during interpersonal coordination recalibrate how the brain processes multisensory information about the self and others after having interacted with them. Specifically, interactions involving integrated visual and proprioceptive feedback sharpen self-other distinction and reduce cross-modal interference. These effects manifest in both behavioral responses and early neural activity patterns, revealing plastic, offline changes in multisensory integration shaped by social sensorimotor contingencies. Our findings highlight the role of interpersonal experience in shaping the functional responses to multisensory stimuli and provide novel insights into the sensorimotor foundations of social embodied cognition.

Related articles

Related articles are currently not available for this article.