Facial Muscle Mapping and Expression Prediction using a Conformal Surface-Electromyography Platform
Abstract
Facial muscles are unique in their attachment to the skin, dense innervation, and complex co-activation patterns, enabling fine motor control in various physiological tasks. Facial surface Electromyography (sEMG) is a valuable tool for assessing muscle function, yet traditional setups remain restrictive, requiring meticulous electrode placement and limiting mobility due to susceptibility to mechanical artifacts. Additionally, sEMG signal extraction is hindered by noise and cross-talk from adjacent muscles. Owing to these limitations, associating facial muscle activity with facial expressions has been challenging. Here, we leverage a novel 16-channel conformal sEMG system to extract meaningful electrophysiological data. By applying denoising and source separation techniques, we separated data from 32 healthy participants into independent sources and clustered them based on spatial distribution to create a facial muscle Atlas. Furthermore, we established a functional mapping between these clusters and specific muscle units, providing a comprehensive framework for understanding facial muscle activation patterns. Using this foundation, we demonstrated a participant-specific deep-learning model capable of predicting facial expressions from sEMG signals. This novel approach opens new avenues for facial muscle monitoring, with potential applications in rehabilitation in the medicine and psychological fields, where a precise understanding of facial muscle functions is crucial.
Related articles
Related articles are currently not available for this article.