Human-Robot Variable Impedance Skill Transfer Learning Based on Dynamic Movement Primitives and Vision System

This article has 0 evaluations Published on
Read the full article Related papers
This article on Sciety

Abstract

To enhance robotic adaptability in dynamic environments, this study proposes a multimodal framework for skill transfer. The framework integrates vision-based kinesthetic teaching with surface electromyography (sEMG) signals to estimate human impedance. We establish a Cartesian-space model of upper-limb stiffness, linearly mapping sEMG signals to endpoint stiffness. For flexible task execution, dynamic movement primitives (DMPs) generalize learned skills across varying scenarios. An adaptive admittance controller, incorporating sEMG-modulated stiffness, is developed and validated on a UR5 robot. Experiments involving elastic band stretching demonstrate that the system successfully transfers human impedance characteristics to the robot, enhancing stability, environmental adaptability, and safety during physical interaction.

Related articles

Related articles are currently not available for this article.