Simulating Infant First-Person Sensorimotor Experience via Motion Retargeting from Babies to Humanoids
Using robots to recreate what babies actually feel and sense while moving
Researchers developed a method to translate infant movements from videos onto humanoid robots and virtual models, recreating not just the motion but also the sensory feedback—touch, muscle awareness, and visual input—that babies experience. The technique reconstructs a baby's full 3D body position from a single video, then maps those movements onto different robot platforms with sub-centimeter accuracy, generating realistic streams of multimodal sensory data.
Scientists can now study how babies develop motor skills by literally experiencing movement through a robot's sensors, rather than just watching from the outside. This opens new ways to detect early signs of developmental disorders, helps roboticists design machines that learn more like humans do, and gives developmental psychologists direct access to the sensory world of infancy—something previously impossible to measure or replicate.