Automating contact perception and embedding human motion strategies enable humanoid robots to operate safely in complex, unstructured environments, unlocking new commercial and societal applications.
The IROS 2025 keynote by Eiichi Yoshida examined how contact‑rich human motions can be harvested to advance humanoid robot mechanisms and control. Yoshida traced the evolution from a handful of humanoid platforms in 2022 to a burgeoning ecosystem of commercial robots, emphasizing that mastering multi‑contact locomotion and manipulation remains the next frontier.
He outlined a hybrid research agenda that blends model‑based planning with data‑driven learning. By instrumenting humans with tactile skins from the Technical University of Munich, the team captures whole‑body force distributions, maps them onto standard human shape models, and uses inverse optimal control to infer latent cost functions. A self‑attention BQVA network trained on motion‑capture datasets reconstructs ground‑reaction forces and automatically annotates foot‑ground contact states, dramatically cutting manual labeling.
Demonstrations included a robot maintaining balance on a narrow beam using full‑body tactile feedback, and a visual‑tactile transformer that combines RGB images with skin data to imitate fragile‑object manipulation from a few tele‑operation demos. The system can predict contact forces and adjust grip softness even for unseen objects, showcasing the power of integrated tactile sensing and imitation learning.
The work signals a shift toward robots that navigate tight spaces, handle delicate items, and transfer skills across embodiments with minimal human supervision. By automating contact annotation and embedding human‑derived cost metrics, developers can accelerate the deployment of safe, adaptable humanoids for logistics, healthcare, and service sectors.
Comments
Want to join the conversation?
Loading comments...