Embedding tactile perception and compliance transforms robots from data‑driven imitators into agents that can safely learn and adapt in real‑world environments, accelerating the path to true embodied AI.
The keynote highlighted the limits of pure deep‑learning approaches for robot cognition, arguing that true general intelligence requires embodied, tactile experience. Perla Maiolino described how artificial skin (SciSkin) and distributed proximity sensors give robots a closed‑loop sense‑act‑perceive cycle, allowing safe, reactive interaction without relying on pre‑collected visual data. Key demonstrations included a manipulator navigating clutter using only skin feedback, full‑body manipulation of objects with force‑position control, and self‑localization through combined tactile and time‑of‑flight data. The speaker also showcased a soft, 3‑D‑printed hand whose morphology and barometric sensors enable object recognition via emergent tactile patterns, mimicking human exploratory touch. Notable quotes emphasized that "the body shapes perception" and that learning emerges through experience, not just data scaling. The research integrated vision, proprioception, and touch into a unified representation, allowing robots to adjust forces based on material stiffness and to plan paths that exploit compliant obstacles. The implications are clear: future robotic platforms must be built with whole‑body tactile sensing and compliance to gather their own data, achieve safe human‑robot collaboration, and move toward genuine embodied intelligence.
Comments
Want to join the conversation?
Loading comments...