Robotics Videos
  • All Technology
  • AI
  • Autonomy
  • B2B Growth
  • Big Data
  • BioTech
  • ClimateTech
  • Consumer Tech
  • Crypto
  • Cybersecurity
  • DevOps
  • Digital Marketing
  • Ecommerce
  • EdTech
  • Enterprise
  • FinTech
  • GovTech
  • Hardware
  • HealthTech
  • HRTech
  • LegalTech
  • Nanotech
  • PropTech
  • Quantum
  • Robotics
  • SaaS
  • SpaceTech
AllNewsDealsSocialBlogsVideosPodcastsDigests

Robotics Pulse

EMAIL DIGESTS

Daily

Every morning

Weekly

Sunday recap

NewsDealsSocialBlogsVideosPodcasts
RoboticsVideosIROS 2025 Keynotes - Perception and Sensors: Perla Maiolino
AutonomyRobotics

IROS 2025 Keynotes - Perception and Sensors: Perla Maiolino

•February 11, 2026
0
IEEE Robotics & Automation Society
IEEE Robotics & Automation Society•Feb 11, 2026

Why It Matters

Embedding tactile perception and compliance transforms robots from data‑driven imitators into agents that can safely learn and adapt in real‑world environments, accelerating the path to true embodied AI.

Summary

The keynote highlighted the limits of pure deep‑learning approaches for robot cognition, arguing that true general intelligence requires embodied, tactile experience. Perla Maiolino described how artificial skin (SciSkin) and distributed proximity sensors give robots a closed‑loop sense‑act‑perceive cycle, allowing safe, reactive interaction without relying on pre‑collected visual data. Key demonstrations included a manipulator navigating clutter using only skin feedback, full‑body manipulation of objects with force‑position control, and self‑localization through combined tactile and time‑of‑flight data. The speaker also showcased a soft, 3‑D‑printed hand whose morphology and barometric sensors enable object recognition via emergent tactile patterns, mimicking human exploratory touch. Notable quotes emphasized that "the body shapes perception" and that learning emerges through experience, not just data scaling. The research integrated vision, proprioception, and touch into a unified representation, allowing robots to adjust forces based on material stiffness and to plan paths that exploit compliant obstacles. The implications are clear: future robotic platforms must be built with whole‑body tactile sensing and compliance to gather their own data, achieve safe human‑robot collaboration, and move toward genuine embodied intelligence.

Original Description

"Keynote Title: ""Shaping Intelligence: Soft Bodies, Sensors, and Experience""
Speaker Biography
Prof. Perla Maiolino is Deputy Director of the Oxford Robotics Institute and Associate Professor in the Department of Engineering Science at the University of Oxford. She is a globally recognized leader in soft robotics and tactile sensing, known for pioneering research on safe, intelligent, and adaptive robotic interaction. She holds a B.Eng., M.Eng., and Ph.D. in robotics from the University of Genoa, Italy, and leads the Soft Robotics Lab at Oxford Robotics Institute, which was awarded the Queen’s Anniversary Prize for excellence and impact in robotics research. Prof. Maiolino’s work has introduced key innovations such as CySkin, an advanced tactile sensing technology exhibited at the Science Museum in London. Her research has been featured in the Royal Institution Christmas Lectures 2024 on BBC, the Wall Street Journal, and several robotics podcasts, and her soft robotic hand has been showcased as an example of cutting-edge engineering to the wider public. She is an Associate Editor for IEEE RA-L and Soft Robotics journal, has served as editor for ICRA in Medical and Rehabilitation robotics, and was previously Associate Editor for IEEE Robotics and Automation Magazine. She has organized workshops at ICRA, IROS, RoboSoft, and NeurIPS, and is part of the organizing committee for the EI conference, UK-RAS TAROS conference and IEEE RoboSoft 2025 and 2026. Prof. Maiolino's research continues to advance the transformative potential of soft and tactile robotics for embodied intelligence, autonomy, and safe human–robot interaction.
Abstract
""Robot intelligence does not emerge from data alone. Much like humans, robots can be instructed through explicit teaching or learn by imitation. Yet the most profound form of learning arises through experience, through acting, sensing, and adapting in the world. To build robots that truly learn, we must give them the capacity to generate their own data through physical interaction. In this keynote, I will discuss how equipping robots with advanced sensors, compliant morphologies, and artificial skins can transform their bodies into perceptive surfaces. These designs enable robots to explore, adapt, and interact safely with humans. Unlike pre-collected datasets, this data is grounded in physical experience: robots bump, grasp, yield, and recover, constructing their own understanding of themselves and their environments. Such sensorized and adaptive bodies make it possible for robots to continuously gather the experiential data that supports learning while ensuring safety in human–robot interaction. In this emerging paradigm, the body is not just a container for sensors, it is the generator of data, the mediator of safe interaction, and the foundation of robotic intelligence.""
"
0

Comments

Want to join the conversation?

Loading comments...