AI Podcasts
  • All Technology
  • AI
  • Autonomy
  • B2B Growth
  • Big Data
  • BioTech
  • ClimateTech
  • Consumer Tech
  • Crypto
  • Cybersecurity
  • DevOps
  • Digital Marketing
  • Ecommerce
  • EdTech
  • Enterprise
  • FinTech
  • GovTech
  • Hardware
  • HealthTech
  • HRTech
  • LegalTech
  • Nanotech
  • PropTech
  • Quantum
  • Robotics
  • SaaS
  • SpaceTech
AllNewsDealsSocialBlogsVideosPodcastsDigests

AI Pulse

EMAIL DIGESTS

Daily

Every morning

Weekly

Sunday recap

NewsDealsSocialBlogsVideosPodcasts
AIPodcasts#347 Let's Get Physical with AI with Ivan Poupyrev, CEO at Archetype AI
#347 Let's Get Physical with AI with Ivan Poupyrev, CEO at Archetype AI
Big DataAIHardware

DataFramed

#347 Let's Get Physical with AI with Ivan Poupyrev, CEO at Archetype AI

DataFramed
•February 23, 2026•45 min
0
DataFramed•Feb 23, 2026

Why It Matters

Physical AI promises to turn the flood of sensor data into real‑world value, improving safety, efficiency, and cost savings across industries and homes. As the underlying IoT infrastructure is already in place, the next wave of AI can deliver immediate, tangible benefits, making this a pivotal moment for businesses and consumers eager to harness intelligent, interconnected environments.

Key Takeaways

  • •Physical AI expands AI beyond robots to everyday devices.
  • •Foundation models turn raw sensor data into actionable insights.
  • •Three tiers: insight, recommendation, full automation for physical systems.
  • •Physical AI models prioritize measurement accuracy over text generation.
  • •Generalizable models enable sensor fusion across diverse hardware.

Pulse Analysis

Physical AI marks a shift from traditional robotics to embedding intelligence in any connected object. While the Internet of Things linked sensors to the cloud, it left raw data idle. Ivan Poupyrev explains that foundation models now translate those streams into meaningful insights, turning ordinary appliances, HVAC units, and industrial machines into proactive agents. This evolution leverages the same transformer architecture that powers large language models, but adapts it to the sensor‑rich, time‑series world where accuracy isn’t optional—it’s critical.

The conversation outlines three progressive layers of physical AI: first, generating insights from measurements such as a smartwatch’s step count or a factory’s vibration data. Second, delivering concrete recommendations—like optimizing a failing turbine or guiding a driver through a maintenance issue—by compressing human expertise into model outputs. The final tier envisions full automation, where intelligent devices coordinate across homes, factories, and grids to maximize safety, efficiency, and cost savings. Poupyrev cites smart factories, autonomous city infrastructure, and interconnected home systems as emerging real‑world deployments that illustrate this trajectory toward physical superintelligence.

Technically, physical AI diverges from text‑centric models because half of its input is raw sensor measurements, with video and minimal text following. Training requires architectures that respect time‑series dynamics and physical constraints, ensuring predictions are physically plausible—hallucinations are unacceptable in contexts like power‑plant control. Poupyrev highlights the breakthrough of models that generalize across sensor types, locations, and even unseen physical phenomena, enabling robust sensor fusion and anomaly detection without per‑device retraining. With the IoT backbone already in place, these foundation models are poised to unlock the next wave of AI‑driven automation across industries.

Episode Description

Physical AI is showing up across the industry as sensors, connected devices, and foundation models move from the cloud into the real world. After years of IoT wiring everything to the internet, the big shift is turning raw measurements and video into meaning, not just dashboards. For day-to-day teams, that changes how you monitor equipment, detect failures, and decide what to do next. When thousands of sensor streams hit storage, who turns them into insights and recommendations fast enough to matter? Can one model generalize across different sensors and conditions? And what must run on the asset versus the cloud?

Dr. Ivan Poupyrev is CEO and Founder of Archetype AI, where he is building a multimodal AI foundation model that combines real-time sensor data and natural language to help people and organizations better understand and act on the physical world. The company is developing a developer platform to unlock new applications of Physical AI across industries.

Previously, he was Director of Engineering at Google’s Advanced Technology and Projects (ATAP) division, where he founded and led large cross-functional teams to create Soli, a radar-based sensing platform, and Jacquard, a connected apparel platform powered by smart textiles and embedded ML. These technologies shipped in more than 15 products across 33 countries, including collaborations with Levi’s, YSL, Adidas, and Samsonite, and were integrated into flagship devices such as Pixel 4 and Nest products. His work has been widely published, recognized with major international awards, and featured in global media.

In the episode, Richie and Ivan explore physical AI beyond robotics, turning IoT sensor streams into insights, recommendations, and automation, why physical foundation models differ from LLMs, sensor-fusion wins like wind-turbine failure alerts, edge deployment and privacy, how to pick a first project in practice, and much more.

Links Mentioned in the Show:

Archetype AI

Attention Is All You Need (Original Transformer Architecture Paper)

A Mathematical Theory of Communication (Shannon, 1948)

Connect with Ivan

AI-Native Course: Intro to AI for Work

Related Episode: Enterprise AI Agents with Jun Qian, VP of Generative AI Services at Oracle

Explore AI-Native Learning on DataCamp

New to DataCamp?

Learn on the go using the <a href="https://www.datacamp.com/mobile" rel="noopener...

Show Notes

0

Comments

Want to join the conversation?

Loading comments...