Physical AI Is the Next Frontier - and It's Already All Around You
Robotics

Physical AI Is the Next Frontier - and It's Already All Around You

ZDNet Robotics
ZDNet RoboticsJan 13, 2026

Why It Matters

Physical AI enables machines to perform real‑world tasks, opening new automation markets while creating data‑driven ecosystems that boost productivity and innovation.

Physical AI is the next frontier - and it's already all around you

Kerry Wan (ZDNET) · January 6 2026

RayNeo Air 4 Pro at CES 2026


ZDNET's key takeaways

  • Physical AI is the latest trending frontier of the technology.

  • It leverages real‑world data for more autonomous robots.

  • Its early stages could be on your face right now.


ChatGPT's release over three years ago triggered an AI frenzy. While AI models continue to become more capable, to truly be as helpful as possible to people in their everyday lives, they need to have access to everyday tasks. That's only possible by allowing them to live outside a chatbot on your laptop screen and more presently in your environment.

Enter the industry's latest buzzword: physical AI. The term was on full display at the Consumer Electronics Show (CES) last week, with nearly every company touting a new model or hardware that can contribute to advancing the space, including Nvidia. During the company's keynote, CEO Jensen Huang even compared the significance of physical AI to that of ChatGPT's release.

“The ChatGPT moment for physical AI is here — when machines begin to understand, reason, and act in the real world,” he said.

What is physical AI?

Physical AI can be generally defined as AI implemented in hardware that can perceive the world around it and then reason to perform or orchestrate actions. Popular examples include autonomous vehicles and robots — but robots that utilize AI to perform tasks have existed for decades. So what's the difference?

“The whole idea of a chain of thoughts, a reasoning, a brain, which will work in a context and take some actions as humans would — that's the real definition of physical AI,” said Anshuman Saxena, VP and GM of automated driving and robotics at Qualcomm.

For instance, a humanoid robot would be able to go a step beyond performing a task such as moving materials or packages as directed, and instead would be able to perceive its environment to intuitively perform the task.

“Smartglasses are the best representation already of physical AI,” said Ziad Asghar, SVP & GM of XR, wearables, and personal AI at Qualcomm. “They are a device that basically are present and are able to see what you are seeing; they're able to hear what you're hearing, so they're in your physical world.”

A symbiotic data relationship

Saxena adds that while humanoid robots will be useful in instances where humans don’t want to perform a task—either because it is too tedious or too risky—they will not replace humans. That’s where AI wearables, such as smart glasses, play an important role, as they can augment human capabilities.

Beyond that, AI wearables might actually be able to feed back into other physical AI devices, such as robots, by providing a high‑quality dataset based on real‑life perspectives and examples.

“Why are LLMs so great? Because there is a ton of data on the internet, for a lot of the contextual information and whatnot, but physical data does not exist,” Saxena explained.

The problem he describes is one that often hinders physical AI developments. Because it is too risky to train robots in the real world—such as by putting autonomous cars on the road—companies must create synthetic‑data simulations to train and test these models. Many companies attempted to tackle this issue at CES.

Nvidia released new models that understand the real world around you and can be used to create synthetic data and simulations that emulate realistic life scenarios. Qualcomm offers a comprehensive physical AI stack that combines a new Qualcomm Dragonwing IQ10 Series processor, released at CES, with the necessary tools for AI data collection and training.

Creating datasets for this training is often a time‑consuming and costly process. However, robots could use the data from the wearables people already use every day, which is effectively physical‑AI data that is true to human experiences.

“Think about these sensors, the glasses, so many things that are there, which, if I have the glasses on, and I take an action based on, ‘Oh, I saw something here,’ so much information is immediately generated, which can help the robots as well, creating a new set of information today,” Saxena said.

Given the privacy concerns that may come from having your everyday data used to train robots, Saxena highlighted that the data from your wearables should always be kept at the highest level of privacy. As a result, the data—already anonymized by the wearable company—could be very helpful in training robots. That robot can then create more data, resulting in a healthy ecosystem.

“This sharing of context, this sharing of AI between that robot and the wearable AI devices that you have around you is, I think, the benefit that you are going to be able to accrue,” added Asghar.

Comments

Want to join the conversation?

Loading comments...