New Chip Lets Robots See in 4D by Tracking Distance and Speed Simultaneously

New Chip Lets Robots See in 4D by Tracking Distance and Speed Simultaneously

Tech Xplore Robotics
Tech Xplore RoboticsMar 13, 2026

Why It Matters

The ability to acquire real‑time distance and velocity data on a single chip could dramatically lower the cost and size of perception systems, accelerating autonomous robot and drone deployment. It also opens pathways for advanced imaging in consumer devices such as smartphones.

Key Takeaways

  • 61,952-pixel focal plane array integrates send/receive LiDAR.
  • Simultaneous distance and velocity measurement achieved on-chip.
  • 4D sensor maps up to 65 m, indoor 6‑11 m.
  • Continuous-wave FMCW reduces size, cost versus pulsed LiDAR.
  • Potential uses include robots, drones, smartphones, digital cameras.

Pulse Analysis

LiDAR has become the backbone of autonomous navigation, yet conventional systems rely on separate transmit and receive arrays and pulsed lasers that add bulk, power draw, and expense. The new 4D imaging chip sidesteps these constraints by embedding both functions into a single focal‑plane array and employing continuous‑wave frequency‑modulated continuous‑wave (FMCW) techniques. This architecture not only shrinks the sensor footprint but also enables simultaneous extraction of range and radial velocity, a capability traditionally reserved for radar, thereby delivering richer situational awareness in a single sweep.

The chip’s 61,952 stationary pixels are wired to an on‑chip optical switching network that directs a shared laser beam across the array. Each pixel measures the frequency shift of the reflected light, converting it into precise distance and speed data. In laboratory trials the sensor rendered high‑density point clouds of indoor spaces up to 11 meters and resolved structural details on a building façade 65 meters away, while also tracking the instantaneous speed of a rotating disk. These results demonstrate a viable path toward compact, high‑resolution perception modules that can operate at meter‑scale ranges without the mechanical scanning heads typical of legacy LiDAR.

For industry, the implications are twofold. First, the reduced component count and reliance on a continuous‑wave source promise lower bill‑of‑materials, making advanced perception affordable for mass‑produced robots, delivery drones, and even consumer electronics. Second, the dual‑mode data stream opens new algorithmic opportunities in sensor fusion, enabling tighter integration with visual and inertial systems. While resolution and long‑range performance still require refinement, the technology signals a shift toward ubiquitous 4D sensing that could accelerate autonomous deployments across logistics, manufacturing, and mobile imaging markets.

New chip lets robots see in 4D by tracking distance and speed simultaneously

Comments

Want to join the conversation?

Loading comments...