Autonomy Videos
  • All Technology
  • AI
  • Autonomy
  • B2B Growth
  • Big Data
  • BioTech
  • ClimateTech
  • Consumer Tech
  • Crypto
  • Cybersecurity
  • DevOps
  • Digital Marketing
  • Ecommerce
  • EdTech
  • Enterprise
  • FinTech
  • GovTech
  • Hardware
  • HealthTech
  • HRTech
  • LegalTech
  • Nanotech
  • PropTech
  • Quantum
  • Robotics
  • SaaS
  • SpaceTech
AllNewsDealsSocialBlogsVideosPodcastsDigests

Autonomy Pulse

EMAIL DIGESTS

Daily

Every morning

Weekly

Sunday recap

NewsDealsSocialBlogsVideosPodcasts
AutonomyVideosPrecision Landing with PX4 and ROS 2 Using Aruco Markers
AutonomyRoboticsAIAerospace

Precision Landing with PX4 and ROS 2 Using Aruco Markers

•February 13, 2026
0
PX4 Autopilot
PX4 Autopilot•Feb 13, 2026

Why It Matters

This integration unlocks ROS 2’s rich ecosystem for drone developers, enabling high‑precision, vision‑based landing without sacrificing the reliability of PX4’s flight controller.

Key Takeaways

  • •PX4 middleware now bridges to ROS 2 via DDS and CycloneDDS.
  • •External flight modes let ROS 2 control drones beyond onboard limits.
  • •ROSU library abstracts PX4 topics for custom navigation and landing.
  • •Precision landing demonstrated using ArUco markers and vision processing.
  • •Simulation in Gazebo validates workflow before real‑world deployment.

Summary

The session, led by PX4 maintainer Benjamin and Linux Foundation’s Ramon, introduced a new workflow for achieving precision landing on drones by integrating the PX4 autopilot stack with ROS 2.

Key technical points included the use of PX4’s uORB middleware extended to ROS 2 via a DDS bridge (or experimental CycloneDDS), the introduction of external flight modes that allow ROS 2 nodes to define custom control logic, and the ROSU interface library that abstracts low‑level PX4 topics into simple APIs for navigation, control, and mission planning.

The presenters demonstrated the end‑to‑end pipeline in Gazebo: a down‑facing camera streams images to a ROS 2 node that detects ArUco markers with OpenCV, publishes the tag pose, transforms it into the world frame, and commands the drone to hover and land precisely on the marker. The take‑off and landing sequence was shown both in simulation and on the ROSU control panel.

By offloading heavy perception and planning to Linux‑level ROS 2 while retaining PX4’s safety‑critical core, developers can prototype sophisticated autonomous behaviors faster, reduce firmware complexity, and bring advanced applications such as indoor GPS‑denied navigation to market more quickly.

Original Description

Ever wanted to build a drone that can find and land on a visual target all on its own? This tutorial walks you through a complete precision landing pipeline using PX4, ROS 2, and OpenCV. You'll learn how PX4's internal architecture works, how to detect ArUco markers with a downward-facing camera, and how to build a custom external flight mode that searches for, approaches, and lands precisely on a target. Everything runs in Gazebo simulation using PX4 SITL, and the same code works on real hardware with no changes.
Along the way, you'll dig into the key building blocks of the Dronecode stack: µORB messaging, the µXRCE-DDS bridge to ROS 2, coordinate frame transforms, and the PX4 ROS 2 Interface Library by Auterion. The landing mode uses a four-state machine with a PI velocity controller for precision descent, and it fails gracefully if the target is lost, handing control back to PX4's safety systems. The complete source code, Docker containers, and workshop materials are all available on the Dronecode GitHub so you can follow along and reproduce everything yourself.
0

Comments

Want to join the conversation?

Loading comments...