
PX4 and ROS are moving closer together every day. Even as aerial robotics specialization grows, the need to integrate more closely with general robotics systems grows at an even greater pace. In a recent Drone Software Meetup Group session, Godfrey Nolan, President at RIIS, gave a tutorial on Custom Flight Modes capable with the latest PX4/ROS integration.
This written version explores how to integrate ROS 2 with the PX4 autopilot to create custom flight modes. focusing specifically on implementing a flight pattern that traces the letter ‘R’. We’ll walk through the setup process, explore the fundamental concepts behind both platforms, and provide practical code examples that demonstrate how to extend PX4’s functionality using ROS 2 components. As always, you can follow along with the video if you wish.
Understanding PX4 and ROS 2
What is PX4?
For those coming from the ROS 2 side, PX4 is an open-source flight controller software that powers many modern drones. It provides:
Flight controller software with extensive configurability
Firmware, tools, and integration for software-in-the-loop and hardware-in-the-loop simulations
Hardware-agnostic design supporting numerous flight controllers
Communication with ground control stations via the MAVLink protocol
Based on NuttX RTOS (real-time operating system)
PX4 is often compared to ArduPilot, another major flight control software. The key difference lies in licensing: ArduPilot uses a strong copyleft license requiring source code distribution, while PX4 employs a more permissive license, which has led to its use by commercial companies.
What is ROS 2?
ROS 2 (Robot Operating System 2) represents a significant evolution from the original ROS:
It’s an open-source ecosystem of packages rather than an actual operating system
Uses a decentralized communication model based on DDS (Data Distribution Service)
Offers a multi-platform framework for managing communications between system components
Employs a publish/subscribe model between nodes, the smallest units of a ROS 2 system
Supports both C++ and Python officially, with the examples in this article using C++
Each of the systems are represented as nodes with a publish-subscribe model between them. This architecture makes ROS 2 ideal for complex robotics applications where different subsystems need to communicate reliably.

Ways to Extend PX4 with ROS 2
Historically, extending PX4 functionality involved either directly modifying the firmware or using a companion computer to send MAVLink commands. The ROS 2 integration approach represents a new path. If PX4 doesn’t do something out of the box, you can now leverage the capabilities of ROS 2 to open up a host of new possibilities by interacting with PX4’s messaging API uORB.
The PX4-ROS2-interface library enables several powerful capabilities:
Creating custom flight modes
Creating flight mode executors (sequenced state machines)
Overriding existing flight modes with new behaviors
Controlling drone movements via setpoints (position, velocity, and altitude targets)
Integrating with other ROS packages for advanced functionality
Today we’ll be using a library from Auterion px4-ros2-interface-lib, which will handle the interface between ROS 2 on a companion computer and PX4. The key advantages of this approach are modularity and access to larger library of functions and features from general robotics.
Setting Up PX4 with ROS 2
To get started with PX4 and ROS 2 integration, you’ll need:
Ubuntu 22.04 (recommended operating system)
PX4 toolchain
ROS 2 Humble distribution
Micro XRCE-DDS Agent (the bridge between PX4 and ROS 2)
Installing PX4
First, clone the PX4-Autopilot repository and set up the development environment:
Installing ROS 2 Humble

While detailed ROS 2 installation instructions are beyond the scope of this article, the recommended approach is to follow the official ROS 2 Humble installation guide or use the command snippets provided in the PX4 documentation (Seen above).
Installing the Micro XRCE-DDS Agent

PX4 and ROS 2 use different messaging systems: PX4 uses uORB (micro Object Request Broker) internally, while ROS 2 uses its own messaging system. To bridge these systems, you’ll need the Micro XRCE-DDS Agent:
The client for this bridge is built into the PX4 firmware, while the agent runs on the companion computer.
Creating a Workspace
Next, create a ROS 2 workspace for your PX4 integration:
If everything went well you should have your custom workspace with a tree similar to this:

When creating your own custom code, you’ll add it to the src directory alongside the px4_msgs package. The version of the px4_msgs package should match the version of your PX4 firmware to ensure compatibility.
Running Your Project
Now that we have our workspace ready, let’s go over the steps for running a project. First, while using a new terminal run the following:
Then we would load the project’s environment.
Finally, you would run this.
Keep these steps in mind for later.
Example 1: Implementing a Custom Flight Mode
Now, let’s explore the implementation of a custom flight mode to trace the letter ‘R’. This example demonstrates how to create a drone behavior that:
Arms the drone when loaded
Takes off
Traces the letter ‘R’
Returns to the starting position and lands
The result will look something like this:


On the left hand side (top image) we will have our menu with the [ Draw R ] button and that command will execute a series of commands, creating custom waypoints for a drone to draw an R. You may have to have QGroundControl installed locally or you will not get the flight mode options.
Setting Up the Environment
To implement our custom flight mode, we need to set up:
A new workspace with our custom package
The simulation environment (using Gazebo)
Run our custom component
Setting Up the ‘Draw R” Project
First, let’s set up our workspace and dependencies:
Create a new workspace:
Clone the dependencies:
Clone the custom package:
These commands create a dedicated workspace directory, cloning three essential repositories: the PX4 message definitions (px4_msgs) at release version 1.15, Auterion's ROS 2 interface library for PX4 communication, and a custom flight mode package developed by Nolan. After setting up the repository structure, the code sources the ROS Humble environment, builds the workspace using colcon
, and activates the newly built packages through the setup script, making the custom flight mode functionality available for use with our PX4 drone.
Starting the Simulation Environment

To test our custom flight mode, we need to set up a simulation environment:
Start the Micro XRCE-DDS Agent:
Start the PX4 SITL simulation:
Note. other versions of PX4 will probably work, but we tested it on 1.15.4. When

If everything was done correctly, Gazebo should launch when you run the sitl
command.
Now, launch QGroundControl, which should automatically connect to the simulator.

Running the Custom Flight Mode
Once you’ve set up your environment and implemented the custom flight mode, you can run it using:
When executed, the drone will arm, take off, trace the letter ‘R’ in the air according to the state machine logic defined in your code, and then return to land at the starting position.
Examining the Custom Flight Mode Code
You had cloned the godfreynolan/px4_ros2_custom_flight_mode repo. Let’s dive into that to take a closer look. Here’s our basic package structure:
Most of the action is going to take place in mode.hpp
. Our custom flight mode will inherit from the base class which is in our Auterion library.
DrawFlightMode
Now, let’s take a look at different functions within the large DrawFlightMode
class.
So you see here, we have our constructor
, destructor
, onActivate
, onDeactivate
, and updateSetpoint
, all of which are controlling our drone at various points of its operation. Here’s our constructor in more detail:
Goto setpoints allow you to set an end position, yaw, and max speed targets for the autopilot to handle. Essentially, this is what allows us to draw our R.
Looking back up at updateSetPoint
, we can see that it is controlling the drones movement through different states:
Each of the movements and waypoints the drone will need to execute to make our letter ‘R’ are here.
Let’s take a closer look at one of the controls. Here’s what’s under the hood in the LeftEdge
:
Here we have our target position, and the positionReached()
checks if we are in a certain threshold of the target and then sets the state to the next waypoint which is our TopEdge
.
DrawModeExecutor
In addition to the flight mode itself, we need an executor that manages the overall flight sequence, including arming, takeoff, executing our custom mode, returning to launch (RTL), and landing:
An executor needs an owned_mode
which when it is selected, the executor has the ability to switch to other modes. In the constructor, we configure the custom flight mode to activate as soon as the ROS node is loaded. This is done mainly because there isn’t an easy way to select custom flight mode from QGC currently, as mentioned earlier. Then we move to our business logic in runState
. When px4_ros2::Result::Success
it’s going to move onto the next state.
Let’s take a closer look at runState
:
This code shows how the executor transitions between states based on the success or failure of previous operations. It handles arming, takeoff, executing our custom flight mode, returning to launch, and waiting for disarming.
That’s enough looking at the code. Let’s launch and see what happens in the sim. This video will go through the whole setup and simulation demonstration:
Example 2: Object Detection
In addition to custom flight modes, another powerful capability enabled by the PX4-ROS2 integration is object detection using computer vision. We can implement this by using YOLOv8, a state-of-the-art object detection model, in conjunction with ROS 2 and Gazebo simulation. This example is going to be much more streamlined.
Extending with Object Detection Capabilities
Head over to monemati/PX4-ROS2-Gazebo-YOLOv8 and clone the repo. This demo video will give you an idea of just what’s possible for yourself:
Once you complete the setup, you’ll be able to control a drone via keyboard directional keys while tracking an object in real-time.
The process involves:
Setting up a simulation environment with a drone equipped with a camera
Streaming camera frames to ROS 2 topics
Applying YOLOv8 for object detection on these frames
Controlling the drone based on detection results
The implementation uses Docker to simplify the setup, creating:
A MicroXRCEAgent instance for PX4-ROS2 communication
A custom Gazebo simulation with PX4
A ROS2 node for camera image bridging
Run 2 Python scripts for object detection and keyboard control
ROS 2 and Gazebo Integration
The integration between ROS 2 and Gazebo is a key component. For the demo, you will need to be using Gazebo Garden, which you can install using the command :
Next, run the setup command for ROS:
And finally, we can set up our image bridge:
This all sets up a ROS 2 node that can read frames from the simulated camera in Gazebo and publish them to a ROS 2 topic called camera.
Running the Python Files
In the repo, you will find two important Python files you will need to run separately.
Our first script, uav_camera_det.py
creates a ROS node that subscribes to a /camera topic, processes incoming video frames through a YOLOv8 object detection model, and displays the detection results visually with bounding boxes and labels. Give that a run in the command line.
The second script, keyboard-mavsdk-test.py
monitors keyboard inputs and translates them into movement commands, which are then transmitted to a drone using the MAVLink protocol via the MAVSDK library.
Give It a Test Drive!
Now you should be able to run your simulation. If it has been set up correctly, you can use the directional keys to control your drone and you should be able to detect the truck on the track.

Conclusion
Through this tutorial, you've learned to combine PX4 and ROS 2 for advanced drone development. You've implemented a custom flight pattern (tracing the letter 'R') without modifying firmware and incorporated object detection using computer vision. These skills will now enable you to create sophisticated autonomous behaviors and open up entire worlds of possibilities that would be hard or even impossible with PX4 alone.