[Documentation] [TitleIndex] [WordIndex

Planet ROS

Planet ROS - http://planet.ros.org

Planet ROS - http://planet.ros.org[WWW] http://planet.ros.org


ROS Discourse General: Is there a way to identify which network card the ros2 program data goes through (network port or WIFI)?

Is there a way to identify which network card the ros2 program data goes through (network port or WIFI)?

2 posts - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/is-there-a-way-to-identify-which-network-card-the-ros2-program-data-goes-through-network-port-or-wifi/49932

ROS Discourse General: No Meeting 2025-09-08 (Cloud Robotics Working Group)

The group will not be holding its regular meeting this coming Monday 8th September 2025. Meetings will resume as normal from the 22nd September.

Last meeting, the CRWG discussed the similarities and differences between the tech of the companies that had come to talk about Logging & Observability. The group agreed that we need to talk to consumers of these technologies, so effort will be made to contact some of the companies with large fleets to request they talk to us. If you’d like to see the meeting, the recording is available on YouTube.

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/no-meeting-2025-09-08-cloud-robotics-working-group/49924

ROS Discourse General: ROS 2 Rust Meeting: September 2025

The next ROS 2 Rust Meeting will be Mon, Sep 8, 2025 2:00 PM UTC

The meeting room will be at https://meet.google.com/rxr-pvcv-hmu

In the unlikely event that the room needs to change, we will update this thread with the new info!

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/ros-2-rust-meeting-september-2025/49923

ROS Discourse General: Curve and Cube Recognition with AgileX PIPER Robotic Arm

Hi everyone,

We’d like to share a simple program we developed for color block recognition and 3D curve extraction using the AgileX PIPER robotic arm. The project leverages OpenCV with depth and color data from a depth camera, enabling 3D coordinate extraction of color targets that can be applied to basic robotic grasping and trajectory tracking tasks.

We hope this example can serve as a starting point for anyone exploring computer vision–based manipulation with lightweight robotic arms. Feedback, improvements, and pull requests are always welcome!

Abstract

A simple program for color block recognition and 3D coordinate extraction, utilizing the OpenCV library. It performs recognition using depth information and color data from a depth camera, and the extracted 3D coordinates of color blocks can be applied to simple robotic arm grasping tasks.

Keywords: Color Recognition, 3D Curve Extraction, Curve Fitting, Curve Interpolation, AgileX PIPER

Code Repository

GitHub link: AgileX-College/piper/cubeAndLineDet

Function Demonstration

Teaching a Robot Arm to Spot Colors & Draw Curves

Preparation Before Use

Hardware Preparation

Software Environment Configuration

  1. Compile and install the PCL point cloud library. For Linux compilation examples, refer to the official documentation. The on_nurbs option needs to be specified during compilation.
  2. For PIPER manipulator driver deployment, refer to: piper_sdk
  3. For PIPER manipulator ROS control node deployment, refer to: piper_ros

Functional Operation Steps

1. Start the Color Block Detection Node

  1. Start the depth camera ROS driver node (example: Orbbec Petrel):
roslaunch astra_camera dabai_dc1.launch
  1. Start the color block detection node. Two image windows will pop up: hsv_image and origin_image. Adjust HSV parameters in hsv_image to find specific colors, or click in origin_image to extract the target color automatically:
rosrun cube_det cube_det

  1. Example of automatic color search:

  2. The 3D coordinate information of the detected point will be visualized in rviz:


2. Start the Curve Detection

  1. Start the depth camera driver:
roslaunch astra_camera dabai_dc1.launch
  1. Start the curve detection node. It uses a built-in curve fitter and interpolator to make the output curve continuous and smooth:
rosrun cube_det line_det
  1. Click the target curve in origin_image to extract and search automatically.

  2. Open rviz to view the converted point cloud curve:

  3. Start the manipulator’s ROS control node:

# Find the robotic arm CAN port
./find_all_can_port.sh
# Connect to the CAN port
./can_activate.sh
  1. Start the manipulator motion node:
roslaunch piper start_single_piper.launch
  1. Start inverse kinematics. Reference: piper_ros Pinocchio README
python piper_pinocchio.py
  1. Set a manipulator home position:
rostopic pub /pos_cmd piper_msgs/PosCmd "{
x: -0.344,
y: 0.0,
z: 0.110,
roll: 0.0,
pitch: 0.0,
yaw: 0.0,
gripper: 0.0,
mode1: 1,
mode2: 0
}"
  1. Visualize the generated end-effector control path /line_path in rviz:


  2. Start the manipulator motion path loading and end-effector control program:

rosrun cube_det path_confer.py

path_confer.py supports three operations:

  1. After recording (r), the transformed path /transformed_cloud will be shown in rviz:

  2. Press s or p to control the manipulator to follow the generated trajectory.

Conclusion

This project demonstrates how the AgileX PIPER robotic arm can be combined with depth cameras and OpenCV to achieve:

All source code and setup instructions are available in the GitHub repository.

If you try this out or extend the functions, we’d love to hear your feedback and results. Please feel free to share your experiences or raise issues in the repo.

Thanks for reading, and happy coding! :rocket:

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/curve-and-cube-recognition-with-agilex-piper-robotic-arm/49919

ROS Discourse General: Introduction to ROS YouTube Video Series

Hi everyone! I’d like to let you know that I dove into ROS earlier this year and have been working on a “Getting Started” video series for DigiKey’s YouTube channel. There will be 12 videos, and they should be released weekly on DigiKey’s channel. I start off with basic communication between nodes (as you do) before diving into parameters, launch files, and some TF2. I mention talking to microcontrollers at the end, but I don’t get into micro-ROS (as I feel that should probably be its own series). I welcome feedback and suggestions for future videos!

The first video is here:

Introduction to ROS Part 1: What is the Robot Operating System? | DigiKey

2 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/introduction-to-ros-youtube-video-series/49907

ROS Discourse General: How to connect TurtleBot3 to Windows (WSL2)

I’m setting up a lab environment for my students and have replaced the Raspberry Pi 4 on the TurtleBot3 with a Raspberry Pi 5, so all logic now runs directly on the robot.

The challenge: most of my students use Windows PCs. I’d like to use WSL2 and a Docker container to connect to the TurtleBot3 via USB or over a Wi-Fi access point. However, I haven’t been able to get communication working between the TurtleBot3 and WSL2.

Is this even possible? Has anyone managed to get this setup running? I’d appreciate any guidance or tips.

2 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/how-to-connect-turtlebot3-to-windows-wsl2/49906

ROS Discourse General: Another MCP Server to analyze your rosbags with LLMs + a UI to benchmark it against different LLM providers

Hi!

We are releasing yet another MCP server for rosbags, and an MCP Lab UI so that you can see which LLMs (closed or open) are able to properly use the available tools.

Quick demos

With Claude Desktop
claude-desktop-trajectory-analysis

With MCP Lab running locally
mcp-lab-gpt-oss-tf-tree

2d-trajectory-claude

rosbags-mcp

The rosbags-mcp server allows LLMs to pull data from your rosbags and plot it in interactive plots with plotly. It is a pure Python package with no ROS dependencies (based on the rosbags library).

Some of the features include:

We also believe that there is value in creating domain-specific tools that are useful across ROS robots. We started with basic functionality:

We also decided to add a few tools to visualize data (even though we’d love to have foxglove integration, for example!)

Finally, you can also return an image with an LLM (if you are using a multi-modal model).

All this is not entirely new. For example, Bagel is an amazing tool that has been posted already here (release), here (update), and here (DuckDB update). However, we started working on this before we became aware of Bagel and thought it might have a place out there! We are working on other ROS-related MCP servers and hope to release more soon. We see a lot of potential, for example, in the DuckDB integration from Bagel.

MCP Lab Web UI

Together with the rosbags MCP server, we are releasing MCP-Lab, a Web UI where you can choose OpenAI, Anthropic, or open-source models provided by Groq (for now), to test their ability to call tools in any MCP server.

You can also run MCP Lab on your robot and access it remotely.

lidar-plot-gpt5-mini

velocities-claude

Benchmarking

We will release a short technical paper. In the meantime, here’s a preview of selected queries and whether different LLMs are able to use the MCP server properly or not:

Any feedback is welcome!
The MCP server and MCP Lab are both projects under active development so not everything will be working perfectly.

Contributors: @sahars93, @sisqui, @jopequ

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/another-mcp-server-to-analyze-your-rosbags-with-llms-a-ui-to-benchmark-it-against-different-llm-providers/49897

ROS Discourse General: Is it time to discuss rosidl?

TL; DR I am of the opinion that some of the performance bottlenecks we still see in ROS 2 can be traced back to rosidl design. There are benefits to language-specific runtime representations and vendor-specific serialization formats, but efficiency is not one of them. Other designs may be better suited to the kind of data streams that are common in robotics. In that sense, GitHub - Ekumen-OS/flatros2 may be an interesting conversation starter.


Howdy! I don’t post often on ROS Discourse but I thought this may be worthwhile. The Physical AI rebranding of robotics is drawing attention and resources, and in that whirlwind I keep seeing new libraries and frameworks showcasing performance figures that seemingly obliterate those
of ROS 2 (like dora-rs/dora-benchmark, but there are others). Why? The total amount of engineering resources invested by this community in ROS 2 far exceeds that of any other new project and yet I still find myself second guessing ros2 topic hz output. Well, I’ve been around and about for a good while now, and I have a hypothesis.

rosidl is one the oldest corners of ROS 2. C and C++ message generators were first released with Alpha 1, each with their own runtime representations: simple enough, language-specific. The first few DDS based middlewares (like opensplice) had their own vendor-specific IDL to comply with, for interoperability and full feature support, and so type support code and internal runtime representations had to be generated. A decade later ROS 2 is still bound by this design.

Zero-copy transports cannot cross the language boundary because there’s no common runtime representation, and because in-memory layouts are nonlocal, even for the same language their scope of application is extremely narrow (so narrow not even standard messages qualify). Middlewares (de)serialize messages to vendor-specific formats that keep them traceable on their domain and afford us features like keyed topics, whether that makes sense for a given form of data or not. Repeated (de)serialization of images and pointclouds (and other forms of multi-dimensional data) certainly does not help speed.

I honestly didn’t know if there was a way out of this. Some of these shortcomings cannot be fixed out of tree. So I started Ekumen-OS/flatros2 with some Ekumen colleagues as an experiment. It turns out there are lots of pitfalls and limitations but there is a way. Ekumen-OS/flatros2 is NOT it, however. A true solution (I believe) must be a core solution, and Ekumen-OS/flatros2 is just an exercise on how that future solution may look like i.e. messages as language-specific views to contiguous memory layouts, bounded on write, unbounded on read. The choice of flatbuffers and iceoryx2 was informed but arbitrary. Both have interesting properties nonetheless.

Hope this helps kickstart a discussion. There’s no fundamental reason why ROS 2 cannot perform near I/O limits. And who knows, maybe there’s enough momentum to sort out message compabitility too while we are at it (and I’d very much appreciate backwards and forward compatible bags).

11 posts - 6 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/is-it-time-to-discuss-rosidl/49867

ROS Discourse General: Announcing Fast DDS Pro – Advanced Features for ROS 2 and Vulcanexus

Hello ROS 2 community,

We are excited to announce the upcoming release of Fast DDS Pro, the new commercial edition of Fast DDS.

Fast DDS Pro extends the capabilities of the community edition with advanced networking features designed for industrial and defense-grade deployments. These features can be used directly from ROS 2, or through Vulcanexus, our all-in-one ROS 2 toolset.

:rocket: Key features at launch:

:soon_arrow: Release planned for September 2025.

:pushpin: Learn more and sign up for updates here.

We believe this new edition will provide the ROS 2 community with tools to tackle more demanding networking challenges in production systems, while continuing to support the community with the open-source Fast DDS.

We’d love to hear your feedback and discuss use cases where these features could bring value to your projects.

2 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/announcing-fast-dds-pro-advanced-features-for-ros-2-and-vulcanexus/49862

ROS Discourse General: Join Us for the Next Virtual ROS Meetup in Nigeria (September 13)! 🇳🇬

Hello ROS Developers,

We’re excited to announce that we’ll be hosting another Virtual ROS Meetup in Nigeria!

Whether you’re just starting and want to learn how to set up a ROS 2 development environment without changing your PC’s OS, or you’re an enthusiast or expert eager to dive deeper into advanced topics like Kalman filters for robot localization, this session is for you.

We’ll be joined by two amazing speakers, who are contributors to the great Nav2 project with extensive experience in using and applying ROS2:

Sakshay Mahna will demonstrate how to quickly set up a ROS 2 development environment using the ros2env VS Code Extension, which he developed :blush:.

Stevedan Ogochukwu Omodolor will guide us through the application of Kalman filters to fuse multiple sensors for improved robot localization :raising_hands: .

Kindly Register here: Robot Operating System (ROS) Meetup, Lagos

We look forward to seeing you there!

Venue: Virtual only

Date: Sat, Sep 13, 2025 11:00 AM UTCSat, Sep 13, 2025 1:00 PM UTC

Contact Email Address: rosnaija.ng@gmail.com

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/join-us-for-the-next-virtual-ros-meetup-in-nigeria-september-13/49861

ROS Discourse General: Robot Steering & video steaming on new iOS App (Robot Steering)

I launched a new iOS app, “Robot Steering,” using the web_video_server & rosbridge_suite libraries.
Please download this free app to control your robot with video streaming features.

Robot Motion Capture

More description on Github

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/robot-steering-video-steaming-on-new-ios-app-robot-steering/49849

ROS Discourse General: RobotCAD integrated Dynamic World Generator

Added “Create custom world” tool based on Dynamic World Generator repo. It let you create Gazebo world with static and dynamic obstacles directly in RobotCAD.

RobotCAD - Dynamic world creating — Video | VK - video demo

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/robotcad-integrated-dynamic-world-generator/49815

ROS Discourse General: ManyMove integration with Isaac SIM 5.0

ManyMove, an open-source framework built on ROS 2, MoveIt 2, and BehaviorTree.CPP, is now enabled to interact with NVIDIA Isaac Sim 5.0.

With this integration, you can:
:heavy_check_mark: Execute behavior-tree–based motion control directly to physics simulation.
:heavy_check_mark: Prototype and validate manipulation tasks like pick-and-place in a high-fidelity digital twin.
:heavy_check_mark: Streamline the path from simulation to real-world deployment.

ManyMove is designed for industrial applications and offers a wide range of examples with single or double robot configuration.
Its command structure is similar to industrial robot languages, easing the transition from classic robotics frameworks to ROS 2.
Behavior Trees power the logic cycles for flexibility and clarity.


:movie_camera: Demos:

ManyMove_IsaacSim_interaction-BTs

Combining ManyMove, Isaac Sim’s physics and rendering and Isaac ROS reference workflows aims to allow easier deployment of GPU-accelerated packages for perception-driven manipulation.


:tools: How to run the examples

  1. Install & source the Humble or Jazzy version of ManyMove by following the instructions in the GitHub README.

  2. Install Isaac Sim 5.0 and enable the ROS2 Simulation Control extension.

  3. Launch the example:

    ros2 launch manymove_bringup lite_isaac_sim_movegroup_fake_cpp_trees.launch.py
    
  4. Load .../manymove/manymove_planner/config/cuMotion/example_scene_lite6_grf_ros2.usd and start the simulation.

  5. Press START on ManyMove’s HMI


Tested on Ubuntu 22.04 with ROS 2 Humble and on Ubuntu 24.04 with ROS2 Jazzy.

Feedback welcome!

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/manymove-integration-with-isaac-sim-5-0/49814

ROS Discourse General: Anyone here going to SWITCH 2025 after ROSCon 2025?

Hey everyone!

I noticed that SWITCH Conference is happening the same week as ROSCon in Singapur. It is a major deep‑tech innovation and startup event that brings together founders, investors, corporates, and researchers focused on research commercialization…

I’d love to know if anyone in the ROS community has been there before, and if so, would you recommend me going (as an robotics enthusiasts or entrepreneurs wannabe)?

Also, is anyone here planning to attend this year?

Thanks in advance for any thoughts or advice! :folded_hands:

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/anyone-here-going-to-switch-2025-after-roscon-2025/49805

ROS Discourse General: Guidance Nav2 for similar project: straight lines trajectory

Hi there,

I am currently working on a project which has similarities to something showcased on the Nav2 website landing page. I would be after recommendations of global planner and local planner/controller plugins to execute a trajectory made of parallel passes spaced by a given constant offset distance. It is similar to what is visible below (visible on Nav2) website:

output

I was not sure in the example above which type of localization method was employed (GPS based for outdoor application, as it seem close to an agricultural outdoor application?). Unsure also who or which company would be behind (I could have given them credit for the illustration above), so would be thankful if the community would know who or which company is behind this illustration.

On my case, at the end of each straight pass (in blue), the robot could reposition via a curved trajectory or pause and execute two turns in place for the repositioning pass. I do not need to maintain speed during the turns.

So far I have explored the straight line global planner plugin as shown in the Nav2 example and trialed the Regulated Pure Pursuit and Vector Pursuit as local planner / controller. Is there other local planner I should consider for this type of application (straight line then repositioning to next straight line)? The goal is to be able to maintain a precise trajectory for the straight passes while going on an acceleration, cruise, deceleration speed profile on the straight lines. I am currently trialing those in Gazebo simulating the environment where the robot will be deployed. I am able to generate the required waypoints via the simple commander API and using AMCL for now as my application is indoor.

Using Gazebo and AMCL, here are the kind of things I obtain via PlotJuggler:

First of, there are differences between ACML and groundtruth, so AMCL would require some improvements, but I can see the AMCL goes by each corner which are waypoints, so the robot from its perception goes to all waypoints. Second the trajectory followed (groundtruth) is within +/-10cm of the target. Not sure if other or better tuned controller can improve the order of magnitude of the accuracy, I would be aiming to about +/-1cm at 1m/s. This is a combination of localization + control method. Open to discuss if I need to improve localization first (AMCL or other, such as Visual + IMU odometry, UWB, …). I am also using Humble which does not have the python API driveOnHeading, to trial a pure straight line command (maybe similar to broadcasting in /cmd_vel a positive x component).

Thank you very much for your help and guidance,

Nicolas

PS: first post on my side, so apologies if this is the wrong platform, I was trying to find out about a specific Nav2 platform but could not join the Nav2 slack.

2 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/guidance-nav2-for-similar-project-straight-lines-trajectory/49766

ROS Discourse General: ROS 2 Desktop Apps with rclnodejs + Electron

Looking for an alternative to traditional ROS visualization tools? Want to create custom robotics interfaces that are modern, interactive, and cross-platform?

The rclnodejs + Electron combination offers a powerful solution for building ROS 2 desktop applications using familiar web technologies, giving you the flexibility to create exactly the interface your project needs.

Why Use This?

Architecture

Example: Turtle TF2 Demo

Real-time TF2 coordinate frame visualization with interactive 3D graphics:

turtle-tf2-demo

Perfect for: Robot dashboards, monitoring tools, educational apps, research prototypes.

Learn more: https://github.com/RobotWebTools/rclnodejs

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/ros-2-desktop-apps-with-rclnodejs-electron/49748

ROS Discourse General: Think Outside the Box: Controlling a Robotic Arm with a Mobile Phone Gyroscope

:waving_hand: Hi ROS Community,

We’d like to share an experimental project where we explored controlling a robotic arm using the gyroscope and IMU sensors of a mobile phone. By streaming the phone’s sensor data via WebSocket, performing attitude estimation with an EKF, and mapping the results to the robotic arm, we managed to achieve intuitive motion control without the need for traditional joysticks or external controllers.

This post documents the setup process, environment configuration, and code usage so you can try it out yourself or adapt it to your own robotic applications. :rocket:


Abstract

This project implements robotic arm control using mobile phone sensor data (accelerometer, IMU, magnetometer).
The data is transmitted from the mobile phone to a local Python script in real time via WebSocket.
After attitude calculation, the script controls the movement of the robotic arm.


Tags

Mobile Phone Sensor, Attitude Remote Control, IMU, Attitude Calculation, EKF


Code Repository

GitHub: Agilex-College/piper/mobilePhoneCtl


Function Demonstration

Robots by phone?!


Environment Configuration

git clone https://github.com/agilexrobotics/Agilex-College.git
cd Agilex-College/piper/mobilePhoneCtl/
pip install -r requirements.txt --upgrade

Mobile Phone App Installation

We recommend using Sensor Stream IMU+ (a paid app) for mobile phone–side data collection and transmission:


App Usage Instructions

  1. Open the Sensor Stream IMU+ app.
  2. Enter the IP address and port of the computer running this script (default port: 5000), e.g., 192.168.1.100:5000.
  3. Select sensors to transmit: Accelerometer, Gyroscope, Magnetometer.
  4. Set an update interval (e.g., 20 ms).
  5. Tap Start Streaming to begin data transmission.

Python Script Usage

  1. Connect the robotic arm and activate the CAN module:
sudo ip link set can0 up type can bitrate 1000000
  1. Run main.py in this directory:
python3 main.py
  1. The script will display the local IP address and port—fill this into the app.
  2. Once the app starts streaming, the script performs attitude calculation and sends the results to the robotic arm via the EndPoseCtrl interface of piper_sdk.

Data Transmission & Control Explanation


Precautions


Related Projects

Check out these previous tutorials on controlling the AgileX PIPER robotic arm:


:white_check_mark: That’s all for this demo!
If you’re interested in experimenting with mobile-sensor-based robot control, feel free to check out the GitHub repo (again) and give it a try.

We’d love to hear your thoughts, suggestions, or ideas for extending this approach—whether it’s integrating ROS2 teleoperation, improving the EKF pipeline, or adding gesture control.

Looking forward to your feedback and discussions! :raising_hands:

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/think-outside-the-box-controlling-a-robotic-arm-with-a-mobile-phone-gyroscope/49747

ROS Discourse General: Don't trust LLM math with your rosbags? Bagel has a solution

Hi ROS Community,

Couple weeks ago, we launched an open source tool called Bagel. It allows you to troubleshoot rosbags using natural language, just like how you use ChatGPT.

We released a new feature today to address a common concern raised by the ROS community.

:speaking_head: “We don’t trust LLM math.”:man_gesturing_no:

That’s feedback we kept hearing from the ROS community regarding Bagel.:bagel:

:ear: We heard you, so we built a solution: whenever Bagel needs to do a mathematical calculation on a robotics log, it now creates a DuckDB table and runs a clear SQL query against the data.

:brain: That makes Bagel’s math:

• Transparent – you see the query and verify it manually.

• Deterministic – no “LLM guesswork.”

We’re launching this today and would love community feedback :rocket: . Love it :heart: or hate it :-1: ? Let us know! Or yell at us in our Discord :rofl:.

https://github.com/Extelligence-ai/bagel

3 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/dont-trust-llm-math-with-your-rosbags-bagel-has-a-solution/49737

ROS Discourse General: Image_transport and point_cloud_transport improvements

During the last months we have being improving point_cloud_transport and image_point_transport API, in particular to close some major ROS 2 feature gaps

Related PR: https://github.com/ros-perception/image_common/pull/364 Support lifecycle node - NodeInterfaces by ahcorde · Pull Request #352 · ros-perception/image_common · GitHub Removed deprecated code by ahcorde · Pull Request #356 · ros-perception/image_common · GitHub Update subscriber filter by elsayedelsheikh · Pull Request #126 · ros-perception/point_cloud_transport · GitHub Simplify NodeInterface API mehotd call by ahcorde · Pull Request #129 · ros-perception/point_cloud_transport · GitHub Deprecated rmw_qos_profile_t by ahcorde · Pull Request #125 · ros-perception/point_cloud_transport · GitHub Feat/Add LifecycleNode Support by elsayedelsheikh · Pull Request #109 · ros-perception/point_cloud_transport · GitHub

Some of these changes are breaking ABI (only on rolling). If you are maintaining a point_cloud_transport or image_transport plugins you should probably update a little bit your code to remove new deprecation warnings and avoid potencial segfaults due these new changes.

Thank you everyone involved in these changes and hope this help!

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/image-transport-and-point-cloud-transport-improvements/49732

ROS Industrial: Industrial Calibration Refresh

A number of ease of use updates have been made to the ROS-Industrial Industrial Calibration repository. These updates seek to improve the ease of use and provides a significant update around documentation. The updates include:

  1. Cleaning up the intrinsic and extrinsic calibration widgets so they now share common infrastructure reducing total code.
  2. The widgets are now in the application main window such that the widgets may now more easily used in Rviz where the image display cannot or does not need to be in the main widget.
  3. The update also includes a widget that separates the calibration configuration/data viewer from the results page (also added to support usage in other contexts (like Rviz) that are more limited on vertical space, and it makes both pages more easily readable).
  4. An “instructions” action and tool bar button has been added to provide information on how to run the calibration apps.

A new documentation page has been created that includes information and examples to help users get the most out of the calibration and get the most accurate hand-eye calibration. This includes an example and unit test for the camera intrinsic calibration using the 10 x 10 modified circle grid data set.

 The updated documentation page includes a primer on calibration that covers the basics of calibration and a “getting started” page that includes guidance on building the application and the ROS 1 and ROS 2 interfaces, as well as links to the GUI applications. A docker has also been provided for those that prefer to work from a Docker container.

 We look forward to getting feedback on this latest update and hope that the community finds this update useful relative to having robust industrial calibration for your robotic perception systems and applications.

[WWW] https://rosindustrial.org/news/2025/8/22/industrial-calibration-refresh

ROS Discourse General: Ros-python-wheels: pip-installable ROS 2 packages

Hello ROS community,

I’d like to share that I’ve been working on making ROS 2 packages pip-installable with a project called ros-python-wheels! Furthermore, a select number of ROS packages are made available on a Python package index which I’ve hosted at Cloudsmith.io

Getting started with ROS 2 in Python is as simple as running:

pip install --extra-index-url https://dl.cloudsmith.io/public/ros-python-wheels/kilted/python/simple ros-rclpy[fastrtps]

Key Benefits

This enable a first-class developer experience when working with ROS in Python projects:

Comparison

This approach provides a distinct alternative to existing solutions:


I’d love to hear your thoughts on this project. I’d also appreciate a star on my project if you find this useful or if you think this is a good direction for ROS!

4 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/ros-python-wheels-pip-installable-ros-2-packages/49688

ROS Discourse General: Announcement: rclrs 0.5.0 Release

We are thrilled to announce the latest official release of rclrs (0.5.0), the Rust client library for ROS 2! This latest version brings significant improvements and enhancements to rclrs, making it easier than ever to develop robotics applications in Rust.

Some of the highlights of this release include:

In addition to the new release of rclrs, we’re also happy to announce that rosidl_generator_rs is now released on the ROS buildfarm ( ROS Package: rosidl_generator_rs ). Paving the way for shipping generated Rust messages alongside current messages for C++ and Python.

I’ll be giving a talk at ROSCon UK in Edinburgh and at ROSCon in Singapore about ros2-rust, rclrs and all the projects that we’ve been working on to bring Rust support to ROS. Happy to chat with anyone interested in our work, or in Rust in ROS in general :slight_smile:

And if you want to be part of the development of the next release, you’re more than welcome to join us on our Matrix chat channel!

I can’t emphasize enough how much this release was a community effort, we’re happy to have people from so many different affiliations contributing to rclrs. This release wouldn’t have been possible without:

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/announcement-rclrs-0-5-0-release/49681

ROS Discourse General: [Project] ROS 2 MediaPipe Suite — parameterized multi-model node with landmarks & gesture events

We’re releasing a small ROS 2 suite that turns Google MediaPipe Tasks into reusable components.
A single parameterized node switches between hand / pose / face, publishes landmarks + high-level gesture events, ships RViz viz (overlay + MarkerArray), and a turtlesim demo for perception → event → behavior.

Newcomer note: This is my first ROS 2 package—I kept the setup minimal so students and prototypers can plug MediaPipe into ROS 2 in minutes.

Turtlesim Demo:

Repo: https://github.com/PME26Elvis/mediapipe_ros2_suite · ROS 2: Humble (CI), Jazzy (experimental) · License: Apache-2.0
CI: Humble — passing (required) · Jazzy — experimental (non-blocking)
Quick start:

ros2 run v4l2_camera v4l2_camera_node
ros2 launch mediapipe_ros2_py mp_node.launch.py model:=hand image_topic:=/image_raw start_rviz:=true
ros2 run turtlesim turtlesim_node & ros2 run mediapipe_ros2_py gesture_to_turtlesim

Feedback & contributions welcome.

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/project-ros-2-mediapipe-suite-parameterized-multi-model-node-with-landmarks-gesture-events/49680

ROS Discourse General: New packages and patch release for Jazzy Jalisco 2025-08-20

We’re happy to announce 82 new packages and 705 updates are now available on Ubuntu Noble on amd64 for Jazzy Jalisco.

This sync was tagged as jazzy/2025-08-20 .

:jazzy::jazzy::jazzy:

Package Updates for jazzy

Note that package counts include dbgsym packages which have been filtered out from the list below

Added Packages [82]:

Updated Packages [705]:

Removed Packages [1]:

Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:

4 posts - 4 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/new-packages-and-patch-release-for-jazzy-jalisco-2025-08-20/49667

ROS Discourse General: Continuous Trajectory Recording and Replay for AgileX PIPER Robotic Arm

Hi ROS Community,

I’m excited to share details about implementing continuous trajectory recording and replay for the AgileX PIPER robotic arm. This solution leverages time-series data to accurately replicate complex motion trajectories, with full code, usage guides, and step-by-step demos included. It’s designed to support teaching demonstrations and automated operations, and I hope it brings value to your projects.

Abstract

This paper achieves continuous trajectory recording and replay based on the AgileX PIPER robotic arm. Through the recording and reproduction of time-series data, it achieves the perfect replication of the complex motion trajectories of the robotic arm. In this paper, we will analyze the code implementation and provide complete code, usage guidelines, and step-by-step demonstrations.

Keywords

Trajectory control; Continuous motion; Time series; Motion reproduction; AgileX PIPER

Code Repository

github link: https://github.com/agilexrobotics/Agilex-College.git

Function Demonstration

From Code → to Motion 🤖 The Magic of Robotic Arm

1. Preparation Before Use

1.1. Preparation Work

1.2. Environment Configuration

2. Operation Steps for Continuous Trajectory Recording and Replay Function

2.1. Operation Steps

  1. Power on the robotic arm and connect the USB-to-CAN module to the computer (ensure that only one CAN module is connected).

  2. Open the terminal and activate the CAN module.

    sudo ip link set can0 up type can bitrate 1000000
    
  3. Clone the remote code repository.

    git clone https://github.com/agilexrobotics/Agilex-College.git
    
  4. Switch to the recordAndPlayTraj directory.

    cd Agilex-College/piper/recordAndPlayTraj/
    
  5. Run the recording program.

    python3 recordTrajectory_en.py
    
  6. Short-press the teach button to enter the teaching mode.

  7. Set the initial position of the robotic arm. After pressing Enter in the terminal, drag the robotic arm to record the trajectory.

  8. After recording, short-press the teach button again to exit the teaching mode.

  9. Precautions before replay:
    When exiting the teaching mode for the first time, a specific initialization process is required to switch from the teaching mode to the CAN mode. Therefore, the replay program will automatically perform a reset operation to return joints 2, 3, and 5 to safe positions (zero points) to prevent the robotic arm from suddenly falling due to gravity and causing damage. In special cases, manual assistance may be needed to return joints 2, 3, and 5 to zero points.

  10. Run the replay program.

    python3 playTrajectory_en.py
    
  11. After successful enabling, press Enter in the terminal to play the trajectory.

2.2. Recording Techniques and Strategies

Motion Planning Strategies:

Before starting the recording, the trajectory to be recorded should be planned:

  1. Starting Position Selection:

    • Select a safe position of the robotic arm as the starting point.
    • Ensure that the starting position is convenient for initialization during subsequent replay.
    • Avoid choosing a position close to the joint limit.
  2. Trajectory Path Design:

    • Plan a smooth motion path to avoid sharp direction changes.
    • Consider the kinematic constraints of the robotic arm to avoid singular positions.
    • Reserve sufficient safety margins to prevent collisions.
  3. Speed Control:

    • Maintain a moderate movement speed to ensure both recording quality and avoid being too slow.
    • Appropriately slow down at key positions to improve accuracy.
    • Avoid sudden acceleration or deceleration.

3. Problems and Solutions

Problem 1: No Piper Class

image

Reason: The currently installed SDK is not the version with API.

Solution: Execute pip3 uninstall piper_sdk to uninstall the current SDK, and then install the 1_0_0_beta version of the SDK according to the method in 1.2. Environment Configuration.

Problem 2: The Robotic Arm Does Not Move, and the Terminal Outputs as Follows

Reason: The teach button was short-pressed during the program operation.

Solution: Check whether the indicator light of the teach button is off. If it is, re-run the program. If not, short-press the teach button to exit the teaching mode first and then run the program.

4. Implementation of Trajectory Recording Program

The trajectory recording program is the data collection module of the system, responsible for capturing the position information of the continuous joint movements of the robotic arm in the teaching mode.

4.1. Program Initialization and Configuration

4.1.1. Parameter Configuration Design

# Whether there is a gripper
have_gripper = True
# Maximum recording time in seconds (0 = unlimited, stop by terminating program)
record_time = 10.0
# Teach mode detection timeout in seconds
timeout = 10.0
# CSV file path for saving trajectory
CSV_path = os.path.join(os.path.dirname(__file__), "trajectory.csv")

Analysis of Configuration Parameters:

4.1.2. Robotic Arm Connection and Initialization

# Initialize and connect to robotic arm
piper = Piper("can0")
piper.connect()
interface = piper.init()
time.sleep(0.1)

Analysis of Connection Mechanism:

4.1.3. Position Acquisition and Data Storage

4.1.3.1. Position Acquisition Function
def get_pos():
    joint_state = piper.get_joint_states()[0]
    if have_gripper:
        '''Get current joint angles and gripper opening distance'''
        return joint_state + (piper.get_gripper_states()[0][0], )
    return joint_state
4.1.3.2. Position Change Detection
if current_pos != last_pos:  # Record only when position changes
    current_pos = get_pos()
    wait_time = round(time.time() - last_time, 4)
    print(f"INFO: Wait time: {wait_time:0.4f}s, current position: {current_pos}")
    last_pos = current_pos
    last_time = time.time()
    csv.write(f"{wait_time}," + ",".join(map(str, current_pos)) + "\n")

Position Processing:

Time Processing:

4.1.4. Mode Detection and Switching

print("step 1: Press teach button to enter teach mode")
while interface.GetArmStatus().arm_status.ctrl_mode != 2:
    over_time = time.time() + timeout
    if over_time < time.time():
        print("ERROR: Teach mode detection timeout. Please check if teach mode is enabled")
        exit()
    time.sleep(0.01)

Status Polling Strategy:
The program uses the polling method to detect the control mode. This method has the following characteristics:

Timeout Protection Mechanism:
The 10-second timeout setting takes into account the needs of actual operations:

Safety Features of Teaching Mode:

4.1.5. Data Storage

# ... Recording loop ...
csv.write(f"{wait_time}," + ",".join(map(str, current_pos)) + "\n")
# ... End of recording ...
csv = open(CSV_path, "w")
csv.close()

Data Integrity Guarantee:
After each recording, the data is immediately written to the file, and the buffer is refreshed to ensure that the data will not be lost due to abnormal program exit.

Data Format Selection:
Reasons for Choosing CSV Format for Data Storage:

Data Column Attributes:

4.1.6. Recording Control Logic

over_time = last_time + record_time
while record_time == 0 or time.time() < over_time:
    csv = open(CSV_path, "w")
    input("step 2: Press Enter to start recording trajectory")
    last_pos = get_pos()
    last_time = time.time()
    # Recording logic
    time.sleep(0.01)
csv.close()

Time Control Strategy:
The system supports two recording modes:

  1. Timed recording: when record_time > 0, the recording stops automatically after the specified duration.
  2. Infinite recording: When record_time == 0, the program needs to be manually closed.

The flexibility of this design:

4.1.7. Complete Code Implementation of Trajectory Recording Program

#!/usr/bin/env python3
# -*-coding:utf8-*-
# Record continuous trajectory
import os, time
from piper_sdk import *

if __name__ == "__main__":
    # Whether there is a gripper
    have_gripper = True
    # Maximum recording time in seconds (0 = unlimited, stop by terminating program)
    record_time = 10.0
    # Teach mode detection timeout in seconds
    timeout = 10.0
    # CSV file path for saving trajectory
    CSV_path = os.path.join(os.path.dirname(__file__), "trajectory.csv")

    # Initialize and connect to robotic arm
    piper = Piper("can0")
    interface = piper.init()
    piper.connect()
    time.sleep(0.1)

    def get_pos():
        joint_state = piper.get_joint_states()[0]
        if have_gripper:
            '''Get current joint angles and gripper opening distance'''
            return joint_state + (piper.get_gripper_states()[0][0], )
        return joint_state

    print("step 1: Press teach button to enter teach mode")
    over_time = time.time() + timeout
    while interface.GetArmStatus().arm_status.ctrl_mode != 2:
        if over_time < time.time():
            print("ERROR: Teach mode detection timeout. Please check if teach mode is enabled")
            exit()
        time.sleep(0.01)

    input("step 2: Press Enter to start recording trajectory")
    csv = open(CSV_path, "w")
    last_pos = get_pos()
    last_time = time.time()
    over_time = last_time + record_time

    while record_time == 0 or time.time() < over_time:
        current_pos = get_pos()
        if current_pos != last_pos:  # Record only when position changes
            wait_time = round(time.time() - last_time, 4)
            print(f"INFO: Wait time: {wait_time:0.4f}s, current position: {current_pos}")
            csv.write(f"{wait_time}," + ",".join(map(str, current_pos)) + "\n")
            last_pos = current_pos
            last_time = time.time()
        time.sleep(0.01)

    csv.close()
    print("INFO: Recording complete. Press teach button again to exit teach mode")

5. Implementation of Trajectory Replay Program

The trajectory replay program is the execution module of the system, responsible for reading recorded position data and controlling the robotic arm to reproduce these positions.

5.1. Parameter Configuration and Data Loading

Replay Parameter Configuration:

# replay times (0 means infinite loop)
play_times = 1
# replay interval in seconds
play_interval = 1.0
# Motion speed percentage (recommended range: 10-100)
move_spd_rate_ctrl = 100
# replay speed multiplier (recommended range: 0.1-2)
play_speed = 1.0

Play times control:
The play_times parameter supports three modes:

Dual Mechanism for Speed Control:
The system provides two speed control methods:

  1. move_spd_rate_ctrl: Controls the overall movement speed of the robotic arm.
  2. play_speed: Controls the time scaling of trajectory replay.

Advantages of this dual control:

Role of Play Interval:
The play_interval parameter plays an important role in continuous replay:

5.2. Data Loading: Reading Data Files

try:
    with open(CSV_path, 'r', encoding='utf-8') as f:
        track = list(csv.reader(f))
        if not track:
            print("ERROR: Trajectory file is empty")
            exit()
        track = [[float(j) for j in i] for i in track  # Convert to float lists
except FileNotFoundError:
    print("ERROR: Trajectory file not found")
    exit()

Exception Handling Strategy:
The program adopts a comprehensive exception handling mechanism to cover common file operation errors:

Data type conversion: The program uses list comprehensions to convert string data to floating-point numbers.

5.3. Safety Stop Function

def stop():
    '''Stop robotic arm; must call this function when first exiting teach mode before using CAN mode'''
    interface.EmergencyStop(0x01)
    time.sleep(1.0)
    limit_angle = [0.1745, 0.7854, 0.2094]  # Arm only restored when joints 2,3,5 are within safe range
    pos = get_pos()
    while not (abs(pos[1]) < limit_angle[0] and abs(pos[2]) < limit_angle[0] and pos[4] < limit_angle[1] and pos[4] > limit_angle[2]):
        # Restore arm
        piper.disable_arm()
        time.sleep(0.01)
        pos = get_pos()
    time.sleep(1.0)

Staged Stop Strategy:
The stop function adopts a phased safety stop strategy:

  1. Emergency stop phase: EmergencyStop(0x01) sends an emergency stop command to immediately halt all joint movements (joints with impedance).
  2. Safe position waiting: Waits for key joints (joints 2, 3, and 5) to move within the safe range.
  3. System recovery phase: Sends a recovery command to reactivate the control system.

Safety Range Design:
The program focuses on the positions of joints 2, 3, and 5 based on the mechanical structure characteristics of the PIPER robotic arm:

The safety angle ranges (10°, 45°, 12°) are set based on the following considerations:

Real-time monitoring mechanism: The program uses real-time polling to monitor joint positions, ensuring the next operation is performed only when safety conditions are met.

5.4. System Enable Function

def enable():
    while not piper.enable_arm():
        time.sleep(0.01)
    if have_gripper:
        interface.ModeCtrl(0x01, 0x01, move_spd_rate_ctrl, 0x00)
        piper.enable_gripper()
        time.sleep(0.01)
    print("INFO: Enable successful")

Robotic Arm Enable: enable_arm()

Gripper Enable: enable_gripper()

Control Mode Settings:
Parameters for ModeCtrl(0x01, 0x01, move_spd_rate_ctrl, 0x00):

5.5. Replay Control Logic

count = 0
while play_times == 0 or abs(play_times) != count:
    input("step 2: Press Enter to start trajectory replay")
    for n, pos in enumerate(track):
        piper.move_j(pos[1:-1], move_spd_rate_ctrl)
        if have_gripper and len(pos) == 8:
            piper.move_gripper(pos[-1], 1)
        print(f"INFO: replay #{count + 1}, wait time: {pos[0] / play_speed:0.4f}s, target position: {pos[1:]}")
        if n == len(track) - 1:
            time.sleep(play_interval)
        else:
            time.sleep(pos[0] / play_speed)
    count += 1

Joint Control: move_j()

Gripper Control: move_gripper()

Replay Speed Adjustment Mechanism:
pos[0] / play_speed enables dynamic adjustment of trajectory replay speed:

Advantages of this implementation:

5.6. Complete Code of the Trajectory Replay Program

#!/usr/bin/env python3
# -*-coding:utf8-*-
# Play continuous trajectory
import os, time, csv
from piper_sdk import *

if __name__ == "__main__":
    # Whether there is a gripper
    have_gripper = True
    # replay times (0 means infinite loop)
    play_times = 1
    # replay interval in seconds
    play_interval = 1.0
    # Motion speed percentage (recommended range: 10-100)
    move_spd_rate_ctrl = 100
    # replay speed multiplier (recommended range: 0.1-2)
    play_speed = 1.0
    # CAN mode switch timeout in seconds
    timeout = 5.0
    # CSV file path for saved trajectory
    CSV_path = os.path.join(os.path.dirname(__file__), "trajectory.csv")

    # Read trajectory file
    try:
        with open(CSV_path, 'r', encoding='utf-8') as f:
            track = list(csv.reader(f))
            if not track:
                print("ERROR: Trajectory file is empty")
                exit()
            track = [[float(j) for j in i] for i in track  # Convert to float lists
    except FileNotFoundError:
        print("ERROR: Trajectory file not found")
        exit()

    # Initialize and connect to robotic arm
    piper = Piper("can0")
    interface = piper.init()
    piper.connect()
    time.sleep(0.1)

    def get_pos():
        joint_state = piper.get_joint_states()[0]
        if have_gripper:
            '''Get current joint angles and gripper opening distance'''
            return joint_state + (piper.get_gripper_states()[0][0], )
        return joint_state

    def stop():
        '''Stop robotic arm; must call this function when first exiting teach mode before using CAN mode'''
        interface.EmergencyStop(0x01)
        time.sleep(1.0)
        limit_angle = [0.1745, 0.7854, 0.2094]  # Arm only restored when joints 2,3,5 are within safe range
        pos = get_pos()
        while not (abs(pos[1]) < limit_angle[0] and abs(pos[2]) < limit_angle[0] and pos[4] < limit_angle[1] and pos[4] > limit_angle[2]):
            time.sleep(0.01)
            pos = get_pos()
        # Restore arm
        piper.disable_arm()
        time.sleep(1.0)

    def enable():
        while not piper.enable_arm():
            time.sleep(0.01)
        if have_gripper:
            interface.ModeCtrl(0x01, 0x01, move_spd_rate_ctrl, 0x00)
            piper.enable_gripper()
            time.sleep(0.01)
        print("INFO: Enable successful")

    print("step 1: Ensure robotic arm has exited teach mode before replay")
    if interface.GetArmStatus().arm_status.ctrl_mode != 1:
        over_time = time.time() + timeout
        stop()  # Required when first exiting teach mode
        while interface.GetArmStatus().arm_status.ctrl_mode != 1:
            if over_time < time.time():
                print("ERROR: CAN mode switch failed. Please confirm teach mode is exited")
                exit()
            interface.ModeCtrl(0x01, 0x01, move_spd_rate_ctrl, 0x00)
            time.sleep(0.01)
    enable()

    count = 0
    input("step 2: Press Enter to start trajectory replay")
    while play_times == 0 or abs(play_times) != count:
        for n, pos in enumerate(track):
            piper.move_j(pos[1:-1], move_spd_rate_ctrl)
            if have_gripper and len(pos) == 8:
                piper.move_gripper(pos[-1], 1)
            print(f"INFO: replay #{count + 1}, wait time: {pos[0] / play_speed:0.4f}s, target position: {pos[1:]}")
            if n == len(track) - 1:
                time.sleep(play_interval)  # Final point delay
            else:
                time.sleep(pos[0] / play_speed)  # Point-to-point delay
        count += 1

6. Summary

Based on the AgileX PIPER robotic arm, the above has realized the continuous trajectory recording and replay functions. Through the application of Python SDK, the recording and repeated execution of the robotic arm’s trajectory can be achieved, providing strong technical support for teaching demonstrations and automated operations.

If you have any questions or feedback about this implementation, feel free to share in the comments. Let’s discuss and improve it together! You can also contact us directly at support@agilex.ai for further assistance.

Thanks,
AgileX Robotics

2 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/continuous-trajectory-recording-and-replay-for-agilex-piper-robotic-arm/49656


2025-09-06 12:17