August 08, 2025
ROS 2 Rust Meeting: August 2025

The next ROS 2 Rust Meeting will be Mon, Aug 11, 2025 2:00 PM UTC

The meeting room will be at https://meet.google.com/rxr-pvcv-hmu

In the unlikely event that the room needs to change, we will update this thread with the new info!

With the recent announcement about OSRF funding for adding Cargo dependency management to the buildfarm, and a few people having questions on that, I would like to reiterate that this meeting is open to everyone - working group member or not. If you want to learn what we’re trying to accomplish, please drop by! We’d love to have you!

1 post - 1 participant

Read full topic

by jhdcs on August 08, 2025 11:45 AM

August 06, 2025
ROS 2 Cross-compilation / Multi architecture development

Hi,

I’m in the process of looking into migrating our indoor service robot from an amd64 based system to the Jetson Orin Nano.

How are you doing development when targeting aarch64/arm64 machines?

My development machine is not the newest, but reasonably powerful. (AMD Ryzen 9 3900X, 32GB RAM) But it struggles with the officially recommended QEMU based approach. Even the vanilla osrf/ros docker image is choppy under emulation. Building the actual image, stack or running a simulated environment is totally out of the question.

The different pathways I investigated so far are:

  • Using QEMU emulation - unusable

  • Using the target platform as the development machine - slow build, but reasonable runtime performance

  • Cloud building the development container - A bit pricey, and the question of building the actual stack still remains. Maybe CMake cross compilation in native container.

  • Using Apple Silicon for development - haven’t looked into it

I’m interested in your approach of this problem. I imagine that using ARM based systems in production robots is a fairly common practice given the recent advances in this field.

7 posts - 6 participants

Read full topic

by emilnovak on August 06, 2025 12:35 PM

Why do robotics companies choose not to contribute to open source?

Hi all!

We wrote a blog post at Henki Robotics to share some of our thoughts on open-source collaboration, based on what we’ve seen and learned so far. We thought that it would be interesting for the community to hear and discuss the challenges open-source contributions pose from a company standpoint, while also highlighting the benefits of doing so and encouraging more companies to collaborate together.

We’d be happy to hear your thoughts and if you’ve had similar experiences!

1 post - 1 participant

Read full topic

by jak on August 06, 2025 12:22 PM

August 05, 2025
A Dockerfile and a systemd service for starting a rmw-zenoh server

Meanwhile there’s no official method for autostarting rmw-zenoh server this might be useful:

4 posts - 2 participants

Read full topic

by xopxe on August 05, 2025 08:12 PM

August 04, 2025
How to Implement End-to-End Tracing in ROS 2 (Nav2) with OpenTelemetry for Pub/Sub Workflows?

I’m working on implementing end-to-end tracing for robotic behaviors using OpenTelemetry (OTel) in ROS 2. My goal is to trace:

  1. High-level requests (e.g., “move to location”) across components to analyze latency

  2. Control commands (e.g., teleop) through the entire pipeline to motors

Current Progress:

  • Successfully wrapped ROS 2 Service and Action servers to generate OTel traces

  • Basic request/response flows are visible in tracing systems

Challenges with Nav2:

  • Nav2 heavily uses pub/sub patterns where traditional instrumentation falls short

  • Difficult to maintain context propagation across:

    • Multiple subscribers processing the same message

    • Chained topic processing (output of one node becomes input to another)

    • Asynchronous publisher/subscriber relationships

Questions:

  1. Are there established patterns for OTel context propagation in ROS 2 pub/sub systems?

  2. How should we handle fan-out scenarios (1 publisher → N subscribers)?

  3. Any Nav2-specific considerations for tracing (e.g., lifecycle nodes, behavior trees)?

  4. Alternative approaches besides OTel that maintain compatibility with observability tools?

2 posts - 2 participants

Read full topic

by lcmasdf on August 04, 2025 06:44 PM

Space ROS Jazzy 2025.07.0 Release

Hello ROS community!

The Space ROS team is excited to announce Space ROS Jazzy 2025.07.0 was released last week and is available as osrf/space-ros:jazzy-2025.07.0 on DockerHub.

Release details

This release includes a significant refactor the build of our base image making the main container over 60% smaller! Additionally, development images are now pushed to DockerHub to make building with Space ROS and an underly easier than ever. For an exhaustive list of all the issues addressed and PRs merged, check out the GitHub Project Board for this release here.

Code

Current versions of all packages released with Space ROS are available at:

What’s Next

This release comes 3 months after the last release. The next release is planned for October 31, 2025. If you want to contribute to features, tests, demos, or documentation of Space ROS, get involved on the Space ROS GitHub issues and discussion board.

All the best,

The Space ROS Team

1 post - 1 participant

Read full topic

by bkempa on August 04, 2025 03:28 PM

Bagel, the Open Source Project | Guest Speakers Arun Venkatadri and Shouheng Yi | Cloud Robotics WG Meeting 2025-08-11

Please come and join us for this coming meeting at Mon, Aug 11, 2025 4:00 PM UTCMon, Aug 11, 2025 5:00 PM UTC,
where guest speakers Arun Venkatadri and Shouheng Yi will be presenting Bagel. Bagel is a new open source project that lets you chat with your robotics data by using AI to search through recorded data. Bagel was recently featured in ROS News for the Week, and there’s a follow-up post giving more detail.

Last meeting, we tried out the service from Heex Technologies, which allows you to deploy agents to your robots or search through recorded data for set events. The software then records data around those events and uploads to the cloud, allowing you to view events from your robots. If you’d like to see the meeting, it is available on YouTube.

The meeting link for next meeting is here, and you can sign up to our calendar or our Google Group for meeting notifications or keep an eye on the Cloud Robotics Hub.

Hopefully we will see you there!

2 posts - 2 participants

Read full topic

by mikelikesrobots on August 04, 2025 09:12 AM

August 01, 2025
What if your Rosbags could talk? Meet Bagel🥯, the open-source tool we just released!

Huge thanks to @Katherine_Scott and @mrpollo for hosting us at the Joint ROS / PX4 Meetup at Neros in El Segundo, CA! It was an absolute blast connecting with the community in person!

:backhand_index_pointing_down: Missed the demo? No worries! Here’s the scoop on what we unveiled (we showed it with PX4 ULogs, but yes, ROS2 and ROS1 are fully supported!)

bagel

The problem? We felt the pain of wrestling with robotics data and LLMs. Unlike PDF files, we’re talking about massive sensor arrays, complex camera feeds, dense LiDAR point clouds – making LLMs truly useful here has been a real challenge… at least for us.

The solution? Meet Bagel ( GitHub - shouhengyi/bagel: Bagel is ChatGPT for physical data. Just ask questions. No Fuss. )! We built this powerful open-source tool to bridge that gap. Imagine simply asking questions about your robotics data, instead of endless parsing and plotting.

With Bagel, loaded with your ROS2 bag or PX4 ULog, you can ask things like:

  • “Is this front left camera calibrated?”
  • “Were there any hard decelerations detected in the IMU data?”

Sound like something that could change your workflow? We’re committed to building Bagel in the open, with your help! This is where you come in:

  • Dive In! Clone the repo, give Bagel a spin, and tell us what you think.
  • Speak Your Mind! Got an idea? File a feature request. Your insights are crucial to Bagel’s evolution.
  • Code with Us! Open a PR and become a core contributor. Let’s build something amazing together.
  • Feeling the Love? If Bagel sparks joy (or solves a big headache!), please consider giving us a star on GitHub :star:. It’s a huge motivator!

Thanks a lot for being part of this journey. Happy prompting!

1 post - 1 participant

Read full topic

by shouheng on August 01, 2025 08:34 AM

July 31, 2025
ROS Naija Linedlin Group

:rocket: Exciting News for Nigerian Roboticists!

We now have a ROS Naija Community group on here ,a space for engineers, developers, and enthusiasts passionate about ROS (Robot Operating System) and robotics.

Whether you’re a student, hobbyist, researcher, or professional, this is the place to:
:robot: Connect with like-minded individuals
:books: Share knowledge, resources, and opportunities
:light_bulb: Collaborate on robotics and ROS-based projects
:brain: Ask questions and learn from others in the community

If you’re interested in ROS and robotics, you’re welcome to join:

:link: Join here: LinkedIn Login, Sign in | LinkedIn

Let’s build and grow the Nigerian robotics ecosystem together!

ROS robotics #ROSNaija #NigeriaTech #Engineering #ROSCommunity #RobotOperatingSystem

1 post - 1 participant

Read full topic

by Davis_Ogunsina on July 31, 2025 07:16 PM

[Case Study] Cross-Morphology Policy Learning with UniVLA and PiPER Robotic Arm

We’d like to share a recent research project where our AgileX Robotics PiPER 6-DOF robotic arm was used to validate UniVLA, a novel cross-morphology policy learning framework developed by the University of Hong Kong and OpenDriveLab.

Paper: Learning to Act Anywhere with Task-Centric Latent Actions
arXiv: [2505.06111] UniVLA: Learning to Act Anywhere with Task-centric Latent Actions
Code: GitHub - OpenDriveLab/UniVLA: [RSS 2025] Learning to Act Anywhere with Task-centric Latent Actions


Motivation

Transferring robot policies across platforms and environments is difficult due to:

  • High dependence on manually annotated action data
  • Poor generalization between different robot morphologies
  • Visual noise (camera motion, background movement) causing instability

UniVLA addresses this by learning latent action representations from videos, without relying on action labels.


Framework Overview

UniVLA introduces a task-centric, latent action space for general-purpose policy learning. Key features include:

  • Cross-hardware and cross-environment transfer via a unified latent space
  • Unsupervised pretraining from video data
  • Lightweight decoder for efficient deploymen

Figure2: Overview of the UniVLA framework. Visual-language features from third-view RGB and task instruction are tokenized and passed through an auto-regressive transformer, generating latent actions which are decoded into executable actions across heterogeneous robot morphologies.


PiPER in Real-World Experiments

To validate UniVLA’s transferability, the researchers selected the AgileX PiPER robotic arm as the real-world testing platform.

Tasks tested:

  1. Store a screwdriver
  2. Clean a cutting board
  3. Fold a towel twice
  4. Stack the Tower of Hanoi

These tasks evaluate perception, tool use, non-rigid manipulation, and semantic understanding.


Experimental Results

  • Average performance improved by 36.7% over baseline models
  • Up to 86.7% success rate on semantic tasks (e.g., Tower of Hanoi)
  • Fine-tuned with only 20–80 demonstrations per task
  • Evaluated using a step-by-step scoring system


About PiPER

PiPER is a 6-DOF lightweight robotic arm developed by AgileX Robotics. Its compact structure, ROS support, and flexible integration make it ideal for research in manipulation, teleoperation, and multimodal learning.

Learn more: PiPER
Company website: https://global.agilex.ai

Click the link below to watch the experiment video using PIPER:

🚨 Our PiPER robotic arm was featured in cutting-edge robotics research!


Collaborate with Us

At AgileX Robotics, we work closely with universities and labs to support cutting-edge research. If you’re building on topics like transferable policies, manipulation learning, or vision-language robotics, we’re open to collaborations.

Let’s advance embodied intelligence—together.

1 post - 1 participant

Read full topic

by Agilex_Robotics on July 31, 2025 10:26 AM

July 28, 2025
[Demo] Remote Teleoperation with Pika on UR7e and UR12e

Hello ROS developers,

We’re excited to share a new demo featuring Pika, AgileX Robotics’ portable and ergonomic teleoperation gripper system. Pika integrates multiple sensors to enable natural human-to-robot skill transfer and rich multimodal data collection.

Key Features of Pika:

  • Lightweight design (~370g) for comfortable extended handheld use
  • Integrated multimodal sensors including fisheye RGB camera, Intel RealSense depth camera, 6-DoF IMU, and high-precision gripper encoders
  • USB-C plug-and-play connectivity supporting ROS 1 and ROS 2
  • Open-source Python and C++ APIs for easy integration and control
  • Compatible with URDF models, suitable for demonstration-based and teleoperation control

In this demo, Pika teleoperation system remotely controls two collaborative robot arms — UR7e (7.5 kg payload, 850 mm reach) and UR12e (12 kg payload, 33.5 kg robot weight) — to complete several everyday manipulation tasks:

:wrench: Task Set:

  • Twist open a bottle cap
  • Pick up a dish and place it in a cabinet
  • Grab a toy and put it in a container

:hammer_and_wrench: System Highlights:

  • Precise gripper control with high-resolution encoder feedback
  • 6-DoF IMU for accurate motion tracking
  • Synchronized multimodal data capture (vision, 6D pose, gripper status)
  • Low-latency USB-C connection ensuring real-time responsiveness
  • Ergonomic and lightweight design for comfortable long-duration use

:package: Application Scenarios:

  • Human-in-the-loop teleoperation
  • Learning from Demonstration (LfD) and Imitation Learning (IL)
  • Vision-based dexterous manipulation and robot learning
  • Remote maintenance and industrial collaboration
  • Bimanual coordination and complex task execution

:movie_camera: Watch the demo here: Pika Remote Control Demo
:link: Learn more about Pika: https://global.agilex.ai/products/pika

:speech_balloon: Feel free to contact us for GitHub repositories, integration guides, or collaboration opportunities — we look forward to your feedback!

1 post - 1 participant

Read full topic

by Agilex_Robotics on July 28, 2025 10:23 AM

TecGihan Force Sensor Amplifier for Robot Now Supports ROS 2

I would like to share that Tokyo Opensource Robotics Kyokai Association (TORK) has supported the development and release of the ROS 2 / Linux driver software for the DMA-03 for Robot, a force sensor amplifier manufactured by TecGihan Co., Ltd.

The DMA-03 for Robot is a real-time output version of the DMA-03, a compact 3-channel strain gauge amplifier, adapted for robotic applications.

As of July 2025, tecgihan_driver supports the following Linux / ROS environments:

  • Ubuntu 22.04 + ROS 2 Humble
  • Ubuntu 24.04 + ROS 2 Jazzy

A bilingual (Japanese/English) README with detailed usage instructions is available on the GitHub repository:

If you have any questions or need support, feel free to open an issue on the repository.


Yosuke Yamamoto
Tokyo Opensource Robotics Kyokai Association

1 post - 1 participant

Read full topic

by y-yosuke on July 28, 2025 08:12 AM

July 27, 2025
RobotCAD 9.0.0 (Assemly WB -> RobotCAD converter)

Improvements:

  1. Add converter FreeCAD Assembly WB (default) to RobotCAD structure.
  2. Add tool for changing Joint Origin without touching downstream kinematic chain (move only target Joint Origin)
  3. Optimization of Set placement tools performance. Now it does not require intermediate recalculation scene in process.
  4. Decrease size of joint arrows to 150.
  5. Add created collisions to Collision group (folder). Unification of collision part prefix.
  6. Fix Set placement by orienteer for root link (align it to zero Placement)
  7. Refactoring of Set Placement tools.

Fixes:

  1. Fix error when creating collision for empty part.
  2. Fix getting wrapper for LCS body container. It fixes LCS adding to some objects.
  3. Fix NotImplementedError (some joint types units) to warning. Instead of error it will give warning and let possible to set values for other types of joints.

https://vkvideo.ru/video-219386643_456239081 - Converter Assembly WB → RobotCAD in work


1 post - 1 participant

Read full topic

by fenixionsoul on July 27, 2025 06:47 AM

July 25, 2025
🚀 [New Release] BUNKER PRO 2.0 – Reinforced Tracked Chassis for Extreme Terrain and Developer-Friendly Integration

Hello ROS community,

AgileX Robotics is excited to introduce the BUNKER PRO 2.0, a reinforced tracked chassis designed for demanding off-road conditions and versatile field robotics applications.

Key Features:

  • Christie suspension system + Matilda four-wheel independent balancing suspension provide excellent terrain adaptability and ride stability.
  • Easily crosses 30° slope terrain.
  • Maximum unloaded range: 20 km; maximum loaded range: 15 km.
  • Capable of crossing 40 cm trenches and clearing obstacles up to 180 mm in height.
  • IP67-rated enclosure ensures robust protection against dust, water, and mud.
  • Rated payload capacity: 120 kg, supporting a wide range of sensors, manipulators, and payloads.
  • Maximum speed at full load: 1.5 m/s.
  • Minimum turning radius: 67 cm.
  • Developer-ready interfaces and ROS compatibility.

Intelligent Expansion, Empowering the Future

  • Supports customizable advanced operation modes.
  • Communication via CAN bus protocol.
  • Open-source SDK and ROS packages for easy integration and development.

Typical Use Cases:

  • Outdoor Inspection & Patrol
  • Agricultural Transport
  • Engineering & Construction Operations
  • Specialized Robotics Applications

AgileX Robotics provides full ROS driver support and SDK documentation to accelerate your development process. We welcome collaboration opportunities and field testing partnerships with the community.

For detailed technical specifications or to discuss integration options, please contact us at sales@agilex.ai.

Learn more at https://global.agilex.ai/

4 posts - 2 participants

Read full topic

by Agilex_Robotics on July 25, 2025 10:27 AM

Cloud Robotics WG Meeting 2025-07-28 | Heex Technologies Tryout and Anomaly Detection Discussion

Please come and join us for this coming meeting at Mon, Jul 28, 2025 4:00 PM UTCMon, Jul 28, 2025 5:00 PM UTC, where we will be trying out Heex Technologies service offering from their website and discussing anomaly detection for Logging & Observability.

Last meeting, we heard from Bruno Mendes De Silva, Co-Founder and CEO of Heex Technologies, and Benoit Hozjan, Project Manager in charge of customer experience at Heex Technologies. The two discussed the company and purpose of the service they offer, then demonstrated a showcase workspace for the visualisation and anomaly detection capabilities of the server. If you’d like to see the meeting, it is available on YouTube.

The meeting link for nex meeting is here, and you can sign up to our calendar or our Google Group for meeting notifications or keep an eye on the Cloud Robotics Hub.

Hopefully we will see you there!

2 posts - 2 participants

Read full topic

by mikelikesrobots on July 25, 2025 09:55 AM

July 24, 2025
Sponsoring open source project, what do you think?

Hi,

I just saw this and I was thinking about the ROS community.

We have a large and amazing ecosystem of free software, free as in beer and speech!

That accelerated robotic development and we are all very grateful for it.

But I thin that it is also interesting to discuss how to support financially mantainers, keeping the software free for small companies (pre-revenue), students and individuals.

Thoughts’

6 posts - 6 participants

Read full topic

by facontidavide on July 24, 2025 01:21 PM

July 22, 2025
Baxter Robot Troubleshooting Tips

Hey everyone,

I’ve been working with the Baxter robot recently and ran into a lot of common issues that come up when dealing with an older platform with limited support. Since official Rethink Robotics docs are gone, I compiled this troubleshooting guide from my experience and archived resources. Hopefully, this saves someone hours of frustration!


Finding Documentation


Startup & Boot Issues

1. Baxter not powering on / unresponsive screen

  • Power cycle at least 3 times, waiting 30 sec each time.
  • If it still doesn’t work, go into FSD (Field Service Menu):
    Press Alt + F → reboot from there.

2. BIOS password lockout

  • Use BIOS Password Recovery
  • Enter system number shown when opening BIOS.
  • Generated password is admin → confirm with Ctrl+Enter.

3. Real-time clock shows wrong date (e.g., 2016)

  • Sync Baxter’s time with your computer.
  • Set in Baxter FSM or use NTP from your computer via command line.

Networking & Communication

4. IP mismatch between Baxter and workstation

  • Set Baxter to Manual IP in FSM.

5. Static IP configuration on Linux (example: 192.168.42.1)

  • First 3 numbers must match between workstation and Baxter.
  • Ensure Baxter knows your IP in intera.sh.

6. Ping test: can’t reach baxter.local

  • Make sure Baxter’s hostname is set correctly in FSM.
  • Disable firewall on your computer.
  • Try pinging Baxter’s static IP.

7. ROS Master URI not resolving

export ROS_MASTER_URI=http://baxter.local:11311

8. SSH into Baxter fails

  • Verify SSH installed, firewall off, IP correct.

ROS & Intera SDK Issues

9. Wrong catkin workspace sourcing

source ~/ros_ws/devel/setup.bash

10. enable_robot.py or joint_trajectory_action_server.py missing

  • Run catkin_make or catkin_build after troubleshooting.

11. intera.sh script error

  • Ensure file is in root of catkin workspace:
    ~/ros_ws/intera.sh

12. MoveIt integration not working

  • Ensure robot is enabled and joint trajectory server is active in a second terminal.

Hardware & Motion Problems

13. Arms not enabled or unresponsive

rosrun baxter_tools enable_robot.py -e
  • Test by gripping cuffs (zero-g mode should enable).

14. Joint calibration errors

  • Restart robot. Happens if you hit CTRL+Z mid-script.

Software/Configuration Mismatches

15. Time sync errors causing ROS disconnect

  • Sync Baxter’s time in FSM or use chrony or ntp.

Testing, Debugging, & Logging

16. Check robot state:

rostopic echo /robot/state

17. Helpful debug commands:

rostopic list
rosnode list
rosservice list

18. Reading logs:

  • Robot: ~/.ros/log/latest/
  • Workstation: /var/log/roslaunch.log

19. Confirm joint angles:

rostopic echo /robot/joint_states

If you have more tips or fixes, add them in the comments. Let’s keep these robots running.

1 post - 1 participant

Read full topic

by Janga786 on July 22, 2025 05:56 PM

Remote (Between Internet Networks) Control of Robot Running Micro-ROS

Hello,
I am looking into solutions for communicating with a robot running Micro-ROS that is not on the same network as the host computer (the computer running ROS 2).
The only solution I have found till now is this blog post by Husarnet. The only problem is that this use-case no longer works, and the Husarnet team does not plan to resolve the issue any time soon.
Does anybody know any solution for this that work?

1 post - 1 participant

Read full topic

by Amronos on July 22, 2025 08:08 AM

AgileX Robotics at 2025 ROS Summer School: PiPER & LIMO Hands-on Tracks and Schedule

AgileX Robotics at 2025 ROS Summer School

AgileX Robotics is thrilled to announce our participation in the upcoming 2025 ROS Summer School
:date: July 26 – August 1, 2025
:round_pushpin: Zhejiang University International Science and Innovation Center, Hangzhou, China
:globe_with_meridians: Official site: http://www.roseducation.org.cn/ros2025/


Hands-on Tracks

This year, we are bringing two dedicated hands-on tracks designed to empower developers with practical skills in robot navigation and mobile manipulation.


:wrench: PiPER – Mobile Manipulation Track

Our PiPER-based curriculum introduces core concepts in robotic grasping, visual perception, and motion control. Ideal for those exploring real-world robotic manipulation with ROS!

Date Time Session Topic
Day 4 AM Session 1 Introduction to PiPER
Day 4 AM Session 2 Motion analysis
Day 4 PM Session 1 Overview of PiPER-sdk
Day 4 PM Session 2 MoveIt + Gazebo simulation
Day 5 AM Session 1 QR code recognition grasping
Day 5 AM Session 2 Code-level analysis of grasping logic
Day 5 PM Session 1 YOLO-based Object Recognition and Grasping with Code Analysis
Day 5 PM Session 2 Frontier Insights on Embodied Intelligence

:automobile: LIMO – Navigation & AI Track

Focused on the LIMO platform, this track offers structured ROS-based training in navigation, SLAM, perception, and deep learning.

Date Time Session Topic
Day 1 AM Session 1 LIMO basic functions overview
Day 1 AM Session 2 Chassis Kinematics Analysis
Day 1 PM Session 1 ROS communication mechanisms
Day 1 PM Session 2 LiDAR-based Mapping
Day 2 AM Session 1 Path planning
Day 2 AM Session 2 Navigation frameworks
Day 2 PM Session 1 Navigation practice
Day 2 PM Session 2 Visual perception
Day 3 AM Session 1 Intro to deep reinforcement learning
Day 3 AM Session 2 DRL hands-on session
Day 3 PM Session 1 Multi-robot systems intro
Day 3 PM Session 2 Multi-robot simulation practice

We look forward to meeting all ROS developers, enthusiasts, and learners at the event. Come join us for hands-on learning and exciting robotics innovation!

— AgileX Robotics

1 post - 1 participant

Read full topic

by Agilex_Robotics on July 22, 2025 02:07 AM

July 17, 2025
Is DDS suitable for RF datalink communication with intermittent connection?

I’m not using ROS myself, but I understand that ROS 2 relies on DDS as its middleware, so I thought this community might be a good place to ask.

I’m working on a UAV system that includes a secondary datalink between the drone and the ground segment, used for control/status messages. The drone flies up to 35 km away and communicates over an RF-based datalink with an estimated bandwidth of around 2 Mbps, though the link is prone to occasional disconnections and packet loss due to the nature of the environment.

I’m considering whether DDS is a suitable protocol for this kind of scenario, or if its overhead and discovery/heartbeat mechanisms might cause issues in a lossy or intermittent RF link.

Has anyone here tried using DDS over real-world RF communication (not simulated Wi-Fi or Ethernet), and can share experiences or advice?

Thanks in advance!
S.

10 posts - 6 participants

Read full topic

by Santana27 on July 17, 2025 10:09 PM

Feature freeze for Gazebo Jetty (x-post from Gazebo Community)

Hello everyone!

The feature freeze period for Gazebo Jetty starts on Fri, Jul 25, 2025 12:00 AM UTC.

During the feature freeze period, we will not accept new features to Gazebo. This includes new features to Jetty as well as to currently stable versions. If you have a new feature you want to contribute, please open a PR before we go into feature freeze noting that changes can be made to open PRs during the feature freeze period. This period will be close when we go into code freeze on Mon, Aug 25, 2025 12:00 AM UTC.

Bug fixes and documentation changes will still be accepted after the freeze date.

More information on the release timeline can be here: Release Jetty · Issue #1271 · gazebo-tooling/release-tools · GitHub

The Gazebo Dev Team :gazebo:

1 post - 1 participant

Read full topic

by azeey on July 17, 2025 02:40 AM

July 15, 2025
Donate your rosbag (Cloudini benchmark)

Hi,

as my presentation about Cloudini was accepted at ROSCon 2025, I want to come prepared with an automated benchmarking suite that measure performance over a wide range of datasets.

You can contribute to this donating a rosbag!!!

Thanks for your help. Let’s make pointcloud smaller together :pinched_fingers:

How to

Data Donation Disclaimer: Public Availability for CI Benchmarking

By donating your data files, you acknowledge and agree to the following terms regarding their use and public availability:

Purpose: The donated data will be used for research purposes, specifically to perform and validate benchmarking within Continuous Integration (CI) environments.

Public Availability: You understand and agree that the donated data, or subsets thereof, will be made publicly available. This public release is essential for researchers and the wider community to reproduce, verify, and build upon the benchmarking results, fostering transparency and collaborative progress in pointcloud compression.

Anonymization/Pseudonymization: Please ensure that no personally identifiable information is included in the data you submit, as it will be made public as-is.

5 posts - 3 participants

Read full topic

by facontidavide on July 15, 2025 01:35 PM

Everything I Know About ROS Interfaces: Explainer Video

I made a video about everything I’ve learned about ROS Interfaces (messages/services/actions) in my fifteen years of working with ROS.

The ROS Interface Primer

Text Version: ROS Interface Primer - Google Docs (Google Doc)

Featuring:
:information_source: Information about Interfaces, from Super Basic to Complex Design Issues
:microscope:Original Research analyzing all the interfaces in ROS 2 Humble
:magic_wand:Best Practices for designing new interfaces
:supervillain:Hot takes (i.e. the things that I think ROS 2 Interfaces do wrong)
:name_badge: Three different ways to divide information among topics
:waffle: Fun with multidimensional arrays
:nine: Nine different recipes for “optional” components of interfaces
:thought_balloon: Strong opinions that defy the ROS Orthodoxy
:prohibited: Zero content generated by AI/LLM

Making video is hard, and so I’m calling this version 1.0 of the video, so please let me know what I got wrong and what I’m missing, and I may make another version in the future.

In closing: bring back Pose2D you monsters.

3 posts - 2 participants

Read full topic

by DLu on July 15, 2025 11:18 AM

July 14, 2025
ROS and ROS2 Logging Severity Level

Hi All!

I’m working on an application for containerizing ROS (1 & 2) projects.

I’m asking for the help of everyone experienced with ROS loggers.
In particular, I’m looking for a solution to generalize the definition of the minimum severity level for all the nodes running in a project.

This configuration should be possible outside of the node source code, so using parameters, environmental variables, or configuration files.
I know that In ROS 1 (C++ base nodes) it is possible to set the minimum severity level from rosconsole.config. (What about ROS 1 Python nodes? It still uses rosconsole.config?)

Also I may have some doubts about how named loggers works, each node has its own logger? In principle it is not possible to define the minimum severity level for all the nodes running in a project?

In ROS 2 (C++ and Python nodes) I know that the --log-level args works to configure the severity when running a node. But again I’m looking for a global solution…

Anyone with useful resources or insights on this aspect?
As anticipated before, the final goal is having an environmental variable or a configuration file that can be used to set the severity level of all the nodes that will be executed when the project start (so for example multiple nodes running from a launch file).
Moreover, I want it to be independent of the language used to write the node (Python or C++).
I’m not referring to a “global parameters” because I know that ROS 2 is structured such that each node has its parameters.

Thanks to all of you!
(I hope the question is not badly formulated, I?m not very experienced with this aspects and the different structure of ROS 1 and ROS 2 in managing loggers… So also study resources on this aspects can be very helpful for me)

1 post - 1 participant

Read full topic

by AlePuglisi on July 14, 2025 03:06 PM

Ros2top - top-like utility for ROS2

Hi everyone!

Repo: GitHub - AhmedARadwan/ros2top

I’ve always found it hard to track each node’s resource usage, so I thought it might be a good idea to build a tool that works for ROS 2 and essentially any Python or C++ process to monitor resource usage in real time. The goal? Quickly see which processes are consuming the most resources and gain better visibility into a running system.

This is an initial release: it relies on the node registering itself to become visible and tracked by the ros2top utility.

What it does so far:

  • Shows per-node CPU, RAM, GPU, and GPU Mem usage.
  • Get active nodes based on registration.
  • Offers a simple terminal UI interface similar to htop to monitor everything in one place.

How it works:

  • Node imports/includes ros2top and register itself.
  • ros2top then polls resource stats.
  • It displays a list of processes, so you can spot hot resource consumers at a glance.

Why it might help:

  • Instead of juggling htop, nvtop, ros2 node info, etc., you get everything in one screen.
  • Ideal for multi‑node systems where it’s easy to lose track of who’s using what.

I’d love to hear your thoughts:

  • Does this sound helpful in your debugging or monitoring workflow?
  • Any ideas for features, UI improvements, or integrations?
  • Thoughts on automatic registration vs. manual config?

This is very early-stage, but I hope it can evolve into a valuable tool for the ROS 2 community. Feedback, suggestions, or even contributions are all welcome! :blush:

9 posts - 7 participants

Read full topic

by Ahmed_Ali on July 14, 2025 09:43 AM


Powered by the awesome: Planet