April 21, 2026
How to generate a perfect URDF for a cobot (Robotic Arm) for ros2 using Solid works for gravity compensation mode with cyclic syncronous torque mode?

We have a cobot. We are making URDF for ROS2 and for real hardware for gravity compensation mode and CST Cyclic Synchronous Torque Mode with EtherCAT. Links and Joints are such that if Link X has motor, motor is all inside this link joint, and its output rotating side is attached with the flange (half inside link X and half inside Link Y)

Flange is attached fixed to Link Y via screws so it rotates with Link Y Flange other face is attached to output (rototaing part of Link X)

The Motor has Strain Wave Gear (harmonic Drive) attached to its rotor.

Problem: Now, I want to know what Part of Motor should be taken along with Link X, Like in tag as part of the link and same for Link Y.

Because we are going to put these masses separately in soldiworks and then generate its URDF.
So, what things belong to what part. Like do we add stator in link X as fixed in solidworks so its Center of Mass is calculated with it for tag? or what

  • rotor mass
  • strain wave gear input side mass
  • strain wave gear output side mass
  • stator mass

Please Someone guide me how to achieve this in industrial way and correct way. Also how much error in mass is okay?

2 posts - 2 participants

Read full topic

by Zed on April 21, 2026 04:01 PM

April 20, 2026
:guitar: ROS 2 Lyrical Luth Testing Kicks Off on April 30th

:guitar: ROS 2 Lyrical Luth Testing Kicks Off on April 30th

As many of you are already aware, the ROS 2 Lyrical Luth release is just around the corner: Friday, May 22nd, to be exact (World Turtle Day falls on a Saturday this year)! We want this to be our best ROS 2 release yet, and to get there we need to make sure that we thoroughly test Lyrical Luth before it is released to the general public. We also want to make sure that the ROS documentation on docs.ros.org continues to be clear, concise, and correct. That’s where we need your help! We’re looking for community volunteers to join us for our Lyrical Luth Testing and Tutorial Party. If you are looking to start dipping your toes into contributing to the ROS project, this is a great place to start.

So, what is a Testing and Tutorial Party, you may ask? Well, it is a chance for the community to meet with our core team, systematically review all of the current ROS tutorials, and test the latest ROS release. Right now our ROS Boss @sloretz, is working to generate early release binary and source packages for ROS 2 Lyrical Luth. On April 30th, we’ll release those binaries to the public and start the process of systematically testing them.

During the Testing and Tutorial Party, we’ll provide a GitHub repository with a long list of tests we would like to run on the Lyrical Luth beta. These tests will first ask developers to pick a particular release setup, and then run either the test suite along with one or more of the existing ROS 2 tutorials on docs.ros.org. When we say setup, we mean a specific combination of RMW vendor (Zenoh / FastDDS / Cyclone DDS / Connext DDS), build type (binary / debian / source), host operating system (Ubuntu / RHEL / Windows / MacOS), and chip architecture (amd64 / aarch64). For each setup, we’ll perform a number of tests to validate our tutorials, core ROS functionality, and new features. With dozens of possible setup configurations, testing each and every one internally isn’t feasible, which is why we need your help!

During the tutorial party, participants will be asked to sign up for particular tests and report back the results. If you happen to find an issue or bug while participating, you’ll need to report it to us so it can get corrected before the Lyrical release.

We are planning to kick off the tutorial party with a virtual kickoff meeting on Thu, Apr 30, 2026 4:00 PM UTC. During this kickoff meeting, we’ll explain the whole Testing and Tutorial Party process. We’ll record the meeting and post instructions on Open Robotics Discourse for those who can’t make it. To help motivate participants, we’ll be giving away ROS Lyrical Luth swag to the testers who complete the most tests during the event. The testers with the most closed issues will receive a credit to our Fourth Wall shop to pick out some Lyrical swag.

Here are the key dates you’ll want to remember:

  • Thu, Apr 30, 2026 4:00 PM UTC Tutorial & Testing Party begins
  • Thu, May 14, 2026 7:00 AM UTC Tutorial & Testing Party ends
  • Fri, May 22, 2026 7:00 AM UTC ROS 2 Lyrical Luth released

We’ll add these events to the official ROS events calendar, but the big one that you won’t want to miss is the kickoff event on Thu, Apr 30, 2026 4:00 PM UTC. In the meantime, we would like your help spreading the word about the Testing and Tutorial Party.

Finally, if you can’t make it to the T&T Party but would like to help support the next ROS release, consider making a donation via their DonorBox account or joining the OSRA. Our open source contributors, OSRF donors, and OSRA members are the people making our ROS ecosystem possible! :heart:

2 posts - 1 participant

Read full topic

by Katherine_Scott on April 20, 2026 05:28 PM

I benchmarked my ROS 2 localization filter (FusionCore) against robot_localization on real-world data. Here's what happened

I ran FusionCore head-to-head against robot_localization (the standard ROS sensor fusion package) on the NCLT dataset from the University of Michigan… a real robot driving around a campus for 10 minutes. Mixed urban/suburban environment with tree cover, buildings, and open quads: the kind of GPS conditions where multipath is real, not a lab with clear sky view. Ground truth is RTK GPS, sub-10cm accuracy.

Equal comparison, no tricks: same raw IMU + wheel odometry + GPS fed to every filter simultaneously. No tuning advantage. This is strictly equal-config performance on identical sensor data.

The dashed line is RTK GPS ground truth. That’s where the robot actually was.

Left: robot_localization EKF. Right: FusionCore.

Accuracy over 600s (Absolute Trajectory Error (ATE) RMSE: lower is better):

  • FusionCore: 5.5 m

  • robot_localization EKF: 23.4 m: 4.2× worse

The difference comes down to one thing: robot_localization trusts every GPS fix equally and uses fixed noise values you set manually in a config file. FusionCore continuously estimates IMU bias and adapts its noise model in real time… so it knows when a measurement doesn’t fit and how much to trust it.

FusionCore tracks position, velocity, orientation, plus gyro bias and accelerometer bias as live states. RL-EKF has no bias estimation; gyro drift compounds silently into heading error.

I also ran robot_localization’s UKF mode. It diverged numerically at t=31 seconds: covariance matrix hit NaN, every output invalid for the remaining 9 minutes. FusionCore ran stably for the full 600 seconds on the same data. Fusioncore turns out is numerically stable even at high IMU rates. This is why RL-UKF hit NaN at 100Hz and FusionCore didn’t.

Dataset: NCLT (University of Michigan).

GitHub repo: https://github.com/manankharwar/fusioncore

ROS Discourse: https://discourse.ros.org/t/fusioncore-which-is-a-ros-2-jazzy-sensor-fusion-package-robot-localization-replacement

Currently testing on physical hardware. If you’d like to try it, the repo is open… raise an issue, open a PR, or just DM me. Happy to answer any questions… I respond to everything within 24 hours. Happy building!

1 post - 1 participant

Read full topic

by manankharwar on April 20, 2026 01:08 PM

Oxide GNSS — a Rust-based ROS 2 driver for u-blox ZED-F9P with NTRIP and integrity monitoring

Hi everyone,

I’d like to announce the first release of “oxide_gnss”, a ROS 2 driver for u-blox ZED-F9P receivers.

Its focus is on providing a clean, simple way to get GNSS position, velocity and optional heading data from an F9P device with minimal effort, while also providing some safety integrity monitoring.

Built on ros2-rust (rclrs). MIT licensed.

Repo: GitHub - greenforge-labs/oxide_gnss: A Rust-based ROS2 GNSS driver for u-blox ZED-F9P devices with integrated NTRIP client · GitHub

Highlights:

  • Mode-based config for standalone / rover (NTRIP or radio) / moving base / moving-base-rover / static base setups — no need to hand-edit UBX CFG-VAL keys.
  • Integrated NTRIP client (including VRS via GGA uplink).
  • Optional safety integrity monitoring: protection levels (NAV-PL), jamming/spoofing detection (SEC-SIG), antenna status, etc., aggregated into a ~/integrity topic and a simple ~/operational go/no-go Bool.
  • CI against Humble / Jazzy / Kilted on amd64 and arm64.
  • Includes a small admin CLI (oxide_gnss_assign_serial) for setting the F9P USB device serial string (useful when operating in moving base + rover mode).

A minimal rover with NTRIP config looks like:

mode: rover_ntrip

features:
  high_precision: true
  integrity: true

device:
  port: "/dev/gnss_f9p_rover"
  baud_rate: 460800
  frame: ENU
  navigation:
    rate_hz: 5

ntrip:
  host: "ntrip.data.gnss.ga.gov.au"
  port: 2101
  mountpoint: "SFLD00AUS0"
  username: "${NTRIP_USERNAME}"
  password: "${NTRIP_PASSWORD}"
  send_gga: true

Launch and you should see something like:

Startup output (click for more details)

Released as-is from internal automation use — plans for continued feature development are limited, but bug-fix PRs and forks are welcome.

Feedback / issues / PRs: Issues · greenforge-labs/oxide_gnss · GitHub

1 post - 1 participant

Read full topic

by geoffs on April 20, 2026 07:22 AM

Colorful ROS2 Command Line!

Hi everyone,

I am adding color support for ros2cli. Currently it is optional and controlled via ROS_COLORIZED_OUTPUT=1. Do you like it?

GitHub PR: https://github.com/ros2/ros2cli/pull/1223

ros2color

1 post - 1 participant

Read full topic

by penwang on April 20, 2026 02:24 AM

April 19, 2026
URDF Validator (catches real robot failures, privacy-first, with xacro support)

Hi all,

I built a URDF validator aimed at catching real-world issues in robot descriptions not just syntax errors.

Why these matters

Many URDF tools will accept files that still fail later in simulation, motion planning, or TF. The goal here is to catch those problems before runtime.

What it does

  • Validates URDF structure and semantics

  • Detects broken links, joints, and invalid references

  • Flags issues seen in real robot models

  • Supports .xacro (with guided upgrade hints)

Proof (real-world failures)

  • Valkyrie: leftover xacro artifacts → correctly flagged

  • Fetch: invalid XML prefix → caught immediately

These are real bugs in widely used robot models — not synthetic test cases.

Quick check (no login required)

Free URDF Validator RoboInfra

API example

curl -X POST "https://roboinfra-api.azurewebsites.net/api/urdf/validate?include_urdf=true" \
  -H "x-api-key: YOUR_KEY" \
  -F "file=@robot.urdf"

Privacy-first

  • Files are not stored

  • No training on user data

  • Stateless validation

Extras

  • Clear upgrade hints for .xacro

  • Human-readable error explanations (not just parser output)

I’d really appreciate feedback especially edge cases or robot models that break it.

1 post - 1 participant

Read full topic

by Robotic on April 19, 2026 08:30 PM

Upcoming Lyrical Feature Freeze

Hi all,

On Tue, Apr 21, 2026 6:59 AM UTC, we will freeze all core ROS 2 packages to prepare for the upcoming Lyrical Luth release on Fri, May 22, 2026 7:00 AM UTC.

Once this freeze takes effect, we will not accept new features to the core packages until Lyrical branches from ROS Rolling. This restriction applies to the packages and vendor packages appearing in the ros2.repos file: ros2/ros2.repos at rolling · ros2/ros2 · GitHub

We still welcome bug fixes after the freeze date.

Find more information on the Lyrical Luth release timeline here: ROS 2 Lyrical Luth (codename ‘lyrical’; May, 2026).

2 posts - 2 participants

Read full topic

by mjcarroll on April 19, 2026 11:27 AM

April 18, 2026
Unibotics Robot Programming Challenge, April 2026

After checking the interest in the robot programming tournament we at JdeRobot org are launching the Unibotics Robot Programming Challenge :pushpin:

  • Online asynchronous competition (from April 15th to April 30th)
  • Python language
  • Robot programming from your web browser
  • Free, just for fun
  • Based on ROS, on Gazebo simulator and Unibotics web platform

The 2026 April challenge is to program a Formula1 car to follow the red line drawn in the floor along several race circuits :racing_car: . The car is endowed with a front camera, a steering wheel (W) and an accelerator pedal (V). You can use both: the available SimpleAPI or directly the ROS topics for camera images and robot control. The car has Ackermann dynamics and Montmeló is the test circuit, although your solution should also work fine in other circuits such as Montreal or SimpleCircuit.

[Unibotics] RoboticsAcademy - Follow Line

The “rules” are available at here. All the interactions will be held at the Unibotics forum

Cheers and good luck! :slight_smile:

JoseMaria

1 post - 1 participant

Read full topic

by jmplaza on April 18, 2026 04:51 PM

Eclipse Zenoh 1.9.0 "Longwang" released — Regions, QUIC multistream, Go binding (RMW Zenoh impact inside)

Eclipse Zenoh 1.9.0 “Longwang” was released yesterday. There’s a fair bit
that’s directly relevant to ROS 2 / RMW Zenoh deployments, so I wanted to
flag it here for the community.

Full release notes: Zenoh 1.9.x: Longwang · Zenoh

Regions

Regions extends the old fixed router/peer/client hierarchy with
arbitrarily deep trees of topologies (clique / mesh / star) and
configurable gateway relationships between them.

For RMW Zenoh, this means that you can scale to much larger systems
than ever before, dramatically cutting the compute and network overhead
of multi-site fleet deployments.

Other Features Shipping with 1.9.0

  • QUIC multistream — one QUIC stream per Zenoh priority level,
    eliminating head-of-line blocking in mixed-priority traffic.
  • QUIC mixed reliability — reliable streams and best-effort datagrams
    on a single connection.
  • Reliable UDP — unsecure QUIC for trusted environments where TLS
    overhead matters.
  • Zenoh-Go — official, idiomatic Go binding with full API coverage
    from day one (sponsored by SoftBank Corp.). Useful for fleet-side
    tooling and cloud bridges written in Go.
  • Zenoh-Pico async executor — single-threaded task execution bringing
    advanced pub/sub, connectivity events, auto-reconnection, and
    peer-to-peer mode to microcontroller deployments.
  • Nuze 0.3.0 — native Zenoh message decoding in the Nu-powered CLI.

Resources

– Happy Hacking

1 post - 1 participant

Read full topic

by kydos on April 18, 2026 11:36 AM

April 17, 2026
Free online URDF validator no ROS install, instant 9-check validation (also supports xacro)

Hi ROS community,

I built a free online URDF validator because I kept running into the same
problem: testing URDF files required a full ROS install just to catch basic
structural errors.

Live tool (no signup): RoboInfra Dashboard

What it checks (9 structural checks):

  • Root element must be
  • At least one exists
  • No duplicate link/joint names
  • All joint parent/child refs valid
  • Valid joint types (revolute, continuous, prismatic, fixed, floating, planar)
  • revolute/prismatic joints include
  • Exactly one root link (no cycles, no orphans)

Also supports .xacro files (server-side preprocessing via the official
xacro Python package no ROS install needed on your side).

Why I’m sharing:
I built this as a solo developer and want feedback from actual ROS users.
Is the validation useful? What other checks would help? Does xacro support
cover your real-world files?

Other things available (optional, paid plans):

  • Python SDK: pip install roboinfra-sdk
  • GitHub Action for PR validation: uses: roboinfra/validate-urdf-action@v1
  • Kinematic analysis (DOF, end effectors, chain depth)
  • 3D model conversion (STL/OBJ/FBX/GLB/DAE)

Free tier: 50 validations/month access on signup.
Happy to answer any questions.

1 post - 1 participant

Read full topic

by Robotic on April 17, 2026 07:08 PM

Tesseract & ROS-I Developer Monthly Meeting Revisit

ROS-I Developer Meeting

This was the second quarter ROSI Developer Meeting Americas, led by Matt Robinson, focusing on recent GitHub repository updates and documentation improvements. Matt presented new documentation pages for the Scan and Plan Workshop and Noether repositories, showcasing enhanced architecture diagrams and status information.

Michael discussed updates to Python bindings for Tesseract using NanoBind, noting improvements over the previous Swig implementation and plans for code reorganization. The team also discussed upcoming events including a July training session and Automate 2026 exhibition in Chicago, where they will host an open source meetup and ROS Industrial Consortium gathering.

Matt shared updates on OSRA's technical strategy development and concerns about ROS-2 release processes affecting industrial users, particularly regarding RMW and version compatibility issues. The conversation ended with Michael announcing plans to update all repositories to support Ubuntu 20.04 LTS, including necessary changes for QT5 to QT6 transition.

Tesseract Monthly Check-In

The meeting focused on discussing OMPL 2.0's new VAMP (Vectorized Antipodal Motion Planning) integration and Tesseract's 1.0 release updates.

The team explored how VAMP's SIMD acceleration and parallel collision checking capabilities could be integrated into Tesseract, with Levi and Michael explaining that VAMP uses fine-grained parallelism to process thousands of states simultaneously rather than checking single states sequentially.

Roelof provided an update on the Cole continuous collision checking implementation, reporting significant performance improvements of 20-30% and noting that the implementation now matches Bullet's approach using convex hulls.

The team also discussed ongoing work on replacing string-based data structures with hash-based ones to improve performance, and Levi mentioned plans to implement schema validation tools for easier YAML file management in Tesseract.

Information on ROS-I Developer Meetings may be found here: https://rosindustrial.org/developers-meeting

Info on the Tesseract Monthly Check-In may be found here: https://rosindustrial.org/tesseract-robotics

by Matthew Robinson on April 17, 2026 06:58 PM

How to use fastdds_monitor on ROS2 Humble

I tried following some online tutorials ( 3. Example of usage - 4.0.0 , https://www.youtube.com/watch?v=OYibnUnMIlc, …) but cannot get any statistics. I can see my topics, just not statistics. I made sure to set FASTDDS_STATISTICS and even tried rebuilding my WS with --cmak-args -DFASTDDS_STATISTICS=ON but I’m quite sure that did nothing. I then run the AppImage (~/Apps/eProsima_Fast-DDS-Monitor-v3.2.0-Linux.AppImage) but no luck.

1 post - 1 participant

Read full topic

by PeterMitrano on April 17, 2026 01:20 PM

April 16, 2026
Baxter robot's RSDK GUI not booting

Hi everyone,

I’m currently working with a Baxter robot system and ran into an issue after recovering access to the internal PC. I’d really appreciate any guidance from those who have dealt with similar setups.


Background

  • Platform: Baxter robot

  • Internal PC: Dell OptiPlex 7010

  • OS: Baxter RSDK system (Ubuntu-based), but also has a Gentoo layer

  • ROS: Indigo

The robot had been unused for several years. Initially:

  • BIOS was locked (password protected)

  • Could not access GRUB or boot from USB

  • SSH password was unknown

I managed to:

  • Reset BIOS password (via PSWD jumper)

  • Boot from a Live USB

  • Reset the ruser password via chroot

  • Successfully SSH into the robot


Current Status

  • SSH into Baxter works (ruser login OK)

  • Network connection is working (can ping and communicate)

  • System boots into a Gentoo console login

  • I can log into the Gentoo environment

  • I cannot access or see the RSDK (Ubuntu-based) GUI environment

  • ROS tools are accessible after sourcing environment (in some contexts)


Problem

The RSDK GUI does not start automatically on boot.

Instead of the normal Baxter interface, the system:

  • Boots into a Gentoo console

  • Requires manual login

  • Does not launch the Baxter runtime or GUI

  • Does not appear to transition into the RSDK (Ubuntu) environment


What I’ve tried

  • Logged in via SSH and locally

  • Verified system access through Gentoo console

  • Sourced ROS:

    source /opt/ros/indigo/setup.bash
    
  • Tried enabling the robot manually:

    rosrun baxter_tools enable_robot.py -e
    
  • Attempted:

    rostopic list
    

However:

  • It seems the Baxter runtime is not being launched

  • The system may not be switching from Gentoo → RSDK layer

  • Startup scripts/services may be broken or missing


Questions

  1. What is responsible for transitioning from the Gentoo layer into the RSDK (Ubuntu) environment?

  2. What service or script launches the Baxter GUI on boot?

  3. Is there a manual way to trigger the RSDK environment from the Gentoo console?

  4. Could this be a broken startup script, service, boot configuration, or a corrupted drive?

  5. Is there a known way to restore the original Baxter startup behaviour without reinstalling the system?

  6. If there are no way to restore it then is there an image of the system available? I tried checking in with Cothinks Robotics (The ones who took over the license and manufacturing from Rethinks Robotics.) with no response.


Additional Notes

  • I would prefer not to wipe the system, since the original Baxter image is difficult to obtain

  • Hardware appears to be functioning correctly

  • This seems like a boot/runtime configuration issue rather than a hardware failure


Goal

Restore normal behaviour where:

  • Baxter boots into the RSDK GUI

  • The robot runtime starts automatically

  • No manual login or intervention is required


Any help or pointers (especially from others maintaining older Baxter systems) would be greatly appreciated.

Thanks in advance.

1 post - 1 participant

Read full topic

by MinhBao19 on April 16, 2026 06:36 PM

Writing ROS2 nodes using modern python tooling with ros-z

Hi,

I recently found ZettaScaleLabs/ros-z, a work-in-progress Rust reimplementation of ROS2 by some of the people behind Zenoh. This project is still young and does not seem to have been discussed here yet, however they have already developped a very interesting feature: ros-z provides python bindings with no dependency on ROS.

Concretely, this means it is possible to create ROS2 nodes from a pyproject.toml-based python project. AFAIK, this is not possible with the standard ROS tooling.

I think many people (including myself) avoid using ROS in python projects (and python in ROS project) because modern python tooling is not supported. Could ros-z be python’s big comeback in ROS? What do you think?

1 post - 1 participant

Read full topic

by vrichard on April 16, 2026 08:12 AM

April 15, 2026
Real-Time Face Tracking in ROS 2 & OpenCV

Hi everyone,

I recently developed a zero-latency face tracking node using ROS 2 and OpenCV, designed as a foundation for responsive human-machine interaction, and was encouraged to share it with the community here!

The Challenge: Middleware Overhead

During development, I encountered severe frame-rate drops (sub-1 FPS). This was due to the heavy network serialization overhead of translating image matrices across standard ROS middleware.

The Solution: Edge Processing & Optimization

To solve this, I completely re-architected the pipeline:

Bypassing Drivers: By bypassing the standard camera drivers and processing the hardware stream directly at the edge, I eliminated the latency loop entirely.

Algorithm Optimization: The optimized system utilizes Haar cascades paired with dynamic contrast adjustment (CLAHE).

Result: Smooth, real-time bounding box tracking executed entirely on local hardware.

GitHub repository link:GitHub - abinaabey2006/ros2-opencv-face-tracker: A zero-latency, real-time face tracking node for ROS 2 using OpenCV and Haar Cascade · GitHub

LinkedIn post link: #ros2 #ros2 #computervision #opencv #roboticsengineering #python | Abina Abey | 22 comments

1 post - 1 participant

Read full topic

by abinaabey2006 on April 15, 2026 11:19 PM

Session Postponed to 2026-04-20 | Cloud Robotics Working Group

We planned a session for the 13th - more information here:

The session will instead run on Mon, Apr 20, 2026 4:00 PM UTCMon, Apr 20, 2026 5:00 PM UTC. The meeting link for is here, and you can sign up to our calendar or our Google Group for meeting notifications or keep an eye on the Cloud Robotics Hub.

1 post - 1 participant

Read full topic

by mikelikesrobots on April 15, 2026 03:06 PM

April 14, 2026
ROSCon Global 2026 Talk Proposals Due April 26th

Quick reminder, presentation proposals for ROSCon Global 2026 in Toronto are due by Sun, Apr 26, 2026 12:00 AM UTC. Please submit your proposals via HotCRP.

Additional details are available on the ROSCon Global 2026 website.

3 posts - 1 participant

Read full topic

by Katherine_Scott on April 14, 2026 04:38 PM

ROS jazzy driver for lego mindstroms Robot inventor

Hello everyone,

I have ported a ROS driver for Lego Mindstroms Robot inventor to ROS jazzy. Here is the link: GitHub - pem120/lego_ri_ros: ROS packages for Lego Mindstorms Robot Inventor · GitHub

1 post - 1 participant

Read full topic

by pem120 on April 14, 2026 03:58 AM

April 13, 2026
What's new in Transitive 2.0: ClickHouse DB storage, Grafana visualizations, Alerting

Transitive 2.0 is here!

We are thrilled to announce a new major version of Transitive, the open-source framework for full-stack robotics. Version 2.0 adds significant new integrations and features: storage of historic and time-series data in ClickHouse, visualization in Grafana, and custom alerting via Alertmanager. Some of our capabilities, like the free Health Monitoring capability, already use these features, providing significant added value to robotics companies with growing fleets.

Fleet Operation at Scale

Until now Transitive has been very much focused on transactional features needed for the operation of robot fleets. This includes our most popular capabilities: WebRTC Video streaming, Remote Teleop, and ROS Tool. These capabilities are particularly empowering to robotics companies that have not yet deployed more than 50 robots. Transitive’s open-source MQTTSync data protocol, its realization of full-stack packages, and the built-in fine-grained authentication and authorization features provided a solid foundation for us to build such transactional capabilities efficiently and reliably.

But as fleets grow so do the challenges in monitoring and operating. This means that companies need tools that go beyond the direct form of one operator working on one robot at a time, but provide both longitudinal as well as historic views of the fleet. Similarly, passive monitoring and alerting need to gradually replace active monitoring by (remote) operators. Supporting robotics companies in this second chapter of growth was our goal in this new major release, while still staying true to our philosophy of embeddability, ease of use, and fine-grained, namespaced access control.

Read more about the added features and how to try them out here:

1 post - 1 participant

Read full topic

by chfritz on April 13, 2026 07:42 PM

RobotCAD 10.5.0 adapted to FreeCAD 1.1 AppImage

Let you introduce RobotCAD adaption to FreeCAD 1.1 AppImage.
Enjoy new FreeCAD 1.1 functionality.

RobotCAD is a FreeCAD workbench to generate robot description packages for ROS2 (URDF) with launchers to Gazebo and RViz. Includes controllers based on ros2_controllers and sensors based on Gazebo. With integrated models library and a lot of other tools. In other words CAD → ROS2.

How to run RobotCAD - fast install and run with FreeCAD 1.1 AppImage

I did not post long time releases info, there are a lot of bug fixes and new functionality in previous versions.

1 post - 1 participant

Read full topic

by fenixionsoul on April 13, 2026 07:40 PM

Fast Lossless Image Compression: interested?

Hi,

so, I coundn’t shut up about this on my social media, so some of you might be already sick and tired of me, but I am sharing it here, hoping to understand if the robotic community may benefit from this work.

By pure chance, I started exploring the topic of Lossless Image Compression, in particular in terms of speed, thinking about real-time streaming and recording.

I got very interesting results that I think may benefit some use cases in robotics.

Before moving forward releasing the code or more details about the algorithm (that is very much still work in progress) I wanted to:

  • share the binaries with the community to allow people with healthy dose of skepticism to replicate results on their computer.

  • understand what the actual use case for fast, but still better than PNG, lossless compression is.

These are my results: 3 codecs with 3 different tradeoffs (being Griffin the most balanced one in the 3 dimensions).

I would love to hear the feedback of the community :grin:

LINK: GitHub - AurynRobotics/dvid3-codec · GitHub

Also, if you think that you have a practical application for this, please DM me to discuss this, either here or contacting me on dfaconti@aurynrobotics.com

Davide

18 posts - 6 participants

Read full topic

by facontidavide on April 13, 2026 10:32 AM

🚀 New "ROS Adopters" page is live - ADD YOUR PROJECT

Hi everyone :waving_hand:

We are excited to announce a new ROS Adopters page on the official ROS documentation site! This is a community-maintained, self-reported directory that showcases organizations and projects using ROS in any capacity - whether it’s a commercial product, a research platform, an educational tool, or anything in between.

:link: Browse the current adopters here: ROS 2 Adopters — ROS 2 Documentation: Rolling documentation

The page supports filtering by domain (e.g., Aerial/Drone, Manufacturing, Research, Consumer Robot, etc.) and by country, and includes a search function to help you find projects that interest you.

:thinking: Why add your project? (Main Part of the Post)

  • :globe_showing_europe_africa: Visibility - Let the world know your project runs on ROS.
  • :light_bulb: Inspire others - Seeing real-world deployments motivates new adopters and contributors.
  • :flexed_biceps: Strengthen the ecosystem - A healthy adopter list demonstrates the breadth and maturity of ROS to potential users, sponsors, and decision-makers.

:memo: How to add your project

We’ve made it as easy as possible. There’s an interactive form right on the documentation site:

:link: Add Your Project — ROS 2 Documentation: Rolling documentation

That’s it :white_check_mark: No special tooling required - you can do it entirely from your browser.

:robot: What counts as an “adopter”?

Anything that uses ROS :rocket: Commercial products, open-source projects, university research labs, hobby builds - if ROS is part of your stack, we’d love to see it listed. The directory is self-reported and accepted with minimal scrutiny, so don’t be shy :blush:

:open_book: Background / History

This feature was proposed in ros2/ros2_documentation#6248 and implemented in PR #6309.

Please consider to add your project, share this post with your colleagues, and let’s build a comprehensive picture of what the ROS ecosystem looks like in 2026! :tada:

Looking forward to seeing your PRs! :folded_hands:
Ping fujitatomoya@github once your PR is up! I am happy to review PRs!

Cheers,
Tomoya

4 posts - 4 participants

Read full topic

by tomoyafujita on April 13, 2026 01:09 AM

April 09, 2026
International Conference on Humanoid Robotics, Innovation & Leadership

======================================================================

                       **CALL FOR PAPERS**
                         **HRFEST 2026**

International Conference on Humanoid Robotics, Innovation & Leadership

Date: November 05 - 07, 2026
Location: Universidad Nacional del Callao (UNAC) - Callao, Peru (Hybrid Event)
Website: https://hrfest.org

CONFERENCE HIGHLIGHTS & WHY SUBMIT

* High-Impact Indexing: All accepted and presented papers will be
submitted to the IEEE Xplore digital library, which is typically
indexed by Scopus and Ei Compendex.
* Hybrid Format: Offering both in-person and virtual presentation
options to accommodate global researchers and industry professionals.
* Global Networking: Hosted alongside the IEEE RAS Regional
Manufacturing Workshop, connecting LATAM researchers with global
industry leaders.

ABOUT THE CONFERENCE

The HRFEST 2026: International Conference on Humanoid Robotics, Innovation
& Leadership is the premier Latin American forum that bridges the gap
between advanced robotics research and industrial leadership. Hosted by
the Universidad Nacional del Callao (UNAC) as the official academic and
not-for-profit sponsor, with NFM Robotics acting as an industrial patron
and logistical facilitator, this conference gathers top researchers,
industry leaders, and innovators.

HRFEST 2026 is technically co-sponsored by IEEE. Accepted and presented
papers will be submitted for inclusion into the IEEE Xplore digital
library, subject to meeting IEEE Xplore’s scope and quality requirements.

TECHNICAL TRACKS & TOPICS OF INTEREST

We invite researchers, academics, and professionals to submit original,
unpublished technical papers. Topics of interest include, but are not
limited to:

* Track 1: Robotics & Adv. Manufacturing

  • Humanoid Robotics, Bipedalism & Legged Locomotion
  • Control Systems, Kinematics & Dynamics
  • Mechatronics, Soft Robotics & Smart Materials
  • Industrial Automation, Cobots & Swarm Robotics

* Track 2: AI & Data Science

  • Machine Learning & Deep Learning
  • Generative AI & LLMs
  • Computer Vision, Pattern Recognition & NLP
  • Ethical AI & Explainable AI (XAI)

* Track 3: Engineering Management

  • Tech, Innovation & R&D Management
  • Industry 4.0 & Digital Transformation
  • Agile Project Management
  • Tech Entrepreneurship & Startups

* Track 4: Applied Technologies

  • Internet of Things (IoT) & Smart Cities
  • Biomedical Eng. & Healthcare Systems
  • Financial Engineering & FinTech
  • Renewable Energy Systems

SUBMISSION GUIDELINES

* Review Process: HRFEST 2026 enforces a strict Double-Blind Peer Review.
* Submission Portal: All manuscripts must be submitted electronically
via EasyChair at: https://easychair.org/conferences/?conf=hrfest2026
* Format & Length: All manuscripts must follow the standard double-column
IEEE Conference template and should not exceed six (6) pages in PDF format.
* Originality: Submissions must be original work not currently under
review by any other conference or journal.
* Camera-Ready Submissions: Final versions of accepted papers must be
validated using IEEE PDF eXpress (Conference ID: 71784X). The PDF
eXpress validation site will open on September 15, 2026.

IMPORTANT DEADLINES

* Full Paper Submission Deadline: July 05, 2026
* Notification of Acceptance: September 15, 2026
* Final Camera-Ready Submission: October 15, 2026

For more information regarding submissions, registration, and the
IEEE RAS Regional Manufacturing Workshop, please visit our official
website: https://hrfest.org

We look forward to seeing you in Callao!

1 post - 1 participant

Read full topic

by RoboticsLab on April 09, 2026 11:13 PM

[Virtual Event] The Messy Reality of Field Autonomy: ROS 2 Architectures, Behavior Trees & Sim-to-Real

Hi everyone,

If you have ever lost a week of field data because of a typo in a custom ROS message, or watched a perfectly tuned simulation model immediately fail on physical hardware, this session is for you.

On May 1st, the Canadian Physical AI Institute (CPAI) is hosting a highly technical, virtual deep-dive into the architectural evolution of robotic autonomy and the gritty realities of physical deployment.

We are moving past the theoretical benchmarks to talk about what actually breaks in the wild and how to architect your software to handle it.

Here is what we are covering:

Part 1: Driving into the (Un)Known: Navigation for Field Robots

Alec Krawciw (PhD candidate, UofT Autonomous Space Robotics Lab & Vanier Scholar) will cover the logistical and systemic realities of field deployment, including:

  • Pre-Field Data Strategies: Why post-processing tools must be built before testing, and how simple data-logging errors (like ROS message naming typos) can ruin a deployment.

  • System Failure is Inevitable: The critical difference between fault prevention and fault recovery, and why strict deterministic approaches shatter off-road.

  • Maximizing Field Time: Practical workflows to reduce on-site engineering workload.

Part 2: Beyond Hard-Coded Control: Embodied AI & ROS 2 Architecture

Behnam Moradi (Senior Software Engineer in Robotic Autonomy) will break down the shift from classical state machines to modern autonomy stacks:

  • From Loops to Graphs: Making the architectural leap from linear execution loops to the distributed graph of nodes required in ROS 2 (“What data is available now?”).

  • Behavior Trees & Goal-Seeking: Moving beyond massive if-else chains to priority-driven agents that respect constraints and dynamically replan.

  • The True Role of Simulation: Why tools like PX4 and AirSim aren’t for testing if your software works, but for validating if your simulation was accurate in the first place.

Event Details

  • Date: Friday, May 1

  • Time: 6:00 PM - 8:00 PM EDT

  • Location: Google Meet

  • Host: Diana Gomez Galeano (former Director, McGill Robotics)

Whether you are migrating a stack to ROS 2, building out your first Behavior Trees, or gearing up for summer field trials, we would love to have you join the conversation. We will have dedicated time for Q&A to help troubleshoot your specific architecture roadblocks.

Registration & Tickets: We have 10 complimentary tickets for ROS community to join us

Looking forward to seeing some of you there!

Cheers,

Saeed Sarfarazi
Canadian Physical AI Institute (CPAI)

1 post - 1 participant

Read full topic

by Saeed on April 09, 2026 12:02 AM

April 08, 2026
FusionCore demo: GPS outlier rejection in a ROS 2 filter built to replace robot_localization

Quick demo of outlier rejection working in simulation.

I built a spike injector that publishes a fake GPS fix 500 meters from the robot’s actual position into a live running FusionCore filter. The Mahalanobis distance hit 60,505 against a rejection threshold of 16. All three spikes dropped instantly. Position didn’t move.

The video is 30 seconds: robot driving in Gazebo, FusionCore GCS dashboard showing the Mahalanobis waveform, rejection log, and spike counter updating in real time.

GitHub

For anyone who missed the original announcement: FusionCore is a ROS 2 Jazzy sensor fusion package replacing deprecated robot_localization. IMU, wheel encoders, and GPS fused via UKF at 100Hz. Apache 2.0.

GitHub: https://github.com/manankharwar/fusioncore

1 post - 1 participant

Read full topic

by manankharwar on April 08, 2026 04:23 PM


Powered by the awesome: Planet