<?xml version="1.0"?>
<rss version="2.0">

<channel>
	<title>Planet ROS</title>
	<link>http://planet.ros.org</link>
	<language>en</language>
	<description>Planet ROS - http://planet.ros.org</description>

<item>
	<title>ROS Discourse General: International Conference on Humanoid Robotics, Innovation &amp; Leadership</title>
	<guid isPermaLink="false">discourse.openrobotics.org-topic-53922</guid>
	<link>https://discourse.openrobotics.org/t/international-conference-on-humanoid-robotics-innovation-leadership/53922</link>
	<description>&lt;p&gt;======================================================================&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;                       **CALL FOR PAPERS**
                         **HRFEST 2026**
&lt;/code&gt;&lt;/pre&gt;
&lt;h1&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111398-international-conference-on-humanoid-robotics-innovation-leadership-1&quot; name=&quot;p-111398-international-conference-on-humanoid-robotics-innovation-leadership-1&quot;&gt;&lt;/a&gt;&lt;strong&gt;International Conference on Humanoid Robotics, Innovation &amp;amp; Leadership&lt;/strong&gt;&lt;/h1&gt;
&lt;p&gt;Date: November 05 - 07, 2026&lt;br /&gt;
Location: Universidad Nacional del Callao (UNAC) - Callao, Peru (Hybrid Event)&lt;br /&gt;
Website: &lt;a href=&quot;https://hrfest.org&quot; rel=&quot;noopener nofollow ugc&quot;&gt;https://hrfest.org&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111398-conference-highlights-why-submit-2&quot; name=&quot;p-111398-conference-highlights-why-submit-2&quot;&gt;&lt;/a&gt;&lt;strong&gt;CONFERENCE HIGHLIGHTS &amp;amp; WHY SUBMIT&lt;/strong&gt;&lt;/h2&gt;
&lt;p&gt;* High-Impact Indexing: All accepted and presented papers will be&lt;br /&gt;
submitted to the IEEE Xplore digital library, which is typically&lt;br /&gt;
indexed by Scopus and Ei Compendex.&lt;br /&gt;
* Hybrid Format: Offering both in-person and virtual presentation&lt;br /&gt;
options to accommodate global researchers and industry professionals.&lt;br /&gt;
* Global Networking: Hosted alongside the IEEE RAS Regional&lt;br /&gt;
Manufacturing Workshop, connecting LATAM researchers with global&lt;br /&gt;
industry leaders.&lt;/p&gt;
&lt;h2&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111398-about-the-conference-3&quot; name=&quot;p-111398-about-the-conference-3&quot;&gt;&lt;/a&gt;&lt;strong&gt;ABOUT THE CONFERENCE&lt;/strong&gt;&lt;/h2&gt;
&lt;p&gt;The HRFEST 2026: International Conference on Humanoid Robotics, Innovation&lt;br /&gt;
&amp;amp; Leadership is the premier Latin American forum that bridges the gap&lt;br /&gt;
between advanced robotics research and industrial leadership. Hosted by&lt;br /&gt;
the Universidad Nacional del Callao (UNAC) as the official academic and&lt;br /&gt;
not-for-profit sponsor, with NFM Robotics acting as an industrial patron&lt;br /&gt;
and logistical facilitator, this conference gathers top researchers,&lt;br /&gt;
industry leaders, and innovators.&lt;/p&gt;
&lt;p&gt;HRFEST 2026 is technically co-sponsored by IEEE. Accepted and presented&lt;br /&gt;
papers will be submitted for inclusion into the IEEE Xplore digital&lt;br /&gt;
library, subject to meeting IEEE Xploreâ€™s scope and quality requirements.&lt;/p&gt;
&lt;h2&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111398-technical-tracks-topics-of-interest-4&quot; name=&quot;p-111398-technical-tracks-topics-of-interest-4&quot;&gt;&lt;/a&gt;&lt;strong&gt;TECHNICAL TRACKS &amp;amp; TOPICS OF INTEREST&lt;/strong&gt;&lt;/h2&gt;
&lt;p&gt;We invite researchers, academics, and professionals to submit original,&lt;br /&gt;
unpublished technical papers. Topics of interest include, but are not&lt;br /&gt;
limited to:&lt;/p&gt;
&lt;p&gt;* Track 1: Robotics &amp;amp; Adv. Manufacturing&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Humanoid Robotics, Bipedalism &amp;amp; Legged Locomotion&lt;/li&gt;
&lt;li&gt;Control Systems, Kinematics &amp;amp; Dynamics&lt;/li&gt;
&lt;li&gt;Mechatronics, Soft Robotics &amp;amp; Smart Materials&lt;/li&gt;
&lt;li&gt;Industrial Automation, Cobots &amp;amp; Swarm Robotics&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;* Track 2: AI &amp;amp; Data Science&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Machine Learning &amp;amp; Deep Learning&lt;/li&gt;
&lt;li&gt;Generative AI &amp;amp; LLMs&lt;/li&gt;
&lt;li&gt;Computer Vision, Pattern Recognition &amp;amp; NLP&lt;/li&gt;
&lt;li&gt;Ethical AI &amp;amp; Explainable AI (XAI)&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;* Track 3: Engineering Management&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Tech, Innovation &amp;amp; R&amp;amp;D Management&lt;/li&gt;
&lt;li&gt;Industry 4.0 &amp;amp; Digital Transformation&lt;/li&gt;
&lt;li&gt;Agile Project Management&lt;/li&gt;
&lt;li&gt;Tech Entrepreneurship &amp;amp; Startups&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;* Track 4: Applied Technologies&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Internet of Things (IoT) &amp;amp; Smart Cities&lt;/li&gt;
&lt;li&gt;Biomedical Eng. &amp;amp; Healthcare Systems&lt;/li&gt;
&lt;li&gt;Financial Engineering &amp;amp; FinTech&lt;/li&gt;
&lt;li&gt;Renewable Energy Systems&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111398-submission-guidelines-5&quot; name=&quot;p-111398-submission-guidelines-5&quot;&gt;&lt;/a&gt;&lt;strong&gt;SUBMISSION GUIDELINES&lt;/strong&gt;&lt;/h2&gt;
&lt;p&gt;* Review Process: HRFEST 2026 enforces a strict Double-Blind Peer Review.&lt;br /&gt;
* Submission Portal: All manuscripts must be submitted electronically&lt;br /&gt;
via EasyChair at: &lt;a href=&quot;https://easychair.org/conferences/?conf=hrfest2026&quot; rel=&quot;noopener nofollow ugc&quot;&gt;https://easychair.org/conferences/?conf=hrfest2026&lt;/a&gt;&lt;br /&gt;
* Format &amp;amp; Length: All manuscripts must follow the standard double-column&lt;br /&gt;
IEEE Conference template and should not exceed six (6) pages in PDF format.&lt;br /&gt;
* Originality: Submissions must be original work not currently under&lt;br /&gt;
review by any other conference or journal.&lt;br /&gt;
* Camera-Ready Submissions: Final versions of accepted papers must be&lt;br /&gt;
validated using IEEE PDF eXpress (Conference ID: 71784X). The PDF&lt;br /&gt;
eXpress validation site will open on September 15, 2026.&lt;/p&gt;
&lt;h2&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111398-important-deadlines-6&quot; name=&quot;p-111398-important-deadlines-6&quot;&gt;&lt;/a&gt;IMPORTANT DEADLINES&lt;/h2&gt;
&lt;p&gt;* Full Paper Submission Deadline:        July 05, 2026&lt;br /&gt;
* Notification of Acceptance:            September 15, 2026&lt;br /&gt;
* Final Camera-Ready Submission:         October 15, 2026&lt;/p&gt;
&lt;p&gt;For more information regarding submissions, registration, and the&lt;br /&gt;
IEEE RAS Regional Manufacturing Workshop, please visit our official&lt;br /&gt;
website: &lt;a href=&quot;https://hrfest.org&quot; rel=&quot;noopener nofollow ugc&quot;&gt;https://hrfest.org&lt;/a&gt;&lt;/p&gt;
&lt;h1&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111398-we-look-forward-to-seeing-you-in-callao-7&quot; name=&quot;p-111398-we-look-forward-to-seeing-you-in-callao-7&quot;&gt;&lt;/a&gt;We look forward to seeing you in Callao!&lt;/h1&gt;
            &lt;p&gt;&lt;small&gt;1 post - 1 participant&lt;/small&gt;&lt;/p&gt;
            &lt;p&gt;&lt;a href=&quot;https://discourse.openrobotics.org/t/international-conference-on-humanoid-robotics-innovation-leadership/53922&quot;&gt;Read full topic&lt;/a&gt;&lt;/p&gt;</description>
	<pubDate>Thu, 09 Apr 2026 23:13:24 +0000</pubDate>
</item>
<item>
	<title>ROS Discourse General: [Virtual Event] The Messy Reality of Field Autonomy: ROS 2 Architectures, Behavior Trees &amp; Sim-to-Real</title>
	<guid isPermaLink="false">discourse.openrobotics.org-topic-53895</guid>
	<link>https://discourse.openrobotics.org/t/virtual-event-the-messy-reality-of-field-autonomy-ros-2-architectures-behavior-trees-sim-to-real/53895</link>
	<description>&lt;p&gt;Hi everyone,&lt;/p&gt;
&lt;p&gt;If you have ever lost a week of field data because of a typo in a custom ROS message, or watched a perfectly tuned simulation model immediately fail on physical hardware, this session is for you.&lt;/p&gt;
&lt;p&gt;On May 1st, the Canadian Physical AI Institute (CPAI) is hosting a highly technical, virtual deep-dive into the architectural evolution of robotic autonomy and the gritty realities of physical deployment.&lt;/p&gt;
&lt;p&gt;We are moving past the theoretical benchmarks to talk about what actually breaks in the wild and how to architect your software to handle it.&lt;/p&gt;
&lt;p&gt;Here is what we are covering:&lt;/p&gt;
&lt;h3&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111355-part-1-driving-into-the-unknown-navigation-for-field-robots-1&quot; name=&quot;p-111355-part-1-driving-into-the-unknown-navigation-for-field-robots-1&quot;&gt;&lt;/a&gt;Part 1: Driving into the (Un)Known: Navigation for Field Robots&lt;/h3&gt;
&lt;p&gt;&lt;strong&gt;Alec Krawciw&lt;/strong&gt; &lt;em&gt;(PhD candidate, UofT Autonomous Space Robotics Lab &amp;amp; Vanier Scholar)&lt;/em&gt; will cover the logistical and systemic realities of field deployment, including:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Pre-Field Data Strategies:&lt;/strong&gt; Why post-processing tools must be built &lt;em&gt;before&lt;/em&gt; testing, and how simple data-logging errors (like ROS message naming typos) can ruin a deployment.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;System Failure is Inevitable:&lt;/strong&gt; The critical difference between fault prevention and fault recovery, and why strict deterministic approaches shatter off-road.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Maximizing Field Time:&lt;/strong&gt; Practical workflows to reduce on-site engineering workload.&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111355-part-2-beyond-hard-coded-control-embodied-ai-ros-2-architecture-2&quot; name=&quot;p-111355-part-2-beyond-hard-coded-control-embodied-ai-ros-2-architecture-2&quot;&gt;&lt;/a&gt;Part 2: Beyond Hard-Coded Control: Embodied AI &amp;amp; ROS 2 Architecture&lt;/h3&gt;
&lt;p&gt;&lt;strong&gt;Behnam Moradi&lt;/strong&gt; &lt;em&gt;(Senior Software Engineer in Robotic Autonomy)&lt;/em&gt; will break down the shift from classical state machines to modern autonomy stacks:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;From Loops to Graphs:&lt;/strong&gt; Making the architectural leap from linear execution loops to the distributed graph of nodes required in ROS 2 (“What data is available now?”).&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Behavior Trees &amp;amp; Goal-Seeking:&lt;/strong&gt; Moving beyond massive if-else chains to priority-driven agents that respect constraints and dynamically replan.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;The True Role of Simulation:&lt;/strong&gt; Why tools like PX4 and AirSim aren’t for testing if your software works, but for validating if your simulation was accurate in the first place.&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111355-event-details-3&quot; name=&quot;p-111355-event-details-3&quot;&gt;&lt;/a&gt;Event Details&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt; Friday, May 1&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Time:&lt;/strong&gt; 6:00 PM - 8:00 PM EDT&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Location:&lt;/strong&gt; Google Meet&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Host:&lt;/strong&gt; Diana Gomez Galeano (former Director, McGill Robotics)&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Whether you are migrating a stack to ROS 2, building out your first Behavior Trees, or gearing up for summer field trials, we would love to have you join the conversation. We will have dedicated time for Q&amp;amp;A to help troubleshoot your specific architecture roadblocks.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Registration &amp;amp; Tickets:&lt;/strong&gt; We have 10 complimentary tickets for ROS community to join us&lt;/p&gt;
&lt;aside class=&quot;onebox allowlistedgeneric&quot;&gt;
  &lt;header class=&quot;source&quot;&gt;
      &lt;img alt=&quot;&quot; class=&quot;site-icon&quot; height=&quot;64&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/e/3/e3b39e7454658336569f54217647cd20a0b22d17.png&quot; width=&quot;64&quot; /&gt;

      &lt;a href=&quot;https://luma.com/zen4crma?coupon=ROSDISCOURSE&quot; rel=&quot;noopener nofollow ugc&quot; target=&quot;_blank&quot;&gt;luma.com&lt;/a&gt;
  &lt;/header&gt;

  &lt;article class=&quot;onebox-body&quot;&gt;
    &lt;div class=&quot;aspect-image&quot;&gt;&lt;img alt=&quot;&quot; class=&quot;thumbnail&quot; height=&quot;362&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/9/0/90de0551ab816d0d63f6b467a2e1d4e782207f28.jpeg&quot; width=&quot;690&quot; /&gt;&lt;/div&gt;

&lt;h3&gt;&lt;a href=&quot;https://luma.com/zen4crma?coupon=ROSDISCOURSE&quot; rel=&quot;noopener nofollow ugc&quot; target=&quot;_blank&quot;&gt;Physical AI: The Messy Reality of Field Autonomy · Luma&lt;/a&gt;&lt;/h3&gt;

  &lt;p&gt;Are You Building for the Real World, or Just the Benchmark?
While urban driving is a primary focus for both industry and academia, the strict &quot;rules of the…&lt;/p&gt;


  &lt;/article&gt;

  &lt;div class=&quot;onebox-metadata&quot;&gt;
    
    
  &lt;/div&gt;

  &lt;div style=&quot;clear: both;&quot;&gt;&lt;/div&gt;
&lt;/aside&gt;

&lt;p&gt;Looking forward to seeing some of you there!&lt;/p&gt;
&lt;p&gt;Cheers,&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Saeed Sarfarazi&lt;/strong&gt;&lt;br /&gt;
Canadian Physical AI Institute (CPAI)&lt;/p&gt;
            &lt;p&gt;&lt;small&gt;1 post - 1 participant&lt;/small&gt;&lt;/p&gt;
            &lt;p&gt;&lt;a href=&quot;https://discourse.openrobotics.org/t/virtual-event-the-messy-reality-of-field-autonomy-ros-2-architectures-behavior-trees-sim-to-real/53895&quot;&gt;Read full topic&lt;/a&gt;&lt;/p&gt;</description>
	<pubDate>Thu, 09 Apr 2026 00:02:38 +0000</pubDate>
</item>
<item>
	<title>ROS Discourse General: FusionCore demo: GPS outlier rejection in a ROS 2 filter built to replace robot_localization</title>
	<guid isPermaLink="false">discourse.openrobotics.org-topic-53887</guid>
	<link>https://discourse.openrobotics.org/t/fusioncore-demo-gps-outlier-rejection-in-a-ros-2-filter-built-to-replace-robot-localization/53887</link>
	<description>&lt;p&gt;Quick demo of outlier rejection working in simulation.&lt;/p&gt;
&lt;p&gt;I built a spike injector that publishes a fake GPS fix 500 meters from the robot’s actual position into a live running FusionCore filter. The Mahalanobis distance hit 60,505 against a rejection threshold of 16. All three spikes dropped instantly. Position didn’t move.&lt;/p&gt;
&lt;p&gt;The video is 30 seconds: robot driving in Gazebo, FusionCore GCS dashboard showing the Mahalanobis waveform, rejection log, and spike counter updating in real time.&lt;/p&gt;
&lt;p&gt;&lt;img alt=&quot;GitHub&quot; class=&quot;animated&quot; height=&quot;388&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/5/e/5e5a48f23f57b62e6f5b31a7aff4b31cbdab175a.gif&quot; width=&quot;690&quot; /&gt;&lt;/p&gt;
&lt;p&gt;For anyone who missed the original announcement: FusionCore is a ROS 2 Jazzy sensor fusion package replacing deprecated robot_localization. IMU, wheel encoders, and GPS fused via UKF at 100Hz. Apache 2.0.&lt;/p&gt;
&lt;p&gt;GitHub: &lt;a href=&quot;https://github.com/manankharwar/fusioncore&quot; rel=&quot;noopener nofollow ugc&quot;&gt;https://github.com/manankharwar/fusioncore&lt;/a&gt;&lt;/p&gt;
            &lt;p&gt;&lt;small&gt;1 post - 1 participant&lt;/small&gt;&lt;/p&gt;
            &lt;p&gt;&lt;a href=&quot;https://discourse.openrobotics.org/t/fusioncore-demo-gps-outlier-rejection-in-a-ros-2-filter-built-to-replace-robot-localization/53887&quot;&gt;Read full topic&lt;/a&gt;&lt;/p&gt;</description>
	<pubDate>Wed, 08 Apr 2026 16:23:09 +0000</pubDate>
</item>
<item>
	<title>ROS Discourse General: Delaying Lyrical RMW and Feature Freezes</title>
	<guid isPermaLink="false">discourse.openrobotics.org-topic-53858</guid>
	<link>https://discourse.openrobotics.org/t/delaying-lyrical-rmw-and-feature-freezes/53858</link>
	<description>&lt;p&gt;Hi all,&lt;/p&gt;
&lt;p&gt;In today’s ROS PMC meeting we decided to delay the RMW freeze and Feature freeze by 1 week each. The purpose of the delay is to give more time to upgrade and stabilize all Tier 1 RMW implementations. The ROS Lyrical Release date has not changed.&lt;/p&gt;
&lt;p&gt;The new timelines are:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;New RMW Freeze &lt;span class=&quot;discourse-local-date&quot;&gt;Tue, Apr 14, 2026 6:59 AM UTC&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;New Feature freeze: &lt;span class=&quot;discourse-local-date&quot;&gt;Tue, Apr 21, 2026 6:59 AM UTC&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;New Branch from Rolling: &lt;span class=&quot;discourse-local-date&quot;&gt;Wed, Apr 22, 2026 6:59 AM UTC&lt;/span&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Updates here: &lt;a class=&quot;inline-onebox&quot; href=&quot;https://github.com/ros2/ros2_documentation/pull/6350&quot; rel=&quot;noopener nofollow ugc&quot;&gt;Delay Lyrical RMW Freeze; Feature Freeze; Branch by sloretz · Pull Request #6350 · ros2/ros2_documentation · GitHub&lt;/a&gt;&lt;/p&gt;
            &lt;p&gt;&lt;small&gt;1 post - 1 participant&lt;/small&gt;&lt;/p&gt;
            &lt;p&gt;&lt;a href=&quot;https://discourse.openrobotics.org/t/delaying-lyrical-rmw-and-feature-freezes/53858&quot;&gt;Read full topic&lt;/a&gt;&lt;/p&gt;</description>
	<pubDate>Wed, 08 Apr 2026 00:37:31 +0000</pubDate>
</item>
<item>
	<title>ROS Discourse General: Multi-Robot Fleet Management System using ROS2, Nav2, and Gazebo</title>
	<guid isPermaLink="false">discourse.openrobotics.org-topic-53835</guid>
	<link>https://discourse.openrobotics.org/t/multi-robot-fleet-management-system-using-ros2-nav2-and-gazebo/53835</link>
	<description>&lt;p&gt;I am developing a multi-robot fleet management system in a simulated warehouse environment using ROS2 (Humble) and Gazebo. The system is designed to study scalable coordination and task allocation across multiple autonomous mobile robots operating in a structured environment.&lt;/p&gt;
&lt;p&gt;The architecture follows a distributed approach where each robot is implemented as an independent agent node responsible for navigation, execution, and state reporting. A centralized fleet manager node handles global task allocation and coordination. Communication is implemented using ROS2 topics, services, and action interfaces to enable asynchronous and real-time interaction between components.&lt;/p&gt;
&lt;p&gt;Navigation is implemented using the Nav2 stack, integrating localization, global and local path planning, and obstacle avoidance. LiDAR-based perception is used for environmental awareness and safe navigation within the simulated warehouse.&lt;/p&gt;
&lt;p&gt;The system supports dynamic task allocation, where robots receive pick-and-deliver tasks, compute feasible paths, and execute them while continuously publishing execution status. A typical workflow involves a robot navigating to a shelf location, performing a simulated pickup, and delivering the item to a designated drop-off point.&lt;/p&gt;
&lt;p&gt;This project focuses on understanding distributed robotic system design, inter-node communication, and multi-robot coordination challenges such as scalability and synchronization. Future work includes implementing conflict resolution strategies, fleet-level optimization, and extending the system toward real-world deployment.&lt;/p&gt;
&lt;div class=&quot;d-image-grid&quot;&gt;
&lt;p&gt;&lt;/p&gt;&lt;div class=&quot;lightbox-wrapper&quot;&gt;&lt;a class=&quot;lightbox&quot; href=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/2/8/28800d3297f3e5a85f16cf9e7098c2e8639eaf39.jpeg&quot; rel=&quot;noopener nofollow ugc&quot; title=&quot;Screenshot from 2026-04-06 11-28-19&quot;&gt;&lt;img alt=&quot;Screenshot from 2026-04-06 11-28-19&quot; height=&quot;388&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/optimized/3X/2/8/28800d3297f3e5a85f16cf9e7098c2e8639eaf39_2_690x388.jpeg&quot; width=&quot;690&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;p&gt;&lt;/p&gt;
&lt;p&gt;&lt;/p&gt;&lt;div class=&quot;lightbox-wrapper&quot;&gt;&lt;a class=&quot;lightbox&quot; href=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/2/9/296143ac9384b7613817cd0cf7f3a4a929c49202.png&quot; rel=&quot;noopener nofollow ugc&quot; title=&quot;Screenshot from 2026-04-06 11-32-41&quot;&gt;&lt;img alt=&quot;Screenshot from 2026-04-06 11-32-41&quot; height=&quot;123&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/optimized/3X/2/9/296143ac9384b7613817cd0cf7f3a4a929c49202_2_690x123.png&quot; width=&quot;690&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;p&gt;&lt;/p&gt;
&lt;p&gt;&lt;/p&gt;&lt;div class=&quot;lightbox-wrapper&quot;&gt;&lt;a class=&quot;lightbox&quot; href=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/2/4/24141c16d72d44bc034bef7857490c0b16555b65.png&quot; rel=&quot;noopener nofollow ugc&quot; title=&quot;Screenshot from 2026-04-06 11-29-29&quot;&gt;&lt;img alt=&quot;Screenshot from 2026-04-06 11-29-29&quot; height=&quot;388&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/optimized/3X/2/4/24141c16d72d44bc034bef7857490c0b16555b65_2_690x388.png&quot; width=&quot;690&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;p&gt;&lt;/p&gt;
&lt;/div&gt;
            &lt;p&gt;&lt;small&gt;4 posts - 4 participants&lt;/small&gt;&lt;/p&gt;
            &lt;p&gt;&lt;a href=&quot;https://discourse.openrobotics.org/t/multi-robot-fleet-management-system-using-ros2-nav2-and-gazebo/53835&quot;&gt;Read full topic&lt;/a&gt;&lt;/p&gt;</description>
	<pubDate>Mon, 06 Apr 2026 23:54:50 +0000</pubDate>
</item>
<item>
	<title>ROS Discourse General: Ros2_medkit + VDA 5050: bridging SOVD diagnostics with fleet management</title>
	<guid isPermaLink="false">discourse.openrobotics.org-topic-53829</guid>
	<link>https://discourse.openrobotics.org/t/ros2-medkit-vda-5050-bridging-sovd-diagnostics-with-fleet-management/53829</link>
	<description>&lt;p&gt;Hey everyone,&lt;/p&gt;
&lt;p&gt;Quick update on &lt;strong&gt;ros2_medkit&lt;/strong&gt;. We’ve been exploring how medkit’s diagnostic data can serve &lt;strong&gt;VDA 5050&lt;/strong&gt; fleet integrations, and put together a working demo.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Context:&lt;/strong&gt; VDA 5050 error reporting is intentionally minimal (errorType, errorLevel, errorDescription). That’s fine for fleet routing decisions, but when an engineer needs to debug a fault, there’s a gap. We wanted to see if medkit’s SOVD layer could fill it without breaking either standard.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;What we did:&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;The new SOVD Service Interface plugin exposes medkit’s entity tree, faults, and capabilities via ROS 2 services (ListEntities, GetEntityFaults, GetCapabilities). This means any ROS 2 node can query diagnostic data (not just SOVD/REST clients).&lt;/p&gt;
&lt;p&gt;We built a VDA 5050 agent as a separate process that:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Handles MQTT communication with a fleet manager (orders, state, instant actions)&lt;/li&gt;
&lt;li&gt;Drives Nav2 for navigation&lt;/li&gt;
&lt;li&gt;Queries medkit’s services to report faults as VDA 5050 errors&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;strong&gt;medkit&lt;/strong&gt; stays completely unaware of &lt;strong&gt;VDA 5050&lt;/strong&gt;. The agent is just another ROS 2 service consumer (same interface a BT.CPP node or PlotJuggler plugin would use).&lt;/p&gt;
&lt;p&gt;&lt;img alt=&quot;vda5050_demo_560_15fps&quot; class=&quot;animated&quot; height=&quot;500&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/d/e/deafdfc70298fd89171d48899c178ad071d9eafc.gif&quot; width=&quot;500&quot; /&gt;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Demo video&lt;/strong&gt;:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;ROSMASTER M3 Pro (Jetson Orin Nano),&lt;/li&gt;
&lt;li&gt;mission dispatched from VDA 5050 Visualizer,&lt;/li&gt;
&lt;li&gt;LiDAR fault injected mid-navigation,&lt;/li&gt;
&lt;li&gt;fault propagated to fleet manager + full SOVD snapshot (freeze frames, extended data records, rosbag) in medkit’s web UI.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;strong&gt;The service interface plugin is useful beyond VDA 5050&lt;/strong&gt; - anything that consumes ROS 2 services can now pull diagnostic data from medkit. Curious if anyone sees other use cases.&lt;/p&gt;
&lt;p&gt;repo: &lt;a class=&quot;inline-onebox&quot; href=&quot;https://github.com/selfpatch/ros2_medkit&quot; rel=&quot;noopener nofollow ugc&quot;&gt;GitHub - selfpatch/ros2_medkit: ros2_medkit - diagnostics gateway for ROS 2 robots. Faults, live data, operations, scripts, locking, triggers, and OTA updates via REST API. No SSH, no custom tooling. · GitHub&lt;/a&gt;&lt;/p&gt;
            &lt;p&gt;&lt;small&gt;1 post - 1 participant&lt;/small&gt;&lt;/p&gt;
            &lt;p&gt;&lt;a href=&quot;https://discourse.openrobotics.org/t/ros2-medkit-vda-5050-bridging-sovd-diagnostics-with-fleet-management/53829&quot;&gt;Read full topic&lt;/a&gt;&lt;/p&gt;</description>
	<pubDate>Mon, 06 Apr 2026 14:42:00 +0000</pubDate>
</item>
<item>
	<title>ROS Discourse General: How to find code “someone already wrote that”? (WaypointFollow Metrics, Rotate Normal To Wall”)</title>
	<guid isPermaLink="false">discourse.openrobotics.org-topic-53828</guid>
	<link>https://discourse.openrobotics.org/t/how-to-find-code-someone-already-wrote-that-waypointfollow-metrics-rotate-normal-to-wall/53828</link>
	<description>&lt;p&gt;I came to ROS many years ago thinking “someone has probably already coded every basic robotics challenge”.  Indeed, I found lots to use, but still find myself writing basic nodes because I don’t know how to search the “ROS mine” for a particular basic node I need.&lt;/p&gt;
&lt;p&gt;For example:  I’m trying to improve the robustness, and reliability of navigation of my TurtleBot4 robot in my home environment.  Nav2 has a million parameters, and I have managed to get a param set for 10 waypoints around my home that succeed most tests.  Two desirable waypoints cause a lot of recoveries and occasional goal failures.&lt;/p&gt;
&lt;p&gt;I need a test node that collects recovery metrics and goal success/failure/skipped status during a 10 stop waypoint following run, to compare robustness and reliability across parameter changes, and waypoint tweaks.  Other metrics like navigation time, distance travelled, delta x,y,heading between goal and result would be nice to have.&lt;/p&gt;
&lt;p&gt;Surely someone has written a Nav2 test node I can use to optimize my Nav2 parameter set?&lt;/p&gt;
&lt;p&gt;P.s.  “Rotate normal to closest wall by /scan” is another basic challenge I would guess was written years ago.&lt;/p&gt;
            &lt;p&gt;&lt;small&gt;1 post - 1 participant&lt;/small&gt;&lt;/p&gt;
            &lt;p&gt;&lt;a href=&quot;https://discourse.openrobotics.org/t/how-to-find-code-someone-already-wrote-that-waypointfollow-metrics-rotate-normal-to-wall/53828&quot;&gt;Read full topic&lt;/a&gt;&lt;/p&gt;</description>
	<pubDate>Mon, 06 Apr 2026 13:54:54 +0000</pubDate>
</item>
<item>
	<title>ROS Discourse General: Introduction: QERRA-v2 — Hybrid Quantum-Ethical Safety Layer for Humanoid Robots</title>
	<guid isPermaLink="false">discourse.openrobotics.org-topic-53812</guid>
	<link>https://discourse.openrobotics.org/t/introduction-qerra-v2-hybrid-quantum-ethical-safety-layer-for-humanoid-robots/53812</link>
	<description>&lt;pre&gt;&lt;code class=&quot;lang-auto&quot;&gt;Hello everyone,

My name is Marussa Metocharaki (@marunigno).
 I’m the solo founder of **QERRA-v2** — a hybrid quantum-classical ethical decision engine for safer humanoid robots and high-stakes AI systems.

The project combines quantum-inspired exploration (I successfully ran a real 8-qubit W-state on IBM quantum hardware) with classical ethical vectors (SEMEV-12), toxicity detection, and a safety kernel. I already have a live public API with a working /analyze endpoint.

Right now the project is still in an early experimental stage — the classical safety layer works well, while the hybrid quantum part is a small prototype that I am actively improving.

I’m building this completely alone under significant personal constraints, and I would love to connect with people in the robotics community who care about ethical and safety layers for humanoid robots.

I just published the full Whitepaper and the code is open-source (AGPL-3.0).

Would be very grateful for any feedback, ideas, or potential collaboration.

GitHub: https://github.com/marunigno-ship-it/QERRA-v2
Whitepaper: https://github.com/marunigno-ship-it/QERRA-v2/blob/main/WHITEPAPER.md

Thank you and looking forward to learning from this community!
&lt;/code&gt;&lt;/pre&gt;
            &lt;p&gt;&lt;small&gt;1 post - 1 participant&lt;/small&gt;&lt;/p&gt;
            &lt;p&gt;&lt;a href=&quot;https://discourse.openrobotics.org/t/introduction-qerra-v2-hybrid-quantum-ethical-safety-layer-for-humanoid-robots/53812&quot;&gt;Read full topic&lt;/a&gt;&lt;/p&gt;</description>
	<pubDate>Sun, 05 Apr 2026 23:24:14 +0000</pubDate>
</item>
<item>
	<title>ROS Discourse General: Rapid deployment of OpenClaw and GraspGen crawling system</title>
	<guid isPermaLink="false">discourse.openrobotics.org-topic-53764</guid>
	<link>https://discourse.openrobotics.org/t/rapid-deployment-of-openclaw-and-graspgen-crawling-system/53764</link>
	<description>&lt;h1&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111118-openclawpi-agilex-robotics-skill-set-library-1&quot; name=&quot;p-111118-openclawpi-agilex-robotics-skill-set-library-1&quot;&gt;&lt;/a&gt;OpenClawPi: AgileX Robotics Skill Set Library&lt;/h1&gt;
&lt;p&gt;&lt;img alt=&quot;License&quot; height=&quot;20&quot; src=&quot;https://img.shields.io/badge/license-MIT-blue.svg&quot; width=&quot;78&quot; /&gt;&lt;br /&gt;
&lt;img alt=&quot;Language&quot; height=&quot;20&quot; src=&quot;https://img.shields.io/badge/language-Python-blue.svg&quot; width=&quot;110&quot; /&gt;&lt;br /&gt;
&lt;img alt=&quot;Platform&quot; height=&quot;20&quot; src=&quot;https://img.shields.io/badge/platform-Linux%20(Ubuntu)-green.svg&quot; width=&quot;150&quot; /&gt;&lt;/p&gt;
&lt;p&gt;OpenClawPi is a modular skill set repository focused on the rapid integration and reuse of core robot functions. Covering key scenarios such as robotic arm control, grasping, visual perception, and voice interaction, it provides out-of-the-box skill components for secondary robot development and application deployment.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://www.youtube.com/watch?v=oL3qP4fgddw&quot;&gt;From Zero to AI Robot Grasping: OpenClaw + GrabGen Full Setup Guide (Step-by-Step)&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111118-i-quick-start-2&quot; name=&quot;p-111118-i-quick-start-2&quot;&gt;&lt;/a&gt;I. Quick Start&lt;/h2&gt;
&lt;h3&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111118-openclaw-deployment-3&quot; name=&quot;p-111118-openclaw-deployment-3&quot;&gt;&lt;/a&gt;OpenClaw Deployment&lt;/h3&gt;
&lt;p&gt;Visit the OpenClaw official website: &lt;a href=&quot;https://openclaw.ai/&quot; rel=&quot;noopener nofollow ugc&quot;&gt;https://openclaw.ai/&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Execute the one-click installation command:&lt;/p&gt;
&lt;pre&gt;&lt;code class=&quot;lang-bash&quot;&gt;curl -fsSL https://openclaw.ai/install.sh | bash
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Next, configure OpenClaw:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;Select ‘&lt;strong&gt;YES&lt;/strong&gt;’&lt;br /&gt;
&lt;/p&gt;&lt;div class=&quot;lightbox-wrapper&quot;&gt;&lt;a class=&quot;lightbox&quot; href=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/3/3/33b2ba808f2fa1ac16761ab8ca1dd05c1cd613fe.jpeg&quot; rel=&quot;noopener nofollow ugc&quot; title=&quot;image&quot;&gt;&lt;img alt=&quot;image&quot; height=&quot;500&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/optimized/3X/3/3/33b2ba808f2fa1ac16761ab8ca1dd05c1cd613fe_2_685x500.jpeg&quot; width=&quot;685&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;p&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Select ‘&lt;strong&gt;QuickStart&lt;/strong&gt;’&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Select ‘&lt;strong&gt;Update values&lt;/strong&gt;’&lt;br /&gt;
&lt;/p&gt;&lt;div class=&quot;lightbox-wrapper&quot;&gt;&lt;a class=&quot;lightbox&quot; href=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/e/0/e0a4c38faf9687623f7d62e300ef158d227dd920.png&quot; rel=&quot;noopener nofollow ugc&quot; title=&quot;image&quot;&gt;&lt;img alt=&quot;image&quot; height=&quot;500&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/optimized/3X/e/0/e0a4c38faf9687623f7d62e300ef158d227dd920_2_685x500.png&quot; width=&quot;685&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;br /&gt;
&lt;div class=&quot;lightbox-wrapper&quot;&gt;&lt;a class=&quot;lightbox&quot; href=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/a/1/a15a393f0379a167e8679fc4f5ea3497a947424d.png&quot; rel=&quot;noopener nofollow ugc&quot; title=&quot;image&quot;&gt;&lt;img alt=&quot;image&quot; height=&quot;500&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/optimized/3X/a/1/a15a393f0379a167e8679fc4f5ea3497a947424d_2_682x500.png&quot; width=&quot;682&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;p&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Select your provider (recommended: free options like &lt;strong&gt;Qwen&lt;/strong&gt;, &lt;strong&gt;OpenRouter&lt;/strong&gt;, or &lt;strong&gt;Ollama&lt;/strong&gt;)&lt;br /&gt;
&lt;/p&gt;&lt;div class=&quot;lightbox-wrapper&quot;&gt;&lt;a class=&quot;lightbox&quot; href=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/8/5/8550fb2520b546e230fe633239483cf4ae494d7d.png&quot; rel=&quot;noopener nofollow ugc&quot; title=&quot;image&quot;&gt;&lt;img alt=&quot;image&quot; height=&quot;500&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/optimized/3X/8/5/8550fb2520b546e230fe633239483cf4ae494d7d_2_685x500.png&quot; width=&quot;685&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;p&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Select the company model you wish to use.&lt;br /&gt;
&lt;/p&gt;&lt;div class=&quot;lightbox-wrapper&quot;&gt;&lt;a class=&quot;lightbox&quot; href=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/4/e/4e259396b30c3c11abde3b1e2f91659815090caa.png&quot; rel=&quot;noopener nofollow ugc&quot; title=&quot;image&quot;&gt;&lt;img alt=&quot;image&quot; height=&quot;500&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/optimized/3X/4/e/4e259396b30c3c11abde3b1e2f91659815090caa_2_684x500.png&quot; width=&quot;684&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;p&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Select a default model.&lt;br /&gt;
&lt;/p&gt;&lt;div class=&quot;lightbox-wrapper&quot;&gt;&lt;a class=&quot;lightbox&quot; href=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/a/9/a91d555d059d64fcdc33404f46b11f89e66437ce.jpeg&quot; rel=&quot;noopener nofollow ugc&quot; title=&quot;image&quot;&gt;&lt;img alt=&quot;image&quot; height=&quot;500&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/optimized/3X/a/9/a91d555d059d64fcdc33404f46b11f89e66437ce_2_685x500.jpeg&quot; width=&quot;685&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;p&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Select the APP you will connect to OpenClaw.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Select a web search provider.&lt;br /&gt;
&lt;/p&gt;&lt;div class=&quot;lightbox-wrapper&quot;&gt;&lt;a class=&quot;lightbox&quot; href=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/0/c/0c34d812f6e5e834d6e02223d8561ca9cd05bdb6.png&quot; rel=&quot;noopener nofollow ugc&quot; title=&quot;image&quot;&gt;&lt;img alt=&quot;image&quot; height=&quot;500&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/optimized/3X/0/c/0c34d812f6e5e834d6e02223d8561ca9cd05bdb6_2_682x500.png&quot; width=&quot;682&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;p&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Select skills (not required for now).&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Check all Hook options.&lt;br /&gt;
&lt;/p&gt;&lt;div class=&quot;lightbox-wrapper&quot;&gt;&lt;a class=&quot;lightbox&quot; href=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/c/2/c28138e8580a2c53bde5824cdfefce9db8d62291.png&quot; rel=&quot;noopener nofollow ugc&quot; title=&quot;image&quot;&gt;&lt;img alt=&quot;image&quot; height=&quot;500&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/optimized/3X/c/2/c28138e8580a2c53bde5824cdfefce9db8d62291_2_685x500.png&quot; width=&quot;685&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;p&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Select ‘&lt;strong&gt;restart&lt;/strong&gt;’.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Select ‘&lt;strong&gt;Web UI&lt;/strong&gt;’.&lt;br /&gt;
&lt;/p&gt;&lt;div class=&quot;lightbox-wrapper&quot;&gt;&lt;a class=&quot;lightbox&quot; href=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/6/e/6efd9ef46269151baaa7704580f3c6de5771d11b.png&quot; rel=&quot;noopener nofollow ugc&quot; title=&quot;image&quot;&gt;&lt;img alt=&quot;image&quot; height=&quot;500&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/optimized/3X/6/e/6efd9ef46269151baaa7704580f3c6de5771d11b_2_682x500.png&quot; width=&quot;682&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;p&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;h3&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111118-h-1-clone-the-project-4&quot; name=&quot;p-111118-h-1-clone-the-project-4&quot;&gt;&lt;/a&gt;1. Clone the Project&lt;/h3&gt;
&lt;pre&gt;&lt;code class=&quot;lang-bash&quot;&gt;git clone https://github.com/vanstrong12138/OpenClawPi.git
&lt;/code&gt;&lt;/pre&gt;
&lt;h3&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111118-h-2-prompt-the-agent-to-learn-skills-5&quot; name=&quot;p-111118-h-2-prompt-the-agent-to-learn-skills-5&quot;&gt;&lt;/a&gt;2. Prompt the Agent to Learn Skills&lt;/h3&gt;
&lt;p&gt;Using the vision skill as an example:&lt;/p&gt;
&lt;pre&gt;&lt;code class=&quot;lang-plaintext&quot;&gt;User: Please learn vl_vision_skill
&lt;/code&gt;&lt;/pre&gt;
&lt;hr /&gt;
&lt;h2&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111118-skill-modules-overview-6&quot; name=&quot;p-111118-skill-modules-overview-6&quot;&gt;&lt;/a&gt;&lt;img alt=&quot;:package:&quot; class=&quot;emoji&quot; height=&quot;20&quot; src=&quot;https://emoji.discourse-cdn.com/noto/package.png?v=15&quot; title=&quot;:package:&quot; width=&quot;20&quot; /&gt; Skill Modules Overview&lt;/h2&gt;
&lt;div class=&quot;md-table&quot;&gt;
&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th style=&quot;text-align: left;&quot;&gt;Module Name&lt;/th&gt;
&lt;th style=&quot;text-align: left;&quot;&gt;Description&lt;/th&gt;
&lt;th style=&quot;text-align: left;&quot;&gt;Core Dependencies&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td style=&quot;text-align: left;&quot;&gt;&lt;code&gt;agx-arm-codegen&lt;/code&gt;&lt;/td&gt;
&lt;td style=&quot;text-align: left;&quot;&gt;Robotic arm code generation tool; automatically generates trajectory planning and joint control code. Supports custom path templates.&lt;/td&gt;
&lt;td style=&quot;text-align: left;&quot;&gt;pyAgxArm&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style=&quot;text-align: left;&quot;&gt;&lt;code&gt;grab_skill&lt;/code&gt;&lt;/td&gt;
&lt;td style=&quot;text-align: left;&quot;&gt;Robot grasping skill, including gripper control, target pose calibration, and grasping strategies (single-point/adaptive).&lt;/td&gt;
&lt;td style=&quot;text-align: left;&quot;&gt;pyAgxArm&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style=&quot;text-align: left;&quot;&gt;&lt;code&gt;vl_vision_skill&lt;/code&gt;&lt;/td&gt;
&lt;td style=&quot;text-align: left;&quot;&gt;Visual perception skill, supporting object detection, visual positioning, and image segmentation.&lt;/td&gt;
&lt;td style=&quot;text-align: left;&quot;&gt;SAM3, Qwen3-VL&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td style=&quot;text-align: left;&quot;&gt;&lt;code&gt;voice_skill&lt;/code&gt;&lt;/td&gt;
&lt;td style=&quot;text-align: left;&quot;&gt;Voice interaction skill, supporting voice command recognition, voice feedback, and custom command set configuration.&lt;/td&gt;
&lt;td style=&quot;text-align: left;&quot;&gt;cosyvoice&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;/div&gt;&lt;h1&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111118-ii-grabgen-pose-generation-and-grasping-7&quot; name=&quot;p-111118-ii-grabgen-pose-generation-and-grasping-7&quot;&gt;&lt;/a&gt;II. GrabGen - Pose Generation and Grasping&lt;/h1&gt;
&lt;p&gt;This article demonstrates the identification, segmentation, pose generation, and grasping of arbitrary objects using SAM3 and pose generation tools.&lt;/p&gt;
&lt;h2&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111118-repositories-8&quot; name=&quot;p-111118-repositories-8&quot;&gt;&lt;/a&gt;Repositories&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;GraspGen: &lt;a href=&quot;https://github.com/vanstrong12138/GraspGen&quot; rel=&quot;noopener nofollow ugc&quot;&gt;https://github.com/vanstrong12138/GraspGen&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Agilex-College: &lt;a href=&quot;https://github.com/agilexrobotics/Agilex-College/tree/master&quot; rel=&quot;noopener nofollow ugc&quot;&gt;https://github.com/agilexrobotics/Agilex-College/tree/master&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111118-hardware-requirements-9&quot; name=&quot;p-111118-hardware-requirements-9&quot;&gt;&lt;/a&gt;Hardware Requirements&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;x86 Desktop Platform&lt;/li&gt;
&lt;li&gt;NVIDIA GPU with at least 16GB VRAM&lt;/li&gt;
&lt;li&gt;Intel RealSense Camera&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111118-project-deployment-environment-10&quot; name=&quot;p-111118-project-deployment-environment-10&quot;&gt;&lt;/a&gt;Project Deployment Environment&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;OS&lt;/strong&gt;: Ubuntu 24.04&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Middleware&lt;/strong&gt;: ROS Jazzy&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;GPU&lt;/strong&gt;: RTX 5090&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;NVIDIA Driver&lt;/strong&gt;: Version 570.195.03&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;CUDA&lt;/strong&gt;: Version 12.8&lt;/li&gt;
&lt;/ul&gt;

&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;Install NVIDIA Graphics Driver&lt;/strong&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;pre&gt;&lt;code class=&quot;lang-bash&quot;&gt;sudo apt update
sudo apt upgrade
sudo add-apt-repository ppa:graphics-drivers/ppa
sudo apt update
sudo apt install nvidia-driver-570
# Restart
reboot
&lt;/code&gt;&lt;/pre&gt;
&lt;ol start=&quot;2&quot;&gt;
&lt;li&gt;&lt;strong&gt;Install CUDA Toolkit 12.8&lt;/strong&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;Go to the &lt;a href=&quot;https://www.google.com/search?q=https://developer.nvidia.com/cuda-12-8-1-download-archive%3Ftarget_os%3DLinux%26target_arch%3Dx86_64%26Distribution%3DUbuntu%26target_version%3D24.04%26target_type%3Drunfile_local&quot; rel=&quot;noopener nofollow ugc&quot;&gt;NVIDIA Official Website&lt;/a&gt; to download the CUDA runfile.&lt;br /&gt;
&lt;/p&gt;&lt;div class=&quot;lightbox-wrapper&quot;&gt;&lt;a class=&quot;lightbox&quot; href=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/e/b/ebb5e02460a7d4a3253d201867e94f0d1e467259.jpeg&quot; rel=&quot;noopener nofollow ugc&quot; title=&quot;image&quot;&gt;&lt;img alt=&quot;image&quot; height=&quot;421&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/optimized/3X/e/b/ebb5e02460a7d4a3253d201867e94f0d1e467259_2_690x421.jpeg&quot; width=&quot;690&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;p&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Execute the installation command:&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;pre&gt;&lt;code class=&quot;lang-bash&quot;&gt;wget https://developer.download.nvidia.com/compute/cuda/12.8.1/local_installers/cuda_12.8.1_570.124.06_linux.run
sudo sh cuda_12.8.1_570.124.06_linux.run
&lt;/code&gt;&lt;/pre&gt;
&lt;ul&gt;
&lt;li&gt;During installation, &lt;strong&gt;uncheck&lt;/strong&gt; the first option (“driver”) since the driver was installed in the previous step.&lt;/li&gt;
&lt;/ul&gt;

&lt;ol start=&quot;3&quot;&gt;
&lt;li&gt;&lt;strong&gt;Add Environment Variables&lt;/strong&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;pre&gt;&lt;code class=&quot;lang-bash&quot;&gt;echo 'export PATH=/usr/local/cuda-12.8/bin:$PATH' &amp;gt;&amp;gt; ~/.bashrc
echo 'export LD_LIBRARY_PATH=/usr/local/cuda-12.8/lib64:$LD_LIBRARY_PATH' &amp;gt;&amp;gt; ~/.bashrc
source ~/.bashrc
&lt;/code&gt;&lt;/pre&gt;
&lt;ol start=&quot;4&quot;&gt;
&lt;li&gt;&lt;strong&gt;Verify Installation&lt;/strong&gt;&lt;br /&gt;
Execute &lt;code&gt;nvcc -V&lt;/code&gt; to check CUDA information.&lt;/li&gt;
&lt;/ol&gt;

&lt;pre&gt;&lt;code class=&quot;lang-bash&quot;&gt;nvcc -V
&lt;/code&gt;&lt;/pre&gt;
&lt;ol start=&quot;5&quot;&gt;
&lt;li&gt;&lt;strong&gt;Install cuDNN&lt;/strong&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;Download the cuDNN tar file from the &lt;a href=&quot;https://developer.nvidia.com/cudnn-downloads?target_os=Linux&amp;amp;target_arch=x86_64&amp;amp;Distribution=Agnostic&amp;amp;cuda_version=12&amp;amp;Configuration=Full&quot; rel=&quot;noopener nofollow ugc&quot;&gt;NVIDIA Official Website&lt;/a&gt;. After extracting, copy the files.&lt;br /&gt;
&lt;/p&gt;&lt;div class=&quot;lightbox-wrapper&quot;&gt;&lt;a class=&quot;lightbox&quot; href=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/f/8/f89857fc1a255f397f1ba28d708af027159e1418.jpeg&quot; rel=&quot;noopener nofollow ugc&quot; title=&quot;image&quot;&gt;&lt;img alt=&quot;image&quot; height=&quot;465&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/optimized/3X/f/8/f89857fc1a255f397f1ba28d708af027159e1418_2_690x465.jpeg&quot; width=&quot;690&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;p&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Execute the following commands to copy cuDNN to the CUDA directory:&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;pre&gt;&lt;code class=&quot;lang-bash&quot;&gt;sudo cp cuda/include/cudnn*.h /usr/local/cuda/include
sudo cp cuda/lib/libcudnn* /usr/local/cuda/lib64
sudo chmod a+r /usr/local/cuda/include/cudnn*.h /usr/local/cuda/lib64/libcudnn*
&lt;/code&gt;&lt;/pre&gt;
&lt;ol start=&quot;6&quot;&gt;
&lt;li&gt;&lt;strong&gt;Install TensorRT&lt;/strong&gt;&lt;br /&gt;
Download the TensorRT tar file from the &lt;a href=&quot;https://developer.nvidia.com/nvidia-tensorrt-8x-download&quot; rel=&quot;noopener nofollow ugc&quot;&gt;NVIDIA Official Website&lt;/a&gt;.&lt;br /&gt;
&lt;div class=&quot;lightbox-wrapper&quot;&gt;&lt;a class=&quot;lightbox&quot; href=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/5/d/5d03c923f99947a7f7e0653b1bdf6d6335bbf9ab.jpeg&quot; rel=&quot;noopener nofollow ugc&quot; title=&quot;image&quot;&gt;&lt;img alt=&quot;image&quot; height=&quot;482&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/optimized/3X/5/d/5d03c923f99947a7f7e0653b1bdf6d6335bbf9ab_2_690x482.jpeg&quot; width=&quot;690&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;Extract and move TensorRT to the &lt;code&gt;/usr/local&lt;/code&gt; directory:&lt;/li&gt;
&lt;/ul&gt;

&lt;pre&gt;&lt;code class=&quot;lang-bash&quot;&gt;# Extract
tar -xvf TensorRT-10.16.0.72.Linux.x86_64-gnu.cuda-12.9.tar.gz 

# Enter directory
cd TensorRT-10.16.0.72.Linux.x86_64-gnu.cuda-12.9/

# Move to /usr/local
sudo mv TensorRT-10.16.0.72/ /usr/local/
&lt;/code&gt;&lt;/pre&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Test TensorRT Installation&lt;/strong&gt;:&lt;/li&gt;
&lt;/ul&gt;

&lt;pre&gt;&lt;code class=&quot;lang-bash&quot;&gt;# Enter MNIST sample directory
cd /usr/local/TensorRT-10.16.0.72/samples/sampleOnnxMNIST

# Compile
make

# Run the executable found in bin
cd /usr/local/TensorRT-10.16.0.72/bin
./sample_onnx_mnist
&lt;/code&gt;&lt;/pre&gt;
&lt;hr /&gt;
&lt;h3&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111118-sam3-deployment-11&quot; name=&quot;p-111118-sam3-deployment-11&quot;&gt;&lt;/a&gt;SAM3 Deployment&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Python&lt;/strong&gt;: 3.12 or higher&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;PyTorch&lt;/strong&gt;: 2.7 or higher&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;CUDA&lt;/strong&gt;: Compatible GPU with CUDA 12.6 or higher&lt;/li&gt;
&lt;/ul&gt;

&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;Create Conda Virtual Environment&lt;/strong&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;pre&gt;&lt;code class=&quot;lang-bash&quot;&gt;conda create -n sam3 python=3.12
conda deactivate
conda activate sam3
&lt;/code&gt;&lt;/pre&gt;
&lt;ol start=&quot;2&quot;&gt;
&lt;li&gt;&lt;strong&gt;Install PyTorch and Dependencies&lt;/strong&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;pre&gt;&lt;code class=&quot;lang-bash&quot;&gt;# For 50-series GPUs, CUDA 12.8 and Torch 2.8 are recommended
# Downgrade numpy to &amp;lt;1.23 if necessary
pip install torch==2.8.0 torchvision==0.23.0 torchaudio==2.8.0 --index-url https://download.pytorch.org/whl/cu128

cd sam3
pip install -e .
&lt;/code&gt;&lt;/pre&gt;
&lt;ol start=&quot;3&quot;&gt;
&lt;li&gt;&lt;strong&gt;Model Download&lt;/strong&gt;
&lt;ol&gt;
&lt;li&gt;Submit the form to gain download access on HuggingFace: &lt;a href=&quot;https://huggingface.co/facebook/sam3&quot; rel=&quot;noopener nofollow ugc&quot;&gt;https://huggingface.co/facebook/sam3&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Or search via local mirror sites.&lt;/li&gt;
&lt;/ol&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;hr /&gt;
&lt;h3&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111118-robotic-arm-driver-deployment-12&quot; name=&quot;p-111118-robotic-arm-driver-deployment-12&quot;&gt;&lt;/a&gt;Robotic Arm Driver Deployment&lt;/h3&gt;
&lt;p&gt;The project outputs &lt;code&gt;target_pose&lt;/code&gt; (end-effector pose), which can be manually adapted for different robotic arms.&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;Example: PiPER Robotic Arm&lt;/strong&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;pre&gt;&lt;code class=&quot;lang-bash&quot;&gt;pip install python-can

git clone https://github.com/agilexrobotics/pyAgxArm.git

cd pyAgxArm
pip install .
&lt;/code&gt;&lt;/pre&gt;
&lt;hr /&gt;
&lt;h2&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111118-cloning-13&quot; name=&quot;p-111118-cloning-13&quot;&gt;&lt;/a&gt;Cloning&lt;/h2&gt;
&lt;p&gt;Clone this project to your local machine:&lt;/p&gt;
&lt;pre&gt;&lt;code class=&quot;lang-bash&quot;&gt;cd YOUR_PATH
git clone -b ros2_jazzy_version https://github.com/AgilexRobotics/GraspGen.git
&lt;/code&gt;&lt;/pre&gt;
&lt;hr /&gt;
&lt;h2&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111118-running-the-project-14&quot; name=&quot;p-111118-running-the-project-14&quot;&gt;&lt;/a&gt;Running the Project&lt;/h2&gt;
&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;Grasping Node&lt;/strong&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;pre&gt;&lt;code class=&quot;lang-bash&quot;&gt;python YOUR_PATH/sam3/realsense-sam.py --prompt &quot;Target Object Name in English&quot;
&lt;/code&gt;&lt;/pre&gt;
&lt;ol start=&quot;2&quot;&gt;
&lt;li&gt;&lt;strong&gt;Grasping Task Execution Controls&lt;/strong&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;pre&gt;&lt;code class=&quot;lang-plaintext&quot;&gt;A = Zero-force mode (Master arm) | D = Normal mode + Record pose | S = Return to home
X = Replay pose | Q = Open gripper | E = Close gripper | P = Pointcloud/Grasp
T = Change prompt | G = Issue grasp command | Esc = Exit
&lt;/code&gt;&lt;/pre&gt;
&lt;ol start=&quot;3&quot;&gt;
&lt;li&gt;&lt;strong&gt;Automatic Grasping Task&lt;/strong&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;pre&gt;&lt;code class=&quot;lang-bash&quot;&gt;python YOUR_PATH/sam3/realsense-sam.py --prompt &quot;Target Object Name&quot; --auto
&lt;/code&gt;&lt;/pre&gt;
            &lt;p&gt;&lt;small&gt;1 post - 1 participant&lt;/small&gt;&lt;/p&gt;
            &lt;p&gt;&lt;a href=&quot;https://discourse.openrobotics.org/t/rapid-deployment-of-openclaw-and-graspgen-crawling-system/53764&quot;&gt;Read full topic&lt;/a&gt;&lt;/p&gt;</description>
	<pubDate>Fri, 03 Apr 2026 10:38:18 +0000</pubDate>
</item>
<item>
	<title>ROS Discourse General: Interactive GUI toolkit for robotics visualization - Python &amp; C++, runs on desktop and web</title>
	<guid isPermaLink="false">discourse.openrobotics.org-topic-53752</guid>
	<link>https://discourse.openrobotics.org/t/interactive-gui-toolkit-for-robotics-visualization-python-c-runs-on-desktop-and-web/53752</link>
	<description>&lt;p&gt;Hi everyone,&lt;/p&gt;
&lt;p&gt;I’d like to share &lt;strong&gt;Dear ImGui Bundle&lt;/strong&gt;, an open-source framework for building interactive GUI applications in Python and C++. It wraps Dear ImGui with 23 integrated libraries (plotting, image inspection, node editors, 3D gizmos, etc.) and runs on desktop, mobile, and web.&lt;/p&gt;
&lt;p&gt;I’m a solo developer and have been working hard on this for 4 years. I am new here, but I thought it might be useful for robotics developers.&lt;/p&gt;
&lt;p&gt;It provides:&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Real-time visualization&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;ImPlot and ImPlot3D for sensor data, trajectories, live plots at 60fps (or even 120fps)&lt;/li&gt;
&lt;li&gt;ImmVision for camera feed inspection with zoom, pan, pixel values, and colormaps&lt;/li&gt;
&lt;li&gt;All GPU-accelerated (OpenGL/Metal/Vulkan)&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;strong&gt;Interactive parameter tuning&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Immediate mode means your UI code is just a few lines of Python or C++&lt;/li&gt;
&lt;li&gt;Sliders, knobs, toggles, color pickers - all update in real time&lt;/li&gt;
&lt;li&gt;No callbacks, no widget trees, no framework boilerplate&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;strong&gt;Cross-platform deployment&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Same code runs on Linux, macOS, Windows&lt;/li&gt;
&lt;li&gt;Python apps can run in the browser via Pyodide (useful for sharing dashboards without requiring install)&lt;/li&gt;
&lt;li&gt;C++ apps compile to WebAssembly via Emscripten&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;strong&gt;Example: live camera + Laplacian filter with colormaps in 54 lines&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;&lt;/p&gt;&lt;div class=&quot;lightbox-wrapper&quot;&gt;&lt;a class=&quot;lightbox&quot; href=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/5/4/54a2c9885fc86edd963d9c0e832746b7b433b9e8.jpeg&quot; rel=&quot;noopener nofollow ugc&quot; title=&quot;image&quot;&gt;&lt;img alt=&quot;image&quot; height=&quot;332&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/optimized/3X/5/4/54a2c9885fc86edd963d9c0e832746b7b433b9e8_2_690x332.jpeg&quot; width=&quot;690&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;p&gt;&lt;/p&gt;
&lt;pre&gt;&lt;code class=&quot;lang-python&quot;&gt;import cv2
import numpy as np
from imgui_bundle import imgui, immvision, immapp


class AppState:
    def __init__(self):
        self.cap = cv2.VideoCapture(0)
        self.image = None
        self.filtered = None
        self.blur_sigma = 2.0
        # ImmVision params
        # For the camera image
        self.params_image = immvision.ImageParams()
        self.params_image.image_display_size = (400, 0)
        self.params_image.zoom_key = &quot;cam&quot;
        # For the filtered image (synced zoom via zoom_key)
        self.params_filter = immvision.ImageParams()
        self.params_filter.image_display_size = (400, 0)
        self.params_filter.zoom_key = &quot;cam&quot;
        self.params_filter.show_options_panel = True


def gui(s: AppState):
    # grab
    has_image, frame = s.cap.read()
    if has_image:
        s.image = cv2.resize(frame, (640, 480))
        gray = cv2.cvtColor(s.image, cv2.COLOR_BGR2GRAY)
        gray_f = gray.astype(np.float64) / 255.0
        blurred = cv2.GaussianBlur(gray_f, (0, 0), s.blur_sigma)
        s.filtered = cv2.Laplacian(blurred, cv2.CV_64F, ksize=5)

    # Refresh images only if needed
    s.params_image.refresh_image = has_image
    s.params_filter.refresh_image = has_image

    if s.image is not None:
        immvision.image(&quot;Camera&quot;, s.image, s.params_image)
        imgui.same_line()
        immvision.image(&quot;Filtered&quot;, s.filtered, s.params_filter)

    # Controls
    _, s.blur_sigma = imgui.slider_float(&quot;Blur&quot;, s.blur_sigma, 0.5, 10.0)


state = AppState()
immvision.use_bgr_color_order()
immapp.run(lambda: gui(state), window_size=(1200, 550), window_title=&quot;Camera Filter&quot;, fps_idle=0)
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;The filtered image is float64 - click “Options” to try different colormaps (Heat, Jet, Viridis…). Both views are zoom-linked: pan one, the other follows.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Try it:&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;&lt;a href=&quot;https://traineq.org/imgui_bundle_online/projects/imgui_bundle_playground/&quot; rel=&quot;noopener nofollow ugc&quot;&gt;Online Python Playground&lt;/a&gt;&lt;/strong&gt; - edit and run Python GUI apps in your browser, no install needed&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;&lt;a href=&quot;https://traineq.org/imgui_bundle_explorer/&quot; rel=&quot;noopener nofollow ugc&quot;&gt;Interactive Explorer&lt;/a&gt;&lt;/strong&gt; - browse all 23 libraries with live demos and source code&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;&lt;a href=&quot;https://github.com/pthom/imgui_bundle&quot; rel=&quot;noopener nofollow ugc&quot;&gt;GitHub&lt;/a&gt;&lt;/strong&gt; (1100+ stars)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;&lt;a href=&quot;https://pthom.github.io/imgui_bundle/&quot; rel=&quot;noopener nofollow ugc&quot;&gt;Documentation&lt;/a&gt;&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;&lt;a href=&quot;https://discord.gg/xkzpKMeYN3&quot; rel=&quot;noopener nofollow ugc&quot;&gt;Discord&lt;/a&gt;&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Install: &lt;code&gt;pip install imgui-bundle&lt;/code&gt;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Adoption:&lt;/strong&gt;&lt;br /&gt;
The framework is used in several research projects, including CVPR 2024 papers (4K4D), Newton Physics, and moderngl. The Python bindings are auto-generated with &lt;a href=&quot;https://github.com/pthom/litgen&quot; rel=&quot;noopener nofollow ugc&quot;&gt;litgen&lt;/a&gt;, so they stay in sync with upstream Dear ImGui.&lt;/p&gt;
&lt;p&gt;Happy to answer any questions or discuss how it could fit into ROS workflows.&lt;/p&gt;
&lt;p&gt;Best,&lt;br /&gt;
Pascal&lt;/p&gt;
            &lt;p&gt;&lt;small&gt;2 posts - 2 participants&lt;/small&gt;&lt;/p&gt;
            &lt;p&gt;&lt;a href=&quot;https://discourse.openrobotics.org/t/interactive-gui-toolkit-for-robotics-visualization-python-c-runs-on-desktop-and-web/53752&quot;&gt;Read full topic&lt;/a&gt;&lt;/p&gt;</description>
	<pubDate>Thu, 02 Apr 2026 18:10:24 +0000</pubDate>
</item>
<item>
	<title>ROS Discourse General: On message standardization (and a call for participation)</title>
	<guid isPermaLink="false">discourse.openrobotics.org-topic-53744</guid>
	<link>https://discourse.openrobotics.org/t/on-message-standardization-and-a-call-for-participation/53744</link>
	<description>&lt;p&gt;Hi folks!&lt;/p&gt;
&lt;p&gt;I presume at least some of you are aware of the OSRA efforts towards better supporting Physical AI applications. Some of those efforts revolve around messaging and interfaces, and in that context, a few gaps in standard sensing messages have been identified. In a way, this is orthogonal to Physical AI, yet still we may as well seize the opportunity to improve the state of things.&lt;/p&gt;
&lt;p&gt;To that end, the Standardized Interfaces &amp;amp; Messages Working Group will be hosting public sessions to discuss, review, and craft &lt;em&gt;proposals&lt;/em&gt; to address those gaps. Either through implementation or through recommendation if the community has already organically developed a solution. Academic researchers and industry practitioners are more than welcome to join. If you design or manufacture sensor hardware, even better.&lt;/p&gt;
&lt;p&gt;Our friends at Ouster already took the lead and posted a proposal for a new &lt;a href=&quot;https://discourse.openrobotics.org/t/a-proposal-for-a-lidarscan-sensor-message/53225&quot; title=&quot;https://discourse.openrobotics.org/t/a-proposal-for-a-lidarscan-sensor-message/53225&quot;&gt;3D LiDAR message&lt;/a&gt;, so our focus during the first couple sessions will likely be on LiDAR technology. Tactile is a close second. We’ve heard complains about the IMU message structure too. Feel free to propose more (and challenge others too).&lt;/p&gt;
&lt;p&gt;We’ll meet on Mondays, biweekly, starting &lt;span class=&quot;discourse-local-date&quot;&gt;Mon, Apr 6, 2026 3:00 PM UTC&lt;/span&gt;. Fill &lt;a href=&quot;https://forms.gle/1eJo1zKYnArBbeWk9&quot; rel=&quot;noopener nofollow ugc&quot;&gt;this form&lt;/a&gt; to join the meetings. Hope to see you there!&lt;/p&gt;
            &lt;p&gt;&lt;small&gt;1 post - 1 participant&lt;/small&gt;&lt;/p&gt;
            &lt;p&gt;&lt;a href=&quot;https://discourse.openrobotics.org/t/on-message-standardization-and-a-call-for-participation/53744&quot;&gt;Read full topic&lt;/a&gt;&lt;/p&gt;</description>
	<pubDate>Thu, 02 Apr 2026 15:17:32 +0000</pubDate>
</item>
<item>
	<title>ROS Discourse General: Announcing MoveIt Pro 9 with ROS 2 Jazzy Support</title>
	<guid isPermaLink="false">discourse.openrobotics.org-topic-53723</guid>
	<link>https://discourse.openrobotics.org/t/announcing-moveit-pro-9-with-ros-2-jazzy-support/53723</link>
	<description>&lt;p&gt;Hi ROS Community!&lt;/p&gt;
&lt;p&gt;It’s been a while, but we’re excited to announce &lt;a href=&quot;https://picknik.ai/pro/&quot; rel=&quot;noopener nofollow ugc&quot;&gt;MoveIt Pro 9.0&lt;/a&gt;, the latest major release of PickNik’s manipulation developer platform built on ROS 2. MoveIt Pro includes comprehensive support for AI model training &amp;amp; execution, Behavior Trees, MuJoCo simulation, and all the classic capabilities you expect like motion planning, collision avoidance, inverse kinematics, and real-time control.&lt;/p&gt;
&lt;p&gt;This release adds support for ROS 2 Jazzy LTS (while still supporting ROS Humble), along with significant improvements to teleoperation, motion planning, developer tooling, and robot application workflows. MoveIt Pro now includes new joint-space and Cartesian-space motion planners that outperform previous implementations, to improve cycle time, robustness, and industry-required reliability. See the &lt;a href=&quot;https://picknik.ai/moveit-2-vs-moveit-pro/&quot; rel=&quot;noopener nofollow ugc&quot;&gt;full benchmarking comparison&lt;/a&gt; for details&lt;/p&gt;
&lt;p&gt;MoveIt Pro is developed by the team behind MoveIt 2, and our goal is to make it easier for robotics teams to build and deploy real-world manipulation systems using ROS. Many organizations in manufacturing, aerospace, logistics, agriculture, industrial cleaning, and research use MoveIt Pro to accelerate development without needing to build large amounts of infrastructure from scratch.&lt;/p&gt;
&lt;p&gt;&lt;/p&gt;&lt;div class=&quot;lightbox-wrapper&quot;&gt;&lt;a class=&quot;lightbox&quot; href=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/f/8/f8f56e0108433113153bc8c88351271b2d5ee3ce.jpeg&quot; rel=&quot;noopener nofollow ugc&quot; title=&quot;image&quot;&gt;&lt;img alt=&quot;image&quot; height=&quot;388&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/optimized/3X/f/8/f8f56e0108433113153bc8c88351271b2d5ee3ce_2_690x388.jpeg&quot; width=&quot;690&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;p&gt;&lt;/p&gt;
&lt;h2&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111056-whats-new-1&quot; name=&quot;p-111056-whats-new-1&quot;&gt;&lt;/a&gt;&lt;strong&gt;What’s new&lt;/strong&gt;&lt;/h2&gt;
&lt;h3&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111056-improved-real-time-control-and-teleoperation-with-joint-jog-2&quot; name=&quot;p-111056-improved-real-time-control-and-teleoperation-with-joint-jog-2&quot;&gt;&lt;/a&gt;&lt;strong&gt;Improved real-time control and teleoperation with Joint Jog&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;MoveIt Pro now includes a new “Joint Jog” teleoperation mode for controlling robots directly from the web UI. This replaces the previous MoveIt Servo based teleoperation implementation and introduces continuous collision checking, configurable safety factors, and optional link padding for safer manual control during debugging or demonstrations.&lt;/p&gt;
&lt;h3&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111056-scan-and-plan-workflows-3&quot; name=&quot;p-111056-scan-and-plan-workflows-3&quot;&gt;&lt;/a&gt;&lt;strong&gt;Scan-and-plan workflows&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;New scan-and-plan capabilities allow robots to scan surfaces with a sensor and automatically generate tool paths for tasks like spraying, sanding, washing, or grinding. These workflows make it easier to build surface-processing applications.&lt;/p&gt;
&lt;p&gt;&lt;img alt=&quot;scan-and-plan-capabilities-for-spraying-f1782ba23bff8f3dbedf9550a8dd3403&quot; class=&quot;animated&quot; height=&quot;419&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/a/c/ac369b12d4346b8c6d49490f2d09dd6b736d4277.gif&quot; width=&quot;690&quot; /&gt;&lt;/p&gt;
&lt;h3&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111056-new-python-apis-for-moveit-pro-core-4&quot; name=&quot;p-111056-new-python-apis-for-moveit-pro-core-4&quot;&gt;&lt;/a&gt;&lt;strong&gt;New Python APIs for MoveIt Pro Core&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;New low-level Python APIs expose the core planners, solvers, and controllers directly, enabling developers to build custom applications outside of the Behavior Tree framework. These APIs provide fine-grained control over motion planning and kinematics, including advanced features like customizable nullspace optimization and path constraints.&lt;/p&gt;
&lt;h3&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111056-improved-motion-planning-apis-5&quot; name=&quot;p-111056-improved-motion-planning-apis-5&quot;&gt;&lt;/a&gt;&lt;strong&gt;Improved motion planning APIs&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;Several updates improve flexibility for motion generation, including: improved path inverse kinematics, orientation tracking as a nullspace cost, customizable nullspace behavior, tunable path deviation tolerances.&lt;/p&gt;
&lt;h3&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111056-developer-productivity-improvements-6&quot; name=&quot;p-111056-developer-productivity-improvements-6&quot;&gt;&lt;/a&gt;&lt;strong&gt;Developer productivity improvements&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;The MoveIt Pro UI and Behavior Tree tooling received a number of improvements to make debugging and application development faster, including a redesigned UI layout and improved editing workflows, Behavior Tree editor improvements such as search and node snapping, better debugging tools including TF visualization and alert history&lt;/p&gt;
&lt;h3&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111056-expanded-library-of-reusable-manipulation-skills-7&quot; name=&quot;p-111056-expanded-library-of-reusable-manipulation-skills-7&quot;&gt;&lt;/a&gt;&lt;strong&gt;Expanded Library of Reusable Manipulation Skills&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;MoveIt Pro also includes a large library of reusable robot capabilities implemented as thread-safe Behavior Tree nodes, allowing developers to compose complex manipulation applications from modular building blocks instead of writing large amounts of robotics infrastructure from scratch. See our &lt;a href=&quot;https://picknik.ai/behaviors/&quot; rel=&quot;noopener nofollow ugc&quot;&gt;Behaviors Hub&lt;/a&gt; to explore the 200+ available Behaviors.&lt;/p&gt;
&lt;p&gt;&lt;img alt=&quot;enhanced-ai-processing-of-point-clouds-4ec0f48f9070435cd417ab4915e90bed&quot; class=&quot;animated&quot; height=&quot;443&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/7/a/7a66dc8b95edef5258966f4a3ba04acf79870cdb.gif&quot; width=&quot;690&quot; /&gt;&lt;/p&gt;
&lt;h2&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111056-built-for-the-ros-ecosystem-8&quot; name=&quot;p-111056-built-for-the-ros-ecosystem-8&quot;&gt;&lt;/a&gt;&lt;strong&gt;Built for the ROS ecosystem&lt;/strong&gt;&lt;/h2&gt;
&lt;p&gt;MoveIt Pro integrates with the broader ROS ecosystem, including standard ROS drivers and packages. PickNik has been deeply involved in the MoveIt project since its early development, and we continue investing heavily in open-source robotics such as &lt;a href=&quot;https://picknik.ai/hardware-ecosystem/&quot; rel=&quot;noopener nofollow ugc&quot;&gt;developing many ROS drivers for major vendors&lt;/a&gt;.&lt;/p&gt;
&lt;h2&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111056-learn-more-9&quot; name=&quot;p-111056-learn-more-9&quot;&gt;&lt;/a&gt;&lt;strong&gt;Learn more&lt;/strong&gt;&lt;/h2&gt;
&lt;p&gt;Full release notes:&lt;br /&gt;
&lt;a href=&quot;https://docs.picknik.ai/release-notes/&quot; rel=&quot;noopener nofollow ugc&quot;&gt;https://docs.picknik.ai/release-notes/&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;We’d love feedback from the ROS community, and we’re excited to see what developers build with these new capabilities. &lt;a href=&quot;https://picknik.ai/connect/&quot; rel=&quot;noopener nofollow ugc&quot;&gt;Contact us&lt;/a&gt; to learn more.&lt;/p&gt;
            &lt;p&gt;&lt;small&gt;4 posts - 3 participants&lt;/small&gt;&lt;/p&gt;
            &lt;p&gt;&lt;a href=&quot;https://discourse.openrobotics.org/t/announcing-moveit-pro-9-with-ros-2-jazzy-support/53723&quot;&gt;Read full topic&lt;/a&gt;&lt;/p&gt;</description>
	<pubDate>Wed, 01 Apr 2026 16:45:04 +0000</pubDate>
</item>
<item>
	<title>ROS Discourse General: [Policy Change] Detailed Standards for REP-2026-04 (Lyrical Enforcement)</title>
	<guid isPermaLink="false">discourse.openrobotics.org-topic-53717</guid>
	<link>https://discourse.openrobotics.org/t/policy-change-detailed-standards-for-rep-2026-04-lyrical-enforcement/53717</link>
	<description>&lt;p&gt;Hi everyone,&lt;/p&gt;
&lt;p&gt;Following up on the recent announcement regarding the &lt;strong&gt;Lyrical Luth&lt;/strong&gt; release requirements, the PMC has finalized the automated enforcement protocols. To ensure our May release remains on schedule, we are providing expanded guidelines and examples for the new &lt;code&gt;rhyme-lint&lt;/code&gt; and &lt;code&gt;README.shanty&lt;/code&gt; checks.&lt;/p&gt;
&lt;p&gt;Effective immediately, all pull requests targeting the &lt;code&gt;rolling&lt;/code&gt; or &lt;code&gt;lyrical&lt;/code&gt; branches must pass these poetic audits.&lt;/p&gt;
&lt;h3&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111042-h-1-the-rhyme-lint-mandatory-ci-check-1&quot; name=&quot;p-111042-h-1-the-rhyme-lint-mandatory-ci-check-1&quot;&gt;&lt;/a&gt;1. The &lt;code&gt;rhyme-lint&lt;/code&gt; Mandatory CI Check&lt;/h3&gt;
&lt;p&gt;All pull requests will now trigger a &lt;code&gt;rhyme-lint&lt;/code&gt; action. If your commit message lacks proper meter or rhyme, the build will fail with a &lt;code&gt;403: UNPOETIC_CONTRIBUTION&lt;/code&gt; error.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Accepted Commit Styles:&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;The Heroic Couplet (for Security/Bug Fixes):&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code class=&quot;lang-auto&quot;&gt;fix: A buffer overflow was found in C,
We've locked the heap to keep the memory free.
&lt;/code&gt;&lt;/pre&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Iambic Pentameter (for Feature Additions):&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code class=&quot;lang-auto&quot;&gt;feat: The twenty-standard now we must embrace,
To bring C++20 speed to every space.
&lt;/code&gt;&lt;/pre&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;The Middleware Haiku (for RMW Updates):&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code class=&quot;lang-auto&quot;&gt;Packets drift like leaves,
The middleware finds the path,
Silence in the logs.
&lt;/code&gt;&lt;/pre&gt;
&lt;hr /&gt;
&lt;h3&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111042-h-2-the-readmeshanty-documentation-standard-2&quot; name=&quot;p-111042-h-2-the-readmeshanty-documentation-standard-2&quot;&gt;&lt;/a&gt;2. The &lt;code&gt;README.shanty&lt;/code&gt; Documentation Standard&lt;/h3&gt;
&lt;p&gt;Any new package added to the core must include a &lt;code&gt;README.shanty&lt;/code&gt; file. This ensures our documentation can be easily memorized and sung during long deployment cycles or deep-sea robotics missions.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;em&gt;Note: Harmonies are optional but encouraged for Tier-1 platforms.&lt;/em&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;strong&gt;Example: &lt;code&gt;README.shanty&lt;/code&gt; for &lt;code&gt;rcl::Buffer&lt;/code&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;&lt;em&gt;(To the tune of “The Wellerman”)&lt;/em&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;There once was a node that sent a frame,&lt;br /&gt;
Without a copy or a name,&lt;br /&gt;
The CPU was much to blame,&lt;br /&gt;
For latency so high! (HUH!)&lt;/p&gt;
&lt;p&gt;Soon may the Zero-Copy come,&lt;br /&gt;
To bring us throughput, megabytes, and fun,&lt;br /&gt;
When the data transfer’s done,&lt;br /&gt;
We’ll take our leave and go!&lt;br /&gt;
We used the vendor’s memory backend,&lt;br /&gt;
A pointer sent to every friend,&lt;br /&gt;
The bandwidth limit met its end,&lt;br /&gt;
Beneath the Lyrical sky!&lt;/p&gt;
&lt;/blockquote&gt;
&lt;hr /&gt;
&lt;h3&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111042-h-3-the-lyrical-luth-rhyming-dictionary-3&quot; name=&quot;p-111042-h-3-the-lyrical-luth-rhyming-dictionary-3&quot;&gt;&lt;/a&gt;3. The Lyrical Luth Rhyming Dictionary&lt;/h3&gt;
&lt;p&gt;We recognize that many maintainers may find this transition challenging. To assist, the PMC has curated an initial dictionary of “Technical Rhymes” to help you pass CI.&lt;/p&gt;
&lt;div class=&quot;md-table&quot;&gt;
&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;&lt;strong&gt;ROS Term&lt;/strong&gt;&lt;/th&gt;
&lt;th&gt;&lt;strong&gt;Approved Rhymes&lt;/strong&gt;&lt;/th&gt;
&lt;th&gt;&lt;strong&gt;Example&lt;/strong&gt;&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Node&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Code, Mode, Load, Road&lt;/td&gt;
&lt;td&gt;“A lonely node / with heavy load.”&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;DDS&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Success, Progress, Finesse&lt;/td&gt;
&lt;td&gt;“Tune the DDS / with pure finesse.”&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Topic&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Myopic, Tropic, Microscopic&lt;/td&gt;
&lt;td&gt;“A hidden topic / so microscopic.”&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;RMW&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Now, How, Allow, Brow&lt;/td&gt;
&lt;td&gt;“The RMW / we fix it now.”&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Linter&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Splinter, Winter, Printer&lt;/td&gt;
&lt;td&gt;“The static linter / cold as winter.”&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Pointer&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Anointer, Appointer&lt;/td&gt;
&lt;td&gt;“The null pointer / a soul-disappointer.”&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Humble&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Rumble, Stumble, Grumble&lt;/td&gt;
&lt;td&gt;“Backported from Humble / without a stumble.”&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;/div&gt;&lt;hr /&gt;
&lt;h3&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-111042-compliance-and-ros-ffice-hours-4&quot; name=&quot;p-111042-compliance-and-ros-ffice-hours-4&quot;&gt;&lt;/a&gt;Compliance and “ROS-ffice Hours”&lt;/h3&gt;
&lt;p&gt;We understand this is a significant shift in our development workflow, but we believe it is necessary to harmonize our ecosystem. To help with the transition, our upcoming &lt;strong&gt;“ROS-ffice Hours”&lt;/strong&gt; sessions will be dedicated to bardic troubleshooting.&lt;/p&gt;
&lt;p&gt;Let’s make this May the most harmonious release in robotics history.&lt;/p&gt;
            &lt;p&gt;&lt;small&gt;5 posts - 4 participants&lt;/small&gt;&lt;/p&gt;
            &lt;p&gt;&lt;a href=&quot;https://discourse.openrobotics.org/t/policy-change-detailed-standards-for-rep-2026-04-lyrical-enforcement/53717&quot;&gt;Read full topic&lt;/a&gt;&lt;/p&gt;</description>
	<pubDate>Wed, 01 Apr 2026 13:00:26 +0000</pubDate>
</item>
<item>
	<title>ROS Discourse General: Custom Capabilities in Transitive Robotics | Cloud Robotics WG Meeting 2026-04-13</title>
	<guid isPermaLink="false">discourse.openrobotics.org-topic-53709</guid>
	<link>https://discourse.openrobotics.org/t/custom-capabilities-in-transitive-robotics-cloud-robotics-wg-meeting-2026-04-13/53709</link>
	<description>&lt;p&gt;Please come and join us for this coming meeting at &lt;span class=&quot;discourse-local-date&quot;&gt;Mon, Apr 13, 2026 4:00 PM UTC&lt;/span&gt;→&lt;span class=&quot;discourse-local-date&quot;&gt;Mon, Apr 13, 2026 5:00 PM UTC&lt;/span&gt;, where we plan to continue our Transitive Robotics tryout by trying one of the more advanced features: writing and deploying a custom capability. This feature allows customers to write their own custom code and deploy it to their robots alongside the features available directly from Transitive Robotics.&lt;/p&gt;
&lt;p&gt;Last session, we tried running Transitive Robotics on a Turtlebot. We managed to remotely operate the robot, plus set up Maps as a capability which unfortunately didn’t work due to incompatibility with ROS 2 Jazzy (support has since been added for Jazzy). If you’re interested to watch the meeting, it is available &lt;a href=&quot;https://youtu.be/GVUuPl1C4vE?si=ieqfLQZamg_EBDV0&quot; rel=&quot;noopener nofollow ugc&quot;&gt;on YouTube&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The &lt;strong&gt;meeting link for next meeting is &lt;a href=&quot;https://meet.google.com/xox-nshv-uvm&quot; rel=&quot;noopener nofollow ugc&quot;&gt;here&lt;/a&gt;&lt;/strong&gt;, and you can sign up to &lt;a href=&quot;https://calendar.google.com/calendar/u/0/embed?src=c_3fc5c4d6ece9d80d49f136c1dcd54d7f44e1acefdbe87228c92ff268e85e2ea0@group.calendar.google.com&amp;amp;ctz=UTC&quot; rel=&quot;noopener nofollow ugc&quot;&gt;our calendar&lt;/a&gt; or our &lt;a href=&quot;https://groups.google.com/g/cloud-robotics-working-group-invites&quot; rel=&quot;noopener nofollow ugc&quot;&gt;Google Group&lt;/a&gt; for meeting notifications or keep an eye on the &lt;a href=&quot;https://cloudroboticshub.github.io/&quot; rel=&quot;noopener nofollow ugc&quot;&gt;Cloud Robotics Hub&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Hopefully we will see you there!&lt;/p&gt;
            &lt;p&gt;&lt;small&gt;1 post - 1 participant&lt;/small&gt;&lt;/p&gt;
            &lt;p&gt;&lt;a href=&quot;https://discourse.openrobotics.org/t/custom-capabilities-in-transitive-robotics-cloud-robotics-wg-meeting-2026-04-13/53709&quot;&gt;Read full topic&lt;/a&gt;&lt;/p&gt;</description>
	<pubDate>Wed, 01 Apr 2026 08:52:45 +0000</pubDate>
</item>
<item>
	<title>ROS Discourse General: Upcomming RMW Feature Freeze - April 6th, 2026 - ROS Lyrical</title>
	<guid isPermaLink="false">discourse.openrobotics.org-topic-53696</guid>
	<link>https://discourse.openrobotics.org/t/upcomming-rmw-feature-freeze-april-6th-2026-ros-lyrical/53696</link>
	<description>&lt;p&gt;Hi all,&lt;/p&gt;
&lt;p&gt;On &lt;span class=&quot;discourse-local-date&quot;&gt;Tue, Apr 7, 2026 6:59 AM UTC&lt;/span&gt;, we will freeze all RMW-related packages to prepare for the upcoming &lt;code&gt;Lyrical Luth&lt;/code&gt; release on &lt;span class=&quot;discourse-local-date&quot;&gt;Fri, May 22, 2026 7:00 AM UTC&lt;/span&gt;.&lt;/p&gt;
&lt;p&gt;Once this freeze takes effect, we will not accept new features to the RMW packages until Lyrical branches from ROS Rolling. This restriction applies to the following packages and vendor packages:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href=&quot;https://github.com/ros2/rmw_fastrtps.git&quot; rel=&quot;noopener nofollow ugc&quot;&gt;rmw_fastrtps&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://github.com/ros2/rmw_cyclonedds.git&quot; rel=&quot;noopener nofollow ugc&quot;&gt;rmw_cyclonedds&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://github.com/ros2/rmw_connextdds.git&quot; rel=&quot;noopener nofollow ugc&quot;&gt;rmw_connextdds&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://github.com/ros2/rmw_zenoh&quot; rel=&quot;noopener nofollow ugc&quot;&gt;rmw_zenoh&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://github.com/eProsima/Fast-DDS&quot; rel=&quot;noopener nofollow ugc&quot;&gt;Fast-DDS&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://github.com/eProsima/Fast-CDR&quot; rel=&quot;noopener nofollow ugc&quot;&gt;Fast-CDR&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://github.com/eclipse-cyclonedds/cyclonedds&quot; rel=&quot;noopener nofollow ugc&quot;&gt;cyclonedds&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://github.com/eclipse/iceoryx&quot; rel=&quot;noopener nofollow ugc&quot;&gt;iceoryx&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://github.com/eclipse-zenoh/zenoh-cpp&quot; rel=&quot;noopener nofollow ugc&quot;&gt;zenoh_cpp&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;We still welcome bug fixes after the freeze date.&lt;/p&gt;
&lt;p&gt;Find more information on the Lyrical Luth release timeline here: &lt;a href=&quot;https://docs.ros.org/en/rolling/Releases/Release-Lyrical-Luth.html#release-timeline&quot;&gt;ROS 2 Lyrical Luth (codename ‘lyrical’; May, 2026)&lt;/a&gt;.&lt;/p&gt;
            &lt;p&gt;&lt;small&gt;5 posts - 2 participants&lt;/small&gt;&lt;/p&gt;
            &lt;p&gt;&lt;a href=&quot;https://discourse.openrobotics.org/t/upcomming-rmw-feature-freeze-april-6th-2026-ros-lyrical/53696&quot;&gt;Read full topic&lt;/a&gt;&lt;/p&gt;</description>
	<pubDate>Tue, 31 Mar 2026 15:29:08 +0000</pubDate>
</item>
<item>
	<title>ROS Discourse General: ROS2 Launch File Validation</title>
	<guid isPermaLink="false">discourse.openrobotics.org-topic-53686</guid>
	<link>https://discourse.openrobotics.org/t/ros2-launch-file-validation/53686</link>
	<description>&lt;h1&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110980-introducing-an-xml-launch-file-scheme-1&quot; name=&quot;p-110980-introducing-an-xml-launch-file-scheme-1&quot;&gt;&lt;/a&gt;Introducing an XML launch file scheme&lt;/h1&gt;
&lt;p&gt;XSD schema for validating ROS2 XML launch files.&lt;br /&gt;
Catch syntax errors before runtime and get IDE support.&lt;/p&gt;
&lt;aside class=&quot;onebox githubrepo&quot;&gt;
  &lt;header class=&quot;source&quot;&gt;

      &lt;a href=&quot;https://github.com/nobleo/ros2_launch_validation&quot; rel=&quot;noopener nofollow ugc&quot; target=&quot;_blank&quot;&gt;github.com&lt;/a&gt;
  &lt;/header&gt;

  &lt;article class=&quot;onebox-body&quot;&gt;
    &lt;div class=&quot;github-row&quot;&gt;
  &lt;img class=&quot;thumbnail&quot; height=&quot;344&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/optimized/3X/5/b/5b71fdc2783c6d19cebc7162744c231ddde0cf37_2_690x344.png&quot; width=&quot;690&quot; /&gt;

  &lt;h3&gt;&lt;a href=&quot;https://github.com/nobleo/ros2_launch_validation&quot; rel=&quot;noopener nofollow ugc&quot; target=&quot;_blank&quot;&gt;GitHub - nobleo/ros2_launch_validation: Validate your ros2 launchfiles using an XMLSchema&lt;/a&gt;&lt;/h3&gt;

    &lt;p&gt;&lt;span class=&quot;github-repo-description&quot;&gt;Validate your ros2 launchfiles using an XMLSchema&lt;/span&gt;&lt;/p&gt;
&lt;/div&gt;

  &lt;/article&gt;

  &lt;div class=&quot;onebox-metadata&quot;&gt;
    
    
  &lt;/div&gt;

  &lt;div style=&quot;clear: both;&quot;&gt;&lt;/div&gt;
&lt;/aside&gt;

&lt;h2&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110980-why-2&quot; name=&quot;p-110980-why-2&quot;&gt;&lt;/a&gt;Why&lt;/h2&gt;
&lt;p&gt;For the package.xml we have had it for years, a &lt;a href=&quot;http://download.ros.org/schema/package_format3.xsd&quot;&gt;scheme&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;But we found my muscle memory often typing &lt;code&gt;type=&lt;/code&gt; instead of &lt;code&gt;exec=&lt;/code&gt;.&lt;br /&gt;
Or &lt;code&gt;$(find my_pkg)&lt;/code&gt; instead of &lt;code&gt;$(find-pkg-share my_pkg)&lt;/code&gt;.&lt;/p&gt;
&lt;p&gt;And we could unit-test the node all we wanted, these errors only popped up in integration tests or even on the robot itself.&lt;br /&gt;
Would it not be nice if your editor already warned about you it?&lt;/p&gt;
&lt;p&gt;&lt;/p&gt;&lt;div class=&quot;lightbox-wrapper&quot;&gt;&lt;a class=&quot;lightbox&quot; href=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/4/d/4dbf848c58a3058d0a2c778f586338416446b8a6.png&quot; rel=&quot;noopener nofollow ugc&quot; title=&quot;image&quot;&gt;&lt;img alt=&quot;image&quot; height=&quot;128&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/optimized/3X/4/d/4dbf848c58a3058d0a2c778f586338416446b8a6_2_690x128.png&quot; width=&quot;690&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;p&gt;&lt;/p&gt;
&lt;h2&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110980-how-3&quot; name=&quot;p-110980-how-3&quot;&gt;&lt;/a&gt;How&lt;/h2&gt;
&lt;h3&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110980-embed-in-launch-file-4&quot; name=&quot;p-110980-embed-in-launch-file-4&quot;&gt;&lt;/a&gt;Embed in launch file&lt;/h3&gt;
&lt;p&gt;Start your launchfile like this:&lt;/p&gt;
&lt;pre&gt;&lt;code class=&quot;lang-xml&quot;&gt;&amp;lt;?xml version=&quot;1.0&quot;?&amp;gt;
&amp;lt;?xml-model href=&quot;https://nobleo.github.io/ros2_launch_validation/ros2_launch.xsd&quot; schematypens=&quot;http://www.w3.org/2001/XMLSchema&quot;?&amp;gt;

&amp;lt;launch&amp;gt;
&lt;/code&gt;&lt;/pre&gt;
&lt;h3&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110980-command-line-validation-5&quot; name=&quot;p-110980-command-line-validation-5&quot;&gt;&lt;/a&gt;Command-line validation&lt;/h3&gt;
&lt;p&gt;Quickstart! Validate all your launch xml files in your workspace right now!&lt;/p&gt;
&lt;pre&gt;&lt;code class=&quot;lang-bash&quot;&gt;xmllint --noout --schema &amp;lt;(curl -s https://nobleo.github.io/ros2_launch_validation/ros2_launch.xsd) **/*.launch.xml
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;This was verified internally and on some larger public repositories like autoware. Even found an &lt;a href=&quot;https://github.com/autowarefoundation/autoware_universe/pull/11019#discussion_r2412363847&quot; rel=&quot;noopener nofollow ugc&quot;&gt;issue&lt;/a&gt; &lt;img alt=&quot;:slight_smile:&quot; class=&quot;emoji&quot; height=&quot;20&quot; src=&quot;https://emoji.discourse-cdn.com/noto/slight_smile.png?v=15&quot; title=&quot;:slight_smile:&quot; width=&quot;20&quot; /&gt;&lt;/p&gt;
            &lt;p&gt;&lt;small&gt;6 posts - 2 participants&lt;/small&gt;&lt;/p&gt;
            &lt;p&gt;&lt;a href=&quot;https://discourse.openrobotics.org/t/ros2-launch-file-validation/53686&quot;&gt;Read full topic&lt;/a&gt;&lt;/p&gt;</description>
	<pubDate>Tue, 31 Mar 2026 07:09:55 +0000</pubDate>
</item>
<item>
	<title>ROS Discourse General: RFC: Open standard for robot-to-human light signaling — looking for technical feedback from ROS 2 developers</title>
	<guid isPermaLink="false">discourse.openrobotics.org-topic-53656</guid>
	<link>https://discourse.openrobotics.org/t/rfc-open-standard-for-robot-to-human-light-signaling-looking-for-technical-feedback-from-ros-2-developers/53656</link>
	<description>&lt;p&gt;Hi everyone,&lt;/p&gt;
&lt;p&gt;I’m working on an open standard called LSEP (Luminae Signal Expression Protocol) — a state machine specification for how robots communicate intent, awareness, and safety states to humans through light signals.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;The problem it solves&lt;/strong&gt;:&lt;/p&gt;
&lt;p&gt;Most robotic platforms implement ad-hoc LED patterns with no shared semantics. Robot A blinks blue for “idle,” Robot B blinks blue for “navigating.” There’s no interoperability, and no way for a human in a shared workspace to learn one signal language that transfers across platforms.&lt;/p&gt;
&lt;p&gt;LSEP defines a modular 9-state architecture: 6 Core states (IDLE, AWARENESS, INTENT, CARE, CRITICAL, THREAT) and 3 Extended states (MED_CONF, LOW_CONF, INTEGRITY) with deterministic mappings from sensor inputs like Time-to-Collision (TTC) to signal outputs. The full spec is open: &lt;a href=&quot;https://lsep.org&quot; rel=&quot;noopener nofollow ugc&quot;&gt;https://lsep.org&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Where ROS 2 comes in&lt;/strong&gt;:&lt;/p&gt;
&lt;p&gt;We’ve designed LSEP to run as an isolated safety node — it reads from your perception pipeline (TTC, proximity, sensor health) and publishes signal commands. It doesn’t touch your navigation stack. The architecture pattern uses lifecycle nodes to keep the signaling guardrail separate from autonomy logic.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;What I’m looking for&lt;/strong&gt;:&lt;/p&gt;
&lt;p&gt;We’re running a free Beta program for 20 ROS 2 developers who want to stress-test the integration. No cost — your payment is brutal, unfiltered technical feedback and (optionally) a short write-up on how it fits into your stack.&lt;/p&gt;
&lt;p&gt;The program covers:&lt;/p&gt;
&lt;p&gt;- Translating TTC and proximity data to deterministic state machine outputs&lt;/p&gt;
&lt;p&gt;- EU AI Act compliance layers (Art. 9 &amp;amp; 50) for high-risk physical AI transparency&lt;/p&gt;
&lt;p&gt;- LSEP core &amp;amp; extended states: mechanics of the 9-state multimodal standard&lt;/p&gt;
&lt;p&gt;- ROS 2 integration: isolating the LSEP safety node from your navigation stack&lt;/p&gt;
&lt;p&gt;- Sensor fusion resilience: hysteresis and fallback patterns for sensor dropouts&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Not looking for&lt;/strong&gt;:&lt;/p&gt;
&lt;p&gt;This isn’t a pitch. I’m not selling anything here. I’m looking for the people who actually build these systems to tell me where LSEP breaks, what’s missing, and what’s naive. The harshest feedback is the most useful.&lt;/p&gt;
&lt;p&gt;Full spec: &lt;a href=&quot;https://lsep.org&quot; rel=&quot;noopener nofollow ugc&quot;&gt;https://lsep.org&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Beta registration: &lt;a href=&quot;https://www.experiencedesigninstitute.ch&quot; rel=&quot;noopener nofollow ugc&quot;&gt;https://www.experiencedesigninstitute.ch&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;— Nemanja Galić&lt;/p&gt;
            &lt;p&gt;&lt;small&gt;2 posts - 2 participants&lt;/small&gt;&lt;/p&gt;
            &lt;p&gt;&lt;a href=&quot;https://discourse.openrobotics.org/t/rfc-open-standard-for-robot-to-human-light-signaling-looking-for-technical-feedback-from-ros-2-developers/53656&quot;&gt;Read full topic&lt;/a&gt;&lt;/p&gt;</description>
	<pubDate>Mon, 30 Mar 2026 16:10:45 +0000</pubDate>
</item>
<item>
	<title>ROS Industrial: PLCnext ROS Bridge: Enabling Hardware Interoperability Between Industrial PLCs and ROS</title>
	<guid isPermaLink="false">51df34b1e4b08840dcfd2841:51df4543e4b0e97ae8d7ea7d:69c6497b32310c659507ac6a</guid>
	<link>https://rosindustrial.org/news/2026/3/27/plcnext-ros-bridge-enabling-hardware-interoperability-between-industrial-plcs-and-ros</link>
	<description>&lt;p&gt;For developers already working with ROS, the integration of industrial fieldbuses, I/Os, and functional safety into robotic applications often introduces unexpected challenges. ROS offers a flexible and modular software framework, although connecting it to industrial automation hardware typically requires additional integration layers and specialized knowledge.&lt;/p&gt;
&lt;p&gt;This led to the idea of creating a solution that allows ROS developers to leverage a PLC where it excels, for example in deterministic control, industrial communication, and safety, while high performance computation and complex logic remain handled within ROS.&lt;/p&gt;
&lt;h2&gt;PLCnext Technology Architecture Overview&lt;/h2&gt;
&lt;p&gt;PLCnext Controls run PLCnext Linux, a real-time capable operating system that hosts the PLCnext Runtime. The Runtime manages deterministic process data and stores it in the Global Data Space (GDS). &lt;/p&gt;
&lt;p&gt;&lt;em&gt;Key architectural components&lt;/em&gt; :&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;PLCnext Linux: Yoctoâ€‘based embedded Linux&lt;/li&gt;
&lt;li&gt;PLCnext Runtime (tasks, data handling, Axioline integration): Provides deterministic processing and the Global Data Space&lt;/li&gt;
&lt;li&gt;Global Data Space (GDS): Central storage for process variables accessible from PLC programs and system apps&lt;/li&gt;
&lt;li&gt;PLCnext Apps: Packaged software components that can be installed on the controller&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;PLCnext ROS Bridge&lt;/h2&gt;












































  

    
  
    

      

      
        &lt;figure class=&quot;               sqs-block-image-figure               intrinsic             &quot;&gt;
          
        
        

        
          
            
          
            
                
                
                
                
                
                
                
                &lt;img alt=&quot;&quot; height=&quot;543&quot; src=&quot;https://images.squarespace-cdn.com/content/v1/51df34b1e4b08840dcfd2841/0faa7d01-afbc-4416-9cef-8a94d3e260aa/Picture1.png?format=1000w&quot; width=&quot;1404&quot; /&gt;

            
          
        
          
        

        
      
        &lt;/figure&gt;
      

    
  


  


&lt;h4&gt;Concept&lt;/h4&gt;
&lt;p&gt;At its core, the PLCnext ROS Bridge is a custom ROS node with dedicated services running inside a Docker container, packaged as a PLCnext App. It provides a bidirectional communication gateway between the PLCnext Global Data Space (industrial side) and ROS topics (robotics side).&lt;/p&gt;
&lt;p&gt;To illustrate this, consider a motor connected to the PLC via &lt;em&gt;EtherCAT/FSoE&lt;/em&gt; or &lt;em&gt;PROFINET/PROFIsafe&lt;/em&gt;. The motor, along with its associated safety functions, can be managed through simple PLC logic and represented by a set of variables. Depending on the implementation, these variables, such as setpoints, command velocities, etc., can be exposed to ROS. When the navigation stack publishes a command velocity, the ROS Bridge, as a subscriber to this topic, writes the received values to the corresponding variable on the PLC side. Likewise, information such as safety status or system state can be sent from the PLC to ROS and made available through a defined topic.&lt;/p&gt;
&lt;h4&gt;Commissioning Workflow&lt;/h4&gt;
&lt;p&gt;The ROS Bridge Node is generated through an automated code-generation process. This process is driven by the Interface Description File (IDF), which defines the PLC instance paths (variables) that should be exposed to ROS.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;A typical build process performs the following steps&lt;/em&gt;:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;Building the ROS Packages&lt;ul&gt;
&lt;li&gt;Parse the IDF and generate the source code for the topic, publisher and subscribers&lt;/li&gt;
&lt;li&gt;Build the ROS Node&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Place the resulting binaries and gRPC dependencies into a Docker image with a minimal ros-core installation.&lt;/li&gt;
&lt;li&gt;Package the Docker image, together with required metadata, into a read-only PLCnext App.&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;The resulting App can be deployed to a PLCnext Controller using the Web-Based Management (WBM) interface. While it is possible to build everything in a local environment, the project is designed to be built via CI/CD. An example pipeline can also be found in the GitHub repository. &lt;/p&gt;
&lt;h4&gt;Runtime Behaviour&lt;/h4&gt;
&lt;p&gt;After installation, the App starts the container defined via the compose file. Inside this container, the generated ROS Node connects to the Global Data Space using the built gRPC client and then exposes the selected PLC variables via ROS publishers and subscribers. This enables ROS developers to integrate automation components, such as sensors, actuators, I/O modules, and fieldbus devices, into a ROS-based architecture through the GDS. Moreover, the Bridge sets up a set of services that enable users to read and write information at runtime.&lt;/p&gt;
&lt;h2&gt;Further Reading&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;A Demo including a PLCnext Project can be downloaded from the PLCnext Store: &lt;a href=&quot;https://www.plcnextstore.com/permalinks/apps/latest/60002172000613&quot;&gt;https://www.plcnextstore.com/permalinks/apps/latest/60002172000613&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;The Source Code can be found in the PLCnext GitHub Repository: &lt;a href=&quot;https://github.com/PLCnext/PLCnext-ROS-bridge&quot;&gt;https://github.com/PLCnext/PLCnext-ROS-bridge&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;em&gt;More Information about the PLCnext Technology&lt;/em&gt;: &lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href=&quot;https://www.plcnext.help&quot;&gt;https://www.plcnext.help&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://www.phoenixcontact.com/agv&quot;&gt;https://www.phoenixcontact.com/agv&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;</description>
	<pubDate>Sat, 28 Mar 2026 05:00:00 +0000</pubDate>
</item>
<item>
	<title>ROS Discourse General: Questions on Zero-Copy for Variable-Size Messages (PointCloud2) with Iceoryx in ROS 2</title>
	<guid isPermaLink="false">discourse.openrobotics.org-topic-53585</guid>
	<link>https://discourse.openrobotics.org/t/questions-on-zero-copy-for-variable-size-messages-pointcloud2-with-iceoryx-in-ros-2/53585</link>
	<description>&lt;p&gt;Hi everyone,&lt;/p&gt;
&lt;p&gt;I am currently working on optimizing high-bandwidth sensor data transmission (specifically LiDAR point clouds) using &lt;strong&gt;ROS 2 and Iceoryx&lt;/strong&gt; for zero-copy communication.&lt;/p&gt;
&lt;p&gt;I have successfully set up the Iceoryx environment and confirmed zero-copy works for fixed-size types. However, I am facing challenges when applying this to &lt;strong&gt;variable-size messages&lt;/strong&gt;, such as &lt;code&gt;sensor_msgs/msg/PointCloud2&lt;/code&gt;.&lt;/p&gt;
&lt;p&gt;As I understand it, Iceoryx typically requires pre-allocated memory pools with fixed chunks. In the case of &lt;code&gt;PointCloud2&lt;/code&gt;, the data size can vary depending on the LiDAR’s points (in my case, around 5.2MB per message).&lt;/p&gt;
&lt;p&gt;I have two specific questions:&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;1. Best practices for variable-size data like PointCloud2&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;How should we handle messages where the size is not strictly fixed at compile-time while still maintaining zero-copy benefits? Should we always pre-allocate the “worst-case” maximum size for the underlying buffers? If anyone has implemented this for &lt;code&gt;sensor_msgs/msg/PointCloud2&lt;/code&gt; or similar dynamic types, I would appreciate any advice or examples.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;2. Tuning RouDi Configuration (&lt;code&gt;size&lt;/code&gt; and &lt;code&gt;count&lt;/code&gt;)&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Regarding the &lt;code&gt;roudi_config.toml&lt;/code&gt; (or the RouDi memory pool setup), what is the general rule of thumb for determining the optimal &lt;code&gt;size&lt;/code&gt; and &lt;code&gt;count&lt;/code&gt;?&lt;/p&gt;
&lt;p&gt;For high-resolution LiDAR data:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;How do you balance between the number of chunks (&lt;code&gt;count&lt;/code&gt;) and the buffer &lt;code&gt;size&lt;/code&gt; for each chunk to avoid memory exhaustion without being overly wasteful?&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Are there any common pitfalls when setting these values for a system with multiple subscribers?&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;I’ve already got Iceoryx installed and basic IPC working, but I want to ensure my configuration is production-ready for large-scale sensor data.&lt;/p&gt;
&lt;p&gt;Thank you in advance for your insights!&lt;/p&gt;
            &lt;p&gt;&lt;small&gt;4 posts - 3 participants&lt;/small&gt;&lt;/p&gt;
            &lt;p&gt;&lt;a href=&quot;https://discourse.openrobotics.org/t/questions-on-zero-copy-for-variable-size-messages-pointcloud2-with-iceoryx-in-ros-2/53585&quot;&gt;Read full topic&lt;/a&gt;&lt;/p&gt;</description>
	<pubDate>Fri, 27 Mar 2026 16:14:52 +0000</pubDate>
</item>
<item>
	<title>ROS Discourse General: WEBINAR: Accelerating Robotics Development with Qt Robotics Framework</title>
	<guid isPermaLink="false">discourse.openrobotics.org-topic-53584</guid>
	<link>https://discourse.openrobotics.org/t/webinar-accelerating-robotics-development-with-qt-robotics-framework/53584</link>
	<description>&lt;p&gt;Join Qt Group Webinar&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://www.qt.io/events/accelerating-robotics-development-with-qt-robotics-framework&quot; rel=&quot;noopener nofollow ugc&quot;&gt;Accelerating Robotics Development with Qt Robotics Framework&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;&lt;/p&gt;&lt;div class=&quot;lightbox-wrapper&quot;&gt;&lt;a class=&quot;lightbox&quot; href=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/6/8/689a260c54b0defc3c94ee0ba49de3e7a476819c.jpeg&quot; rel=&quot;noopener nofollow ugc&quot; title=&quot;image&quot;&gt;&lt;img alt=&quot;image&quot; height=&quot;500&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/optimized/3X/6/8/689a260c54b0defc3c94ee0ba49de3e7a476819c_2_500x500.jpeg&quot; width=&quot;500&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;p&gt;&lt;/p&gt;
&lt;p&gt;Qt Robotics Framework (QRF) introduces a fast, reliable way to connect Qt‑based applications (QML and C++) with ROS2 middleware. By automatically generating strongly‑typed Qt/QML bindings from ROS2 interface definitions, QRF enables robotics teams to integrate control, visualization, and simulation capabilities with minimal boilerplate and maximum safety.&lt;/p&gt;
&lt;p&gt;In this webinar, Qt Group’s engineers and industry experts demonstrate how QRF simplifies prototyping, reduces integration complexity, and helps teams move rapidly from concept to production.&lt;/p&gt;
&lt;p&gt;Whether you’re building robot controllers, diagnostics dashboards, or simulation environments, Qt Robotics Framework reduces the development cycle and improves reliability across your robotics stack.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Speakers&lt;/strong&gt;:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;Michele Rossi, Director, Industry, Qt Group&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Przemysław Nogaj, Head of HMI Technology, Spyrosoft&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Tommi Mänttäri, Senior Manager, R&amp;amp;D, Qt Group&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;a href=&quot;https://www.qt.io/events/accelerating-robotics-development-with-qt-robotics-framework&quot; rel=&quot;noopener nofollow ugc&quot;&gt;Accelerating Robotics Development with Qt Robotics Framework&lt;/a&gt;&lt;/p&gt;
            &lt;p&gt;&lt;small&gt;1 post - 1 participant&lt;/small&gt;&lt;/p&gt;
            &lt;p&gt;&lt;a href=&quot;https://discourse.openrobotics.org/t/webinar-accelerating-robotics-development-with-qt-robotics-framework/53584&quot;&gt;Read full topic&lt;/a&gt;&lt;/p&gt;</description>
	<pubDate>Fri, 27 Mar 2026 16:12:57 +0000</pubDate>
</item>
<item>
	<title>ROS Discourse General: ROS2 Studio — GUI tool for performance monitoring, bag operations and system dashboard</title>
	<guid isPermaLink="false">discourse.openrobotics.org-topic-53554</guid>
	<link>https://discourse.openrobotics.org/t/ros2-studio-gui-tool-for-performance-monitoring-bag-operations-and-system-dashboard/53554</link>
	<description>&lt;p&gt;Hi ROS community! &lt;img alt=&quot;:waving_hand:&quot; class=&quot;emoji&quot; height=&quot;20&quot; src=&quot;https://emoji.discourse-cdn.com/noto/waving_hand.png?v=15&quot; title=&quot;:waving_hand:&quot; width=&quot;20&quot; /&gt;&lt;/p&gt;
&lt;p&gt;I’d like to share a tool I built — &lt;strong&gt;ROS2 Studio&lt;/strong&gt;, a single GUI that brings together the most common ROS2 monitoring and bag operations in one place.&lt;/p&gt;
&lt;h2&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110782-what-is-ros2-studio-1&quot; name=&quot;p-110782-what-is-ros2-studio-1&quot;&gt;&lt;/a&gt;What is ROS2 Studio?&lt;/h2&gt;
&lt;p&gt;ROS2 Studio is a PyQt5-based desktop GUI that runs as a native ROS2 CLI extension (&lt;code&gt;ros2 studio&lt;/code&gt;). Instead of juggling multiple terminal windows, everything is accessible from one interface.&lt;/p&gt;
&lt;h2&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110782-features-2&quot; name=&quot;p-110782-features-2&quot;&gt;&lt;/a&gt;Features&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;img alt=&quot;:bar_chart:&quot; class=&quot;emoji&quot; height=&quot;20&quot; src=&quot;https://emoji.discourse-cdn.com/noto/bar_chart.png?v=15&quot; title=&quot;:bar_chart:&quot; width=&quot;20&quot; /&gt; &lt;strong&gt;Performance Monitor&lt;/strong&gt; — real-time CPU, memory, and frequency graphs for any topic or node&lt;/li&gt;
&lt;li&gt;&lt;img alt=&quot;:red_circle:&quot; class=&quot;emoji&quot; height=&quot;20&quot; src=&quot;https://emoji.discourse-cdn.com/noto/red_circle.png?v=15&quot; title=&quot;:red_circle:&quot; width=&quot;20&quot; /&gt; &lt;strong&gt;Bag Recorder&lt;/strong&gt; — multi-topic selection with custom save location&lt;/li&gt;
&lt;li&gt;&lt;img alt=&quot;:play_button:&quot; class=&quot;emoji&quot; height=&quot;20&quot; src=&quot;https://emoji.discourse-cdn.com/noto/play_button.png?v=15&quot; title=&quot;:play_button:&quot; width=&quot;20&quot; /&gt; &lt;strong&gt;Bag Player&lt;/strong&gt; — playback with adjustable rate (0.1x–10x) and loop controls&lt;/li&gt;
&lt;li&gt;&lt;img alt=&quot;:counterclockwise_arrows_button:&quot; class=&quot;emoji&quot; height=&quot;20&quot; src=&quot;https://emoji.discourse-cdn.com/noto/counterclockwise_arrows_button.png?v=15&quot; title=&quot;:counterclockwise_arrows_button:&quot; width=&quot;20&quot; /&gt; &lt;strong&gt;Bag to CSV Converter&lt;/strong&gt; — full message deserialization via &lt;code&gt;rosbag2_py&lt;/code&gt; to CSV&lt;/li&gt;
&lt;li&gt;&lt;img alt=&quot;:control_knobs:&quot; class=&quot;emoji&quot; height=&quot;20&quot; src=&quot;https://emoji.discourse-cdn.com/noto/control_knobs.png?v=15&quot; title=&quot;:control_knobs:&quot; width=&quot;20&quot; /&gt; &lt;strong&gt;System Dashboard&lt;/strong&gt; — CPU, memory, disk, network stats, ROS2 entities, and process monitor&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110782-installation-3&quot; name=&quot;p-110782-installation-3&quot;&gt;&lt;/a&gt;Installation&lt;/h2&gt;
&lt;pre&gt;&lt;code class=&quot;lang-bash&quot;&gt;cd ~/ros2_ws/src
git clone https://github.com/Sourav0607/ROS2-STUDIO
cd ~/ros2_ws
colcon build --packages-select ros2_studio
source install/setup.bash
ros2 studio
&lt;/code&gt;&lt;/pre&gt;
&lt;h2&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110782-compatibility-4&quot; name=&quot;p-110782-compatibility-4&quot;&gt;&lt;/a&gt;Compatibility&lt;/h2&gt;
&lt;p&gt;Tested on ROS2 Humble and Jazzy on Ubuntu 22.04.&lt;/p&gt;
&lt;h2&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110782-links-5&quot; name=&quot;p-110782-links-5&quot;&gt;&lt;/a&gt;Links&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;GitHub: &lt;a class=&quot;inline-onebox&quot; href=&quot;https://github.com/Sourav0607/ROS2-STUDIO&quot; rel=&quot;noopener nofollow ugc&quot;&gt;GitHub - Sourav0607/ROS2-STUDIO: A comprehensive ROS2 monitoring and management tool with an intuitive GUI for performance monitoring, bag recording, and bag playback. · GitHub&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Feedback, issues, and contributions are very welcome! I’m actively maintaining this and plan to add more features based on community input.&lt;/p&gt;
&lt;p&gt;— Sourav&lt;/p&gt;
&lt;p&gt;&lt;/p&gt;&lt;div class=&quot;lightbox-wrapper&quot;&gt;&lt;a class=&quot;lightbox&quot; href=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/1/f/1f41e8e001a180c2e4751d1c6dee3bca212d46ae.png&quot; rel=&quot;noopener nofollow ugc&quot; title=&quot;systm dashboard&quot;&gt;&lt;img alt=&quot;systm dashboard&quot; height=&quot;462&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/optimized/3X/1/f/1f41e8e001a180c2e4751d1c6dee3bca212d46ae_2_690x462.png&quot; width=&quot;690&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;p&gt;&lt;/p&gt;
&lt;p&gt;&lt;/p&gt;&lt;div class=&quot;lightbox-wrapper&quot;&gt;&lt;a class=&quot;lightbox&quot; href=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/6/5/65a8a36c1e874dbb0f6bb234874b91624c34f0f9.png&quot; rel=&quot;noopener nofollow ugc&quot; title=&quot;bag play&quot;&gt;&lt;img alt=&quot;bag play&quot; height=&quot;462&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/optimized/3X/6/5/65a8a36c1e874dbb0f6bb234874b91624c34f0f9_2_690x462.png&quot; width=&quot;690&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;p&gt;&lt;/p&gt;
&lt;p&gt;&lt;/p&gt;&lt;div class=&quot;lightbox-wrapper&quot;&gt;&lt;a class=&quot;lightbox&quot; href=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/8/a/8a46e499ef9944bf83612a1bcc04bf08443f09fc.png&quot; rel=&quot;noopener nofollow ugc&quot; title=&quot;bag recorder&quot;&gt;&lt;img alt=&quot;bag recorder&quot; height=&quot;462&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/optimized/3X/8/a/8a46e499ef9944bf83612a1bcc04bf08443f09fc_2_690x462.png&quot; width=&quot;690&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;p&gt;&lt;/p&gt;
&lt;p&gt;&lt;/p&gt;&lt;div class=&quot;lightbox-wrapper&quot;&gt;&lt;a class=&quot;lightbox&quot; href=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/e/1/e10c4d5458ed8e07982d867c85dbc8f2775aa132.png&quot; rel=&quot;noopener nofollow ugc&quot; title=&quot;convert to csv&quot;&gt;&lt;img alt=&quot;convert to csv&quot; height=&quot;462&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/optimized/3X/e/1/e10c4d5458ed8e07982d867c85dbc8f2775aa132_2_690x462.png&quot; width=&quot;690&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;p&gt;&lt;/p&gt;
&lt;p&gt;&lt;/p&gt;&lt;div class=&quot;lightbox-wrapper&quot;&gt;&lt;a class=&quot;lightbox&quot; href=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/d/e/de170a78a7065e137f664fca6310d1b77fa289d1.png&quot; rel=&quot;noopener nofollow ugc&quot; title=&quot;performanc metrics&quot;&gt;&lt;img alt=&quot;performanc metrics&quot; height=&quot;462&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/optimized/3X/d/e/de170a78a7065e137f664fca6310d1b77fa289d1_2_690x462.png&quot; width=&quot;690&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;p&gt;&lt;/p&gt;
            &lt;p&gt;&lt;small&gt;1 post - 1 participant&lt;/small&gt;&lt;/p&gt;
            &lt;p&gt;&lt;a href=&quot;https://discourse.openrobotics.org/t/ros2-studio-gui-tool-for-performance-monitoring-bag-operations-and-system-dashboard/53554&quot;&gt;Read full topic&lt;/a&gt;&lt;/p&gt;</description>
	<pubDate>Thu, 26 Mar 2026 14:58:28 +0000</pubDate>
</item>
<item>
	<title>ROS Discourse General: Remote Control of Robotic Arms – Using a Standard Gamepad</title>
	<guid isPermaLink="false">discourse.openrobotics.org-topic-53550</guid>
	<link>https://discourse.openrobotics.org/t/remote-control-of-robotic-arms-using-a-standard-gamepad/53550</link>
	<description>&lt;h1&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110774-gamepad-control-for-piper-manipulator-1&quot; name=&quot;p-110774-gamepad-control-for-piper-manipulator-1&quot;&gt;&lt;/a&gt;Gamepad Control for PiPER Manipulator&lt;/h1&gt;
&lt;h2&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110774-h-1-abstract-2&quot; name=&quot;p-110774-h-1-abstract-2&quot;&gt;&lt;/a&gt;1. Abstract&lt;/h2&gt;
&lt;p&gt;This document implements intuitive control of the PiPER robotic arm using a standard gamepad. With a common gamepad, you can operate the PiPER manipulator in a visualized environment, delivering a precise and intuitive control experience.&lt;/p&gt;
&lt;h3&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110774-tags-3&quot; name=&quot;p-110774-tags-3&quot;&gt;&lt;/a&gt;Tags&lt;/h3&gt;
&lt;p&gt;PiPER Manipulator, Gamepad Teleoperation, Joint Control, Pose Control, Gripper Control, Forward &amp;amp; Inverse Kinematics&lt;/p&gt;
&lt;h2&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110774-h-2-repositories-4&quot; name=&quot;p-110774-h-2-repositories-4&quot;&gt;&lt;/a&gt;2. Repositories&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Navigation Repository: &lt;a class=&quot;inline-onebox&quot; href=&quot;https://github.com/agilexrobotics/Agilex-College&quot; rel=&quot;noopener nofollow ugc&quot;&gt;GitHub - agilexrobotics/Agilex-College: Agilex College · GitHub&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Project Repository: &lt;a class=&quot;inline-onebox&quot; href=&quot;https://github.com/kehuanjack/Gamepad_PiPER&quot; rel=&quot;noopener nofollow ugc&quot;&gt;GitHub - kehuanjack/Gamepad_PiPER: This project implements the functionality of teleoperating a PiPER robotic arm using a gamepad. · GitHub&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110774-h-3-function-demo-5&quot; name=&quot;p-110774-h-3-function-demo-5&quot;&gt;&lt;/a&gt;3. Function Demo&lt;/h2&gt;
&lt;p&gt;&lt;img alt=&quot;20260326-173204&quot; class=&quot;animated&quot; height=&quot;268&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/8/3/83802d8fe7d2cf90b060545fcb1c883aadac6237.gif&quot; width=&quot;567&quot; /&gt;&lt;/p&gt;
&lt;h2&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110774-h-4-environment-setup-6&quot; name=&quot;p-110774-h-4-environment-setup-6&quot;&gt;&lt;/a&gt;4. Environment Setup&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;OS: Ubuntu 20.04 or later&lt;/li&gt;
&lt;li&gt;Python Environment: Python 3.9 or later. Anaconda or Miniconda is recommended&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Clone the project and enter the root directory:&lt;/p&gt;
&lt;pre&gt;&lt;code class=&quot;lang-auto&quot;&gt;git clone https://github.com/kehuanjack/Gamepad_PiPER.git
cd Gamepad_PiPER
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Install common dependencies and kinematics libraries (choose &lt;strong&gt;one&lt;/strong&gt; option; pytracik is recommended):&lt;/p&gt;
&lt;h3&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110774-option-1-based-on-pinocchiohttpsgithubcomstack-of-taskspinocchio-7&quot; name=&quot;p-110774-option-1-based-on-pinocchiohttpsgithubcomstack-of-taskspinocchio-7&quot;&gt;&lt;/a&gt;Option 1: Based on &lt;a href=&quot;https://github.com/stack-of-tasks/pinocchio&quot; rel=&quot;noopener nofollow ugc&quot;&gt;pinocchio&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;(Python == 3.9; requires &lt;a href=&quot;https://github.com/agilexrobotics/piper_ros&quot; rel=&quot;noopener nofollow ugc&quot;&gt;piper_ros&lt;/a&gt; and sourcing the ROS workspace, otherwise meshes will not be found)&lt;/p&gt;
&lt;pre&gt;&lt;code class=&quot;lang-auto&quot;&gt;conda create -n test_pinocchio python=3.9.* -y
conda activate test_pinocchio
pip3 install -r requirements_common.txt --upgrade
conda install pinocchio=3.6.0 -c conda-forge
pip install meshcat
pip install casadi
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;In &lt;code&gt;main.py&lt;/code&gt; and &lt;code&gt;main_virtual.py&lt;/code&gt;, select:&lt;code&gt;from src.gamepad_pin import RoboticArmController&lt;/code&gt;&lt;/p&gt;
&lt;h3&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110774-option-2-based-on-pyrokihttpsgithubcomchungmin99pyroki-8&quot; name=&quot;p-110774-option-2-based-on-pyrokihttpsgithubcomchungmin99pyroki-8&quot;&gt;&lt;/a&gt;Option 2: Based on &lt;a href=&quot;https://github.com/chungmin99/pyroki&quot; rel=&quot;noopener nofollow ugc&quot;&gt;PyRoKi&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;(Python &amp;gt;= 3.10)&lt;/p&gt;
&lt;pre&gt;&lt;code class=&quot;lang-auto&quot;&gt;conda create -n test_pyroki python=3.10.* -y
conda activate test_pyroki
pip3 install -r requirements_common.txt --upgrade
pip3 install pyroki@git+https://github.com/chungmin99/pyroki.git@f234516
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;In &lt;code&gt;main.py&lt;/code&gt; and &lt;code&gt;main_virtual.py&lt;/code&gt;, select:&lt;code&gt;from src.gamepad_limit import RoboticArmController&lt;/code&gt; or&lt;code&gt;from src.gamepad_no_limit import RoboticArmController&lt;/code&gt;&lt;/p&gt;
&lt;h3&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110774-option-3-based-on-curobohttpsgithubcomnvlabscurobo-9&quot; name=&quot;p-110774-option-3-based-on-curobohttpsgithubcomnvlabscurobo-9&quot;&gt;&lt;/a&gt;Option 3: Based on &lt;a href=&quot;https://github.com/NVlabs/curobo&quot; rel=&quot;noopener nofollow ugc&quot;&gt;cuRobo&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;(Python &amp;gt;= 3.8; CUDA 11.8 recommended)&lt;/p&gt;
&lt;pre&gt;&lt;code class=&quot;lang-auto&quot;&gt;conda create -n test_curobo python=3.10.* -y
conda activate test_curobo
pip3 install -r requirements_common.txt --upgrade
sudo apt install git-lfs &amp;amp;&amp;amp; cd ../
git clone https://github.com/NVlabs/curobo.git &amp;amp;&amp;amp; cd curobo
pip3 install &quot;numpy&amp;lt;2.0&quot; &quot;torch==2.0.0&quot; pytest lark
pip3 install -e . --no-build-isolation
python3 -m pytest .
cd ../Gamepad_PiPER
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;In &lt;code&gt;main.py&lt;/code&gt; and &lt;code&gt;main_virtual.py&lt;/code&gt;, select:&lt;code&gt;from src.gamepad_curobo import RoboticArmController&lt;/code&gt;&lt;/p&gt;
&lt;h3&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110774-option-4-based-on-pytracikhttpsgithubcomchenhaoxpytracik-10&quot; name=&quot;p-110774-option-4-based-on-pytracikhttpsgithubcomchenhaoxpytracik-10&quot;&gt;&lt;/a&gt;Option 4: Based on &lt;a href=&quot;https://github.com/chenhaox/pytracik&quot; rel=&quot;noopener nofollow ugc&quot;&gt;pytracik&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;(Python &amp;gt;= 3.10)&lt;/p&gt;
&lt;pre&gt;&lt;code class=&quot;lang-auto&quot;&gt;conda create -n test_tracik python=3.10.* -y
conda activate test_tracik
pip3 install -r requirements_common.txt --upgrade
git clone https://github.com/chenhaox/pytracik.git
cd pytracik
pip install -r requirements.txt
sudo apt install g++ libboost-all-dev libeigen3-dev liborocos-kdl-dev libnlopt-dev libnlopt-cxx-dev
python setup_linux.py install --user
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;In &lt;code&gt;main.py&lt;/code&gt; and &lt;code&gt;main_virtual.py&lt;/code&gt;, select:&lt;code&gt;from src.gamepad_trac_ik import RoboticArmController&lt;/code&gt;&lt;/p&gt;
&lt;h2&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110774-h-5-execution-steps-11&quot; name=&quot;p-110774-h-5-execution-steps-11&quot;&gt;&lt;/a&gt;5. Execution Steps&lt;/h2&gt;
&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Connect manipulator and activate CAN interface&lt;/strong&gt;:&lt;code&gt;sudo ip link set can0 up type can bitrate 1000000&lt;/code&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Connect gamepad&lt;/strong&gt;:Connect the gamepad to the PC via USB or Bluetooth.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Launch control script&lt;/strong&gt;:Run &lt;code&gt;python3 main.py&lt;/code&gt; or &lt;code&gt;python3 main_virtual.py&lt;/code&gt; in the project directory.It is recommended to test with &lt;code&gt;main_virtual.py&lt;/code&gt; first in simulation mode.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Verify gamepad connection&lt;/strong&gt;:Check console output to confirm the gamepad is recognized.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Web visualization&lt;/strong&gt;:Open a browser and go to &lt;code&gt;http://localhost:8080&lt;/code&gt; to view the manipulator status.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Start control&lt;/strong&gt;:Operate the manipulator according to the gamepad mapping.&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;h2&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110774-h-6-gamepad-control-instructions-12&quot; name=&quot;p-110774-h-6-gamepad-control-instructions-12&quot;&gt;&lt;/a&gt;6. Gamepad Control Instructions&lt;/h2&gt;
&lt;h3&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110774-h-61-button-mapping-13&quot; name=&quot;p-110774-h-61-button-mapping-13&quot;&gt;&lt;/a&gt;6.1 Button Mapping&lt;/h3&gt;
&lt;div class=&quot;md-table&quot;&gt;
&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Button&lt;/th&gt;
&lt;th&gt;Short Press Function&lt;/th&gt;
&lt;th&gt;Long Press Function&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;HOME&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Connect / Disconnect manipulator&lt;/td&gt;
&lt;td&gt;None&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;START&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Switch high-level control mode (Joint / Pose)&lt;/td&gt;
&lt;td&gt;Switch low-level control mode (Joint / Pose)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;BACK&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Switch low-level command mode (Position-Velocity 0x00 / Fast Response 0xAD)&lt;/td&gt;
&lt;td&gt;None&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Y&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Go to home position&lt;/td&gt;
&lt;td&gt;None&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;A&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Save current position&lt;/td&gt;
&lt;td&gt;Clear current saved position&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;B&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Restore previous saved position&lt;/td&gt;
&lt;td&gt;None&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;X&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Switch playback order&lt;/td&gt;
&lt;td&gt;Clear all saved positions&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;LB&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Increase speed factor (high-level)&lt;/td&gt;
&lt;td&gt;Decrease speed factor (high-level)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;RB&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Increase movement speed (low-level)&lt;/td&gt;
&lt;td&gt;Decrease movement speed (low-level)&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;/div&gt;&lt;h3&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110774-h-62-joystick-trigger-functions-14&quot; name=&quot;p-110774-h-62-joystick-trigger-functions-14&quot;&gt;&lt;/a&gt;6.2 Joystick &amp;amp; Trigger Functions&lt;/h3&gt;
&lt;div class=&quot;md-table&quot;&gt;
&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Control&lt;/th&gt;
&lt;th&gt;Joint Mode&lt;/th&gt;
&lt;th&gt;Pose Mode&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Left Joystick&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;J1 (Base rotation): Left / RightJ2 (Shoulder): Up / Down&lt;/td&gt;
&lt;td&gt;End-effector X / Y translation&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Right Joystick&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;J3 (Elbow): Up / DownJ6 (Wrist rotation): Left / Right&lt;/td&gt;
&lt;td&gt;End-effector Z translation &amp;amp; Z-axis rotation&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;D-Pad&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;J4 (Wrist yaw): Left / RightJ5 (Wrist pitch): Up / Down&lt;/td&gt;
&lt;td&gt;End-effector X / Y-axis rotation&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Left Trigger (LT)&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Close gripper&lt;/td&gt;
&lt;td&gt;Close gripper&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Right Trigger (RT)&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Open gripper&lt;/td&gt;
&lt;td&gt;Open gripper&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;/div&gt;&lt;h3&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110774-h-63-special-functions-15&quot; name=&quot;p-110774-h-63-special-functions-15&quot;&gt;&lt;/a&gt;6.3 Special Functions&lt;/h3&gt;
&lt;h4&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110774-h-631-gripper-control-16&quot; name=&quot;p-110774-h-631-gripper-control-16&quot;&gt;&lt;/a&gt;6.3.1 Gripper Control&lt;/h4&gt;
&lt;ul&gt;
&lt;li&gt;Gripper opening range: 0–100%&lt;/li&gt;
&lt;li&gt;Quick toggle: When fully open (100%) or fully closed (0%), a quick press and release of the trigger toggles the state.&lt;/li&gt;
&lt;/ul&gt;
&lt;h4&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110774-h-632-speed-control-17&quot; name=&quot;p-110774-h-632-speed-control-17&quot;&gt;&lt;/a&gt;6.3.2 Speed Control&lt;/h4&gt;
&lt;ul&gt;
&lt;li&gt;Speed factor: 0.25x, 0.5x, 1.0x, 2.0x, 3.0x, 4.0x, 5.0x (adjust with LB)&lt;/li&gt;
&lt;li&gt;Movement speed: 10%–100% (adjust with RB)&lt;/li&gt;
&lt;/ul&gt;
&lt;h4&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110774-h-633-position-memory-18&quot; name=&quot;p-110774-h-633-position-memory-18&quot;&gt;&lt;/a&gt;6.3.3 Position Memory&lt;/h4&gt;
&lt;ul&gt;
&lt;li&gt;Supports saving multiple waypoints&lt;/li&gt;
&lt;li&gt;Supports forward and reverse playback&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110774-notes-19&quot; name=&quot;p-110774-notes-19&quot;&gt;&lt;/a&gt;Notes&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;You may run &lt;code&gt;main_virtual.py&lt;/code&gt; first to test in simulation.&lt;/li&gt;
&lt;li&gt;For first-time use, start with low speed and increase gradually after familiarization.&lt;/li&gt;
&lt;li&gt;Keep a safe distance during operation. Do not approach the moving manipulator.&lt;/li&gt;
&lt;li&gt;Numerical solutions may cause large joint jumps near singularities — maintain safe distance.&lt;/li&gt;
&lt;li&gt;Fast response mode (0xAD) is dangerous. Use with extreme caution and keep clear.&lt;/li&gt;
&lt;li&gt;If using pinocchio, source the ROS workspace of the manipulator in advance, otherwise meshes will not be detected.&lt;/li&gt;
&lt;/ul&gt;
            &lt;p&gt;&lt;small&gt;1 post - 1 participant&lt;/small&gt;&lt;/p&gt;
            &lt;p&gt;&lt;a href=&quot;https://discourse.openrobotics.org/t/remote-control-of-robotic-arms-using-a-standard-gamepad/53550&quot;&gt;Read full topic&lt;/a&gt;&lt;/p&gt;</description>
	<pubDate>Thu, 26 Mar 2026 09:51:15 +0000</pubDate>
</item>
<item>
	<title>ROS Discourse General: FusionCore, which is a ROS 2 Jazzy sensor fusion package (robot_localization replacement)</title>
	<guid isPermaLink="false">discourse.openrobotics.org-topic-53502</guid>
	<link>https://discourse.openrobotics.org/t/fusioncore-which-is-a-ros-2-jazzy-sensor-fusion-package-robot-localization-replacement/53502</link>
	<description>&lt;p&gt;Hey everyone,&lt;br /&gt;
I’ve been working on FusionCore for the last few months… it’s a ROS 2 Jazzy sensor fusion package that aims to bridge the gap left by the deprecation of robot_localization.&lt;/p&gt;
&lt;p&gt;There wasn’t anything user-friendly available for ROS 2 Jazzy. It merges IMU, wheel encoders, and GPS/GNSS into a single, reliable position estimate at 100Hz. No need for manual covariance matrices…. just one YAML config file.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;It uses an Unscented Kalman Filter (UKF) with a complete 3D state…. and it’s not just a port of robot_localization.&lt;/li&gt;
&lt;li&gt;It features native GNSS fusion in ECEF coordinates, so you won’t run into UTM zone issues.&lt;/li&gt;
&lt;li&gt;It supports dual antenna heading right out of the box….&lt;/li&gt;
&lt;li&gt;It automatically estimates IMU gyroscope and accelerometer bias.&lt;/li&gt;
&lt;li&gt;It includes HDOP/VDOP quality-aware noise scaling, which means bad GPS fixes are automatically down-weighted.&lt;/li&gt;
&lt;li&gt;It’s under the Apache 2.0 license, making it commercially safe.&lt;/li&gt;
&lt;li&gt;And it’s built natively for ROS 2 Jazzy….. not just a port.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;strong&gt;GitHub:&lt;/strong&gt; &lt;a href=&quot;https://github.com/manankharwar/fusioncore&quot; rel=&quot;noopener nofollow ugc&quot;&gt;https://github.com/manankharwar/fusioncore&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;&lt;/p&gt;&lt;div class=&quot;lightbox-wrapper&quot;&gt;&lt;a class=&quot;lightbox&quot; href=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/0/4/047cee4fbac7c674ca62269b38098fba6261bd51.png&quot; rel=&quot;noopener nofollow ugc&quot; title=&quot;image&quot;&gt;&lt;img alt=&quot;image&quot; height=&quot;478&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/optimized/3X/0/4/047cee4fbac7c674ca62269b38098fba6261bd51_2_690x478.png&quot; width=&quot;690&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;p&gt;&lt;/p&gt;
&lt;p&gt;I respond to issues within 24 hours. If you’re working on a wheeled robot with GPS on ROS 2 Jazzy and hit problems….. open an issue or reply here.&lt;/p&gt;
            &lt;p&gt;&lt;small&gt;6 posts - 3 participants&lt;/small&gt;&lt;/p&gt;
            &lt;p&gt;&lt;a href=&quot;https://discourse.openrobotics.org/t/fusioncore-which-is-a-ros-2-jazzy-sensor-fusion-package-robot-localization-replacement/53502&quot;&gt;Read full topic&lt;/a&gt;&lt;/p&gt;</description>
	<pubDate>Tue, 24 Mar 2026 23:17:59 +0000</pubDate>
</item>
<item>
	<title>ROS Discourse General: ROSCon Global 2026: Call for Sponsors</title>
	<guid isPermaLink="false">discourse.openrobotics.org-topic-53498</guid>
	<link>https://discourse.openrobotics.org/t/roscon-global-2026-call-for-sponsors/53498</link>
	<description>&lt;p&gt;&lt;/p&gt;&lt;div class=&quot;lightbox-wrapper&quot;&gt;&lt;a class=&quot;lightbox&quot; href=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/original/3X/e/a/eaad788c3e306801044828cd49a691386bf177da.png&quot; title=&quot;ROSCon_2026_Horizontal_FINAL&quot;&gt;&lt;img alt=&quot;ROSCon_2026_Horizontal_FINAL&quot; height=&quot;237&quot; src=&quot;https://us1.discourse-cdn.com/flex022/uploads/ros/optimized/3X/e/a/eaad788c3e306801044828cd49a691386bf177da_2_517x237.png&quot; width=&quot;517&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;p&gt;&lt;/p&gt;
&lt;h1&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110699-roscon-global-2026-call-for-sponsors-1&quot; name=&quot;p-110699-roscon-global-2026-call-for-sponsors-1&quot;&gt;&lt;/a&gt;ROSCon Global 2026: Call for Sponsors&lt;/h1&gt;
&lt;p&gt;Hi Everyone,&lt;/p&gt;
&lt;p&gt;The ROSCon executive committee is happy to announce &lt;a href=&quot;https://roscon.ros.org/2026/img/ROSCon2026SponsorProspectus.pdf&quot;&gt;that sponsorship opportunities&lt;/a&gt; are now available for &lt;a href=&quot;https://roscon.ros.org/2026/&quot;&gt;ROSCon Global 2026 in Toronto&lt;/a&gt; (September 22-24)!&lt;/p&gt;
&lt;p&gt;If you would like to get your product or service in front of over a thousand robot application developers, decision makers, and students, ROSCon Global is the place to be!&lt;/p&gt;
&lt;p&gt;This year we are aiming for over 1,000 attendees, and if this event is anything like ROSCon 2025, our attendees will represent:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;350+ companies in the field of robotics&lt;/li&gt;
&lt;li&gt;50+ countries&lt;/li&gt;
&lt;li&gt;60+ universities&lt;/li&gt;
&lt;li&gt;80% filling roles as engineers or executive management&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;This year we will be offering our largest number of sponsorship opportunities yet, including the chance to:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Host a booth in our amazing ROSCon Global Expo hall. Booth locations are first come, first served, so do not delay.&lt;/li&gt;
&lt;li&gt;Demonstrate your robot or device in our robot demo area.&lt;/li&gt;
&lt;li&gt;Support our worldwide community with our free live stream and video archive, reaching thousands of viewers.&lt;/li&gt;
&lt;li&gt;Include your stickers, one-sheet, or giveaway in our swag bag.&lt;/li&gt;
&lt;li&gt;Support ROSCon attendees in their native language with our live captioning and translation service.&lt;/li&gt;
&lt;li&gt;Be the life of the party by hosting our ROSCon Global reception and gala.&lt;/li&gt;
&lt;li&gt;Feed and recharge our amazing ROSCon attendees by becoming a lunch or refreshment sponsor.&lt;/li&gt;
&lt;li&gt;Elevate your startupâ€™s visibility by joining our amazing ROSCon startup alley.&lt;/li&gt;
&lt;li&gt;Connect with ROSCon attendees by supporting our award-winning and &lt;strong&gt;surprisingly good&lt;/strong&gt; Whova app.&lt;/li&gt;
&lt;li&gt;Show your support for underrepresented groups in robotics by sponsoring our inspiring ROSCon Diversity Scholars.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;a href=&quot;https://roscon.ros.org/2026/img/ROSCon2026SponsorProspectus.pdf&quot;&gt;Our full ROSCon Global 2026 sponsorship prospectus&lt;/a&gt; is now available on the ROSCon website, and you can start your ROSCon journey by emailing &lt;a href=&quot;mailto:roscon-2026-ec@roscon.org&quot;&gt;roscon-2026-ec@roscon.org&lt;/a&gt;. We recommend you start your sponsorship conversation as soon as possible, as ROSCon booths and sponsorship opportunities tend to sell out quickly!&lt;/p&gt;
            &lt;p&gt;&lt;small&gt;2 posts - 1 participant&lt;/small&gt;&lt;/p&gt;
            &lt;p&gt;&lt;a href=&quot;https://discourse.openrobotics.org/t/roscon-global-2026-call-for-sponsors/53498&quot;&gt;Read full topic&lt;/a&gt;&lt;/p&gt;</description>
	<pubDate>Tue, 24 Mar 2026 19:12:28 +0000</pubDate>
</item>
<item>
	<title>ROS Discourse General: iRoboCity2030 Summer School 2026: ROS 2, AI and Field Robotics</title>
	<guid isPermaLink="false">discourse.openrobotics.org-topic-53487</guid>
	<link>https://discourse.openrobotics.org/t/irobocity2030-summer-school-2026-ros-2-ai-and-field-robotics/53487</link>
	<description>&lt;h1&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110683-international-irobocity2030-summer-school-2026-ros-2-ai-and-field-robotics-1&quot; name=&quot;p-110683-international-irobocity2030-summer-school-2026-ros-2-ai-and-field-robotics-1&quot;&gt;&lt;/a&gt;International iRoboCity2030 Summer School 2026: ROS 2, AI and Field Robotics&lt;/h1&gt;
&lt;p&gt;&lt;strong&gt;Madrid, Spain, 22–26 June 2026&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Web: &lt;a class=&quot;inline-onebox&quot; href=&quot;https://intelligentroboticslabs.github.io/RoboCity_ROS2_Summer_School_2026/&quot; rel=&quot;noopener nofollow ugc&quot;&gt;iRoboCity2030 Summer School 2026 – ROS 2, AI and Field Robotics&lt;/a&gt;&lt;br /&gt;
Email: &lt;a href=&quot;mailto:irobocity2030@gmail.com&quot;&gt;irobocity2030@gmail.com&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Registration deadlines:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Early until 30 April 2026&lt;/li&gt;
&lt;li&gt;Normal until 31 May 2026&lt;/li&gt;
&lt;li&gt;Late until the event.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110683-motivation-and-description-2&quot; name=&quot;p-110683-motivation-and-description-2&quot;&gt;&lt;/a&gt;MOTIVATION AND DESCRIPTION&lt;/h2&gt;
&lt;p&gt;The iRoboCity2030 Summer School 2026, entitled “ROS 2: AI and Field Robotics”, offers undergraduate and graduate students from all over the world an intensive one-week experience focused on the technologies driving the new generation of autonomous and intelligent robots. The program combines theoretical and practical training in ROS 2 (Robot Operating System 2), Artificial Intelligence, and Field Robotics, guided by researchers from leading universities and technological centers in Madrid. Over five days, participants will advance both theoretical knowledge and practical skills, from the fundamentals of ROS 2 to the application of AI techniques in different field robotics domains such as autonomous driving, quadrupedal robots, agricultural robotics, aerial robotics.&lt;/p&gt;
&lt;p&gt;In addition to the academic program, the summer school will feature two plenary lectures delivered by internationally recognized leaders in the ROS 2 ecosystem. The first will be given by &lt;strong&gt;Steve Macenski&lt;/strong&gt; (OpenNavigation), lead developer of the Nav2 system, widely regarded as the reference standard for autonomous robot navigation in ROS 2. The second will be delivered by &lt;strong&gt;Davide Faconti&lt;/strong&gt;, creator of BehaviorTrees.CPP and Groot, tools that are extensively used for developing robotics applications based on Behavior Trees.&lt;/p&gt;
&lt;p&gt;The school’s pedagogical approach is strongly practical and collaborative: participants will learn by doing, combining knowledge of artificial intelligence, control, and perception with their direct application in ROS 2, both in simulation environments and on real robotic platforms. Beyond its technical dimension, the school promotes intercultural collaboration and international teamwork, creating a dynamic environment for learning and experimentation.&lt;br /&gt;
This summer school is part of the iRoboCity2030 initiative, the robotics innovation network of the Community of Madrid, and represents a joint effort by the region’s leading universities and research centers to promote advanced training and knowledge transfer in robotics and artificial intelligence.&lt;/p&gt;
&lt;h2&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110683-list-of-speakers-and-instructors-3&quot; name=&quot;p-110683-list-of-speakers-and-instructors-3&quot;&gt;&lt;/a&gt;LIST OF SPEAKERS AND INSTRUCTORS&lt;/h2&gt;
&lt;p&gt;Steve Macenski (OpenNavigation) — “Nav2 &amp;amp; ROS 2 Overview: Techniques &amp;amp; Applications Powering an Industry”&lt;br /&gt;
Davide Faconti (BehaviorTrees.CPP / Groot) — “Being a roboticist in the era of AI: what changed and what didn’t”&lt;/p&gt;
&lt;p&gt;Carlos Balaguer, UC3M&lt;br /&gt;
Francisco Martín Rico, URJC&lt;br /&gt;
José M. Cañas, URJC&lt;br /&gt;
Luis Miguel Bergasa, UAH&lt;br /&gt;
Fabio Sánchez, UAH&lt;br /&gt;
Miguel Antunes, UAH&lt;br /&gt;
Santiago Montiel, UAH&lt;br /&gt;
Rodrigo Gutiérrez, UAH&lt;br /&gt;
Christyan Cruz, UPM&lt;br /&gt;
Roemi Fernández, CSIC&lt;br /&gt;
Raúl Fernández, UCM&lt;br /&gt;
…&lt;/p&gt;
&lt;h2&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110683-organization-4&quot; name=&quot;p-110683-organization-4&quot;&gt;&lt;/a&gt;ORGANIZATION&lt;/h2&gt;
&lt;p&gt;This summer school is part of the iRoboCity2030 initiative, the Robotics Innovation Network of the Madrid Region. It represents a joint effort by leading universities and research institutions to promote advanced training and knowledge transfer in robotics and artificial intelligence.&lt;/p&gt;
&lt;h2&gt;&lt;a class=&quot;anchor&quot; href=&quot;https://discourse.openrobotics.org#p-110683-social-experience-5&quot; name=&quot;p-110683-social-experience-5&quot;&gt;&lt;/a&gt;SOCIAL EXPERIENCE&lt;/h2&gt;
&lt;p&gt;The Summer School will take place in the city centre of Madrid and well connected by public transport. The city is famous for its lively atmosphere, outdoor cafés, cultural events, and late-evening social life, providing countless opportunities to meet people and enjoy experiences beyond the classroom. With its warm climate, rich culture, excellent food, and safe, walkable neighborhoods, Madrid combines academic learning with an unforgettable social experience.&lt;/p&gt;
            &lt;p&gt;&lt;small&gt;1 post - 1 participant&lt;/small&gt;&lt;/p&gt;
            &lt;p&gt;&lt;a href=&quot;https://discourse.openrobotics.org/t/irobocity2030-summer-school-2026-ros-2-ai-and-field-robotics/53487&quot;&gt;Read full topic&lt;/a&gt;&lt;/p&gt;</description>
	<pubDate>Tue, 24 Mar 2026 11:00:28 +0000</pubDate>
</item>

</channel>
</rss>
