If you are a ROS developer/user and you blog about it, ROS wants those contributions on this page ! All you need for that to happen is:
have an RSS/Atom blog (no Tweeter/Facebook/Google+ posts)
open a pull request on planet.ros tracker indicating your name and your RSS feed/ATOM url. (You can just edit the file and click "Propose File Change" to open a pull request.)
make your ROS related posts tagged with any of the following categories: "ROS", "R.O.S.", "ros", "r.o.s."
Warnings
For security reasons, html iframe, embed, object, javascript will be stripped out. Only Youtube videos in object and embed will be kept.
Guidelines
Planet ROS is one of the public faces of ROS and is read by users and potential contributors. The content remains the opinion of the bloggers but Planet ROS reserves the right to remove offensive posts.
Blogs should be related to ROS but that does not mean they should be devoid of personal subjects and opinions : those are encouraged since Planet ROS is a chance to know more about ROS developers.
Posts can be positive and promote ROS, or constructive and describe issues but should not contain useless flaming opinions. We want to keep ROS welcoming :)
ROS covers a wide variety of people and cultures. Profanities, prejudice, lewd comments and content likely to offend are to be avoided. Do not make personal attacks or attacks against other projects on your blog.
Suggestions ?
If you find any bug or have any suggestion, please file a bug on the planet.ros tracker.
We developed an open-source system that simplifies the training of robots to perform tasks through imitation learning. Our setup allows you to collect data, control robots, and train models in both real and simulated environments.
The system is built on the Diffusion Policy model and the ROS2 framework. To help you get started, we provide a pre-trained model and datasets, but you can also collect your own data for customized training.
We believe that imitation learning can be useful in dynamic environments where object positions can change, and our project aims to simplify this imitation learning process
We invite you to explore its capabilities and contribute to its growth! If you’re interested in training robots to perform tasks using imitation learning, check it out and let us know what you think!
I have read the (now closed) topic: Call For Testing: Standards-based Python packaging with colcon. But it seems not much has happened in the linked repo since then. I have some co-workers who are asking why they need to use setup.py to use ROS, so I’m investigating for them. Should I try using this colcon-python-project Colcon extension? Or was the effort abandoned? Should I be using something else?
I also found this Poetry Colcon extension. I feel indifferent about Poetry, but could try it out. I would appreciate any sharing of thoughts about it.
The ROS-Industrial Consortium Asia Pacific Annual Summit 2024, themed "Robotics in the Age of AI," concluded successfully, marking a significant milestone for the robotics and automation sector in Asia Pacific. Hosted by the ROS-Industrial Consortium Asia Pacific (RIC Asia Pacific) and managed by the Advanced Remanufacturing and Technology Center (ARTC), the summit brought together over 150 international participants, including industry leaders, researchers, and innovators, who gathered to explore the impact of AI-powered robotics on industries.
This year’s summit featured an impressive agenda packed with expert talks, hands-on masterclasses, and an engaging tech marketplace, all designed to highlight how AI is transforming robotics across industries, especially in manufacturing, logistics, and physical security.
Day 1 Highlights
The first day of the summit opened with a series of insightful presentations by leaders from major organizations in the AI and robotics space.
Amazon Web Services (AWS) set the tone with a talk titled "AI and Robotics-Driven Innovations for Manufacturing Excellence." Presented by Mirela Juravle, Data and AI Specialist Lead at AWS, the session showcased how robotics and AI are being leveraged to support digital transformation efforts toward achieving "lights-out" manufacturing, a concept where factories operate autonomously without human intervention. AWS shared insights on the benefits of cloud services in scaling robotic operations and optimizing workflows for increased efficiency in manufacturing.
Next, Mitsuharu Sonehara, Manager at IHI Corporation’s Technology Planning Department, presented on IHI's vision of advancing robotics and automation within logistics. Titled "From Deep Sea to Deep Space," his talk explored IHI's work in using robotics for high-stakes applications, from underwater operations to outer space logistics. He detailed the challenges and opportunities in these extreme environments and highlighted how AI and robotics can revolutionize logistics operations of the future.
Swarooph Nirmal Seshadri, Chief Technology Officer at Kabam Robotics, shared insights on the transformation of physical security through robotics, intelligence, and connectivity. His session explored how AI-driven robots are becoming crucial in the security industry, enabling smarter monitoring, data gathering, and response systems that are safer and more efficient.
Dr. Dikai Liu, a solution architect from NVIDIA, followed with an exciting presentation on NVIDIA’s suite of services, which facilitates the acceleration of robotics and AI from simulation to real-world deployment. NVIDIA’s tools and platforms empower developers to simulate complex environments and rapidly prototype AI algorithms for robotics, ultimately shortening the timeline from concept to deployment.
An important announcement on ROS2 compatibility came from Mr. Steven Chong, Senior Business Development Manager at Mitsubishi Electric Asia. He announced the release of a ROS2 driver for the MELFA robot, enabling broader integration with the ROS ecosystem. This advancement allows Mitsubishi Electric’s industrial robots to seamlessly integrate with ROS2, opening up new possibilities for automation in various industries. More details can be found on the ROS-Industrial blog here.
Wrapping up the first day’s talks, Dr. Yang Geng, Assistant Director at IMDA, presented on "Embodied AI as the Next Leap in GenAI." He described how embodied AI, which focuses on giving AI systems a physical presence, can revolutionize industries, from customer service robots to healthcare assistants, by enhancing interactions and adaptability through AI.
Day 2 Highlights
The second day of the summit was equally informative, beginning with a presentation by Matt Robinson, Consortium Manager at ROS-Industrial Consortium America. He discussed the collective global effort to standardize ROS2 drivers, aiming to establish ROS2 as the default industrial standard for robotics software. Robinson emphasized the benefits of this standardization for interoperability and efficiency in automation.
Following Robinson, Vishnuprasad Prachandabhanu, Consortium Manager of ROS-Industrial Consortium Europe, shared ongoing efforts to implement ROS2 across various applications. He highlighted a substantial EU-backed initiative with a €60 million funding commitment toward developing AI for robotics across Europe, signifying a significant investment in the advancement of open-source robotics.
Mr. Eugene Goh of JM Vistec presented next, offering insights on the integration of vision systems in robotics. His talk emphasized how JM Vistec enables robots to "see," enhancing precision and capability in industrial tasks, from quality inspection to object recognition.
Concluding the speaker sessions, Dr. Kenneth Kwok from IHPC shared cutting-edge research on enabling human-robot collaboration, powered by AI. His session emphasized the importance of human-centered AI in creating safe, efficient, and collaborative environments where robots can work alongside humans in factories, warehouses, and more.
Masterclass Lineup
Participants then moved to the masterclass sessions, which provided hands-on learning experiences across various aspects of AI and robotics. Each session was designed to deepen the practical knowledge and technical skills required for integrating AI with robotics.
Empowering Innovations with MELSOFT: Led by Mr. Liu Muyao, Software Application Engineer from Mitsubishi Electric Asia, this session focused on MELSOFT, Mitsubishi's integrated software environment, which enhances the control and flexibility of industrial robots.
Introduction to Reinforcement Learning for Robot Arm Manipulation: Hosted by Mr. Shalman Khan, Mr. Santosh Balaji, and Ms. Mercedes Ramesh from ROS-Industrial Consortium Asia Pacific, this session introduced reinforcement learning principles, showing participants how to apply these techniques to control robotic arms more effectively.
Introduction to Deep Learning with YOLO and ROS: Dr. Carlos Acosta, Robotics Specialist at Singapore Polytechnic, led a session on utilizing YOLO (You Only Look Once) with ROS for object detection. The masterclass offered participants a foundation in integrating deep learning algorithms with ROS to enhance robotic vision applications.
Introduction to Fleet Management with Open-RMF: This session, led by Dr. Ricardo Tellez, CEO of The Construct, demonstrated the Open-RMF (Robot Management Framework) for multi-robot fleet management. Participants learned how to manage multiple robots collaboratively, a critical capability for applications in large facilities like hospitals and factories.
Tech Marketplace Highlights
The tech marketplace featured a diverse array of participants, including Megazo, Kabam Robotics, IHPC, Parasoft, and Pepperl+Fuchs. Each company showcased their latest innovations, giving attendees a firsthand look at cutting-edge robotics solutions and AI-driven technologies designed to tackle challenges in industries like manufacturing, logistics, and safety. The marketplace provided a vibrant space for networking, collaboration, and discovering new tools that could redefine industrial automation.
RIC-AP Annual Summit 2024 also announce 2025 exciting event as we welcome the largest ROS Conference, ROSCon 2025 to Singapore. This would be the first time ROSCon will be hosted in Singapore
Finally, on behalf of everyone at ROS-Industrial Consortium Asia Pacific, we would like to thank all participants and delegates for their enthusiasm and we look forward to RIC-AP Annual summit 2025.
Mitsubishi Electric aims to integrate their MELFA robots into the ROS2 ecosystem, allowing robotics developers and integrators to utilize their industry proven platform seamlessly in ROS-based applications.
By developing MELFA ROS2 packages, Mitsubishi Electric seeks to enable developers to leverage on the flexibility, modularity, and extensive support of ROS2 community coupled with proven global hardware support.
MELFA ROS2 Driver is a collaborative effort between ROS-I Consortium Asia Pacific and Mitsubishi Electric Asia. MELFA ROS2 Driver consists of modular components that allow users to interface with the robot’s motion control, state monitoring, and digital/analog I/O operations in ROS2 control framework. This development bridges the gap between Mitsubishi Electric automation hardware and ROS2, providing developers with the tools needed to build, deploy, and manage robotic applications on an industrial platform effectively.
MELFA ROS2 Driver I/O controllers enable cyclic communication between ROS2 and MELFA. Developers can leverage on the IQ platform through MELFA ROS2 Driver to access other Mitsubishi Electric automation products (such as PLC, HMI, motor drives, NC machines), utilize industrial networks (such as CC-Link, PROFINET, EtherCAT, EtherNet/IP, DeviceNet, etc) and explore time sensitive networks (such as CC-Link IE TSN).
MELFA ROS2 Driver is designed for flexibility, supporting various ROS2 packages such as MoveIt2 for motion planning and custom nodes for specialized tasks.
MELFA ROS2 driver will officially support 9 models in the first batch and will aim to support more than 20 models in the near future
Space Station OS (SSOS) is an open-source development platform for space stations, built on ROS 2 to support interoperability, modularity, and scalability across various space station projects.
By unifying core functions like thermal control, power, and life support into reusable modules, Space Station OS provides a universal environment that allows engineers globally to develop and operate space stations collaboratively. This enables rapid innovation and cross-mission compatibility, lowering development costs and enhancing sustainable space station operations.
Space Station OS represents a global effort to democratize space station technology, welcoming contributions from the international aerospace and robotics communities.
Ros2_controllers are intergrated to RobotCAD 4.0.0 and let you make controllable diff drive car, manipulator, etc just from RobotCAD GUI without programming.
Also sometime before was integrated PX4 autopilot. It makes synergy of making airal - land riding robots with manipulator in 1 hour. See functionality demonstration.
Moreover you can extend RobotCAD controllers by adding your generic controller to ros2_controllers repository by pull request (or locally at your computer). And after that you will be able to construct your specific controllable tool via RobotCAD GUI and automatically generate code of ROS2 package and docker for it.
ROSCon 2024 is in the bag and most of the team is finally home! The videos should be up on the web soon (please don’t ask when, they’ll be up as soon as we can get them edited and uploaded).
@Anis_Koubaa has put together a comprehensive survey on ROS 2.. They survey includes an amazing webpage that allows you to search through ROS papers that is worth bookmarking.
Our friends at ROS Industrial had their annual European and Asian consortium meetings over the past two weeks. One of the big take aways is the new Mitsubishi MELFA ROS 2 driver. You can check out the source code here.
Autonomous Mobile Robot (AMR) that assists with picking through collaboration with humans, aiming to reduce labor. AMRs, positioned as the next generation of Automatic Guided Vehicles (AGVs), can travel without guides and move autonomously while avoiding people and obstacles.
Those robotics packages are already running in the market and actual workspace in Japan !!!
System Overview
The system is constructed with 2 components, the one is Fleet Management System and the other is Robot Navigation System.
All these proprietary system application are built on top of ROS 2 humble and Fast-DDS.
The system does many things such as map creation, autonomous navigation and device management. Based on our experiences especially for edge IoT and embedded devices, we develop the stable and robust system for the robots.
ROS Open Source Eco-System in Sony
Sony has been working to make a lot of contribution to the mainline based on the requirements from business logics, then we can use ROS as a user to the business applications. We really appreciate ROS community and ROS open source, and we will keep it up
As some of you may know I’ve been on the board of the Open Source Hardware Association for some time now. The OSHWA team is really busy with an exciting new NSF project at the moment so I am stepping up to help with organizing this year’s Open Hardware Summit. I presented a Lightning Talk at ROSCon about this year’s summit and the response was so positive I figured I should also make an announcement on ROS Discourse. I might also be in the process of planning a ROS meetup in Edinburgh. If that’s something you would be interested in helping with please reach out to me directly via DM.
Open Hardware Summit 2025
This year’s Open Hardware Summit will be held in Edinburgh, Scotland on 2025-05-29T23:00:00Z UTC→2025-05-30T23:00:00Z UTC and tickets just went on sale. If you’ve never been to Open Hardware Summit you can get a taste of the event by watching our 2024 Summit Live Stream. Our keynote this year was from Danielle Boyer, a Ojibwe roboticist who is builds open hardware educational robots that teach students their indigenous languages.
OSHWA currently has an open call for proposals for this year’s Open Hardware Summit. If you have an open source hardware robot, or an interesting open hardware project that you would like to share, please consider submitting a talk or workshop proposal! Applications are due by 2024-12-22T08:00:00Z UTC.
The OSHWA certification website is a gold mine of close to 3000 open hardware projects that you are free to study and use as part of your robotics project. The certification website currently includes 19 different motor drivers (like this stepper driver and this Grove Motor Driver from TU Delft ), 229 different robots (such as the NASA JPL Rover), and 312 different sensors (like this line sensor, and this pneumatics controller). I recommend you bookmark the certification website for easy reference!
Credits go to Abdulrahman S. Al-Batati for the great efforts in gathering this volume of related works and also in building the first repository of ROS/ROS2 publications available at:
This study stands as the ğ�—ºğ�—¼ğ�˜€ğ�˜� ğ�—°ğ�—¼ğ�—ºğ�—½ğ�—¿ğ�—²ğ�—µğ�—²ğ�—»ğ�˜€ğ�—¶ğ�˜ƒğ�—² ğ�˜€ğ�˜‚ğ�—¿ğ�˜ƒğ�—²ğ�˜† to date on the transition from ğ�—¥ğ�—¢ğ�—¦ ğ�Ÿ ğ�˜�ğ�—¼ ğ�—¥ğ�—¢ğ�—¦ ğ�Ÿ®, offering a deep dive into the enhancements, challenges, and future directions for ROS 2.
Our analysis covers:
Real-time capabilities
Enhanced modularity
Security improvements
Middleware and distributed systems
Multi-robot system applications
We carefully analyzed ğ�Ÿ³,ğ�Ÿ°ğ�Ÿµğ�Ÿ´ ğ�—¥ğ�—¢ğ�—¦-ğ�—¿ğ�—²ğ�—¹ğ�—®ğ�˜�ğ�—²ğ�—± ğ�—®ğ�—¿ğ�˜�ğ�—¶ğ�—°ğ�—¹ğ�—²ğ�˜€, with a focused review of ğ�Ÿ°ğ�Ÿ¯ğ�Ÿ ğ�—¥ğ�—¢ğ�—¦ ğ�Ÿ®-ğ�˜€ğ�—½ğ�—²ğ�—°ğ�—¶ğ�—³ğ�—¶ğ�—° ğ�—½ğ�˜‚ğ�—¯ğ�—¹ğ�—¶ğ�—°ğ�—®ğ�˜�ğ�—¶ğ�—¼ğ�—»ğ�˜€, making this a key resource for researchers, developers, and enthusiasts in the ROS community.
Our goal is to provide a cohesive synthesis that helps deepen the understanding of ROS 2’s contributions and guides future research in robotic systems design.
Join us in exploring the potential of ROS 2 and shaping the future of robotics!
Our company GEEAR (full announcement soon) has been developing FIREBRINGER - a generalised autonomous navigation algorithm implemented on ROS2 for field-robotic applications.
The algorithm is meant to support a variety of missions from simple point-to-point travelling and trajectory tracking, to more advanced interaction with third-party vehicles and our favourite - real-world dense robotic swarms (in the full academic sense of emergent collaborative behaviour). We will continue working on populating our library of mission types/functionalities so you can expect more in the future.
So far we have validated FIREBRINGER on real-world boats and ground vehicles and on copters and fixed-winged aircraft in gazebo, (real-world experiments coming soon). You can find our recent video unveiling our autonomous vessel prototype here.
The algorithm is meant to be easy to use and tune, only requiring basic physical characteristics of the robot (maximum linear/angular velocity and maximum acceleration for each degree of freedom of the robot), and it offers plug-n-play functionality when combined with an Ardupilot autopilot through MAVROS.
It is based on a lightweight, robust NMPC-variant integrated with artificial potential field theory. It can typically run at 100Hz on a normal PC and 25Hz on a RPi4B for all vehicle types. So far, it can receive topics regarding the location of other vehicles (third-party and collaborative), trajectory (e.g., NAV2 global path), destination (complete pose), and various other mission specific info. We are currently working on incorporating point-cloud/costmap information for surrounding obstacle avoidance. Our aim is to allow integration with any ROS2 perception package that may be used in field robotics. The algorithm outputs an optimal velocity for each degree of freedom so it needs to be combined with appropriate low-level actuation control (or connected to a well-tuned Ardupilot autopilot which will do the work).
We are currently considering creating and publishing a complete ROS2 package for the algorithm and we may propose some type of fusion with NAV2 in the future, but we wanted to gauge interest first. Obviously we are open to ideas and recommendations (and some criticism)!
The Infrastructure Project Management Committee meeting has been on the OSRF Official Events calendar since we made the switch from the earlier Infrastructure Project Committee to the OSRA-chartered Infrastructure PMC.
When I created the event, I accidentally did so using my local timezone instead of UTC pinning, as result, the meeting time was shown an hour later than previous weeks when the US clocks changed. Another PMC member noticed and reported this and I corrected the event.
My apologies to observers who joined an hour later, when the meeting time was originally posted.
Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:
That which we call an API by any other name would serve just as well. But when there are many diverging opinions on exactly what names to use or exactly how to structure things, we may end up scattering our efforts instead of working together.
The OSRA TGC opened a technical committee to examine the ROS Enhancement Proposal (REP) process to refine it and apply it to all of the projects that are governed by the OSRA, including Open-RMF. The PMC of Open-RMF sees this as an opportunity to formalize something similar to a Request For Comments (RFC) process for Open-RMF where we can define interfaces for our next generation development and receive comments from the public, especially from stakeholders, to make sure that we’ll be developing the most useful tools possible for everyone. This would also define a formal way for anyone from the general public to propose new interfaces for Open-RMF to support.
At this session of the special interest group, Grey will present a working draft of what kind of process we are considering for Open-RMF.
We are eager to get feedback from the Open-RMF community and beyond about how to make this process as effective as possible. Effectiveness would be measured both in terms of making sure we are producing the highest quality stable interfaces possible and also getting contributions (proposals, feedback, and implementation) from as many community stakeholders as possible.
After presenting our current ideas, we want to open up the session to discussion and feedback, so please come ready to share your own thoughts. We will especially appreciate input from anyone that has experience (both positive and negative experiences) working with standardization bodies.
Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:
Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:
Today I would like to show you a package that we have prepared to be able to install and use TurtleBot2 Kobuki in a more comfortable way, either with a real robot or using the new gazebo simulator.
This package directly uses the kobuki_ros packages, which have been tested and prepared so that they can be used in the Jazzy and Rolling versions. In addition, I have retouched the urdf to my liking, so that in a more comfortable way you can add/remove the camera and lidar sensors, the structure, or indicate whether or not you are going to use the gazebo plugin.
This package is being used by students from the Rey Juan Carlos University, and I hope it will be useful to all of you who have a kobuki and don’t know what to do with it. And of course, we are open to any PR to make everything work better, any problem you find, or to clarify the steps much more.
Thank you very much for reading the post and I hope it helps many of you.
ROS 2 Iron Irwini which was released on 23rd May 2023 is a non-LTS release and is due to be marked end-of-life (EOL) by the end of November 2024.
If you are using Iron, it is strongly recommended to switch to Jazzy Jalisco which will be supported until May 2029.
Once EOL, we will no longer release updates to core and community managed packages through the ROS Buildfarm. We will also not backport any changes to the iron branches of core repositories. The distribution will be frozen forever.
The plan for the next month is as follows:
We will perform a regular sync on 1st Nov as announced here.
Between 1st Nov and 15th Nov, will be aim to merge all open backport PRs in the core repositories and release binaries for these packages.
On 15th Nov, we will perform the final Patch release of all core packages and sync community packages one last time.
Between 15th Nov and 23rd Nov, we will update infrastructure and documentation to mark Iron as EOL. During this time we will also address any issues from the final sync.
If you’re a maintainer of a package released on Iron, this is your call to get the final versions of your packages released on the Buildfarm over the next couple of weeks. Remember Iron will be in sync hold a few days prior to Nov 15th so please plan in advance.
We’re excited to announce the first Robotics and Simulation devroom at FOSDEM 2025, scheduled for Sunday, February 2nd in Brussels!
A Robotics and Simulation developer room at FOSDEM, focused on core robotics libraries, frameworks, simulation tools, and open-source platforms. The selected talks will cover key tools like ROS and Gazebo, showcasing advancements and projects in the field. The half-day event will feature four presentations and a lightning talk session, offering a platform for developers to share work and foster collaboration.
Scope of Talks
Because of the varied and interdisciplinary nature of robotics as a field, plenty of topics could fit a ‘Robotics and Simulation’ devroom. To help the speakers target their presentations to a specific audience, and to minimize the overlap with other established FOSDEM devrooms, we defined the scope of this devroom by listing key areas of interest:
Core robotics libraries and applications. Mapping, planning, localization, perception, and control solutions would fit this category. For example, Cartographer, OMPL, grid_map, Octomap, PCL, OpEn, OpenRMF
Frameworks used when building robotics applications, being robotic specific tools, or generic tools used for robotics applications. Examples include ROS, Dora-RS, OpenRR, YARP, Zenoh, and Eclipse iceoryxOf course this is the ROS discourse but FOSDEM invites all types of robotic frameworks so please see this as an opportunity to learn from each other.
Specific robotic simulation software like Gazebo, Coppelia, or simulations using non robotic specific tools, like Bevy, Godot.
Robotic-specific devops and related tooling, like webviz, MCAP
Robotic products implemented using open source software. This can include examples of AI methods used to teach robots to perform certain tasks, or controlling robots using LLMs. For example, dora-rs, ros2ai, robo-gym
The examples mentioned are meant to be descriptive and not restricted. Talks could be improvements to any of the mentioned packages, interesting uses of them, alternatives with different degrees of maturity, or totally unrelated to them.
Submission instructions
Talks proposal submission:
Deadline: 1st December 2024
Accepted talks notification: 6th December 2024
Format (indicate that in the proposal notes)
20 mins + 5 mins for Q&A
5 min lightning talks
Use this link: FOSDEM 2025 :: pretalx and please choose “Robotics and Simulation” as the “Track” when submitting your talk.
ROS2 diagnostics messages are part of jros2client starting from version 10.0.
We wrote a short article which covers how users can use ROS2 diagnostics, by subscribing to it with jros2client directly from Java. All received ROS diagnostics message are converted to OpenTelemetry metrics and sent to OpenTelemetry for further processing.
In our example, as a source of all diagnostics we use Jetson Orin Nano. All diagnostics from Jetson is gathered and published to ROS by isaac_ros_jetson_stats.
You know, I’ve suddenly realized been over a year since my last post on this topic and there have been a lot of additions so I figured I ought to compile a short rundown of the more notable bits.
For those new to it, Vizanti is a community maintained open source control and visualization command center for ROS 1/2 in the web browser, targeting primarily outdoor use cases like marine, field, and aerial robotics on mobile and desktop devices.
Jazzy Release
This mainly involved waiting for rosbridge and rws to update, since there was little in terms of breaking changes. So far the ros2 branch can stay common to both Humble and Jazzy, which reduces overhead for backporting changes. The stability and usability is now more or less on par with Noetic, which will continue to get equal feature updates at least until the end of 2025 as well.
Well I wrote “release” there, but there isn’t exactly an apt release for the package yet, for various popen, rws, and other reasons. TBD.
RWS - ROS Websocket Server
Since web browsers are mostly restricted to TCP and as such can’t really connect with a DDS (or even ros_comm for that matter), an intermediary server is needed to pack traffic up into websockets.
Originally this was established by rosbridge suite on ROS, but due to a multitude of reasons it’s not nearly as performant on ROS 2. That’s why I’ve been working with v-kiniv to integrate his drop in replacement server, RWS.
There were a few things to fix up and a feature or two to add, but after nearly a year I think we’re at a point where it’s basically there now.
While a bit more of a hassle to set up, it uses significantly fewer resources while providing much higher throughput, enabling use on low end platforms like the Pi 4 and 5 where the ROS 2 version of Rosbridge isn’t efficient enough to function under typical load.
Right now it’s the recommended backend to use with Vizanti on Humble and Jazzy. And unless you’re using client side services, probably for your roslibjs project as well.
Updates
There are a handful of new widgets, and a few existing ones have been expanded to support multiple topics for better compatibility.
Grid Cells
This one is somewhat underused in Rviz in my experience, but extremely efficient for sparse grid data, such as the occasional obstacle in an open field. Comes in any 24-bit colour.
Altimeter
Yes a way to finally see the Z axis, very good. Ah, but it’s not just for observation, tapping the meter sends a Float32 target with the clicked value for direct depth/altitude control.
What it tracks exactly is the position of a chosen TF link relative to the fixed frame. Comes in depth (blue) and altitude (green) versions with adjustable units, but doesn’t do both at the same time. I have to apologise if any of you happen to be working on aerial submarines, but those simply won’t fly.
Folder
Is your tiny screen getting filled up with rows and rows of widgets? Well, now they can be grouped together, just like in rviz.
There are some limitations, mainly that the widget has to be created in the folder and can’t be moved around. I’m still figuring out a way to even conceptually do that UX-wise now that both the short and long click are taken, and mobile has no right click…
Speedometer
Okay this one’s a bit funky, it lets you measure the relative speed of any two tf links in various scientific (and unscientific) units. A handy thing to check if your nav stack is set to drive at the correct velocity and/or if your boat props have weeds in them.
In the gif above, the first few dials track turtle1 relative to world in various units, the next one turtle1 relative to turtle2, final one tracks turtle2 to world.
General improvements
Covariance rendering is correct now:
(what do you mean it was wrong before , aaah you saw nothing)
The icons will reflect the visualizer color by recoloring the svg, so it’s easier to find which icon corresponds to what’s being rendered:
Map rendering has code-exact rendering parity with rviz in terms of colour mapping (as well as the raw display mode), except for a slightly more transparent tinge for unknown pixels, which imo looks slightly better:
TF frames now get grouped together by prefix, for clarity:
Community Contributions
A notable change to the satelite tile renderer was started by kosmonauta144 and darkhannibal, which enabled the use of other tile zoom levels, letting you see your robot from orbit:
The tiles themselves are now exportable and importable, so it’s possible to just load up the entire area beforehand and load it up offline on another device.
Thanks to samkys, all versions now also support building and running a Docker container, for those of you trying to run the wrong versions of ROS on the wrong OS
I also have to thank tony2guo for adding base url support, which essentially lets you namespace the browser path, and of course v-kiniv for his continuing help in integrating RWS.
Etc.
And there’s of course a lot of gradual improvements in general stability, rendering accuracy, speed, etc. There is also a github wiki now that contains everything from introductory to more esoteric info.
In Conclusion?
Well there’s still a lot left to do, features and visualizers to add, widgets to rework, performance to improve, bugs to fix, Firefox rendering to speed up. If anyone feels like helping along or has a good idea, I’m always up for a code review
Sorry that this is short notice, but I’m giving a presentation about ROS to a club that meets online monthly to discuss laboratory automation. I’ll warn you in advance that I am NOT an expert in ROS, but I know enough that I wanted to talk about it. If anyone is available, you can find information here: Robot Operating System: October 28 with Andy Gross
And the link is here:
Feel free to stop by and possibly correct me on anything I miscommunicate!
Please come and join us for this coming meeting at 1700-1800 UTC on Monday 4th November 2024, where Julien Enoch will be talking about Eclipse Zenoh and its offering for Cloud Connectivity.
Julien is a a Senior Solutions Architect at ZettaScale Technology and Eclipse Zenoh committer. He will talk about the Zenoh protocol, which can run everywhere, from micro-controllers to the Cloud; over TCP, UDP, QUIC, and Websockets. Zenoh provides a software router that can be deployed in any Cloud instance such as AWS EC2, and that can route the protocol between different subsystems. This talk will cover how this is particularly helpful for Connectivity in Cloud Robotics.
Last meeting, we had a general catch-up and a discussion about the current progress of the working group. If you’re interested to see our discussion, please check the meeting recording.
If you are willing and able to give a talk on cloud robotics in future meetings, we would be happy to host you - please reply here, message me directly, or sign up using the Guest Speaker Signup Sheet. We will record your talk and host it on YouTube with our other meeting recordings too!