If you are a ROS developer/user and you blog about it, ROS wants those contributions on this page ! All you need for that to happen is:
have an RSS/Atom blog (no Tweeter/Facebook/Google+ posts)
open a pull request on planet.ros tracker indicating your name and your RSS feed/ATOM url. (You can just edit the file and click "Propose File Change" to open a pull request.)
make your ROS related posts tagged with any of the following categories: "ROS", "R.O.S.", "ros", "r.o.s."
Warnings
For security reasons, html iframe, embed, object, javascript will be stripped out. Only Youtube videos in object and embed will be kept.
Guidelines
Planet ROS is one of the public faces of ROS and is read by users and potential contributors. The content remains the opinion of the bloggers but Planet ROS reserves the right to remove offensive posts.
Blogs should be related to ROS but that does not mean they should be devoid of personal subjects and opinions : those are encouraged since Planet ROS is a chance to know more about ROS developers.
Posts can be positive and promote ROS, or constructive and describe issues but should not contain useless flaming opinions. We want to keep ROS welcoming :)
ROS covers a wide variety of people and cultures. Profanities, prejudice, lewd comments and content likely to offend are to be avoided. Do not make personal attacks or attacks against other projects on your blog.
Suggestions ?
If you find any bug or have any suggestion, please file a bug on the planet.ros tracker.
I’m conducting a study on skill development in autonomous robotics, specifically focusing on what students (bachelor’s, master’s, or PhD level) learn through hands-on project work. This study centers on practical experiences in projects like final assignments, student groups, and research initiatives—not typical classroom content.
If you’re currently a student involved in UAV or drone projects as part of your studies, I’d be grateful for your insights. This survey aims to uncover the unique skills and knowledge that students develop through direct, project-based work in aerial and autonomous systems.
One interesting thing about this application, and the others in the repo, is that they’re built using NVIDIA Isaac ROS Dev Containers. Which are totally awesome.
I know it’s a old platform these days. But one of my work horse’s is a Kobuki/Turtlebot 2. I’d like to build a couple more charging bases for it. Can’t buy them any more. Wondered if anyone might know of the schematic for it. Charging circuits specifically.
The videos from ROSCon 2024 in Odense are now available on the ROSCon Website (see the program), this Vimeo showcase, and in the ROS documentation. The ROSCon website also includes the slides from all the talks at ROSCon. I have also included a list of all the videos below.
I want to thank AMD for being our 2024 ROSCon video sponsor, their generous support makes the ROSCon live stream and videos possible.
We would like to present our newly open-sourced ROS 2 Package Generator, simply called ros2-pkg-create. It supports C++ and Python nodes as well as advanced features such as C++ components or lifecycle nodes.
ros2-pkg-create is an interactive CLI tool for quickly generating ROS 2 packages from basic pub/sub nodes to complex lifecycle components. It is meant to replace the official ros2 pkg create command.
You can either directly control all options through command-line arguments or use the interactive questionnaire to walk through. No more memorization of available options for such a generation tool.
ros2-pkg-create can generate ROS 2 C++ Packages, Python Packages, and Interfaces Packages. The supported features include:
C++ Package: publisher, subscriber, parameter loading, launch file, service server, action server, timer callback, component, lifecycle node, docker-ros
Please come and join us for this coming meeting at 1700-1800 UTC on Monday 21st November 2024, which will be a general catch-up! If you’re passionate about robotics or the cloud, come and say hello. We plan to discuss the latest news from the group and from the world of Cloud Robotics.
Last meeting, we had a guest talk from Julien Enoch on Eclipse Zenoh. If you’re interested to see the talk, we have published it on YouTube.
If you are willing and able to give a talk on cloud robotics in future meetings, we would be happy to host you - please reply here, message me directly, or sign up using the Guest Speaker Signup Sheet. We will record your talk and host it on YouTube with our other meeting recordings too!
My apologies for the late notice on this particular meeting. I have been off sick, and just now returned on the day of the meeting. Future meetings will have more notice, and the Meetings page on the Cloud Robotics Hub will always be the first to be updated with future meetings.
Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:
Here’s your regular reminder that Gazebo Classic goes end of life in JANUARY and ALL OF ROS 1, INCLUDING NOETIC, GOES END OF LIFE NEXT MAY. We’re working on a few surprises internally to motivate the last stragglers to move over to ROS 2 and modern Gazebo.
This week some of our colleagues in Japan released Space Station OS based on ROS. The idea here is to create a standard ROS interface to space stations to enable robotic space station tending.
We developed an open-source system that simplifies the training of robots to perform tasks through imitation learning. Our setup allows you to collect data, control robots, and train models in both real and simulated environments.
The system is built on the Diffusion Policy model and the ROS2 framework. To help you get started, we provide a pre-trained model and datasets, but you can also collect your own data for customized training.
We believe that imitation learning can be useful in dynamic environments where object positions can change, and our project aims to simplify this imitation learning process
We invite you to explore its capabilities and contribute to its growth! If you’re interested in training robots to perform tasks using imitation learning, check it out and let us know what you think!
I have read the (now closed) topic: Call For Testing: Standards-based Python packaging with colcon. But it seems not much has happened in the linked repo since then. I have some co-workers who are asking why they need to use setup.py to use ROS, so I’m investigating for them. Should I try using this colcon-python-project Colcon extension? Or was the effort abandoned? Should I be using something else?
I also found this Poetry Colcon extension. I feel indifferent about Poetry, but could try it out. I would appreciate any sharing of thoughts about it.
The ROS-Industrial Consortium Asia Pacific Annual Summit 2024, themed "Robotics in the Age of AI," concluded successfully, marking a significant milestone for the robotics and automation sector in Asia Pacific. Hosted by the ROS-Industrial Consortium Asia Pacific (RIC Asia Pacific) and managed by the Advanced Remanufacturing and Technology Center (ARTC), the summit brought together over 150 international participants, including industry leaders, researchers, and innovators, who gathered to explore the impact of AI-powered robotics on industries.
This year’s summit featured an impressive agenda packed with expert talks, hands-on masterclasses, and an engaging tech marketplace, all designed to highlight how AI is transforming robotics across industries, especially in manufacturing, logistics, and physical security.
Day 1 Highlights
The first day of the summit opened with a series of insightful presentations by leaders from major organizations in the AI and robotics space.
Amazon Web Services (AWS) set the tone with a talk titled "AI and Robotics-Driven Innovations for Manufacturing Excellence." Presented by Mirela Juravle, Data and AI Specialist Lead at AWS, the session showcased how robotics and AI are being leveraged to support digital transformation efforts toward achieving "lights-out" manufacturing, a concept where factories operate autonomously without human intervention. AWS shared insights on the benefits of cloud services in scaling robotic operations and optimizing workflows for increased efficiency in manufacturing.
Next, Mitsuharu Sonehara, Manager at IHI Corporation’s Technology Planning Department, presented on IHI's vision of advancing robotics and automation within logistics. Titled "From Deep Sea to Deep Space," his talk explored IHI's work in using robotics for high-stakes applications, from underwater operations to outer space logistics. He detailed the challenges and opportunities in these extreme environments and highlighted how AI and robotics can revolutionize logistics operations of the future.
Swarooph Nirmal Seshadri, Chief Technology Officer at Kabam Robotics, shared insights on the transformation of physical security through robotics, intelligence, and connectivity. His session explored how AI-driven robots are becoming crucial in the security industry, enabling smarter monitoring, data gathering, and response systems that are safer and more efficient.
Dr. Dikai Liu, a solution architect from NVIDIA, followed with an exciting presentation on NVIDIA’s suite of services, which facilitates the acceleration of robotics and AI from simulation to real-world deployment. NVIDIA’s tools and platforms empower developers to simulate complex environments and rapidly prototype AI algorithms for robotics, ultimately shortening the timeline from concept to deployment.
An important announcement on ROS2 compatibility came from Mr. Steven Chong, Senior Business Development Manager at Mitsubishi Electric Asia. He announced the release of a ROS2 driver for the MELFA robot, enabling broader integration with the ROS ecosystem. This advancement allows Mitsubishi Electric’s industrial robots to seamlessly integrate with ROS2, opening up new possibilities for automation in various industries. More details can be found on the ROS-Industrial blog here.
Wrapping up the first day’s talks, Dr. Yang Geng, Assistant Director at IMDA, presented on "Embodied AI as the Next Leap in GenAI." He described how embodied AI, which focuses on giving AI systems a physical presence, can revolutionize industries, from customer service robots to healthcare assistants, by enhancing interactions and adaptability through AI.
Day 2 Highlights
The second day of the summit was equally informative, beginning with a presentation by Matt Robinson, Consortium Manager at ROS-Industrial Consortium America. He discussed the collective global effort to standardize ROS2 drivers, aiming to establish ROS2 as the default industrial standard for robotics software. Robinson emphasized the benefits of this standardization for interoperability and efficiency in automation.
Following Robinson, Vishnuprasad Prachandabhanu, Consortium Manager of ROS-Industrial Consortium Europe, shared ongoing efforts to implement ROS2 across various applications. He highlighted a substantial EU-backed initiative with a €60 million funding commitment toward developing AI for robotics across Europe, signifying a significant investment in the advancement of open-source robotics.
Mr. Eugene Goh of JM Vistec presented next, offering insights on the integration of vision systems in robotics. His talk emphasized how JM Vistec enables robots to "see," enhancing precision and capability in industrial tasks, from quality inspection to object recognition.
Concluding the speaker sessions, Dr. Kenneth Kwok from IHPC shared cutting-edge research on enabling human-robot collaboration, powered by AI. His session emphasized the importance of human-centered AI in creating safe, efficient, and collaborative environments where robots can work alongside humans in factories, warehouses, and more.
Masterclass Lineup
Participants then moved to the masterclass sessions, which provided hands-on learning experiences across various aspects of AI and robotics. Each session was designed to deepen the practical knowledge and technical skills required for integrating AI with robotics.
Empowering Innovations with MELSOFT: Led by Mr. Liu Muyao, Software Application Engineer from Mitsubishi Electric Asia, this session focused on MELSOFT, Mitsubishi's integrated software environment, which enhances the control and flexibility of industrial robots.
Introduction to Reinforcement Learning for Robot Arm Manipulation: Hosted by Mr. Shalman Khan, Mr. Santosh Balaji, and Ms. Mercedes Ramesh from ROS-Industrial Consortium Asia Pacific, this session introduced reinforcement learning principles, showing participants how to apply these techniques to control robotic arms more effectively.
Introduction to Deep Learning with YOLO and ROS: Dr. Carlos Acosta, Robotics Specialist at Singapore Polytechnic, led a session on utilizing YOLO (You Only Look Once) with ROS for object detection. The masterclass offered participants a foundation in integrating deep learning algorithms with ROS to enhance robotic vision applications.
Introduction to Fleet Management with Open-RMF: This session, led by Dr. Ricardo Tellez, CEO of The Construct, demonstrated the Open-RMF (Robot Management Framework) for multi-robot fleet management. Participants learned how to manage multiple robots collaboratively, a critical capability for applications in large facilities like hospitals and factories.
Tech Marketplace Highlights
The tech marketplace featured a diverse array of participants, including Megazo, Kabam Robotics, IHPC, Parasoft, and Pepperl+Fuchs. Each company showcased their latest innovations, giving attendees a firsthand look at cutting-edge robotics solutions and AI-driven technologies designed to tackle challenges in industries like manufacturing, logistics, and safety. The marketplace provided a vibrant space for networking, collaboration, and discovering new tools that could redefine industrial automation.
RIC-AP Annual Summit 2024 also announce 2025 exciting event as we welcome the largest ROS Conference, ROSCon 2025 to Singapore. This would be the first time ROSCon will be hosted in Singapore
Finally, on behalf of everyone at ROS-Industrial Consortium Asia Pacific, we would like to thank all participants and delegates for their enthusiasm and we look forward to RIC-AP Annual summit 2025.
Mitsubishi Electric aims to integrate their MELFA robots into the ROS2 ecosystem, allowing robotics developers and integrators to utilize their industry proven platform seamlessly in ROS-based applications.
By developing MELFA ROS2 packages, Mitsubishi Electric seeks to enable developers to leverage on the flexibility, modularity, and extensive support of ROS2 community coupled with proven global hardware support.
MELFA ROS2 Driver is a collaborative effort between ROS-I Consortium Asia Pacific and Mitsubishi Electric Asia. MELFA ROS2 Driver consists of modular components that allow users to interface with the robot’s motion control, state monitoring, and digital/analog I/O operations in ROS2 control framework. This development bridges the gap between Mitsubishi Electric automation hardware and ROS2, providing developers with the tools needed to build, deploy, and manage robotic applications on an industrial platform effectively.
MELFA ROS2 Driver I/O controllers enable cyclic communication between ROS2 and MELFA. Developers can leverage on the IQ platform through MELFA ROS2 Driver to access other Mitsubishi Electric automation products (such as PLC, HMI, motor drives, NC machines), utilize industrial networks (such as CC-Link, PROFINET, EtherCAT, EtherNet/IP, DeviceNet, etc) and explore time sensitive networks (such as CC-Link IE TSN).
MELFA ROS2 Driver is designed for flexibility, supporting various ROS2 packages such as MoveIt2 for motion planning and custom nodes for specialized tasks.
MELFA ROS2 driver will officially support 9 models in the first batch and will aim to support more than 20 models in the near future
Space Station OS (SSOS) is an open-source development platform for space stations, built on ROS 2 to support interoperability, modularity, and scalability across various space station projects.
By unifying core functions like thermal control, power, and life support into reusable modules, Space Station OS provides a universal environment that allows engineers globally to develop and operate space stations collaboratively. This enables rapid innovation and cross-mission compatibility, lowering development costs and enhancing sustainable space station operations.
Space Station OS represents a global effort to democratize space station technology, welcoming contributions from the international aerospace and robotics communities.
Ros2_controllers are intergrated to RobotCAD 4.0.0 and let you make controllable diff drive car, manipulator, etc just from RobotCAD GUI without programming.
Also sometime before was integrated PX4 autopilot. It makes synergy of making airal - land riding robots with manipulator in 1 hour. See functionality demonstration.
Moreover you can extend RobotCAD controllers by adding your generic controller to ros2_controllers repository by pull request (or locally at your computer). And after that you will be able to construct your specific controllable tool via RobotCAD GUI and automatically generate code of ROS2 package and docker for it.
ROSCon 2024 is in the bag and most of the team is finally home! The videos should be up on the web soon (please don’t ask when, they’ll be up as soon as we can get them edited and uploaded).
@Anis_Koubaa has put together a comprehensive survey on ROS 2.. They survey includes an amazing webpage that allows you to search through ROS papers that is worth bookmarking.
Our friends at ROS Industrial had their annual European and Asian consortium meetings over the past two weeks. One of the big take aways is the new Mitsubishi MELFA ROS 2 driver. You can check out the source code here.
Autonomous Mobile Robot (AMR) that assists with picking through collaboration with humans, aiming to reduce labor. AMRs, positioned as the next generation of Automatic Guided Vehicles (AGVs), can travel without guides and move autonomously while avoiding people and obstacles.
Those robotics packages are already running in the market and actual workspace in Japan !!!
System Overview
The system is constructed with 2 components, the one is Fleet Management System and the other is Robot Navigation System.
All these proprietary system application are built on top of ROS 2 humble and Fast-DDS.
The system does many things such as map creation, autonomous navigation and device management. Based on our experiences especially for edge IoT and embedded devices, we develop the stable and robust system for the robots.
ROS Open Source Eco-System in Sony
Sony has been working to make a lot of contribution to the mainline based on the requirements from business logics, then we can use ROS as a user to the business applications. We really appreciate ROS community and ROS open source, and we will keep it up
As some of you may know I’ve been on the board of the Open Source Hardware Association for some time now. The OSHWA team is really busy with an exciting new NSF project at the moment so I am stepping up to help with organizing this year’s Open Hardware Summit. I presented a Lightning Talk at ROSCon about this year’s summit and the response was so positive I figured I should also make an announcement on ROS Discourse. I might also be in the process of planning a ROS meetup in Edinburgh. If that’s something you would be interested in helping with please reach out to me directly via DM.
Open Hardware Summit 2025
This year’s Open Hardware Summit will be held in Edinburgh, Scotland on 2025-05-29T23:00:00Z UTC→2025-05-30T23:00:00Z UTC and tickets just went on sale. If you’ve never been to Open Hardware Summit you can get a taste of the event by watching our 2024 Summit Live Stream. Our keynote this year was from Danielle Boyer, a Ojibwe roboticist who is builds open hardware educational robots that teach students their indigenous languages.
OSHWA currently has an open call for proposals for this year’s Open Hardware Summit. If you have an open source hardware robot, or an interesting open hardware project that you would like to share, please consider submitting a talk or workshop proposal! Applications are due by 2024-12-22T08:00:00Z UTC.
The OSHWA certification website is a gold mine of close to 3000 open hardware projects that you are free to study and use as part of your robotics project. The certification website currently includes 19 different motor drivers (like this stepper driver and this Grove Motor Driver from TU Delft ), 229 different robots (such as the NASA JPL Rover), and 312 different sensors (like this line sensor, and this pneumatics controller). I recommend you bookmark the certification website for easy reference!
Credits go to Abdulrahman S. Al-Batati for the great efforts in gathering this volume of related works and also in building the first repository of ROS/ROS2 publications available at:
This study stands as the ğ�—ºğ�—¼ğ�˜€ğ�˜� ğ�—°ğ�—¼ğ�—ºğ�—½ğ�—¿ğ�—²ğ�—µğ�—²ğ�—»ğ�˜€ğ�—¶ğ�˜ƒğ�—² ğ�˜€ğ�˜‚ğ�—¿ğ�˜ƒğ�—²ğ�˜† to date on the transition from ğ�—¥ğ�—¢ğ�—¦ ğ�Ÿ ğ�˜�ğ�—¼ ğ�—¥ğ�—¢ğ�—¦ ğ�Ÿ®, offering a deep dive into the enhancements, challenges, and future directions for ROS 2.
Our analysis covers:
Real-time capabilities
Enhanced modularity
Security improvements
Middleware and distributed systems
Multi-robot system applications
We carefully analyzed ğ�Ÿ³,ğ�Ÿ°ğ�Ÿµğ�Ÿ´ ğ�—¥ğ�—¢ğ�—¦-ğ�—¿ğ�—²ğ�—¹ğ�—®ğ�˜�ğ�—²ğ�—± ğ�—®ğ�—¿ğ�˜�ğ�—¶ğ�—°ğ�—¹ğ�—²ğ�˜€, with a focused review of ğ�Ÿ°ğ�Ÿ¯ğ�Ÿ ğ�—¥ğ�—¢ğ�—¦ ğ�Ÿ®-ğ�˜€ğ�—½ğ�—²ğ�—°ğ�—¶ğ�—³ğ�—¶ğ�—° ğ�—½ğ�˜‚ğ�—¯ğ�—¹ğ�—¶ğ�—°ğ�—®ğ�˜�ğ�—¶ğ�—¼ğ�—»ğ�˜€, making this a key resource for researchers, developers, and enthusiasts in the ROS community.
Our goal is to provide a cohesive synthesis that helps deepen the understanding of ROS 2’s contributions and guides future research in robotic systems design.
Join us in exploring the potential of ROS 2 and shaping the future of robotics!
Our company GEEAR (full announcement soon) has been developing FIREBRINGER - a generalised autonomous navigation algorithm implemented on ROS2 for field-robotic applications.
The algorithm is meant to support a variety of missions from simple point-to-point travelling and trajectory tracking, to more advanced interaction with third-party vehicles and our favourite - real-world dense robotic swarms (in the full academic sense of emergent collaborative behaviour). We will continue working on populating our library of mission types/functionalities so you can expect more in the future.
So far we have validated FIREBRINGER on real-world boats and ground vehicles and on copters and fixed-winged aircraft in gazebo, (real-world experiments coming soon). You can find our recent video unveiling our autonomous vessel prototype here.
The algorithm is meant to be easy to use and tune, only requiring basic physical characteristics of the robot (maximum linear/angular velocity and maximum acceleration for each degree of freedom of the robot), and it offers plug-n-play functionality when combined with an Ardupilot autopilot through MAVROS.
It is based on a lightweight, robust NMPC-variant integrated with artificial potential field theory. It can typically run at 100Hz on a normal PC and 25Hz on a RPi4B for all vehicle types. So far, it can receive topics regarding the location of other vehicles (third-party and collaborative), trajectory (e.g., NAV2 global path), destination (complete pose), and various other mission specific info. We are currently working on incorporating point-cloud/costmap information for surrounding obstacle avoidance. Our aim is to allow integration with any ROS2 perception package that may be used in field robotics. The algorithm outputs an optimal velocity for each degree of freedom so it needs to be combined with appropriate low-level actuation control (or connected to a well-tuned Ardupilot autopilot which will do the work).
We are currently considering creating and publishing a complete ROS2 package for the algorithm and we may propose some type of fusion with NAV2 in the future, but we wanted to gauge interest first. Obviously we are open to ideas and recommendations (and some criticism)!
The Infrastructure Project Management Committee meeting has been on the OSRF Official Events calendar since we made the switch from the earlier Infrastructure Project Committee to the OSRA-chartered Infrastructure PMC.
When I created the event, I accidentally did so using my local timezone instead of UTC pinning, as result, the meeting time was shown an hour later than previous weeks when the US clocks changed. Another PMC member noticed and reported this and I corrected the event.
My apologies to observers who joined an hour later, when the meeting time was originally posted.
Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:
That which we call an API by any other name would serve just as well. But when there are many diverging opinions on exactly what names to use or exactly how to structure things, we may end up scattering our efforts instead of working together.
The OSRA TGC opened a technical committee to examine the ROS Enhancement Proposal (REP) process to refine it and apply it to all of the projects that are governed by the OSRA, including Open-RMF. The PMC of Open-RMF sees this as an opportunity to formalize something similar to a Request For Comments (RFC) process for Open-RMF where we can define interfaces for our next generation development and receive comments from the public, especially from stakeholders, to make sure that we’ll be developing the most useful tools possible for everyone. This would also define a formal way for anyone from the general public to propose new interfaces for Open-RMF to support.
At this session of the special interest group, Grey will present a working draft of what kind of process we are considering for Open-RMF.
We are eager to get feedback from the Open-RMF community and beyond about how to make this process as effective as possible. Effectiveness would be measured both in terms of making sure we are producing the highest quality stable interfaces possible and also getting contributions (proposals, feedback, and implementation) from as many community stakeholders as possible.
After presenting our current ideas, we want to open up the session to discussion and feedback, so please come ready to share your own thoughts. We will especially appreciate input from anyone that has experience (both positive and negative experiences) working with standardization bodies.