November 20, 2024
Seeking Feedback from Students with Autonomous Robotics Project Experience

Hello everyone,

I’m conducting a study on skill development in autonomous robotics, specifically focusing on what students (bachelor’s, master’s, or PhD level) learn through hands-on project work. This study centers on practical experiences in projects like final assignments, student groups, and research initiatives—not typical classroom content.

If you’re currently a student involved in UAV or drone projects as part of your studies, I’d be grateful for your insights. This survey aims to uncover the unique skills and knowledge that students develop through direct, project-based work in aerial and autonomous systems.

Survey Link: https://forms.gle/ncuaduBtkcMWGfGs7

Thank you for your time and for sharing this with anyone who may be interested!

1 post - 1 participant

Read full topic

by Steghide on November 20, 2024 05:46 PM

Another boring SMACC demo in IsaacSim

Hi Everyone,
Long time no see.

We released another demo video in Isaac Sim back in Sept on Linkedin. Hope you like it.

This is what an autonomous application demo should look like.

As always, here’s the source code:

One interesting thing about this application, and the others in the repo, is that they’re built using NVIDIA Isaac ROS Dev Containers. Which are totally awesome.

https://nvidia-isaac-ros.github.io/getting_started/dev_env_setup.html

The readme for this application lays out the steps of how to use them.

I also gave a talk about the application at the NVIDIA Jetson AI Research Group last week…

JETSON AI LAB | SMACC State Machines in ROS2 & Kaya sim2real workflow (11/12/2024)

I’ve been really impressed with the Jetson AI Research Group and some of the really awesome people involved.

Cheers,

1 post - 1 participant

Read full topic

by brettpac on November 20, 2024 04:23 AM

November 19, 2024
Next Client Library WG Meeting: Friday 22nd November 2024

Hi,
The next meeting of the Client Library Working Group will be this Friday, 22nd November 2024 at 8 AM Pacific Time.

The agenda for now includes:

Everyone is welcome to join.
If you have topics you want to discuss, feel free to anticipate them here in this thread.

3 posts - 2 participants

Read full topic

by alsora on November 19, 2024 04:54 PM

Kobuki Charging Station Schematic (TurteBot 2)?

Hi,

I know it’s a old platform these days. But one of my work horse’s is a Kobuki/Turtlebot 2. I’d like to build a couple more charging bases for it. Can’t buy them any more. Wondered if anyone might know of the schematic for it. Charging circuits specifically.

Thanks

Mark

1 post - 1 participant

Read full topic

by CycleMark on November 19, 2024 04:09 PM

November 18, 2024
ROSCon 2024 Videos are Now Available

ROSCon 2024 Videos are Now Available

Hi Everyone,

The videos from ROSCon 2024 in Odense are now available on the ROSCon Website (see the program), this Vimeo showcase, and in the ROS documentation. The ROSCon website also includes the slides from all the talks at ROSCon. I have also included a list of all the videos below.

I want to thank AMD for being our 2024 ROSCon video sponsor, their generous support makes the ROSCon live stream and videos possible.

https://vimeo.com/1024971401?app_id=122963

2024 ROSCon Talks

3 posts - 2 participants

Read full topic

by Katherine_Scott on November 18, 2024 05:49 PM

Introducing ros2-pkg-create - New Powerful ROS 2 Package Generator

We would like to present our newly open-sourced ROS 2 Package Generator, simply called ros2-pkg-create. It supports C++ and Python nodes as well as advanced features such as C++ components or lifecycle nodes.

Instantly try it out by running:

pip install ros2-pkg-create
ros2-pkg-create --template ros2_cpp_pkg .

Given the recent interest in Turtle Nest, we felt it’s about time to open-source our own take at a ROS 2 package generator, which use for all of our new packages at Institute for Automotive Engineering (ika) at RWTH Aachen University.

ros2-pkg-create is an interactive CLI tool for quickly generating ROS 2 packages from basic pub/sub nodes to complex lifecycle components. It is meant to replace the official ros2 pkg create command.

You can either directly control all options through command-line arguments or use the interactive questionnaire to walk through. No more memorization of available options for such a generation tool.

ros2-pkg-create can generate ROS 2 C++ Packages, Python Packages, and Interfaces Packages. The supported features include:

  • C++ Package: publisher, subscriber, parameter loading, launch file, service server, action server, timer callback, component, lifecycle node, docker-ros
  • Python Package: publisher, subscriber, parameter loading, launch file, service server, action server, timer callback, docker-ros
  • Interfaces Package: message, service, action

Under the hood, the templates are implemented using the Jinja templating engine, which allows for easy customizability.

We are very much looking forward to your feedback!

Best,
Lennart from Aachen

6 posts - 3 participants

Read full topic

by lreiher on November 18, 2024 10:12 AM

Cloud Robotics WG Meeting 2024-11-18

Please come and join us for this coming meeting at 1700-1800 UTC on Monday 21st November 2024, which will be a general catch-up! If you’re passionate about robotics or the cloud, come and say hello. We plan to discuss the latest news from the group and from the world of Cloud Robotics.

Last meeting, we had a guest talk from Julien Enoch on Eclipse Zenoh. If you’re interested to see the talk, we have published it on YouTube.

If you are willing and able to give a talk on cloud robotics in future meetings, we would be happy to host you - please reply here, message me directly, or sign up using the Guest Speaker Signup Sheet. We will record your talk and host it on YouTube with our other meeting recordings too!

The meeting link is here, and you can sign up to our calendar or our Google Group for meeting notifications.

My apologies for the late notice on this particular meeting. I have been off sick, and just now returned on the day of the meeting. Future meetings will have more notice, and the Meetings page on the Cloud Robotics Hub will always be the first to be updated with future meetings.

Hopefully we will see you there!

1 post - 1 participant

Read full topic

by mikelikesrobots on November 18, 2024 09:14 AM

November 16, 2024
New Packages for Noetic 2024-11-15

We’re happy to announce 0 new packages and 40 updates are now available in ROS Noetic. This sync was tagged as noetic/2024-11-15.

Thank you to every maintainer and contributor who made these updates available!

Package Updates for ROS Noetic

Added Packages [0]:

Updated Packages [40]:

Removed Packages [0]:

Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:

  • Analog Devices
  • Atsushi Watanabe
  • Jose-Luis Blanco-Claraco
  • Martin Pecka
  • Rob Fisher
  • Stefan Laible

1 post - 1 participant

Read full topic

by sloretz on November 16, 2024 01:34 AM

November 15, 2024
ROS News for the Week for November 11th, 2024

ROS News for the Week for November 11th, 2024



We’re preparing for the final Iron Irwini patch and sync. If you are still using Iron now is the time to upgrade to Jazzy.

:skull: Here’s your regular reminder that Gazebo Classic goes end of life in JANUARY and ALL OF ROS 1, INCLUDING NOETIC, GOES END OF LIFE NEXT MAY. We’re working on a few surprises internally to motivate the last stragglers to move over to ROS 2 and modern Gazebo.


hackster
Check out this amazing pick and place demo using ROS 2 on Hackster. The project makes use of an Arduino Braccio++ Robotic Arm, a Luxonis depth camera, ROS 2, microROS, and Edge Impulse.



This week some of our colleagues in Japan released Space Station OS based on ROS. The idea here is to create a standard ROS interface to space stations to enable robotic space station tending.



Clearpath’s amazing “Demystifying ROS 2 Networking” workshop at ROSCon includes this amazing network issue debugging flowchart that is worth sharing.

ROSCon 2024 videos and slides will be released Monday. :wink:

Events

News

ROS

Got a Minute?

Answering just one question a week on Robotics Stack Exchange would really help out the ROS community!

3 posts - 3 participants

Read full topic

by Katherine_Scott on November 15, 2024 09:10 PM

LBR-Stack paper just published in Journal of Open Source Software (JOSS)

I noticed this paper come through the JOSS Mastodon feed:

LBR-Stack: ROS 2 and Python Integration of KUKA FRI for Med and IIWA Robots

The authors submitted the paper back in 2023, but it finally published today. Congrats to them for getting it over the line!

1 post - 1 participant

Read full topic

by vmb on November 15, 2024 01:06 PM

November 14, 2024
Teaching Robots to Perform Tasks: Our Open-Source Project for Imitation Learning!

Screencastfrom10-25-202405_34_01PM-ezgif.com-video-to-gif-converter

We developed an open-source system that simplifies the training of robots to perform tasks through imitation learning. Our setup allows you to collect data, control robots, and train models in both real and simulated environments.

The system is built on the Diffusion Policy model and the ROS2 framework. To help you get started, we provide a pre-trained model and datasets, but you can also collect your own data for customized training.

We believe that imitation learning can be useful in dynamic environments where object positions can change, and our project aims to simplify this imitation learning process
We invite you to explore its capabilities and contribute to its growth! If you’re interested in training robots to perform tasks using imitation learning, check it out and let us know what you think!

1 post - 1 participant

Read full topic

by Marija_Golubovic on November 14, 2024 07:01 PM

November 12, 2024
Status of Colcon building "standards-based" Python packages?

I have read the (now closed) topic: Call For Testing: Standards-based Python packaging with colcon. But it seems not much has happened in the linked repo since then. I have some co-workers who are asking why they need to use setup.py to use ROS, so I’m investigating for them. Should I try using this colcon-python-project Colcon extension? Or was the effort abandoned? Should I be using something else?

I also found this Poetry Colcon extension. I feel indifferent about Poetry, but could try it out. I would appreciate any sharing of thoughts about it.

12 posts - 9 participants

Read full topic

by taughz on November 12, 2024 08:03 PM

ROS-Industrial Consortium Asia Pacific Annual Summit 2024

The ROS-Industrial Consortium Asia Pacific Annual Summit 2024, themed "Robotics in the Age of AI," concluded successfully, marking a significant milestone for the robotics and automation sector in Asia Pacific. Hosted by the ROS-Industrial Consortium Asia Pacific (RIC Asia Pacific) and managed by the Advanced Remanufacturing and Technology Center (ARTC), the summit brought together over 150 international participants, including industry leaders, researchers, and innovators, who gathered to explore the impact of AI-powered robotics on industries.

WhatsApp Image 2024-11-07 at 4.20.03 PM.jpeg
WhatsApp Image 2024-11-07 at 4.20.05 PM (1).jpeg
WhatsApp Image 2024-11-07 at 4.20.05 PM.jpeg
WhatsApp Image 2024-11-07 at 4.20.08 PM (1).jpeg
WhatsApp Image 2024-11-07 at 4.20.08 PM.jpeg
WhatsApp Image 2024-11-07 at 4.20.09 PM.jpeg
WhatsApp Image 2024-11-07 at 4.20.10 PM (1).jpeg
WhatsApp Image 2024-11-07 at 4.20.10 PM.jpeg
WhatsApp Image 2024-11-07 at 4.20.11 PM (1).jpeg
WhatsApp Image 2024-11-07 at 4.20.11 PM.jpeg
WhatsApp Image 2024-11-07 at 4.20.12 PM (1).jpeg
WhatsApp Image 2024-11-07 at 4.20.12 PM (2).jpeg
WhatsApp Image 2024-11-07 at 4.20.12 PM (3).jpeg
WhatsApp Image 2024-11-07 at 4.20.12 PM.jpeg
WhatsApp Image 2024-11-07 at 4.20.13 PM (1).jpeg
WhatsApp Image 2024-11-07 at 4.20.13 PM.jpeg
WhatsApp Image 2024-11-07 at 4.20.14 PM (1).jpeg
WhatsApp Image 2024-11-07 at 4.20.14 PM (2).jpeg
WhatsApp Image 2024-11-07 at 4.20.14 PM.jpeg
WhatsApp Image 2024-11-07 at 4.20.15 PM (1).jpeg
WhatsApp Image 2024-11-07 at 4.20.15 PM.jpeg

This year’s summit featured an impressive agenda packed with expert talks, hands-on masterclasses, and an engaging tech marketplace, all designed to highlight how AI is transforming robotics across industries, especially in manufacturing, logistics, and physical security.

Day 1 Highlights

The first day of the summit opened with a series of insightful presentations by leaders from major organizations in the AI and robotics space.

Amazon Web Services (AWS) set the tone with a talk titled "AI and Robotics-Driven Innovations for Manufacturing Excellence." Presented by Mirela Juravle, Data and AI Specialist Lead at AWS, the session showcased how robotics and AI are being leveraged to support digital transformation efforts toward achieving "lights-out" manufacturing, a concept where factories operate autonomously without human intervention. AWS shared insights on the benefits of cloud services in scaling robotic operations and optimizing workflows for increased efficiency in manufacturing.

Next, Mitsuharu Sonehara, Manager at IHI Corporation’s Technology Planning Department, presented on IHI's vision of advancing robotics and automation within logistics. Titled "From Deep Sea to Deep Space," his talk explored IHI's work in using robotics for high-stakes applications, from underwater operations to outer space logistics. He detailed the challenges and opportunities in these extreme environments and highlighted how AI and robotics can revolutionize logistics operations of the future.

Swarooph Nirmal Seshadri, Chief Technology Officer at Kabam Robotics, shared insights on the transformation of physical security through robotics, intelligence, and connectivity. His session explored how AI-driven robots are becoming crucial in the security industry, enabling smarter monitoring, data gathering, and response systems that are safer and more efficient.

Dr. Dikai Liu, a solution architect from NVIDIA, followed with an exciting presentation on NVIDIA’s suite of services, which facilitates the acceleration of robotics and AI from simulation to real-world deployment. NVIDIA’s tools and platforms empower developers to simulate complex environments and rapidly prototype AI algorithms for robotics, ultimately shortening the timeline from concept to deployment.

An important announcement on ROS2 compatibility came from Mr. Steven Chong, Senior Business Development Manager at Mitsubishi Electric Asia. He announced the release of a ROS2 driver for the MELFA robot, enabling broader integration with the ROS ecosystem. This advancement allows Mitsubishi Electric’s industrial robots to seamlessly integrate with ROS2, opening up new possibilities for automation in various industries. More details can be found on the ROS-Industrial blog here.

Wrapping up the first day’s talks, Dr. Yang Geng, Assistant Director at IMDA, presented on "Embodied AI as the Next Leap in GenAI." He described how embodied AI, which focuses on giving AI systems a physical presence, can revolutionize industries, from customer service robots to healthcare assistants, by enhancing interactions and adaptability through AI.

Day 2 Highlights

The second day of the summit was equally informative, beginning with a presentation by Matt Robinson, Consortium Manager at ROS-Industrial Consortium America. He discussed the collective global effort to standardize ROS2 drivers, aiming to establish ROS2 as the default industrial standard for robotics software. Robinson emphasized the benefits of this standardization for interoperability and efficiency in automation.

Following Robinson, Vishnuprasad Prachandabhanu, Consortium Manager of ROS-Industrial Consortium Europe, shared ongoing efforts to implement ROS2 across various applications. He highlighted a substantial EU-backed initiative with a €60 million funding commitment toward developing AI for robotics across Europe, signifying a significant investment in the advancement of open-source robotics.

Mr. Eugene Goh of JM Vistec presented next, offering insights on the integration of vision systems in robotics. His talk emphasized how JM Vistec enables robots to "see," enhancing precision and capability in industrial tasks, from quality inspection to object recognition.

Concluding the speaker sessions, Dr. Kenneth Kwok from IHPC shared cutting-edge research on enabling human-robot collaboration, powered by AI. His session emphasized the importance of human-centered AI in creating safe, efficient, and collaborative environments where robots can work alongside humans in factories, warehouses, and more.

Masterclass Lineup Participants then moved to the masterclass sessions, which provided hands-on learning experiences across various aspects of AI and robotics. Each session was designed to deepen the practical knowledge and technical skills required for integrating AI with robotics.

Empowering Innovations with MELSOFT: Led by Mr. Liu Muyao, Software Application Engineer from Mitsubishi Electric Asia, this session focused on MELSOFT, Mitsubishi's integrated software environment, which enhances the control and flexibility of industrial robots.

Introduction to Reinforcement Learning for Robot Arm Manipulation: Hosted by Mr. Shalman Khan, Mr. Santosh Balaji, and Ms. Mercedes Ramesh from ROS-Industrial Consortium Asia Pacific, this session introduced reinforcement learning principles, showing participants how to apply these techniques to control robotic arms more effectively.

Introduction to Deep Learning with YOLO and ROS: Dr. Carlos Acosta, Robotics Specialist at Singapore Polytechnic, led a session on utilizing YOLO (You Only Look Once) with ROS for object detection. The masterclass offered participants a foundation in integrating deep learning algorithms with ROS to enhance robotic vision applications.

Introduction to Fleet Management with Open-RMF: This session, led by Dr. Ricardo Tellez, CEO of The Construct, demonstrated the Open-RMF (Robot Management Framework) for multi-robot fleet management. Participants learned how to manage multiple robots collaboratively, a critical capability for applications in large facilities like hospitals and factories.

Tech Marketplace Highlights The tech marketplace featured a diverse array of participants, including Megazo, Kabam Robotics, IHPC, Parasoft, and Pepperl+Fuchs. Each company showcased their latest innovations, giving attendees a firsthand look at cutting-edge robotics solutions and AI-driven technologies designed to tackle challenges in industries like manufacturing, logistics, and safety. The marketplace provided a vibrant space for networking, collaboration, and discovering new tools that could redefine industrial automation.

RIC-AP Annual Summit 2024 also announce 2025 exciting event as we welcome the largest ROS Conference, ROSCon 2025 to Singapore. This would be the first time ROSCon will be hosted in Singapore

Finally, on behalf of everyone at ROS-Industrial Consortium Asia Pacific, we would like to thank all participants and delegates for their enthusiasm and we look forward to RIC-AP Annual summit 2025.

To sign up for our upcoming events, register your interest via this link: [https://form.gov.sg/672480d8116743c2ed31c690]

by ROSIndustrial AP on November 12, 2024 12:54 AM

Official ROS2 Driver Release for Mitsubishi Electric Industrial Robot MELFA: MELFA ROS2 Driver

Mitsubishi Electric aims to integrate their MELFA robots into the ROS2 ecosystem, allowing robotics developers and integrators to utilize their industry proven platform seamlessly in ROS-based applications.

By developing MELFA ROS2 packages, Mitsubishi Electric seeks to enable developers to leverage on the flexibility, modularity, and extensive support of ROS2 community coupled with proven global hardware support.

MELFA ROS2 Driver is a collaborative effort between ROS-I Consortium Asia Pacific and Mitsubishi Electric Asia. MELFA ROS2 Driver consists of modular components that allow users to interface with the robot’s motion control, state monitoring, and digital/analog I/O operations in ROS2 control framework. This development bridges the gap between Mitsubishi Electric automation hardware and ROS2, providing developers with the tools needed to build, deploy, and manage robotic applications on an industrial platform effectively.

MELFA ROS2 Driver I/O controllers enable cyclic communication between ROS2 and MELFA. Developers can leverage on the IQ platform through MELFA ROS2 Driver to access other Mitsubishi Electric automation products (such as PLC, HMI, motor drives, NC machines), utilize industrial networks (such as CC-Link, PROFINET, EtherCAT, EtherNet/IP, DeviceNet, etc) and explore time sensitive networks (such as CC-Link IE TSN).

MELFA ROS2 Driver is designed for flexibility, supporting various ROS2 packages such as MoveIt2 for motion planning and custom nodes for specialized tasks.

MELFA ROS2 driver will officially support 9 models in the first batch and will aim to support more than 20 models in the near future

  • RV-2FR

    RV-4FR

    RV-4FRL

    RV-7RL

    RV-13FRL

    RV-8CRL

  • RV-5AS

  • RH-6FRH5520

    RH-6CRH6020

Users can access detailed documentation and installation instructions from the official repository [https://github.com/Mitsubishi-Electric-Asia/melfa_ros2_driver] to get started or talk to the developers from Mitsubishi Electric on [https://github.com/orgs/Mitsubishi-Electric-Asia/discussions/1]

by ROSIndustrial AP on November 12, 2024 12:07 AM

November 11, 2024
Space Station OS released! - a future where anyone can develop space stations.

Space Station OS (SSOS) is an open-source development platform for space stations, built on ROS 2 to support interoperability, modularity, and scalability across various space station projects.

By unifying core functions like thermal control, power, and life support into reusable modules, Space Station OS provides a universal environment that allows engineers globally to develop and operate space stations collaboratively. This enables rapid innovation and cross-mission compatibility, lowering development costs and enhancing sustainable space station operations.

Space Station OS represents a global effort to democratize space station technology, welcoming contributions from the international aerospace and robotics communities.

Visit us:

6 posts - 4 participants

Read full topic

by yuyuqq on November 11, 2024 05:07 PM

RobotCAD 4.0.0 - released! Let you make controllable robots from GUI

Ros2_controllers are intergrated to RobotCAD 4.0.0 and let you make controllable diff drive car, manipulator, etc just from RobotCAD GUI without programming.

Also sometime before was integrated PX4 autopilot. It makes synergy of making airal - land riding robots with manipulator in 1 hour. See functionality demonstration.

RobotCAD 4.0.0 functionality demo
RobotCAD 4.0.0 functionality demo

Moreover you can extend RobotCAD controllers by adding your generic controller to ros2_controllers repository by pull request (or locally at your computer). And after that you will be able to construct your specific controllable tool via RobotCAD GUI and automatically generate code of ROS2 package and docker for it.

RobotCAD repository

1 post - 1 participant

Read full topic

by fenixionsoul on November 11, 2024 12:03 PM

November 08, 2024
ROS News for the Week of November 4th, 2024

ROS News for the Week of November 4th, 2024



ROSCon 2024 is in the bag and most of the team is finally home! The videos should be up on the web soon (please don’t ask when, they’ll be up as soon as we can get them edited and uploaded).

In the meantime there are some great resources that came out of the event. Sebastian Castro put together a fantastic summary of the event. You can also check out @mjcarroll and @jennuine’s
r2s: Text User Interface (TUI) for ROS 2, this Zenoh + ROS 2 ROSCon Workshop, this demystifying ROS 2 networking workshop, and my intro to ROS 2 workshop



@Anis_Koubaa has put together a comprehensive survey on ROS 2.. They survey includes an amazing webpage that allows you to search through ROS papers that is worth bookmarking.



ROS By-The-Bay with Red Rabbit Robotics and @methyldragon is scheduled for next Thursday, November 14th in Mountain View. There are maybe ten RSVPs left.


MELFA ROS2 Driver Demo

Our friends at ROS Industrial had their annual European and Asian consortium meetings over the past two weeks. One of the big take aways is the new Mitsubishi MELFA ROS 2 driver. You can check out the source code here.



:rocket: Space ROS Humble 2024.10.0 has been released!. I suggest you take a look at the Space ROS Demo repository’s pull requests. There is some really cool stuff in there!

Along those lines, check out ROSA being demonstrated for Neil deGrasse Tyson at JPL.

Events

News

ROS

Got a minute? :mantelpiece_clock:

We desperately need more contributors to the ROS Documentation. If you learned something new this week why not share it with the community?

3 posts - 2 participants

Read full topic

by Katherine_Scott on November 08, 2024 08:36 PM

[Announcement] Sony Robotics Solution (AMR/AGV) with ROS 2

Hi ROS users,

We Sony would like to make an quick announcement that we has released our robotics solution for AMR.

AutonMate

Autonomous Mobile Robot (AMR) that assists with picking through collaboration with humans, aiming to reduce labor. AMRs, positioned as the next generation of Automatic Guided Vehicles (AGVs), can travel without guides and move autonomously while avoiding people and obstacles.

Those robotics packages are already running in the market and actual workspace in Japan !!!

System Overview

The system is constructed with 2 components, the one is Fleet Management System and the other is Robot Navigation System.
All these proprietary system application are built on top of ROS 2 humble and Fast-DDS.

The system does many things such as map creation, autonomous navigation and device management. Based on our experiences especially for edge IoT and embedded devices, we develop the stable and robust system for the robots.

ROS Open Source Eco-System in Sony

Sony has been working to make a lot of contribution to the mainline based on the requirements from business logics, then we can use ROS as a user to the business applications. We really appreciate ROS community and ROS open source, and we will keep it up :rocket:

If you are interested, please contact us.

thanks,
Tomoya

5 posts - 3 participants

Read full topic

by tomoyafujita on November 08, 2024 06:33 PM

Open Hardware Summit 2025 CFP

Hi Everyone,

I want to take off my ROS hat :billed_cap: for a second and put on my Open Source Hardware Association (OSHWA) hat :cowboy_hat_face: for second.

As some of you may know I’ve been on the board of the Open Source Hardware Association for some time now. The OSHWA team is really busy with an exciting new NSF project at the moment so I am stepping up to help with organizing this year’s Open Hardware Summit. I presented a Lightning Talk at ROSCon about this year’s summit and the response was so positive I figured I should also make an announcement on ROS Discourse. I might also be in the process of planning a ROS meetup in Edinburgh. If that’s something you would be interested in helping with please reach out to me directly via DM.

Open Hardware Summit 2025

This year’s Open Hardware Summit will be held in Edinburgh, Scotland on 2025-05-29T23:00:00Z UTC2025-05-30T23:00:00Z UTC and tickets just went on sale. If you’ve never been to Open Hardware Summit you can get a taste of the event by watching our 2024 Summit Live Stream. Our keynote this year was from Danielle Boyer, a Ojibwe roboticist who is builds open hardware educational robots that teach students their indigenous languages.

OSHWA currently has an open call for proposals for this year’s Open Hardware Summit. If you have an open source hardware robot, or an interesting open hardware project that you would like to share, please consider submitting a talk or workshop proposal! Applications are due by 2024-12-22T08:00:00Z UTC.

While I have your attention…

I want to remind the ROS community that they should consider certifying their open hardware designs! Certification ensures that your project meets the minimum documentation requirements to be considered open source hardware, lists your project on our certification website, and provides you with a slick badge that you can include on your designs.

The OSHWA certification website is a gold mine of close to 3000 open hardware projects that you are free to study and use as part of your robotics project. The certification website currently includes 19 different motor drivers (like this stepper driver and this Grove Motor Driver from TU Delft ), 229 different robots (such as the NASA JPL Rover), and 312 different sensors (like this line sensor, and this pneumatics controller). I recommend you bookmark the certification website for easy reference!

2 posts - 2 participants

Read full topic

by Katherine_Scott on November 08, 2024 05:17 PM

Comprehensive Survey on ROS 2

:rocket: ��������� �����������: ��� ���� ������������� ������ �� ��� �:rocket:

I am excited to announce that our latest survey paper,

��� � �� � ��������: � ������

co-authored with Abdulrahman S. Al-Batati and Dr. Mohamed AbdelKader, is now available on Preprints.org! :tada:

Credits go to Abdulrahman S. Al-Batati for the great efforts in gathering this volume of related works and also in building the first repository of ROS/ROS2 publications available at:

:open_book: ���/���� ����������: https://ros.riotu-lab.org/
:open_book: ���� �����: LinkedIn

This study stands as the ���� ������������� ������ to date on the transition from ��� � �� ��� �, offering a deep dive into the enhancements, challenges, and future directions for ROS 2.

Our analysis covers:
:small_blue_diamond: Real-time capabilities
:small_blue_diamond: Enhanced modularity
:small_blue_diamond: Security improvements
:small_blue_diamond: Middleware and distributed systems
:small_blue_diamond: Multi-robot system applications

We carefully analyzed �,��� ���-������� ��������, with a focused review of ��� ��� �-�������� ������������, making this a key resource for researchers, developers, and enthusiasts in the ROS community.

Our goal is to provide a cohesive synthesis that helps deepen the understanding of ROS 2’s contributions and guides future research in robotic systems design.

Join us in exploring the potential of ROS 2 and shaping the future of robotics! :bulb::robot:

hashtag#ROS2 hashtag#Robotics hashtag#ROS hashtag#Research hashtag#AI hashtag#MachineLearning hashtag#RoboticsCommunity hashtag#ROSCommunity hashtag#OpenSource hashtag#Survey

7 posts - 5 participants

Read full topic

by Anis_Koubaa on November 08, 2024 07:39 AM

November 06, 2024
Next Client Library WG Meeting: Friday 8th November 2024

Hi,
after a break I would like to start again the recurring meetings of the client library working group.

The next meeting will be this Friday, 8th November 2024 at 8 AM Pacific Time.

The agenda for now includes:

Everyone is welcome to join.
If you have topics you want to discuss, feel free to anticipate them here in this thread.

2 posts - 1 participant

Read full topic

by alsora on November 06, 2024 04:01 PM

November 05, 2024
Introducing FIREBRINGER - A generalised autonomous navigation algorithm

Hey all,

Our company GEEAR (full announcement soon) has been developing FIREBRINGER - a generalised autonomous navigation algorithm implemented on ROS2 for field-robotic applications.

The algorithm is meant to support a variety of missions from simple point-to-point travelling and trajectory tracking, to more advanced interaction with third-party vehicles and our favourite - real-world dense robotic swarms (in the full academic sense of emergent collaborative behaviour). We will continue working on populating our library of mission types/functionalities so you can expect more in the future.

So far we have validated FIREBRINGER on real-world boats and ground vehicles and on copters and fixed-winged aircraft in gazebo, (real-world experiments coming soon). You can find our recent video unveiling our autonomous vessel prototype here.

The algorithm is meant to be easy to use and tune, only requiring basic physical characteristics of the robot (maximum linear/angular velocity and maximum acceleration for each degree of freedom of the robot), and it offers plug-n-play functionality when combined with an Ardupilot autopilot through MAVROS.

It is based on a lightweight, robust NMPC-variant integrated with artificial potential field theory. It can typically run at 100Hz on a normal PC and 25Hz on a RPi4B for all vehicle types. So far, it can receive topics regarding the location of other vehicles (third-party and collaborative), trajectory (e.g., NAV2 global path), destination (complete pose), and various other mission specific info. We are currently working on incorporating point-cloud/costmap information for surrounding obstacle avoidance. Our aim is to allow integration with any ROS2 perception package that may be used in field robotics. The algorithm outputs an optimal velocity for each degree of freedom so it needs to be combined with appropriate low-level actuation control (or connected to a well-tuned Ardupilot autopilot which will do the work).

We are currently considering creating and publishing a complete ROS2 package for the algorithm and we may propose some type of fusion with NAV2 in the future, but we wanted to gauge interest first. Obviously we are open to ideas and recommendations (and some criticism)!

4 posts - 3 participants

Read full topic

by George_Rossides on November 05, 2024 04:25 PM

November 04, 2024
Clarification of Infrastructure PMC meeting time

tl;dr:

Infrastructure PMC meetings are held online at Mondays 16:00 UTC

The event is open to public observers who can join via the information on the OSRF Official Events calendar


Hello everyone,

The Infrastructure Project Management Committee meeting has been on the OSRF Official Events calendar since we made the switch from the earlier Infrastructure Project Committee to the OSRA-chartered Infrastructure PMC.

When I created the event, I accidentally did so using my local timezone instead of UTC pinning, as result, the meeting time was shown an hour later than previous weeks when the US clocks changed. Another PMC member noticed and reported this and I corrected the event.

My apologies to observers who joined an hour later, when the meeting time was originally posted.

3 posts - 2 participants

Read full topic

by nuclearsandwich on November 04, 2024 06:03 PM

New packages for Humble Hawksbill 2024-11-04

Package Updates for Humble

Added Packages [9]:

  • ros-humble-etsi-its-vam-ts-coding: 2.3.0-1
  • ros-humble-etsi-its-vam-ts-coding-dbgsym: 2.3.0-1
  • ros-humble-etsi-its-vam-ts-conversion: 2.3.0-1
  • ros-humble-etsi-its-vam-ts-msgs: 2.3.0-1
  • ros-humble-etsi-its-vam-ts-msgs-dbgsym: 2.3.0-1
  • ros-humble-sbg-driver: 3.2.0-1
  • ros-humble-sbg-driver-dbgsym: 3.2.0-1
  • ros-humble-web-video-server: 2.0.1-1
  • ros-humble-web-video-server-dbgsym: 2.0.1-1

Updated Packages [140]:

  • ros-humble-clearpath-msgs: 0.3.0-2 → 1.0.0-1
  • ros-humble-clearpath-platform-msgs: 0.3.0-2 → 1.0.0-1
  • ros-humble-clearpath-platform-msgs-dbgsym: 0.3.0-2 → 1.0.0-1
  • ros-humble-control-toolbox: 3.2.0-1 → 3.3.0-1
  • ros-humble-control-toolbox-dbgsym: 3.2.0-1 → 3.3.0-1
  • ros-humble-etsi-its-cam-coding: 2.2.0-1 → 2.3.0-1
  • ros-humble-etsi-its-cam-coding-dbgsym: 2.2.0-1 → 2.3.0-1
  • ros-humble-etsi-its-cam-conversion: 2.2.0-1 → 2.3.0-1
  • ros-humble-etsi-its-cam-msgs: 2.2.0-1 → 2.3.0-1
  • ros-humble-etsi-its-cam-msgs-dbgsym: 2.2.0-1 → 2.3.0-1
  • ros-humble-etsi-its-cam-ts-coding: 2.2.0-1 → 2.3.0-1
  • ros-humble-etsi-its-cam-ts-coding-dbgsym: 2.2.0-1 → 2.3.0-1
  • ros-humble-etsi-its-cam-ts-conversion: 2.2.0-1 → 2.3.0-1
  • ros-humble-etsi-its-cam-ts-msgs: 2.2.0-1 → 2.3.0-1
  • ros-humble-etsi-its-cam-ts-msgs-dbgsym: 2.2.0-1 → 2.3.0-1
  • ros-humble-etsi-its-coding: 2.2.0-1 → 2.3.0-1
  • ros-humble-etsi-its-conversion: 2.2.0-1 → 2.3.0-1
  • ros-humble-etsi-its-conversion-dbgsym: 2.2.0-1 → 2.3.0-1
  • ros-humble-etsi-its-cpm-ts-coding: 2.2.0-1 → 2.3.0-1
  • ros-humble-etsi-its-cpm-ts-coding-dbgsym: 2.2.0-1 → 2.3.0-1
  • ros-humble-etsi-its-cpm-ts-conversion: 2.2.0-1 → 2.3.0-1
  • ros-humble-etsi-its-cpm-ts-msgs: 2.2.0-1 → 2.3.0-1
  • ros-humble-etsi-its-cpm-ts-msgs-dbgsym: 2.2.0-1 → 2.3.0-1
  • ros-humble-etsi-its-denm-coding: 2.2.0-1 → 2.3.0-1
  • ros-humble-etsi-its-denm-coding-dbgsym: 2.2.0-1 → 2.3.0-1
  • ros-humble-etsi-its-denm-conversion: 2.2.0-1 → 2.3.0-1
  • ros-humble-etsi-its-denm-msgs: 2.2.0-1 → 2.3.0-1
  • ros-humble-etsi-its-denm-msgs-dbgsym: 2.2.0-1 → 2.3.0-1
  • ros-humble-etsi-its-messages: 2.2.0-1 → 2.3.0-1
  • ros-humble-etsi-its-msgs: 2.2.0-1 → 2.3.0-1
  • ros-humble-etsi-its-msgs-utils: 2.2.0-1 → 2.3.0-1
  • ros-humble-etsi-its-primitives-conversion: 2.2.0-1 → 2.3.0-1
  • ros-humble-etsi-its-rviz-plugins: 2.2.0-1 → 2.3.0-1
  • ros-humble-etsi-its-rviz-plugins-dbgsym: 2.2.0-1 → 2.3.0-1
  • ros-humble-generate-parameter-library: 0.3.8-3 → 0.3.9-1
  • ros-humble-generate-parameter-library-example: 0.3.8-3 → 0.3.9-1
  • ros-humble-generate-parameter-library-example-dbgsym: 0.3.8-3 → 0.3.9-1
  • ros-humble-generate-parameter-library-py: 0.3.8-3 → 0.3.9-1
  • ros-humble-generate-parameter-module-example: 0.3.8-3 → 0.3.9-1
  • ros-humble-lanelet2: 1.2.1-1 → 1.2.2-1
  • ros-humble-lanelet2-core: 1.2.1-1 → 1.2.2-1
  • ros-humble-lanelet2-core-dbgsym: 1.2.1-1 → 1.2.2-1
  • ros-humble-lanelet2-examples: 1.2.1-1 → 1.2.2-1
  • ros-humble-lanelet2-examples-dbgsym: 1.2.1-1 → 1.2.2-1
  • ros-humble-lanelet2-io: 1.2.1-1 → 1.2.2-1
  • ros-humble-lanelet2-io-dbgsym: 1.2.1-1 → 1.2.2-1
  • ros-humble-lanelet2-maps: 1.2.1-1 → 1.2.2-1
  • ros-humble-lanelet2-matching: 1.2.1-1 → 1.2.2-1
  • ros-humble-lanelet2-matching-dbgsym: 1.2.1-1 → 1.2.2-1
  • ros-humble-lanelet2-projection: 1.2.1-1 → 1.2.2-1
  • ros-humble-lanelet2-projection-dbgsym: 1.2.1-1 → 1.2.2-1
  • ros-humble-lanelet2-python: 1.2.1-1 → 1.2.2-1
  • ros-humble-lanelet2-python-dbgsym: 1.2.1-1 → 1.2.2-1
  • ros-humble-lanelet2-routing: 1.2.1-1 → 1.2.2-1
  • ros-humble-lanelet2-routing-dbgsym: 1.2.1-1 → 1.2.2-1
  • ros-humble-lanelet2-traffic-rules: 1.2.1-1 → 1.2.2-1
  • ros-humble-lanelet2-traffic-rules-dbgsym: 1.2.1-1 → 1.2.2-1
  • ros-humble-lanelet2-validation: 1.2.1-1 → 1.2.2-1
  • ros-humble-lanelet2-validation-dbgsym: 1.2.1-1 → 1.2.2-1
  • ros-humble-launch-pal: 0.4.0-1 → 0.7.0-1
  • ros-humble-mrpt-apps: 2.14.3-1 → 2.14.4-1
  • ros-humble-mrpt-apps-dbgsym: 2.14.3-1 → 2.14.4-1
  • ros-humble-mrpt-libapps: 2.14.3-1 → 2.14.4-1
  • ros-humble-mrpt-libapps-dbgsym: 2.14.3-1 → 2.14.4-1
  • ros-humble-mrpt-libbase: 2.14.3-1 → 2.14.4-1
  • ros-humble-mrpt-libbase-dbgsym: 2.14.3-1 → 2.14.4-1
  • ros-humble-mrpt-libgui: 2.14.3-1 → 2.14.4-1
  • ros-humble-mrpt-libgui-dbgsym: 2.14.3-1 → 2.14.4-1
  • ros-humble-mrpt-libhwdrivers: 2.14.3-1 → 2.14.4-1
  • ros-humble-mrpt-libhwdrivers-dbgsym: 2.14.3-1 → 2.14.4-1
  • ros-humble-mrpt-libmaps: 2.14.3-1 → 2.14.4-1
  • ros-humble-mrpt-libmaps-dbgsym: 2.14.3-1 → 2.14.4-1
  • ros-humble-mrpt-libmath: 2.14.3-1 → 2.14.4-1
  • ros-humble-mrpt-libmath-dbgsym: 2.14.3-1 → 2.14.4-1
  • ros-humble-mrpt-libnav: 2.14.3-1 → 2.14.4-1
  • ros-humble-mrpt-libnav-dbgsym: 2.14.3-1 → 2.14.4-1
  • ros-humble-mrpt-libobs: 2.14.3-1 → 2.14.4-1
  • ros-humble-mrpt-libobs-dbgsym: 2.14.3-1 → 2.14.4-1
  • ros-humble-mrpt-libopengl: 2.14.3-1 → 2.14.4-1
  • ros-humble-mrpt-libopengl-dbgsym: 2.14.3-1 → 2.14.4-1
  • ros-humble-mrpt-libposes: 2.14.3-1 → 2.14.4-1
  • ros-humble-mrpt-libposes-dbgsym: 2.14.3-1 → 2.14.4-1
  • ros-humble-mrpt-libros-bridge: 2.14.3-1 → 2.14.4-1
  • ros-humble-mrpt-libros-bridge-dbgsym: 2.14.3-1 → 2.14.4-1
  • ros-humble-mrpt-libslam: 2.14.3-1 → 2.14.4-1
  • ros-humble-mrpt-libslam-dbgsym: 2.14.3-1 → 2.14.4-1
  • ros-humble-mrpt-libtclap: 2.14.3-1 → 2.14.4-1
  • ros-humble-mvsim: 0.11.0-1 → 0.11.1-1
  • ros-humble-mvsim-dbgsym: 0.11.0-1 → 0.11.1-1
  • ros-humble-octomap-rviz-plugins: 2.1.0-1 → 2.1.1-1
  • ros-humble-octomap-rviz-plugins-dbgsym: 2.1.0-1 → 2.1.1-1
  • ros-humble-omni-base-2dnav: 2.3.0-1 → 2.4.0-1
  • ros-humble-omni-base-bringup: 2.4.0-1 → 2.4.1-1
  • ros-humble-omni-base-controller-configuration: 2.4.0-1 → 2.4.1-1
  • ros-humble-omni-base-description: 2.4.0-1 → 2.4.1-1
  • ros-humble-omni-base-gazebo: 2.1.0-1 → 2.2.0-1
  • ros-humble-omni-base-laser-sensors: 2.3.0-1 → 2.4.0-1
  • ros-humble-omni-base-navigation: 2.3.0-1 → 2.4.0-1
  • ros-humble-omni-base-rgbd-sensors: 2.3.0-1 → 2.4.0-1
  • ros-humble-omni-base-robot: 2.4.0-1 → 2.4.1-1
  • ros-humble-omni-base-simulation: 2.1.0-1 → 2.2.0-1
  • ros-humble-parameter-traits: 0.3.8-3 → 0.3.9-1
  • ros-humble-pmb2-bringup: 5.3.0-1 → 5.3.1-1
  • ros-humble-pmb2-controller-configuration: 5.3.0-1 → 5.3.1-1
  • ros-humble-pmb2-description: 5.3.0-1 → 5.3.1-1
  • ros-humble-pmb2-robot: 5.3.0-1 → 5.3.1-1
  • ros-humble-pose-cov-ops: 0.3.12-1 → 0.3.13-1
  • ros-humble-pose-cov-ops-dbgsym: 0.3.12-1 → 0.3.13-1
  • ros-humble-python-mrpt: 2.14.3-1 → 2.14.4-1
  • ros-humble-realtime-tools: 2.6.0-1 → 2.7.0-1
  • ros-humble-realtime-tools-dbgsym: 2.6.0-1 → 2.7.0-1
  • ros-humble-sick-scan-xd: 3.5.0-1 → 3.6.0-1
  • ros-humble-sick-scan-xd-dbgsym: 3.5.0-1 → 3.6.0-1
  • ros-humble-tiago-bringup: 4.5.0-1 → 4.6.0-1
  • ros-humble-tiago-controller-configuration: 4.5.0-1 → 4.6.0-1
  • ros-humble-tiago-description: 4.5.0-1 → 4.6.0-1
  • ros-humble-tiago-moveit-config: 3.0.18-1 → 3.1.0-1
  • ros-humble-tiago-robot: 4.5.0-1 → 4.6.0-1
  • ros-humble-ur: 2.2.15-1 → 2.2.16-5
  • ros-humble-ur-bringup: 2.2.15-1 → 2.2.16-5
  • ros-humble-ur-calibration: 2.2.15-1 → 2.2.16-5
  • ros-humble-ur-calibration-dbgsym: 2.2.15-1 → 2.2.16-5
  • ros-humble-ur-controllers: 2.2.15-1 → 2.2.16-5
  • ros-humble-ur-controllers-dbgsym: 2.2.15-1 → 2.2.16-5
  • ros-humble-ur-dashboard-msgs: 2.2.15-1 → 2.2.16-5
  • ros-humble-ur-dashboard-msgs-dbgsym: 2.2.15-1 → 2.2.16-5
  • ros-humble-ur-description: 2.1.7-1 → 2.1.8-2
  • ros-humble-ur-moveit-config: 2.2.15-1 → 2.2.16-5
  • ros-humble-ur-robot-driver: 2.2.15-1 → 2.2.16-5
  • ros-humble-ur-robot-driver-dbgsym: 2.2.15-1 → 2.2.16-5
  • ros-humble-urdf-test: 2.0.3-1 → 2.1.0-1
  • ros-humble-velodyne: 2.4.0-1 → 2.5.1-1
  • ros-humble-velodyne-driver: 2.4.0-1 → 2.5.1-1
  • ros-humble-velodyne-driver-dbgsym: 2.4.0-1 → 2.5.1-1
  • ros-humble-velodyne-laserscan: 2.4.0-1 → 2.5.1-1
  • ros-humble-velodyne-laserscan-dbgsym: 2.4.0-1 → 2.5.1-1
  • ros-humble-velodyne-msgs: 2.4.0-1 → 2.5.1-1
  • ros-humble-velodyne-msgs-dbgsym: 2.4.0-1 → 2.5.1-1
  • ros-humble-velodyne-pointcloud: 2.4.0-1 → 2.5.1-1
  • ros-humble-velodyne-pointcloud-dbgsym: 2.4.0-1 → 2.5.1-1

Removed Packages [0]:

Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:

  • Andrea Capodacqua
  • Armin Hornung
  • Bence Magyar
  • Błażej Sowa
  • Denis Stogl
  • Fabian Immel
  • Felix Exner
  • Jan-Hendrik Pauls
  • Jean-Pierre Busch
  • Jordan Palacios
  • Jordi Pages
  • Jose-Luis Blanco-Claraco
  • Josh Whitley
  • Paul Gesel
  • Roni Kreinin
  • SBG Systems
  • TIAGo PAL support team
  • Tyler Weaver
  • Yue Erro
  • paul
  • rostest

1 post - 1 participant

Read full topic

by audrow on November 04, 2024 04:17 PM

November 02, 2024
Interoperability Interest Group November 07, 2024: What's in an interface?

Community Page

Meeting Link

Calendar Link

2024-11-07T15:00:00Z UTC

That which we call an API by any other name would serve just as well. But when there are many diverging opinions on exactly what names to use or exactly how to structure things, we may end up scattering our efforts instead of working together.

The OSRA TGC opened a technical committee to examine the ROS Enhancement Proposal (REP) process to refine it and apply it to all of the projects that are governed by the OSRA, including Open-RMF. The PMC of Open-RMF sees this as an opportunity to formalize something similar to a Request For Comments (RFC) process for Open-RMF where we can define interfaces for our next generation development and receive comments from the public, especially from stakeholders, to make sure that we’ll be developing the most useful tools possible for everyone. This would also define a formal way for anyone from the general public to propose new interfaces for Open-RMF to support.

At this session of the special interest group, Grey will present a working draft of what kind of process we are considering for Open-RMF.

We are eager to get feedback from the Open-RMF community and beyond about how to make this process as effective as possible. Effectiveness would be measured both in terms of making sure we are producing the highest quality stable interfaces possible and also getting contributions (proposals, feedback, and implementation) from as many community stakeholders as possible.

After presenting our current ideas, we want to open up the session to discussion and feedback, so please come ready to share your own thoughts. We will especially appreciate input from anyone that has experience (both positive and negative experiences) working with standardization bodies.

1 post - 1 participant

Read full topic

by grey on November 02, 2024 09:05 AM


Powered by the awesome: Planet