If you are a ROS developer/user and you blog about it, ROS wants those contributions on this page ! All you need for that to happen is:
have an RSS/Atom blog (no Tweeter/Facebook/Google+ posts)
open a pull request on planet.ros tracker indicating your name and your RSS feed/ATOM url. (You can just edit the file and click "Propose File Change" to open a pull request.)
make your ROS related posts tagged with any of the following categories: "ROS", "R.O.S.", "ros", "r.o.s."
Warnings
For security reasons, html iframe, embed, object, javascript will be stripped out. Only Youtube videos in object and embed will be kept.
Guidelines
Planet ROS is one of the public faces of ROS and is read by users and potential contributors. The content remains the opinion of the bloggers but Planet ROS reserves the right to remove offensive posts.
Blogs should be related to ROS but that does not mean they should be devoid of personal subjects and opinions : those are encouraged since Planet ROS is a chance to know more about ROS developers.
Posts can be positive and promote ROS, or constructive and describe issues but should not contain useless flaming opinions. We want to keep ROS welcoming :)
ROS covers a wide variety of people and cultures. Profanities, prejudice, lewd comments and content likely to offend are to be avoided. Do not make personal attacks or attacks against other projects on your blog.
Suggestions ?
If you find any bug or have any suggestion, please file a bug on the planet.ros tracker.
Takes image input from a rosbag, Isaac Sim or live camera stream.
Takes input queries from the user as a list of objects via command line or Foxglove publish panel. This query can be changed anytime while the node is running to detect different objects.
Visualizes results as bounding boxes and labels on RViz or Foxglove.
Check out our Isaac SIM Jetson HIL Course Doc to learn how to use ROS2 NanoOWL with Isaac Sim. This course has information on setting up Isaac Sim, running Isaac ROS on NVIDIA Jetson and testing a simulated robot with Hardware-in-the-loop.
Sign up to attend our tutorial at ICRA 2024 on May 17: Cloud and Fog Robotics: A Hands-on Tutorial with ROS2 and FogROS2!
This tutorial will be led by academia & industry leaders.
Please sign up and see more information on our website here.
Abstract
Cloud and fog robotics empower resource-limited robots to execute computationally intensive tasks like deep learning. This tutorial covers cutting-edge methods and applications, incorporating insights from academia researchers and industry experts. It presents a full hands-on experience for both those who want to develop their first cloud robotics application and experienced practitioners interested in discussing the usability challenges of the field.
Embark on a comprehensive journey into Robot Operating System (ROS) through our immersive 5-day workshop. Whether you’re new to ROS or looking to deepen your understanding, this workshop will cover everything from the basics to advanced concepts, ensuring you’re equipped to navigate ROS effectively.
Workshop Agenda:
Day 1: Introduction to ROS :
Overview of ROS and its importance in robotics.
Understanding ROS architecture and core components.
Exploring the ROS file system and ecosystem.
Installation and setup of ROS on different platforms.
Day 2: Basic ROS Commands and Tools :
Getting hands-on with fundamental ROS commands.
Utilizing ROS tools for package management and navigation.
Exploring ROS workspaces and package creation.
Day 3: ROS Messages and Topics :
Understanding ROS messages and message types.
Creating custom messages for your robotic applications.
Working with ROS topics and implementing publishers/subscribers.
Day 4: ROS Services and Actions :
Exploring ROS services and different service types.
Creating custom ROS services to enable specific functionalities.
Implementing ROS actions and understanding action servers/clients.
Day 5: ROS Launch Files and Bag :
Utilizing ROS launch files for streamlined application launches.
Managing launch parameters and configurations.
Exploring ROS bags for data recording, playback, and analysis.
Join us on this educational journey as we dive deep into ROS and empower you to leverage its capabilities for your robotics projects!
Stay updated by following our page, commenting below, and subscribing to our YouTube channel:
The release date for ROS 2 Jazzy Jalisco is now just one month away. As we traditionally announce the name of the following distribution when a new distribution is released, it’s time to begin namestorming the name for the next release after Jazzy Jalisco.
Following our ancient traditions, the next ROS 2 release name will be an adjective starting with K followed by a turtle-related word or name, also starting with K.
Here are the existing ROS 2 names and code names.
Ardent Apalone - ardent
Bouncy Bolson - bouncy
Crystal Clemmys - crystal
Dashing Diademata - dashing
Eloquent Elusor - eloquent
Foxy Fitzroy - foxy
Galactic Geochelone - galactic
Humble Hawksbill - humble
Iron Irwini - iron
Jazzy Jalisco - jazzy
And here are the ROS 1 names and code names.
Boxturtle - boxturtle
C Turtle - cturtle
Diamondback - diamondback
Electric Emys - electric
Fuerte - fuerte
Groovy Galapagos - groovy
Hydro Medusa - hydro
Indigo Igloo - indigo
Jade Turtle - jade
Kinetic Kame - kinetic
Lunar Loggerhead - lunar
Melodic Morenia - melodic
Noetic Ninjemys - noetic
To get your namestorming going, here are some useful lists.
Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:
Since then, the group met every week.
The focus of the group has been on PR reviews to prepare for the ROS 2 Jazzy feature freeze.
I really want to thank everyone who joined and helped us for their contributions, in particular @JM_ROS for his huge work.
Now, I would like to keep the effort going.
I plan to host the meeting every two weeks.
The purpose of meetings can vary: if you have questions, doubts or proposals concerning the ROS 2 client libraries (e.g. rclcpp, rcl, rclpy, etc) feel free to join and discuss it with the group!
Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:
We’re happy to announce another sync into Rolling. This is the last Rolling sync before we branch off for Jazzy. This sync has been tagged as rolling/2024-04-17.
Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:
You can use this tutorial to display new types of ROS Messages in RViz, which I’ve used in the polygon_ros repo, or for useless things like the snowbot_operating_system
SAVE THE DATE: Jazzy Jalisco Testing and Tutorial Party, 2024-05-01T16:00:00Z UTC
As many of you are already aware, the ROS Jazzy Jalisco release is just around the corner: 2024-05-23T18:00:00Z UTC, to be exact! We want this to be our best ROS 2 release yet, and to get there we need to make sure that we thoroughly test Jazzy before it is released to the general public. We also want to make sure that the ROS documentation on docs.ros.org continues to be clear, concise, and correct. That’s where we need your help! We’re looking for community volunteers to join us for our annual ROS Testing and Tutorial Party. If you are looking to start dipping your toes into ROS development this is a great place to start.
Following the Gazebo team’s successfull history of tutorial parties, and the results from last year’s Iron Irwini Tutorial Party, we’re bringing back the tutorial party of ROS 2 Jazzy Jalisco. We want to replicate for Jazzy the success we had with the Iron Irwini Testing and Tutorial Party. So, what is a testing and tutorial party you may ask? Well, it is a chance for the community to meet and systematically review all of the current ROS tutorials while also testing the latest ROS release. What we attempt to do at the party is to test every ROS tutorial for every “flavor” of ROS user, or in other words, every combination of operating system (Ubuntu Linux, RedHat Linux, Windows, etc.), RMW implementation (FastDDS, Cyclone DDS, etc.), installation method, and CPU architecture (amd64, aarch64, etc.).
Now that we’ve frozen the Jazzy release, the core developer team is working on generating a set of “early release” binary and source packages for Jazzy Jalisco. We plan to make these pre-release binaries publicly available before the Testing and Tutorial Party. During the party we’ll release our testing repository with a long list of tests we would like to run on the testing version of Jazzy Jalisco. These tests will first ask contributors to pick a particular release “setup”, and then run either the test suite or work through one or more of the existing ROS 2 tutorials on docs.ros.org. Each “setup” is a specific combination of RMW vendor (FastDDS/Cyclone DDS/Connext DDS), build type (binary tarball/packages/source), host operating system (Ubuntu Nobel/RHEL-9/Windows), and architecture (amd64/aarch64). For each setup, we would like to perform a number of tests that validate our tutorials, core ROS functionality, and important features. With dozens possible setup configurations, testing each and every one internally just isn’t feasible. To ensure reproducibility of the tests it’s also valuable to get multiple evaluations of each test, which is why we need your help!
Tutorial party participants are asked to perform the tests they sign up for and report back the results. If you happen to find an issue or bug while participating in the tutorial party you’ll need to report it to us so it can get corrected before the Jazzy release.
We are planning to kick-off the tutorial party off at 2024-05-01T16:00:00Z UTC with a meeting where we explain the whole Testing and Tutorial Party Process. That meeting can be found on the OSRF Official Events Calendar. We’ll record the meeting and post instructions on Discourse for those who can’t make the meeting. To help motivate participants, we will be giving away ROS Jazzy Jalisco swag to the testers who complete the most tests during the tutorial party. Every time you complete a test or tutorial you will need to close the Github issue, and then fill out a short Google form. The Testing and Tutorial party participants with the most completed tests during the tutorial party will receive a credit to Zazzle to pick out some Jazzy swag. We’ll post more details about the tutorial party in about two weeks when we kick things off. For now, we’re outlining a rough sequence of events so everyone can set aside some time to participate. Here are the key dates you’ll want to remember:
2024-05-01T07:00:00Z UTC Tutorial & Testing Party begins Jazzy Jalisco
2024-05-01T16:00:00Z UTC Tutorial & Testing Party Hangout + Q&A
2024-05-08T07:00:00Z UTC ROS 2 Jazzy Jalisco logo and swag sale begins (tentative)
2024-05-15T07:00:00Z UTC Tutorial and Testing Party ends
2024-05-23T07:00:00Z UTC ROS 2 Jazzy Jalisco released
We have added these events to the OSRF Official Events Calendar, but the big one that you won’t want to miss is the kick-off event on 2024-05-01T16:00:00Z UTC. At this tutorial party kickoff meeting we’ll walk everyone through the steps involved in participation. In the meantime we would like your help spreading the word about the Testing and Tutorial Party. We hope to see you there on May 1st!
As we reach the end of today 15 April 2024 (2359 PST), all rolling branches on all ROS 2 base packages will be frozen . This is to branch into jazzy from rolling in preparation for the the upcoming release on 23rd May.
We will begin branching into jazzy over the course of this week to help ensure that the jazzy pre-release packages are available for testing during the Tutorial Party (as we did for the Iron Tutorial Party), this time happening on 1st of May. On that note, we hope to get even more community members to partake in the tutorial party and support the testing efforts!
Kind request to all base package maintainers to enforce the freeze after today by no longer merging any changes into rolling until further notice. Once the branching is complete, we will unfreeze rolling so that feature and API changes can be merged in once again. Stay tuned for that announcement on Discourse.
Hi everyone, I recently have been working on flywheel (https://app.flywheelrobotics.com), which lets you visualize and control your robots remotely over the internet.
It is free to use and can be installed on any robot, either using ROS 1 or ROS 2.
Apart from techniques like UMI gripper and kinesthesis, I think teleoperation can be a good way to collect datasets and incorporate edges cases handling into autonomy, and that’s where flywheel can be quite useful.
At the beginning of this year I had started this fun project. A global traversability mapping library which generates 3D traversability maps with 3D Pointclouds. This can be seamlessly integrated with any SLAM (Visual or LiDAR based) to generate a global map in real-time simply by providing a constant stream of Pointclouds along with the KeyFrame Poses.
This library also seamlessly handles loop closing and optimization of keyframe poses from the SLAM. I started writing this with ORB-SLAM3 as the backend so as a plus point, this also has the feature to merge multiple traversability maps into one in case there is a loss of tracking and a new local map is made.
I’m so pleased to release this as an open-source project. Hope this is useful to you all
Here is the link for the code and instructions to use: https://github.com/suchetanrs/traversability_mapping
Among them, aloha_mujoco is the implementation under mujoco simulation. For details, please refer to the aloha_mujoco folder.README
Start Gazebo
Start gazebo simulation
Run the command:
roslaunch arx5_moveit_config demo_gazebo.launch
You can see a mobile aloha model.
Gazebo introduction
After starting the gazebo simulation, there are two windows, one is the gazebo physical simulation window, and the other is the rviz window. In the rviz window, the moveit component can be called to plan the robotic arm.
In the gazebo simulation window, the right side of the screen displays the real-time physical simulation environment. The position information and kinematic simulation information of the robotic arm are displayed here. Gazebo will also feedback the simulated robotic arm status and execute the control angle sent by the planner.
In the rviz simulation window, the UI interface of the moveit component is displayed on the lower left side. Here you can select different planning groups (Planning Group) to control different robotic arms and grippers. The right window displays the real-time robot arm position, which is provided by gazebo simulation.
This is to start the mobile base control node. You can control the mobile base movement by sending the velocity.
Move the robotic arm
Drag the ‘teaching ball’ in the rviz interface and operate as shown in the figure. The robot arm will calculate the joint angle and robot arm trajectory based on the target end gripper position.
Note: After clicking Plan to start planning, the system takes time to calculate. When the Execute button changes from gray to black, click to execute the trajectory just planned.
Rviz simulation
The only difference between rviz simulation and gazebo simulation is that the physics simulation engine gazebo is not started, but only the data visualization platform rviz is started.
roslaunch arx5_moveit_config demo.launch
After startup, it will be the same as the rviz interface in gazebo simulation, just follow the above steps.
The ROS 2 Jazzy Jalisco feature freeze is next Monday, 2024-04-15T07:00:00Z UTC. If you have a pull request you want merged it is now or never. We’ll have full details about the release process, including our annual tutorial party, next week.
(I have succumb to using AI images, this one was from Bard, or Gemini, or whatever they call it now, The real Jazzy Jalisco distro graphic will come out in a couple of weeks).
A Focused Technical Project (FTP), championed by the Steel Founders’ Society of America (SFSA), sponsored by the DLA-Troop Support, Philadelphia, PA, and the Defense Logistics Agency Information Operations, J68, Research & Development, Ft. Belvoir, VA, within the ROS-Industrial Consortium (RIC), has recently completed the Focused Technical Project (FTP) Robotic Blending Milestone 5. The team included RIC team members consisting of Yaskawa America, PushCorp and Southwest Research Institute, along with SFSA member university, Iowa State University. This work culminated in a deployed system at a SFSA member foundry site.
The project built on the prior Robotic Blending Milestone 4, which demonstrated high-mix material surface finishing and edge processing of arbitrarily shaped and contouring parts, largely targeting piece-parts to be welded. This work sought to extend that work, adding new features and incorporating additional SFSA funded work to realize human in the loop high-mix casting finishing for foundry operations.
During the development process, we have contributed the following improvements to the foundational Scan-N-Plan framework that served as a starting point for the FTP. This framework is maintained as a workshop repository within the RIC GitHub repository and is used in whole or in parts for developer instruction in ROS 2 for industrial robotics.
Scan-N-Plan Updates
ROS 2 Control:
We have added ros2_control code to the Scan-N-Plan example implementations to preview the robot’s motion during a simulated motion execution.
Constant TCP Velocity Time Parameterization:
We have added a time parameterization algorithm for maintaining constant TCP speed during motion planning. The approach works by first calculating forward kinematics at each trajectory waypoint to get the pose of the TCP. Then, we create a line path between adjacent TCP poses and parameterize the line with a trapezoidal velocity profile. The joint velocity and acceleration at each waypoint are computed given the TCP Cartesian velocity and acceleration. These obtained velocities and accelerations are then validated to be within the limits of the robot.
Docker Images:
To make the deployment and development of Scan-N-Plan more efficient and standardized, we have created Docker images for ROS 2 Foxy, Humble, and Rolling that can be found here.
Selectable Representation for Collision Object:
In addition to the default behavior of converting the scan mesh into a convex hull, the collision object can now be represented as a detailed mesh or as an octree.
Default behavior: Converts the scan mesh into a convex hull. This generally results in the fastest motion plans, especially with TrajOpt, but can be too conservative and may cause motion planning failures if the scan object is not actually convex or nearly convex.
Mesh: Represents the collision object as the exact “detailed” mesh represented by the mesh file. With the contact test type CLOSEST for TrajOpt, this representation results in a somewhat slower planning time than convex_mesh, but not significantly longer.
Octree: Represents the collision object as an octree comprised of spheres with a diameter specified by the octree_resolution parameter. With the contact test type CLOSEST for TrajOpt, this representation results in slightly faster planning times than mesh but slower than convex_mesh.
Simplified Raster Planner
The parameters exposed in the raster planner GUI widget were modified to only show commonly tuned parameters during raster planning such as rotation offset (degrees), point spacing, line spacing, minimum segment length. This declutters the raster planner to improve usability.
Python Scanning and Execution Nodes
For ease of development and debugging during deployment efforts, we have converted the mesh reconstruction and motion execution simulator nodes from C++ to Python.
Service for Generating Scan Motion Plans
A separate service for generating scan motion plans has been made as an intermediate step to support creating dynamic scan trajectories in the future. This service will be called to create scan trajectory patterns that originate from a specified starting location.
Behavior Tree and Reactive GUI
The custom application back-end logic has been replaced with a behavior tree to improve workflow modularity and customizability. The behavior tree can change the GUI appearance based on the current behavior executed. Widgets are exposed and hidden during the workflow to guide the user’s focus to actions and settings relevant to the process at hand. A progress bar and preview motion bar have been made more accessible to inform the user where they are in the process.
Noether Updates
Tool path and Mesh Visualization Tool
When developing and debugging tool path and mesh modifiers, it is helpful to visualize the original tool path, the modified tool path, the original mesh, and the modified mesh. Previously, there was no way to view the unmodified sub-mesh after applying a mesh modifier.
We have added a widget in the Noether application to toggle the view of the mesh and tool path VTK objects. We replaced the concatenated mesh viewer using a vtkAssembly and using vtkSmartPointers instead of using raw pointers for internal objects.
Approach and Departure Tool Paths
To better stay within selected regions during processing, we added linear approach and departure tool paths (see below for the linear approach tool path). A modified version of these tool paths (circular approach/departure) creates an approach/departure curve of specified radius.
Plane Projection Mesh Modifier
We added a mesh modifier that fits a plane to the input mesh using random sample consensus and projects the inlier points onto the plane. This modifier prevents tool paths from being generated on an unintended inner or outer surface near the edge of a complex part, where the desired processing surface should be flat.
We hope developers and those interested in experimenting with ROS 2 for their industrial robotics application development have found this resource helpful. If you have questions, comments, or have observed an issue, please do not hesitate to either engage with the ROS-Industrial community or leave an issue over at the GitHub repository.
Thanks to Michael Ripperger, ROS-I Consortium Americas Tech Lead, SwRI Sr. Research Engineer, for his contributions to the Scan-N-Plan Workshop and the FTP program and this blog post.
Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:
The ROS Awards 2024 voting is now open.Cast your votes for the best projects and contributors in the ROS community. Visit Vote Now – ROS Awards 2024 and make your voice count!
About ROS Awards
The ROS Awards aim to be the Oscars of the ROS world. We intend to recognize contributions to the ROS community and the development of the ROS-based robot industry, and to help them gain awareness.
ROS Awards is a yearly award voted by the ROS community. Vote for the ROS contributions that inspired you and are shaping the future of robotics.
Voting policy
Everyone in the ROS community can vote.
Voting can only be done once from the same device and IP address.
We have provided a few options for reference, but voting for each category is open. You can nominate the best in your mind (We ask that you give as much detail as possible when voting to avoid confusion with other votes).
Since The Construct organized the awards, none of its products or developers can be nominated.
On June 28, 2024, one week before the ROSDevDay (July 5, 2024), voting will close and three finalists in each category will be announced.
Winners will be announced at the ROS Developers Day 2024, and all voting results will be published to the public on July 30, 2024.
The ROS Awards create awareness for new contributors from the ROS community. Therefore, last year’s winners will not be eligible for nomination this year. However, The Construct will announce the number of votes past winners received this year, along with this year’s winners, on July 30, 2024.
We hope you can cast a vote for good contributions. The Construct Team
When translating a robot trajectory from a motion plan in a ROS system to an actual executed motion, there is an inherent loss of precision. In general, trajectories are sent to a controller that adheres to the exact position and timing constraints to the best of its ability, but compromises must be made to execute the trajectory. Additionally, robots cannot achieve infinite precision in positional accuracy because the kinematics of the system cannot be known with infinite precision. We have participated in work to maximize position and velocity accuracy, but not every process requires this level of precision. The process planning component should match the level of precision required by the application, but, traditionally, motion planners work assuming infinite precision. That is why we have now introduced tolerance into our motion planning pipeline when using Cartesian waypoints in TrajOpt.
TrajOpt is an optimization-based motion planner that uses a seed trajectory along with costs and constraints to refine towards a better trajectory. This process typically involves avoiding collisions and smoothing the motion. Previously, whenever we specified a Cartesian waypoint, the tool frame needed to adhere to the desired waypoint exactly. The total error across all six degrees of freedom (6DOF) could deviate by 1e-4 meters/radians, or it would be considered a constraint violation. In applications where we allowed free rotation or motion about an axis, the coefficient associated with that constraint could be set to zero, but users were not able to set bounds on the motion. For example, a common ROS-I application would allow free rotation about the Z-axis, so we would set the last coefficient in our waypoint coefficient vector to zero. An example of this process can be seen in the Scan N Plan Workshop.
Both the extremely high precision requirements and all-or-nothing approach to 6DOF Cartesian freedom of motion do not fit most applications. Additionally, these tight constraints can often cause unnecessary motion planning failures when a small freedom of motion would enable success. With the new tolerance parameters available in TrajOpt, we can more closely match the requirements of the system while reducing the number of motion planning failures. Now available in the trajopt_default_plan_profile are various settings to tailor the various waypoint requirements to the system’s needs.
Below are two example gifs. The first shows a failed plan when no tolerance was allowed (except for free rotation about the z-axis), and the second example illustrates a successful plan when tolerance was enabled: 15 mm in X and Y, 1.5 mm in Z, and 0.01 radians in rotation around X and Y (again, free rotation about Z was allowed). Visually, these two motions look almost identical, and infinite precision would not be required for this buffing process. Using tolerance in this application will enable the robot systems to align with the process requirements and improve overall success at incorporating robotics into various processes.