August 03, 2015
Simulation available for PMB-2 - TIAGo's mobile base
From Judith Viladomat

The ROS simulation of PAL Robotics mobile base, named PMB-2, is available now and ready to download! You will find all the steps in the ROS wiki.

This mobile base is the one used in TIAGo, the mobile manipulator. Now it is also available independently, and shipping for first units is starting in a few months - in October! PMB-2 is 100% ROS based and can be fully customized.

PMB-2 has a maximum speed of 1 m/s and can move around with its traversable step and traversable gap of 1.5 cm. The mobile base can carry anything on top, with a payload of 50 Kg. PMB-2 features our custom suspension system, making it handle larger irregularities of the floor.

You can watch PMB-2 first prototype in action in this video, where there are more features explained.

by Tully Foote on August 03, 2015 08:18 AM

August 01, 2015
ROS Jade for Gentoo
From Hunter Allen via ros-users@

I have been successful installing ROS Jade on gentoo! I have been
working on an install guide, which you can find here: .

What's working:
 * core utilities, minus rostopic list (some issue with roslz4 and
 * desktop
 * desktop_full, minus gazebo (there's a conflict building Gazebo-
5.1.0 right now - more on this later)

Just figured I'd let people know! Please email me if you have any

by Tully Foote on August 01, 2015 12:14 AM

July 31, 2015
Free 1/2 Day "Getting Started with ROS" Workshop in Santa Clara on Thursday, September 17

From Kelly Kane

So you want to build a ROS robot? Sign up for this free ½ day workshop at the Hilton in Santa Clara.

Shaun Edwards - the Co-Founder of ROS-Industrial - will present:

· ROS Capabilities

  • Overview

  • Mobile Manipulation

  • Robotic Blending

· ROS Basics

  • Intro to ROS

  • MoveIt

  • Intro to ROS-Industrial

· Next Steps: Getting Started and Finding Help

Seats are limited. Register today here.

This workshop is hosted by EandM - your local SICK distributor providing sensor intelligence for industrial and professional robots.

by Tully Foote on July 31, 2015 08:30 PM

Openings at Cruise, self-driving car company
From Richard Ni via ros-users@

Come work with a team of robotics experts on technically challenging problems, building products that improve lives and prevent car accidents. 

Our team is small, but we move quickly. In less than a year, we built prototype vehicles that have logged over 10,000 autonomous miles on California highways. We're looking for smart, ambitious people to help develop our next generation products, ensure they're reliable and safe, and deploy them at scale.

In particular, we're looking for perception engineers to make sure our cars can accurately identify objects and predict where they'll move. Apply at

For a complete list of our openings, see

by Tully Foote on July 31, 2015 04:44 PM

TIAGo, the best robotic partner for research

TIAGo is a mobile manipulator that comes ready to serve at any research institution or lab. TIAGo has evolved capabilities on manipulation, navigation, perception and interaction, and can be customized for any particular need of a research. TIAGo’s features make it the ideal robot for research, especially on ambient assisted living or light industry. TIAGo is open source: its software is completely ROS enabled, and its simulation is available on its ROS Wiki webpage. In the following video, TIAGo’s prototype is showing some of his abilities, like grasping objects or lifting them from the floor.

Watch TIAGo’s prototype in action in this video!

A robot that adapts to your research needs, not the other way around

TIAGo a is totally configurable robot for research, unlike others of its kind. “TIAGo’s purpose is to be custom-fitted to any specific need” says Product Manager, Jordi Pagès. That is why three versions are available – Iron, Steel and Titanium. The Iron version can be upgraded to Steel and Titanium, besides they are also modular and customizable, adapting TIAGo to all budgets. The robot is affordable for institutions and labs and can now be pre-ordered. First final pre-ordered units are now under construction.

The technology used by TIAGo is endorsed by PAL Robotics’ large experience in humanoid robots since 2004. The team is known for developing the REEM humanoids family, with REEM-C and REEM standing out. Now PAL Robotics also works on other platforms to answer specific needs for which the developed technology of PAL Robotics is also useful, such as TIAGo for research environments.

TIAGo’s main features that turn it to a great robot for research

TIAGo is a mobile research platform provided with a sensorized pan-tilt head, a lifting torso and a 7 DOF arm, which ensures a large manipulation workspace. Its end-effector is plug and play, and can be a parallel gripper or a five-fingered humanoid hand. It is able to grasp and manipulate objects with a payload of 2 Kg. The hand is underactuated, soft, resistant and versatile, suitable for manipulation and human-robot interaction tasks. Another optional is a force-torque sensor at its wrist.

TIAGo runs autonomously on the PMB2 mobile base, creating a map of the environment with a 2D laser. Its sensors provide visual perception, enabling it to detect objects, people, obstacles and anything you implement.

TIAGo versions robot for research

The post TIAGo, the best robotic partner for research appeared first on PAL Robotics Blog.

by Judith Viladomat on July 31, 2015 09:20 AM

July 30, 2015
Open Position at Clearpath Robotics: PR2 Support Technologist
From Ryan Gariepy via ros-users@

Are you more of a hardware hacker than a algorithm developer, but still want to get involved with the core of ROS and the ROS community? We're continuing to expand - this job may be for you!


Position:        PR2 Support Technologist 
Location:        Kitchener, Ontario
Experience:    1+ Years hardware prototyping experience
Education:      Undergraduate degree or college diploma in related field

About Us

Clearpath Robotics designs and builds service robots to automate the world's dullest, dirtiest and deadliest jobs.  Our clients range from small local businesses to some of the biggest, best known companies on the planet. We built Clearpath by offering hardware and services to support advanced robotics R&D and are now expanding into commercial and industrial service robot deployments.

We employ a diverse and highly talented team who live and breathe robotics.  We believe that work must have a high "cool" factor and every day should bring new knowledge. We need more passionate people on our team who are willing and able to push the boundaries of robotics into focused and practical applications. 

Clearpath is automating the world and we need your help.  Got what it takes?

About the Job

The PR2 is the most complex and capable ROS robot out there, and we have to keep it running. Over 50 institutions use the PR2 for cutting edge robotics research. The PR2 support infrastructure consists of a server backend, test benches, and knowledgeable teammates. Due to the continued worldwide use of the PR2 platform, we need another PR2 doctor.

As a PR2 Support Technologist your job is to diagnose misbehaving PR2 hardware. You will respond directly to customer support tickets in a timely, courteous, fashion. You will certify the functionality of replacement hardware before it is sent to our valued customers. You will receive assistance from the rest of the Clearpath production, engineering, and operations teams to ensure that your work is as streamlined as possible. You must be a master problem solver, able to track down complex problems and solve them.

Your primary responsibilities will be:

* System level diagnosis of PR2 issues by examining client provided data
* Technical communication with PR2 users on forums, on our support network, and in person at ROS events
* Advanced hardware re-work with PCB design, assembly and testing
* Network troubleshooting, on the PR2 itself and on the test benches

About You

You want to work for a fast-paced growing company that thinks big and dreams huge. You are driven, view work as more than just a job, and are never satisfied with less than 100% effort. You want to be surrounded by people like you; creative, fun-loving, and passionate about their work. You are motivated by making an impact on your workplace and you thrive on challenging and rewarding problems.

You know how to build a robot, full stop...even if you've never done it before. You are able to take a scope and run with it, seeking help and feedback when necessary. You aren't afraid of getting your hands dirty in the shop, soldering a surface-mount component, or compiling code from a command line. You may not care for theory because you've already started building the hardware.

Required Experience/Skills

* Problem Solving and Debug skills are paramount
* Strong Technical Communicator (written and verbal)
* Well organized and responsive, with basic negotiation skills
* Design, prototyping, soldering, and debugging of basic circuits & PCBs
* Comfortable working in a Linux environment
* Strong networking knowledge, programming knowledge in C++ and Python
* Design and drafting of basic mechanical assemblies

Bonus points for

* Analysis of large tables of data 
* Worked with relevant sensors and actuators (LIDAR, cameras, motor controllers, EtherCat, etc)

What Now?

Apply through our online job portal using this link ( Please submit cover letter along with your resume. Instructions for sending supporting documentation, including testimonials as well as references, pictures, web links, drawings, code samples, or other indications of exceptional past work will be provided in the confirmation email sent by our system upon receiving your application. Please include "PR2 Support Technologist" in the subject of any further communications. 

No recruiters or form cover letters, please. They do not please our mechanical masters.

by Tully Foote on July 30, 2015 11:29 PM

Software and Robots having the Biggest Impact in Robotics Manipulation Research
PictureThe number of mentions for open-source software packages, frameworks and libraries at the 2015 IEEE International Conference on Robotics and Automation (ICRA 2015). This list is limited to software related to manipulation.
When I was at Willow Garage, we would often look to measure the impact that ROS and the PR2 were having. The primary target for ROS, at that point, was the robotics academic community. An obvious way to measure impact was to look at metrics like the number of packages released, the number of robots running ROS, etc. Another way to measure impact was to look at citations and mentions in robotics conferences. I ran some numbers using data from the recent IEEE International Conference on Robotics and Automation in Seattle to measure the impact, not only of ROS and the PR2 but also of other open-source software for Robotics and other manipulation robots.

In generating this data, I chose all the open-source frameworks, packages and libraries I know of. I left out proprietary software, like MATLAB (which gained 173 mentions). The names I searched for include those of component libraries, like FCL, simulation libraries (like Gazebo), several planning libraries (OMPL, SBPL, CHOMP, etc.) and complete open-source frameworks, like MoveIt!, OpenRAVE and ROS. The ROS-Industrial consortium is also included in the list. The list is, admittedly, biased towards the ROS ecosystem. The numbers were generated by searching for the number of papers where the names appear in the proceedings of the conference. I counted each paper where a name appears as a single mention even if the name was mentioned multiple times in the same paper. The search was performed using the search functionality in the proceedings so it is likely that some mentions may not be accurate.

The first graph shows the number of mentions for each software package. It is clear that ROS has had a huge impact, gaining mentions in 100 papers (of a total of about 930 accepted papers). Vision frameworks like OpenCV and PCL also garner a large number of mentions. Gazebo seems to be the most popular open-source simulation framework while the ROS-based planning frameworks and packages, like MoveIt!, OMPL, etc. also gain more than 10 mentions. Users seem to be about equally split between MoveIt! and OpenRAVE. Overall, it is clear that open-source software is having a big impact on robotics research in manipulation.

I also looked at which robots garnered the most mentions in manipulation research. In most cases, I looked for the names of the robots themselves, e.g. the PR2. In other cases, I chose to look for the names of vendors, e.g. KUKA. The list includes robot arms, hands, humanoids and mobile manipulation systems. Note that I am only counting the mention of a particular robot - this does not necessarily mean that the robot was used in the research.

The number of mentions for different manipulation robots at the 2015 IEEE International Conference on Robotics and Automation (ICRA 2015). This list includes robot arms, hands, humanoids and mobile manipulation systems. It is not exhaustive so feel free to email me robots you want to see included in this list at robot DOT moveit AT gmail DOT com
The PR2 finds 46 mentions at ICRA 2015. Five years after the platform was released, the PR2 is still a very popular manipulation research platform. Being the base platform on which ROS was first developed and deployed is most certainly a significant factor in its #1 ranking. KUKA robots come in a close second - possibly representing the popularity of the vendor in Europe. Shadow robot hands also find a lot of mentions, indicating the high degree of interest in multi-fingered manipulation.  

Other industrial robot platforms, in general, do not rank high on this list. It is interesting to ask why. Is it safety and setup issues? The lack of a common, open API (an issue only now being addressed through the efforts of the open-source community)? The difficulty in having to integrate multiple components (grippers, sensors and more) vs. fully integrated systems like the PR2? Or is there a need for better collaboration between academia and industrial robotics? It is clear that we have reached a turning  point in robotic manipulation, with the promise of new features, new capabilities and new robots. What seems to be missing is an overall software framework that brings all this together the right way - all the pieces of the puzzle are there, somebody just needs to put them together the right way.

July 30, 2015 05:31 AM

July 28, 2015
Proposed: CAD to ROS Focused Technical Project

Cross posted from ROS-I

he ROS-Industrial Consortium is tackling a topic that is of interest to the whole ROS community: conversion of CAD data to ROS-interpretable file types (e.g. URDF, SRDF). This work will be conducted over the next three years by the TU Delft Robotics Institute. To help us make ROS even more convenient to use:

by Tully Foote on July 28, 2015 11:00 PM

Announcing agile_grasp Package for Localizing Grasps in Point Clouds
From Andreas ten Pas via ros-users@

despite being available for quite a while, I wanted to officially announce our ROS Hydro/Indigo package for localizing grasps in 3D point clouds: 

Here's a demo of Rethink's Baxter robot localizing and executing grasps in a densely cluttered scene.  

Instructions for using our package are available at the ROS wiki page given above.

If you find any problems, please report them at:

by Tully Foote on July 28, 2015 08:03 PM

Company spotlight: HumaRobotics

If you follow ROS-related news you probably noticed that packages were contributed back in May to interface ROS systems to the Cognex In-Sight camera and to Siemens S7 PLCs via Modbus TCP communication. Generation Robots, the company behind these contributions, is in fact not new to ROS development, as its CEO Jérôme Laplace told us.

Generation Robots' R&D branch HumaRobotics worked over the years with ROS on platforms such as NAO from Aldebaran, Baxter from Rethink Robotics, from Thecorpora and DARwIn-OP / DARwIn-Mini from Robotis. Their staff of cognitive science PhDs and robotics engineers provides ROS-based solutions both on real robots and during simulation. For example, the CEA (French Alternative Energies and Atomic Energy Commission) has sought their expertise on the DARwIn-OP and PhantomX robots for usage in inspection, radioactive material handling and disaster relief scenarios. An outcome of this collaboration has been to provide the community with simulation packages for both robots in the Gazebo simulator, user friendly ROS APIs and custom walking algorithms.

DarWin-op gazebo model

DarWin-op gazebo model

phantomx gazebo model

phantomx gazebo model

HumaRobotics helps industrial collaborators by sharing their expertise in human-robot interaction to bring collaborative capabilities to the ROS-enabled Baxter: for instance, by enabling it with speech recognition and synthesis, adaptive dialog abilities, human posture detection and natural face-to-face interaction with an operator. They also make use of advanced machine learning techniques to provide fast and natural inverse kinematics for physical interaction between the robot and the human (tool passing, third-hand).

baxter performing an inspection task with aN ARM-mounted cognex camera

baxter performing an inspection task with aN ARM-mounted cognex camera

safe human-robot interaction with the baxter collaborative robot

safe human-robot interaction with the baxter collaborative robot

"Industrial scenarios often involve integrating robots with standard industrial devices and protocols. ROS-enabled robots do not always have such capabilities by default, but one of the strengths of ROS is how easy it can be extended with new functionalities", Laplace said. "Due to its community-driven nature and the wide range of existing functionality, ROS is really enabling fast development of advanced robotic systems".

To join HumaRobotics in the fast-growing community of ROS(-Industrial) adopters and speed up the prototyping and development of industrial robot applications, download the code. Contact ROS-I (Americas, Europe) to better understand what the ROS-Industrial Consortia can do for you!

by Mirko Bordignon on July 28, 2015 12:28 PM

July 27, 2015
New job openings in Autonomous Driving at Bosch Palo Alto
From Elmar Mair via ros-users@

We have following new open positions for the autonomous driving team in our research center in Palo Alto, CA, USA.
Autonomous Driving Research and Development:
- Autonomous Driving Camera and Computer Vision Research Engineer
- Autonomous Driving Localization/Mapping Research Engineer
- Autonomous Driving Perception Research Engineer
- Autonomous Driving Motion Planning Research Engineer
- Autonomous Driving Planning and Decision Making Research Engineer
Autonomous Driving Software Engineering:
- Autonomous Driving Senior Software Developer/Designer
- Autonomous Driving Software Developer
- Autonomous Driving Data Management Engineer
- Autonomous Driving Data Visualization Engineer
- Autonomous Driving GUI/Web Development Engineer
- Autonomous Driving Software Testing Engineer
Please apply through the webpage.
We are looking forward to your application!

by Tully Foote on July 27, 2015 04:06 PM

July 21, 2015
Proposed: CAD to ROS Focused Technical Project

The ROS-Industrial Consortium is tackling a topic that is of interest to the whole ROS community: conversion of CAD data to ROS-interpretable file types (e.g. URDF, SRDF). This work will be conducted over the next three years by the TU Delft Robotics Institute. To help us make ROS even more convenient to use:

Example CAD data (left image) is converted to a URDF, which is shown in RViz (right image)

Example CAD data (left image) is converted to a URDF, which is shown in RViz (right image)

by Paul Hvass on July 21, 2015 07:15 PM

rss test purpose only (for debugging not pulling some feeds issue)
This entry will be remove in a few days.

by Isaac Saito ( on July 21, 2015 05:08 PM

July 19, 2015
Real-Time ROS for Embedded Systems
From Yigit Gunay via ros-users@

We are developing a lightweight implementation of the ROS middleware on STM32F4Discovery for interfacing embedded and general-purpose software. Currently, we can run multiple ROS nodes concurrently on STM32, and we can send ROS messages between a PC and STM32 over Ethernet (only UDPROS).

Please take a look at our repository on Github if you are interested in our real-time ROS development:

I would appreciate your comments. Thanks for your attention!

by Tully Foote on July 19, 2015 01:07 PM

Playing with Fetch and Freight robots (from Fetch Robotics) on Gazebo
It's been awhile since a promising (as everyone in robotics industry hopes) startup in San Jose called  Fetch Robotics has made some of their controller software public, even before they start selling their robots. Apparently there are a group of folks who took that as "use-it-test-it-giveus-feedback!" message, which I just joined lately.

So here's an incomplete report. It should be more appreciated by anyone who Google the information about particular software if the information is gathered at fewer places, so for ROS software I usually try to write any info on as much as possible (that's one excuse for why this blog has been stale). Because the Fetch Robotics is a commercial company, however, I instead write it here as my personal record. Of course if there will be something that better be shared on their official doc, I'll open a pull request on their doc repository.

Let's start on their tutorial "Tutorial: Gazebo Simulation"

Do the installation. I'm on Ubuntu Trusty 64bit, i7 quad core notebook.

Run what's already prepared

Since I'm interested in using the full-fledged simulation right away, let's run from "Running the Mobile Manipulation Demo":

    $ roslaunch fetch_gazebo playground.launch

This fires up Fetch in Gazebo in a nicely elaborated lab setting (with a good-ol nostalgic wallpaint...Looks like they knew that C-Turtle is my favorite ROS logo).

Then next launch file kicks off demo program.

    $ roslaunch fetch_gazebo_demo demo.launch

With a course of actions, the robot arrives at the table in one of the rooms, releases an object on top of it, and rest. Very nicely done Fetch! Also the demo is very well-made, demonstrating its many capabilities at once.

Now, let's look into a little bit of how the software is coordinated.

With rqt_graph, you can overview the nodes running, and what topics are exchanged between nodes.

Closer look around move_group node shows that it uses the commonly used type of ROS Actions.

Closer look at move_base node (forgot to take snapshot) doesn't show anything special to Fetch, which indicates that ROS' interface for navigation is versatile enough to be applied to various robots without customization.

On the left is a screenshot using rqt_tf_tree, which I recommend to existing/new ROS users in place for rosrun tf view_frames ( as a quick and dynamic introspection (view_frames still has a purpose; it enables to share the tree view with others via email etc.)

Looking at base, I noticed that the actuated wheels are only 2 (l, r) out of 6 wheels/casters I see in Gazebo.

You can see how the robot perceives the world, via RViz.

    $ rviz -d `rospack find fetch_navigation`/config/navigation.rviz

LEFT; RViz, RIGHT; Gazebo

As you've seen in the node graph above, MoveIt! processes are already running with playground.launch.

So here let's open MoveIt! plugin to operate the robot on RViz. Here you have to unclick to disable `RobotModel` plugin on RViz so that MoveIt! plugin can load the robot model.

Also, normally I disable Query Start State in MoveIt! plugin that I barely use.
Open pointcloud2 plugin so that we can see the table and the object on it Fetch just picked up and placed.

Now, let's pick up the object on the table. Plan a trajectory by using MoveIt! plugin, which can simply be done by just placing robot's end-effector to the pose you want. This can be done by drag-n-drop ROS' Interactive Marker by your mouse. With clicking Plan and Execute button the robot computes and plan the trajectory plan.

It looks like MoveIt! is having a hard time to figure out the valid trajectory from the arm's current pose to the object on the table. So I rather move the hand above the table first.

Here I see on RViz a "ghost" of the hand, which I've never seen with my MoveIt! history. Also the pose of this ghost is slightly off from the Interactive Marker. Turned out it looks like the ghost is the view from the pointcloud. And I don't know why the pose was off (debugging is not the first objective on this blog, having fun instead is), but it aligned fine after re-enabled MotionPlanning plugin on RViz.

 At some point while I'm still trying to figure out the pose to pick up the object, the hand interferes the camera view so that the object went invisible on RViz so that it's almost impossible for me to continue finding the goal pose.
So I rotated the robot's base by sending `/cmd_vel` topic (using rqt_robot_steering GUI).

Unfortunately after an approach trajectory execution, RViz had to be restarted, and the goal I gave might have not been right so that the hand hit the object from the side.

With RViz restarted, I now see the table as part of MoveIt! scene object (, which went away again after re-enabling MotionPlanning plugin).
Now, the hand is almost there.

Looks pretty good huh? This time I don't move the base any more although I lost the object again in my vision. It has already taken half an hour to reach here so I'm getting reluctant to do my very best.

Now only thing I need to do might be close the gripper, which doesn't seem possible by Interactive Markers on RViz. I assume there's a Python I/F. But my time ran out before going to gym. I'll resume later to update this blog.

After all, I found it hard to do the pick-and-place tasks via MoveIt! I/F on RViz for many reasons; among other things, not seeing the block on RViz with the lack of the block in sight makes the teaching extremely harder. And for the purpose of extending the existing demo, it looks like far easier to utilize the demo program that's based on vision.

Here with this small change to the original fetch_gazebo package, I wrote a simple Python script mimicking the original demo.

1:  #!/usr/bin/env python  
3: # Modified from
5: import rospy
7: from fetch_gazebo_demo.democlients import *
9: if __name__ == "__main__":
10: # Create a node
11: rospy.init_node("demo2")
13: # Make sure sim time is working
14: while not
15: pass
17: # Setup clients
18: move_base = MoveBaseClient()
19: torso_action = FollowTrajectoryClient("torso_controller", ["torso_lift_joint"])
20: head_action = PointHeadClient()
21: grasping_client = GraspingClient()
23: # Move the base to be in front of the table
24: # Demonstrates the use of the navigation stack
26: # Raise the torso using just a controller
27: rospy.loginfo("Raising torso...")
28: torso_action.move_to([0.4, ])
30: # Look below to find the block
31: head_action.look_at(0.4, -0.2, 0.5, "base_link")
33: # Get block to pick
34: while not rospy.is_shutdown():
35: rospy.loginfo("Picking object...")
36: grasping_client.updateScene()
37: cube, grasps = grasping_client.getGraspableCube()
38: if cube == None:
39: rospy.logwarn("Perception failed.")
40: continue
42: # Pick the block
43: if grasping_client.pick(cube, grasps):
44: break
45: rospy.logwarn("Grasping failed.")
47: # Tuck the arm
48: grasping_client.tuck() 

What this script does to the robot is simply extend the torso up, look for graspable blocks (I assume a process that detects graspable objects is running internally as an ActionServer somewhere. Also the shape is pre-registered in the referenced Python module). If a graspable block is found then get the pose of it, and move the hand toward there.

Hard to see but the arm is moving.

See the block is in her/his hand.

Also the sister (bro?) robot Freight seems fully ROS compatible.

    terminal-1$ roslaunch fetch_gazebo playground.launch robot:=freight
    terminal-2$ roslaunch fetch_gazebo_demo freight_nav.launch
    terminal-3$ rviz -d `rospack find fetch_navigation`/config/navigation.rviz

Freight on autonomous navigation. It moves amazingly fast and smooth on simulation  (I guess its real robot does so as well, at least the command level, it should).

Now I'm tempted to spawn 2 robots and collaborate, which I have no idea how to do so in ROS today.

Little post mortem; though I could use only a small part of all functionalities Fetch and Freight today, I should say I'm delighted to see this happening! Well-made as a ROS robot, easy to operate, all the features I thought of I wanted to give a try are implemented, already (even before it goes on sale). Looks like 2 parents of Turtlebot rock again. I just hope no matter how huge success they will get, its software remains opensource so that it takes the place of the modern reference model of the ROS robot that PR2 has been serving for the long time. Viva opensource robotics!

by Isaac Saito ( on July 19, 2015 02:16 AM

July 17, 2015
ROSCon 2015 Talks Announced


We're excited to announce a great collection of presentations that
will appear in the main track of ROSCon 2015 (also appended below):

If you like what you see there, register for ROSCon today!

Thanks again to our Platinum Sponsors: Canonical / Ubuntu and Fetch Robotics! And our Gold Sponsors: 3D Robotics, Bosch, CoroWare, GaiTech, Qualcomm, Rethink Robotics, Robotnik, and Shadow Robot!

Long presentations

  • "MoveIt! Strengths, Weaknesses, and Developer Insights" - Dave Coleman (University of Colorado Boulder)
  • "State of ROS 2 - demos and the technology behind" - Dirk Thomas (OSRF), Esteve Fernandez (OSRF), William Woodall (OSRF)
  • "Real-time Performance in ROS 2.0" - Jackie Kay (OSRF), Adolfo Rodríguez Tsouroukdissian (PAL Robotics)
  • "Bringing ROS to the factory floor: a status report on the ROS-Industrial initiative" - Mirko Bordignon (Fraunhofer IPA), Shaun Edwards (SwRI), Clay Flannigan (SwRI), Paul Hvass (SwRI), Ulrich Reiser (Fraunhofer IPA) Florian Weisshardt (Fraunhofer IPA)
  • "Commercial models for the robot generation" - Mark Shuttleworth (Canonical)
  • "An Introduction to Team ViGIR's Open Source Software and DRC Post Mortem" - Stefan Kohlbrecher (Technische Universitat Darmstadt)

Short presentations

  • "Automated Driving with ROS at BMW" - Michael Aeberhard (BMW Group Research and Technology), Thomas Kühbeck (BMW Group Research and Technology), Bernhard Seidl (BMW Group Research and Technology), Martin Friedl (BMW Group Research and Technology), Julian Thomas (BMW Group Research and Technology), Oliver Scheickl (BMW ConnectedDrive Lab, China)
  • "Working with the robot_localization Package" - Tom Moore (Charles River Analytics)
  • "ROS android_ndk: What? Why? How?" - Gary Servin (Creativa77)
  • "Accelerating Your Robotics Startup with ROS" - Michael Ferguson (Fetch Robotics)
  • "The Descartes Planning Library for Semi-Constrained Cartesian Trajectories" - Shaun Edwards (SwRI), Jorge Nicho (SwRI), Jonathan Meyer (SwRI)
  • "Phobos - Robot Model Development on Steroids" - Kai von Szadkowski (University of Bremen)
  • "ROS on DroneCode Systems" - Lorenz Meier (ETH Zurich and PX4), Roman Bapst (ETH Zurich and PX4)
  • "Introducing ROS-RealSense: 3D empowered Robotics Innovation Platform" - Amit Moran (Intel), Gila Kamhi (Intel)
  • "ROS-driven user applications in idempotent environments" - Matt Vollrath (End Point), Wojciech Ziniewicz (End Point)
  • "ROS2 on "small" embedded systems" - Morgan Quigley (OSRF)
  • "ROS + Docker: Enabling Repeatable, Reproducible, and Deployable robotic software via Linux Containers" - Ruffin White (Institute for Robotics & Intelligent Machines at Georgia Tech)
  • "ROS for education and applied research: practical experiences" - Ralph Seulin (CNRS - Univ. Bourgogne Franche-Comte), Raphael Duverne (CNRS - Univ. Bourgogne Franche-Comte), Olivier Morel (CNRS - Univ. Bourgogne Franche-Comte), Cansen Jiang (CNRS - Univ. Bourgogne Franche-Comte), Jeremie Deray (PAL Robotics), Jordi Pages (PAL Robotics), Lee Kian Seng (Universiti Teknologi Petronas), Remi Groslevin (CNRS - Univ. Bourgogne Franche-Comte), Cedric Demonceaux (CNRS - Univ. Bourgogne Franche-Comte), David Fofi (CNRS - Univ. Bourgogne Franche-Comte), Yohan Fougerolle (CNRS - Univ. Bourgogne Franche-Comte)
  • "Maru and Toru: Item-specific logistics solutions based on ROS" - Moritz Tenorth (Magazino GmbH), Ulrich Klank (Magazino GmbH), Nikolas Engelhard (Magazino GmbH)
  • "Mapviz: An Extensible 2D Visualization Tool for Automated Vehicles" - Jerry Towler (SwRI), Marc Alban (SwRI)
  • "Docker-based ROS Build Farm" - Tully Foote (OSRF), Dirk Thomas (OSRF), Dejan Pangercic (Robert Bosch), Daniel Di Marco (Robert Bosch), Arne Hamann (Robert Bosch)

by Tully Foote on July 17, 2015 02:41 PM

July 15, 2015
ROS simulation available for PMB-2 – TIAGo’s mobile base

The ROS simulation of PAL Robotics mobile base, named PMB-2, is available now and ready to download! You will find all the steps in the ROS wiki.

Follow the steps in the ROS wiki and start simulating with the mobile base!

PMB-2 mobile base is the one used in TIAGo, the mobile manipulator. Now it is also available independently, and shipping for first units is starting in a few months – in October! PMB-2 is 100% ROS based and can be fully customized to adapt to any particular need. It is designed to work in indoor environments leaving just a little footprint, and travels avoiding obstacles thanks to its sensors.

PMB-2 Mobile base in Gazebo

PMB-2 has a maximum speed of 1 m/s and can move around overcoming potholes or bumps of up to 1.5cm. The mobile base can carry anything on top, with a payload of 50 Kg. PMB-2 features our custom suspension system, making it handle larger irregularities of the floor. PMB-2 can expand its abilities putting optional additions in it. For more information about our mobile base, have a look here! You can also watch PMB-2 first prototype in action in this video, where there are more features explained.

The post ROS simulation available for PMB-2 – TIAGo’s mobile base appeared first on PAL Robotics Blog.

by Judith Viladomat on July 15, 2015 10:43 AM

July 14, 2015
ROS Driver for the IFM Efector O3D303
Cross posted from: Ros-Industrial

Love Park Robotics has announced a ROS driver for the new IFM Efector O3D303 3D camera system. This sensor was officially released in Germany on April 13, 2015. The O3D303 is a time-of-flight sensor, specifically designed for use in industrial environments and automation applications. The 176x132 element detector features a relative accuracy of +/-4mm. In addition to the robust design, it is able to operate in illumination conditions ranging from complete darkness to sunlight. It is also affordable, at a per-unit cost of $1250 USD. A picture of the O3D303 is shown below along with a point cloud of an imaged pallet (taken in an office environment) to highlight the quality of the sensor data.


As part of our beta test period, Love Park Robotics developed a software interface to the O3D303 that allows us to utilize the sensor within software frameworks such as PCL, OpenCV, and ROS. This code has been made available as open-source on Github in the following repositories: libo3d3xx and o3d3xx-ros. Additionally, we are working with the ROS Industrial community to make binary debian packages available as part of the core ROS and ROS-I distributions.

For more information see the ROS Industrial Blog Post

by Tully Foote on July 14, 2015 04:46 PM

July 13, 2015
ROS Summer School at East China Normal University

ROS Summer School 2015

July 23-26

East China Normal University, Shanghai, China

Since 2012, many college students, researchers and engineers have been learning ROS (Robot Operating System) for their robotics projects when the robustness and maintainability of ROS keeps improving. Since 2013, many robotics companies in China have realized the importance of ROS, started recruiting ROS developers and integrating ROS into their robotic products. However, learning ROS framework and its associating components involves a very wide range of knowledge, which not only requires developers to master software development skills, but also to be familiar with robot hardware and even the background of specific industrial applications.

For most people, learning and using ROS is a slow and painful process. Our ROS Summer Schools 2015 (organized by Intelligent Robot Motion and Vision Laboratory, East China Normal University, Shanghai, China) provides a quick and in-depth learning opportunity for ROS beginners and advanced ROS users.

In the first day, some robotics companies are invited to present their profiles, how they use ROS in their products and recent developments of robot industry in China. In the second day, we start with some introductory ROS courses for beginners. In the third day, we tackle the main tasks of integrating ROS with mobile autonomous robots, i.e. perception, localization and navigation. In the fourth day, we continue some advanced topics and skills interesting to many advanced ROS users.

This ROS Summer School also includes some leisure activities, such as sharing the start-up experience, companies and job seekers discussion, etc. Every day, attendees have a chance to win prizes, including ROVIO robots, iRobot Create, Asus Xtion Pro Live RGB-D camera, iRobot Roomba Vacuum Cleaning Robot. For how to win a prize and registration, please visit our official website for the details.

by Tully Foote on July 13, 2015 04:01 PM

July 09, 2015
NIST/SwRI Collaborate on Open Source Software for Robotic Assembly

Over the past 6 months, the SwRI ROS-Industrial team has been executing a Cooperative Research program with the National Institute of Standards and Technology (NIST). From a manufacturing perspective, NIST’s impact is quite diverse. It includes aspects from general process improvement to specific manufacturing processes like nano-manufacturing, and of course robotics.

A core theme of the NIST-supported ROS-Industrial program is agility. That is the ability of manufacturing systems to perform a diverse set of tasks, with the built in intelligence to re-task on the fly. Agility is the perhaps the greatest unrealized promise of robotics. With the support of NIST, it is this valuable and critical aspect of robotics that ROS-Industrial aims to enable. The research effort is broken down into several sub-tasks, outline below. The tasks vary, some with immediate impact and others with more long term goals. However, they all have the common theme of enabling robotic agility.

Robot Testing and Evaluation

Testing and evaluation (T&E) are very important for both measuring and comparing the performance of complex systems. Prior collaborative work was focused on test methods for Response Robots (think robots climbing around piles of rubble). Through these efforts a standard test-suite for response robots was developed. This test-suite demonstrably pushed the state of the art in response robots. With the goal in mind of measuring and pushing the state of the art in robotic agility, SwRI is developing test methods for evaluating robots for complex tasks, such as assembly.

Peg-in-hole assembly test fixture is used to evaluate the ability of a complete robot system to perform this operations.  Metrics including, success rates, and speed of insertion are capture in order to perform meaningful comparisons between systems.

Peg-in-hole assembly test fixture is used to evaluate the ability of a complete robot system to perform this operations.  Metrics including, success rates, and speed of insertion are capture in order to perform meaningful comparisons between systems.

Dual Arm Manipulator Development

Dual arm manipulation is an exciting area of research. Such systems mimic human operations, giving robotic systems the ability to both hold an object with one arm and perform an operation with the other. With NIST support, SwRI researchers have developed ROS-Industrial Hilgendorf support software for the robot configuration shown below. The support software was open sourced to jump start dual arm manipulation research on similar setups. The Hilgendorf system configuration can be easily assembled from off the shelf components. ROS-I researchers will utilize Hilgendorf for developing dual arm applications.

A dual arm manipulator built from two UR5s and two Robotiq grippers was built.  The system allows researchers to experiment with various assembly T&E tasks.

A dual arm manipulator built from two UR5s and two Robotiq grippers was built.  The system allows researchers to experiment with various assembly T&E tasks.

Calibration Library Improvements

The ROS-Industrial Calibration Library is a powerful tool for calibrating frame transformations between multiple robots and sensors. Improvements have been made to this library to make the data collection and calibration steps more streamlined. An additional goal of this effort was to evaluate the accuracy of a system calibrated with our library. System evaluation is a key part of the NIST mission. An example system with a network consisting of 6 cameras was calibrated with the ROS-Industrial Calibration Library using a target held by a UR10 robot. The system demonstrated pose variance for each camera better than 1/4 mm and 1/10th degree.

The observations and camera pose geometry were used to predict the localization accuracy of the camera network. 

The observations and camera pose geometry were used to predict the localization accuracy of the camera network. 

The accuracy map is a slice of the working volume 0.4 meters above the table.

The accuracy map is a slice of the working volume 0.4 meters above the table.

Ontologies for Agile Planning in Manufacturing

Past (and present) robotic automation is primarily used in high volume/low variation production applications, with the acceptations being driven by safety and environmental conditions. The obstacle that steers automation away from low volume/high variation production applications is the effort associated with teaching each part. The interest is to develop an ontology structure to represent the assembly process in a way that automated planning and assembly tasks can be executed. Current literature approaches the problem in a similar way to how a child learns to perform new task. There is a low level skill set (refer to the figure below) that needs to be taught, that then can be used to complete complicated tasks. The challenge is to formulate the skill primitives in such a way that they are robot independent and have the capability to store all information necessary for the robot to execute them efficiently. In the long term, such an ontology could enable highly dynamic and generic functionality within ROS-Industrial.

Skill primitive library

Skill primitive library

Descartes Joint Trajectory Planner for Semi-Constrained Cartesian Paths

This grant also supported development of the Descartes Path Planner. Please refer to our previous post for a description and video of Descartes.


This work was conducted under NIST contract #70NANB14H226.

by Paul Hvass on July 09, 2015 10:55 AM

July 02, 2015
MoveIt! goes underwater!
from MoveIt!

Dina Youakim at the Universitat de Girona used MoveIt! on an underwater Girona500 AUV robot and 4-DOF arm for autonomous underwater manipulation. This research was part of her MSc thesis work under the guidance of Pere Ridao and Narcis Palomeras at the Girona Underwater Vision and Robotics Lab within the Computer Vision and Robotics Research Institute. Watch the movies below to see how MoveIt! was used in this exciting new project:

Free-Floating Autonomous Underwater Manipulation: Connector Plug/Unplug

Free-Floating Autonomous Valve Turning in Presence of Virtual Obstacles

More information about the Girona500 AUV robot can be found in this publication:

D. Ribas, N. Palomeras, P. Ridao, M. Carreras and A. Mallios. Girona 500 AUV, from survey to intervention. IEEE/ASME Transactions on Mechatronics, 17(1):46–53, February 2012.


by Sachin Chitta on July 02, 2015 06:15 PM

Reminder: ROSCon CFP closes on July 7th

This is a friendly reminder that the ROSCon call for proposals is open until July 7th. ROSCon talks are a great opportunity to share your work with the community. Submit your proposals at:

Full text of the CFP is below.

Also if you're planning to attend, registration is also open: And if you're in the US a few people have pointed out that this week United is having a sale on flights to Europe in the fall. Details are at:

ROSCon 2015 Call for Proposals

Presentations and tutorials on all topics related to ROS are invited. Examples include introducing attendees to a ROS package or library, exploring how to use tools, manipulating sensor data, and applications for robots.

Proposals will be reviewed by a program committee that will evaluate fit, impact, and balance.

We cannot offer sessions that are not proposed! If there is a topic on which you would like to present, please propose it. If you have an idea for an important topic that you do not want to present yourself, please post it to

Topic areas

All ROS-related work is invited. Topics of interest include:

  • Best practices
  • Useful packages and stacks
  • Robot-specific development
  • ROS Enhancement Proposals (REPs)
  • Safety and security
  • ROS in embedded systems
  • Product development & commercialization
  • Research and education
  • Enterprise deployment
  • Community organization and direction
  • Testing, quality, and documentation
  • Robotics competitions and collaborations

Proposal submission

A session proposal should include:

  • Title
  • Recommended duration: Short (~20 minutes) or Long (~45 minutes)
  • Summary, 100 word max (to be used in advertising the session)
  • Description (for review purposes): outline, goals (what will the audience learn?), pointers to packages to be discussed (500 Words Maximum)

by Tully Foote on July 02, 2015 01:00 AM

June 29, 2015
Descartes Joint Trajectory Planner for Semi-Constrained Cartesian Paths

Current MoveIt!/ROS path planners are focused on collision-free pick and place applications. In the typical pick and place application, the starting and goal positions and collision models are the only inputs to the planner. By contrast, many industrial applications must follow a pre-defined Cartesian path, where the path in between matters as well. Some common examples of this are blending, painting, machining, sanding, sealing, and welding. Unfortunately, solving the Cartesian path planning problem by simply applying an inverse kinematics solution results in an artificially limited solution set that doesn't take advantage of the process flexibility/tolerance allowances. In reality, Cartesian paths are typically semi-constrained. For example, in a machining application a five degree-of-freedom (DOF) path is required, where the sixth DOF, the orientation about the tool, is not defined (doesn't matter). Joint trajectory planners that fail to take advantage of these open constraints, such as inverse kinematics (IK) based planners, limit the likelihood of finding a valid solution, even though one could exist in the semi-constrained space. The Descartes planner library was initiated in Summer 2014 with NIST and ROS-Industrial Consortium Americas support to address semi-constrained Cartesian industrial processes. Descartes has already been demonstrated in a robotic routing and blending/sanding applications. Key capabilities of Descartes include, path optimization, collision avoidance, near instantaneous re-planning, and a plug-in architecture.

The Descartes library saw its first use in early 2015, and was alpha-released at the ROS-Industrial Community Meeting at ICRA 2015 on May 26, 2015. The focus of recent development has been on making the library more user friendly, better able to capture process requirements, and more computationally efficient. A recent addition with a strong impact on all of these areas is process velocity consideration. Descartes can use this extra knowledge to improve its search for the optimal process path.

At the time of the ROS-Industrial Community meeting in January, a 6DOF robot following a semi-constrained (5DOF) Cartesian path of approximately 800 points took 30 seconds to plan. Today that same path can be solved in a fraction of a second. A specific implementation for the robot blending application has seen speed increases of a factor of 1000 as compared to testing in January. Looking toward the future, Descartes will continue to see improvements to its usability and performance. Active areas of research and development include high degree of freedom ( > 7 DOF) planning for both single and dual arm configurations, and hybrid planning, where free space motions (such as those found in a pick and place application) are combined with well defined process paths.

Descartes documentation can be found at the ROS wiki. For working examples, please refer to the Descartes tutorials and ROS-I Training Session 4.

by Paul Hvass on June 29, 2015 04:09 PM

June 24, 2015
ROS Answers Cleanup Week
From David Lu!! via ros-users@

I am unofficially declaring this week ROS Answers Cleanup Week!

I encourage everyone to do the following between now and the end of June:
1) Log into
2) Click your screenname at the top of the screen to view your profile.
3) Examine the list of questions you've asked to find questions that
don't have an accepted answer or are still open.
4a) Close the questions that are outdated.
4b) Accept an answer if a decent answer exists.
4c) As a last resort and you're still desperate for answers, update
the question with relevant information that might help your question
get answered.
5) Profit! (and/or the satisfaction of making this central tool to our
community a little cleaner)

And if you don't have any unanswered questions please take a few minutes to answer a few for others. 

by Tully Foote on June 24, 2015 09:48 PM

Learning ROS for drones, episodes 2 and 3
From Victor Mayoral Vilches via ros-users@

As announced a few days, we published episodes 2 and 3 of our Learning ROS series. Over these two new episodes we explain how create a ROS package that allows a drone to autonomously takeoff, do stuff (or idle as it's our case) and land (source code). Watch out the last part of the video (minute 8:58) where we show a life demo of the code developed during the session.

We've got good feedback so far with several comments pointing out the quality of the audio. We'll try fixing this in future videos we record. Feel free to through ideas on what kind of stuff you'd like to see explained that could be helpful for your research/classes.

by Tully Foote on June 24, 2015 05:46 PM

Powered by the awesome: Planet