April 16, 2014
ROS Drinks - London
From Dan Greenwald

Calling all London ROS Users! You are warmly invited
to the first London Robotics Network "ROS Drinks",
on the evening of 23/4/14. Put another way, come and
meet up for beer, food and talking about robots.

We thought we should have a regular meetup to talk about
ROS and robots in general. This month we'll be at the
Craft Beer Co in Angel (55 White Lion Street N1 9PP)
from about 6pm in the back room.

The pub does food as well as really excellent beers.

Here is the flyer for event on the LRN group.

Feel free to contact Dan Greenwald (dg@shadowrobot.com)
for more info/any questions.

Hope to see you there.

by Ugo Cupcic on April 16, 2014 04:23 PM

April 10, 2014
PhaROS-Based Tracker Robot
Xuan Sang Le is a PhD student who has joined our team mid-february. His work is co-supervized by Ecole des Mines and ENSTA is about speed optimization of Smalltalk robotic software by means of FPGAs. The first step is to develop an application fully in Smalltalk and our PhaROS Robotics framework that will serve as… Continue reading

by noury on April 10, 2014 01:31 PM

New package: Frontier Exploration
From Paul Bovbel via ros-users@

Hello ros-users,

This package implements frontier exploration using an action server (explore_server), that can be controlled from rviz via explore_client, or directly from other nodes.

When starting out with ROS, I was frustrated that there was no (maintained) exploration package that worked solely using the core ROS APIs (i.e. navigation).

Internally, this package contains a custom costmap_2d layer plugin that could be adapted for more complex exploration strategies.

Please email or post any feedback, comments or concerns!

by Tully Foote on April 10, 2014 01:25 AM

April 08, 2014
New version of STDR SImulator
From Manos Tsardoulias via ros-users@

The current version of STDR Simulator is 0.1.3! The changes compared to the v0.1.0 follow:
  • Full support of robots with polygonal footprint
  • Zoom in STDR GUI is also performed with the mouse wheel
  • Fixed saving and loading robots and sensors from the Robot Creator in GUI
  • Added odometry publisher
  • Added robot-to-obstacles collision check
Special thanks to trainman419 for contributions in 
  • the polygonal robot support and the odometry publisher
  • GUI makefiles
  • writing a tutorial on robot teleoperation with STDR using teleop_twist_keyboard.

The next version (v0.1.4) will include full support of RFID tags and RFID reader sensors.

by Tully Foote on April 08, 2014 06:16 PM

April 07, 2014
NooTriX posted Hydro VM Image
From Nootrix via ros-users:

We finally managed to make a virtual machine with Hydro. As usual, it's freely available for download at:

NooTriX Team

by Tully Foote on April 07, 2014 09:16 PM

April 02, 2014
ROS user survey: the results are in

The results are in from the January 2014 ROS user survey. Thanks to everyone who participated!

We had a total of 336 responses. We'll walk through the questions, one at a time:

In general, for what do you use ROS?


Not surprisingly, the lion's share of ROS users consider themselves to be doing research. That's where we started, and we expect to continue to see high participation in the research community. But we also see about 1/3 of respondents classifying themselves in education and 1/3 in product development, with a smaller share of self-identified hobbyists. Those are all areas for future growth in ROS usage.

What about ROS convinced you to use it?


Interestingly, the top response here is the communications system. When we set out to build ROS, we started with the communications system, because we believe that robotics problems are most naturally solved by developing distributed systems, and further that developing those systems is hard, requiring solid, easy to use tools. It looks like our users appreciate the effort that's been put into ROS middleware.

Also near the top are what we can call the "healthy open source project" benefits: friendly licensing, helpful community, and playing nicely with related open source projects.

How do you primarily use ROS?


Most users are working with a single robot, but a substantial number of people are working with multiple robots, which was outside the initial design of ROS. Multi-robot support definitely needs improvement, but clearly people are already getting something out of ROS in multi-robot environments.

With what type(s) of hardware do you use ROS?


At least in part because most robots in the world (or at least in research labs) are basically cameras and/or lasers on wheels, we see most of our users working on those platforms. But we also see a fair number of people working with arms and hands, and we expect that the number of legged systems will grow in the future.

Have you shared and/or released your own ROS packages?


Here we see a familiar pattern in open source development: most users don't share their code with the community. That's OK with us, because we know that not everybody is in a position to share their code (for example, commercial users who are building ROS-based products). But if you can share code, please do!

Which ROS packages are most important to you?


Here, we have some clear winners. Visualization is important: rviz is a critical piece of infrastructure in our community, and the rqt library of visualization components is also heavily used. Also highly ranked are planning libraries (navigation and MoveIt!), perception libraries (PCL and OpenCV), coordinate transform management (tf), and simulation (Gazebo). Interestingly, we see the OpenNI driver in the top ten, perhaps reflecting the long-standing connection between ROS and Kinect-like devices, dating back to the ROS 3D Contest.

Where should future ROS development focus?


Less clarity here; basically we should do more of everything.

What is your top priority for future ROS development?

The free-form answers we received in response to this question are challenging to quantify. At a high-level, here's a qualitative distillation of common themes, in no particular order:

  • more / better documentation
  • more / better / more up-to-date tutorials
  • improved usability
  • greater stability, less frequent releases
  • better multi-master / multi-robot support
  • consolidation of related parts into coherent wholes
  • better / more mature middleware
  • better / more attentive maintenance of core libraries and tools
  • add features and fix bugs in rqt
  • get to "production quality"
  • IDE support
  • real time support

Would you be willing to anonymously report usage statistics?


About 1/2 of respondents are willing to install a plugin to roscore that would track and anonymously report usage statistics, which would let us automatically collect data on which packages, nodes, launch files, etc. are most heavily used. Any volunteers to write that plugin?

by Brian Gerkey on April 02, 2014 07:20 PM

April 01, 2014
A Universal Pendant Could Elucidate the Interface to Industrial Robot Manipulators

A guest post by Dr. Mitch Pryor, UT Austin Nuclear and Applied Robotics Group

The ROS-Industrial Consortium Americas held its 2014 members meeting at SwRI in San Antonio, Texas on March 6th. One of the primary activities of the Consortium is to establish the technical vision and requirements for ROS-Industrial. This is done through a series of requirements gathering and analysis activities known as roadmapping. This blog provides a useful forum for sharing ideas on the proposed ROS-I roadmap and gives members a chance to succinctly present thoughts on particular topics and receive feedback from all stake holders via constructive comments. The roadmap development approach presented by Clay Flannigan (and Steve Jobs) starts with the end-user’s needs (i.e. applications). Once identified, as many were at the ROS-I spring meeting, the roadmap then pinpoints the technical gaps and puts forward an implementation plan to develop the envisioned technologies.

I want to start a discussion on what “commands” hardware must reliably execute to follow the desired trajectories and/or apply proscribed forces for a given application.  In the traditional paradigm, such commands are communicated via a teach pendant or offline programming

A teach pendant is a handheld controller that provides the primary conduit for moving the robot and recording the position locations. Traditionally, it is used to sequentially teach the EEF locations associated with a given task. This instruction method is insufficient for ROS-I to extend the advanced (i.e. advancing the autonomy or flexibility of a system) capabilities of ROS to new industrial applications. Offline programming offers more flexibility but there is no standard language or set of capabilities offered among hardware vendors. What is needed is a universal Application Program Interface (or universal API) with as much of the functionality as possible accessible via a Universal Pendant.

Traditional Dedicated Industrial Robot Teach Pendant. Source: SwRI 2011

Traditional Dedicated Industrial Robot Teach Pendant. Source: SwRI 2011

Mobile HMI: Notional Universal (i.e. interoperable) Pendant. Source link.

Mobile HMI: Notional Universal (i.e. interoperable) Pendant. Source link.

The notion of a universal pendant is not new. Toyota developed an internal unified teach pendant in 2000. Its development did more than reduce the training time for Toyota operators, it helped Toyota define the underlying capabilities that robotic vendors must provide. The Toyota unified pendant currently does not provide access to all of the capabilities envisioned by ROS-I members. If the ROS-I Consortium was to develop a similar, but more advanced device, it would help clarify and illustrate many of the API capabilities that are needed by industry.  Its development would help clarify API ambiguities and hopefully reduce the barrier to entry to much of the API functionality in an industrial setting.

What would such a teach pendant look like? What core functionality should it have? As developers, the second question is more important to answer. It certainly must provide access to the internal state of the robot (i.e. tool location, current position, current motor currents, operational status, etc.) It should be possible to modify individual joint positions as well as command joint velocities. Many advanced technologies would require access to joint torques and/or joint currents. Another useful feature would be to directly prompt a given robotic system for its mass, inertial and/or compliance parameters that are necessary in many advanced control algorithms. Remote systems should provide battery life information which is necessary to plan extended tasks. Another interesting option would be access to any internal, extensible wiring harness . One could even envision a universal messaging service for commanding hardware via existing proprietary languages. As the ROS-I Consortium develops new capabilities, such a service may become obsolete, but the universal API should not negate existing system capabilities.

Once the API is defined, it may not be possible to expose all functionality in a traditional pendant. Innovative ideas may be necessary if the full API is to be exposed. Even then certain functionality may still require writing code. The definition and scope of such an API is not trivial.  All parties (end-users, integrators, vendors, researchers, etc.) need to assist in its creation. Once developed, hardware vendors must have the right to only partially implement the proposed functionality. But our goal must be to develop an API that enables all the technologies proposed in the roadmap and make as much of the API as possible accessible to traditional (i.e. no command line!) end-users. A universal pendant would help address this and provide a mechanism for precisely illustrating and resolving ambiguities in the proposed API.

by Paul Hvass on April 01, 2014 11:36 PM

Call for Participation: ROS Kong 2014

We're pleased to announce that registration is now open for ROS Kong 2014, an international ROS users group meeting, to be held on June 6th at Hong Kong University, immediately following ICRA:



This one-day event, our first official ROS meeting in Asia, will complement ROSCon 2014, which will happen later this year (stay tuned for updates on that event).

Register for ROS Kong 2014 today: https://events.osrfoundation.org/ros-kong-2014/#registration Early registration ends April 30, 2014.

ROS Kong 2014 will feature: * Invited speakers: Learn about the latest improvements to and applications of ROS software from some of the luminaries in our community. * Lightning talks: One of our most popular events, lightning talks are back-to-back 3-minute presentations that are scheduled on-site. Bring your project pitch and share it with the world! * Birds-of-a-Feather (BoF) meetings: Get together with folks who share your specific interest, whether it's ROS in embedded systems, ROS in space, ROS for products, or anything else that will draw a crowd.

To keep us all together, coffee breaks and lunch will be catered on-site. There will also be a hosted reception (with food and drink) at a classic Hong Kong venue at the end of the day. Throughout the day, there will be lots of time to meet other ROS users both from Asia and around the world.

If you have any questions or are interested in sponsoring the event please contact us at roskong-2014-oc@osrfoundation.org.

Sincerely, Your ROS Kong 2014 Organizing Committee Tully Foote, Brian Gerkey, Wyatt Newman, Daniel Stonier

by Tully Foote on April 01, 2014 02:20 AM

March 27, 2014
Intermodalics Uses ROS-I for Palletizing Application

From Bert Willaert of Intermodalics.

Intermodalics is currently developing a depalletizing application for a client. The goal is to move an average of 2,000 crates per hour from standard pallets to a conveyor belt. Additional challenges include: more than 10 different crate types can occur in varying colors, the crates are not necessarily empty and they are randomly stacked.

The application consists of a UR10 robot from Universal Robots, a 3D camera, an Intermodalics Intelligent Controller (IIC) and an active pallet lift. The software for the application running on the IIC extensively uses ROS and the OROCOS toolchain. OROCOS is a software framework for realtime, distributed robot and machine control which is seamlessly integrated with ROS and has both Industrial and Academic users worldwide.

For finding the crates' position and orientation, Intermodalics developed a crate localizer that builds upon the PCL library as well as on a set of in-house developed point-cloud processing algorithms. The ROS visualization tool RViz proved absolutely invaluable during the realization of this product locator.

The use of the ROS-Industrial package for the UR robot allows both the motions and the application state machine to be simulated. This significantly facilitates the implementation of the whole application.

The integration of the UR controller and the IIC does not affect the inherent safety feature of the UR robot which makes the robot stop if it encounters excessive forces. If such a stop occurs, the application can be easily restarted by a simple human operator intervention.

by Tully Foote on March 27, 2014 10:50 PM

March 26, 2014
Announcing a ROS Japan Users Group Meetup
From Daiki via ros-users@

Content? Explanation of the concept of ROS
Organizer? ROS Japan User's Group and Mamezou Inc.
Number of participants? 30
Venue? 2-7-1 Nishi Shinjuku, Shinjuku City, Tokyo
Dates? April 12, 2014 at 13:30 ~ 18:30
Twitter hashtag? #rosjp

Scheduled to be held every month.

by Tully Foote on March 26, 2014 05:24 PM

March 25, 2014
ROS Indigo Igloo buildfarm running
We're pleased to announce the ROS build farm for Indigo Igloo is now available.  It includes over 180 packages already for Ubuntu 13.10, Saucy Salamander, and Ubuntu 14.04, Trusty Tahr. We expect that number to continue to grow rapidly. Installation instructions already exist for Ubuntu using debians or for compiling from source [1], and you can see the status of Indigo packages on this page:

If you are a maintainer please look at what packages have been released and consider releasing yours as soon as your upstream dependencies have been satisfied.  If you are blocked on another package being released please contact the maintainer.  And if you cannot reach the maintainer please email ros-release@lists.ros.org (join if you aren't a member already).

If you are planning to release into Indigo please read the information provided in the migration guide [2] and refer to the bloom tutorials [3] for doing the release. Please also contribute to the migration guide for updates relating to your package.

After releasing your packages the build farm will keep you notified of the status of the individual jobs. Please pay attention to the automated emails from the buildfarm, if jobs are failing they block downstream packages from releasing and waste our build resources. 

by Tully Foote on March 25, 2014 08:44 PM

Intermodalics Uses ROS-I for Palletizing Application

Intermodalics is currently developing a depalletizing application for a client. The goal is to move an average of 2,000 crates per hour from standard pallets to a conveyor belt. Additional challenges include: more than 10 different crate types can occur in varying colors, the crates are not necessarily empty and they are randomly stacked.

The application consists of a UR10 robot from Universal Robots, a 3D camera, an Intermodalics Intelligent Controller (IIC) and an active pallet lift. The software for the application running on the IIC extensively uses ROS and the OROCOS toolchain. OROCOS is a software framework for realtime, distributed robot and machine control which is seamlessly integrated with ROS and has both Industrial and Academic users worldwide.

For finding the crates’ position and orientation, Intermodalics developed a crate localizer that builds upon the PCL library as well as on a set of in-house developed point-cloud processing algorithms. The ROS visualization tool RViz proved absolutely invaluable during the realization of this product locator.

The use of the ROS-Industrial package for the UR robot allows both the motions and the application state machine to be simulated. This significantly facilitates the implementation of the whole application.

The integration of the UR controller and the IIC does not affect the inherent safety feature of the UR robot which makes the robot stop if it encounters excessive forces. If such a stop occurs, the application can be easily restarted by a simple human operator intervention.

Blog post provided by Bert Willaert of Intermodalics.

by Paul Hvass on March 25, 2014 08:13 PM

Job Opening at Clearpath Robotics
From Ryan Gariepy via ros-users@

Position:               Multi-Robot Autonomy Engineer
Location:              Kitchener, Ontario
Experience:          1-5 Years Relevant Work Experience
Education:            Graduate Degree in Related Field

About Us

Clearpath Robotics Inc. specializes in the design and manufacture of
unmanned vehicle systems, software, and components for academic and
industrial research and development.  Our clients range from small
local businesses to some of the best known technical institutions on
the planet.  Based in Kitchener-Waterloo, Clearpath Robotics employs
highly talented people who live and breathe robotics.  We believe that
work must have a high "cool" factor, and we're looking for people who
share in our passion to create remarkable products and change the

About the Job

We require robust implementations of the latest multi-agent control
and planning algorithms that can function within the constraints of an
unstructured environment, real-world motion dynamics and sensing
constraints. We've been building robots for a while and our clients
are now asking for more than one of our robots to work together in the

You will stay on top of recent developments in multi-agent control and
planning. You will continually evaluate how these algorithms will
benefit our current customers and product offering. Additionally, you
will have to figure out methods to organically incorporate multi-agent
autonomy into the autonomy features currently offered on our robots.
This includes appropriately interfacing with advanced control and
perception algorithms. Finally, you will be field testing these
algorithms to ensure robustness on the field and in real applications.
You will be spending warm summer days driving robots around outside
(cold winters too; this is Canada after all).

 Your primary responsibilities will be:
*    Multi-agent controller design and optimization for autonomous
vehicles with varying dynamics
*    Multi-agent simulation development
*    Algorithm prototyping and implementation

Additional tasks may include:
*    Developing & carrying out system test plans
*    General software development & testing
*    Mentoring and assisting with supervision of interns
*    Explaining our newest shiny toys to the sales & marketing team

About You

You want to work for a small company that thinks big and dreams huge.
You are driven, view work as more than just a job, and are never
satisfied with a project left half-done.  You want to be surrounded by
people like you; creative, fun-loving, and passionate about their
work.  You are motivated by making an impact on your workplace and you
thrive on challenging and rewarding problems.   Oh, and you have some
form of higher education with the common sense to back it up.

Required Technical Skills:
*    Graduate degree in engineering or a related field, with
applicable background
Practical knowledge of  multi-agent planning and control based in a
(primarily) centralized framework
*    Working knowledge of decentralized decision making and/or swarm
Strong software development skills (C, C++, Python preferred),
Proficiency with Linux
*    Hands-on experience with autonomous systems

Desired Soft Skills:

*    Ability to efficiently and clearly communicate ideas, including
to those who may have a limited theoretical background in the area
*    Comfortable with abrupt changes to project deadlines, job
responsibilities and the local gravity field

Bonus points for:

*    ROS, MATLAB, LabVIEW, Gazebo, Player, experience
*    Multi-agent networking or mesh network experience
*    Understanding of sensors and their error models, particularly
laser rangefinders, GPS systems, and vision systems
*    Experience with the control of skid-steer and differential drive
ground vehicles
*    Ability to perform general hands-on troubleshooting of
electromechanical systems
*    Exposure to SLAM and vehicle control methodologies
*    Ability to diagnose broken robots by their sounds and smells

What Now?

Apply through our online job portal using this link:
http://jobsco.re/1eu0CGl. Please submit cover letter along with your
resume. Instructions for sending supporting documentation, including
testimonials as well as conference papers, journal articles, source
code, portfolio media, references, or other indications of exceptional
past work will be provided in the confirmation email sent by our
system upon receiving your application. Please include "Multi-Robot
Autonomy Engineer" in the subject of any further communications. If
your skills don't fit this job description, but you're still
interested in working with us please apply to our "General Robotics
Enthusiast" position. No recruiters or form cover letters, please.
They do not please our mechanical masters.

by Tully Foote on March 25, 2014 07:56 PM

March 24, 2014
New Package nav2d
From Sebastian Kasperski via ros-users@

Hello ROS users,
I would like to share a set of ROS packages that provide nodes for autonomous exploration and map building for mobile robots moving in planar environments. More information and some help can be found in the ROS-Wiki:
The source is available via Github:
It contains ROS nodes for obstacle avoidance, basic path planning and graph based multi-robot mapping using the OpenKarto library. Autonomous exploration is done via plugins that implement different cooperation strategies. Additional strategies should be possible to implement with only little overhead.
These nodes have been used on a team of Pioneer robots, but other platforms should also do. A set of ROS launch files is included to test the nodes in a simulation with Stage. Please feel free to try it and post issues on Github.

by Tully Foote on March 24, 2014 09:53 PM

March 20, 2014
Software Engineer at Exciting 3D Mapping Startup

From Ryan Thompson via ros-users@

Quanergy is a Silicon-Valley-based startup developing smart sensing solutions for real-time 3D mapping and object detection, tracking, and classification. We're a small company run by engineers, dedicated to building next-generation LiDAR technology for autonomous vehicles and advanced driver assistance systems. By joining our team at this point, you'll play a key role in the development of our company, not just our software. We're looking for someone extremely bright, driven, a great communicator and explainer, and just as passionate about the future of transportation and perception as we are!

Job Description:

The Software Engineer at Quanergy will be responsible for designing, developing, and maintaining our map data structure and access system and parallelizing localization with a GPU, all based on point cloud data generated by our next-generation LiDAR sensors. She will work closely with co-workers to test and optimize code for real-time application on the embedded CPU and GPU. He will keep current with the latest research and advances in the field, help shape the direction of software side of the company, and contribute to the sensor integration, mapping, and perception efforts of the software team.


  • B.S., M.S., or Ph.D. in Computer Science, Electrical Engineering, or a related field

  • Fluency in C++ and Linux

  • CUDA (or OpenGL) expertise

  • Strong mathematical foundation

  • Willingness and ability to tackle problems outside his/her areas of expertise

  • Academic or professional experience in at least one of: Robotics, Parallel Programming, Real-Time Embedded Systems, Game Development


  • ROS and/or PCL familiarity

  • Experience with optimization for real-time computing

  • Able to start immediately


Quanergy offers very competitive Silicon Valley salaries and equity.


Email ryan.thompson@quanergy.com for more information. To apply, email with a resumé and cover letter, or apply on Stack Overflow: http://careers.stackoverflow.com/jobs/51242/software-engineer-at-exciting-3d-mapping-startup-quanergy-systems-inc

by Ugo Cupcic on March 20, 2014 06:42 PM

March 18, 2014
HERE mapping cars run ROS

As reported at HERE Three Sixty, their global fleet of hundreds of mapping cars is running ROS!

HERE car

They carry laser range-finders, cameras, and GPS that are used to estimate the vehicle's posisiton and gather 3-D pictures of the surrounding environment. That data gets shipped back to their headquarters for processing.

As HERE's Michael Prados put it, "The system of sensors and computers means the software that's needed is very like that which is used to create robots." So they decided to build their cars' software on ROS. The software runs on a headless server in the car's interior, with the driver interacting via a mobile application on a tablet that he or she can operate easily from the seat.

HERE car interior

"We chose the open source ROS because it was the best solution, hands-down," Michael concludes. "And now we're looking into the ways that we might give back to OSRF, and help its future success."

Read the whole story at HERE Three Sixty.

by Tully Foote on March 18, 2014 04:26 PM

"Mirror" of mirror sites of ros.org
Copied from http://www.ros.org/wiki/Mirrors on Dec 25, 2012 (merry Christmas!)

  • Europe:
  • North America:

  • Willow bike willying

    by Isaac Saito (noreply@blogger.com) on March 18, 2014 01:06 AM

    March 17, 2014
    New Package: catkin_lint
    From Timo Röhling via ros-users@

    I have created a tool to check catkin packages for common build
    configuration errors. I announced it to the ROS Buildsystem SIG a while
    ago, and I think it is ready for public scrutiny:

    Source: https://github.com/fkie/catkin_lint
    PyPI Package: https://pypi.python.org/pypi/catkin_lint
    Ubuntu PPA: https://launchpad.net/~roehling/+archive/latest

    It runs a static analysis with a simplified CMake parser. Among the
    checks are order constraints of macros, missing dependencies, missing
    files, installation of targets and headers, and a few other things. The
    checks are inspired by the catkin manual and issues I encountered in my
    daily work routine.

    Give it a try and feel free to post any issues on Github.

    by Tully Foote on March 17, 2014 05:38 PM

    March 14, 2014
    New Package: ROS Glass Tools
    From Adam Taylor via ros-users@

    We would like to announce ros_glass_tools, an open source project that aims to provide easy voice control, topic monitoring, and background alerts for robot systems running ROS using the Google Glass.  It communicates with ROS using the rosbridge_suite.  

    More information about the tools can be found at the following links.

    by Tully Foote on March 14, 2014 10:15 PM

    March 13, 2014
    Crossposted from www.osrfoundation.org

    Albert II is famous for being the first monkey in space, in June 1949. Laika is equally renowned for being the first animal to orbit the Earth, in 1957. On Sunday, March 16th, at 4:41am (unless inclement weather intervenes), ROS will celebrate its own celestial milestone when it is launched into space aboard a SpaceX rocket as part of a resupply mission to the International Space Station (ISS).

    Albert II

    In conjunction with NASA's Robot Rocket Rally March 14-16 at the Kennedy Space Center in Florida, SpaceX's third mission will include a set of robotic legs for the Robonaut 2 (R2) humanoid torso that is currently aboard the ISS. Once those legs are attached to R2, ROS will officially be running in space.

    For the last few years, the NASA/GM team at the Johnson Space Center has been using ROS for R2 development here on Earth. We first heard about that at ROSCon 2012 in Stephen Hart's keynote presentation, where he described how they combine ROS and OROCOS RTT to achieve flexible, real-time control of R2. Following the launch this weekend, that open source software will be running on the R2 that's on ISS.

    Robonaut 2 legs
    Robonaut 2 simulation

    The R2 team also uses the open source Gazebo simulator to simulate R2 when they're doing development and testing. They've released their models of R2 and ISS as open source for the community to work with. We recently integrated those models into an immersive teleoperation Gazebo demonstration that we'll be running at the Robot Rocket Rally this weekend. Drop by our booth and find out what it's like to "be" Robonaut 2!

    ROS has already powered robots in the air, on the ground, on and under the water, and on every continent, but we at OSRF couldn't be more excited about ROS journeying to outer space.

    by Tully Foote on March 13, 2014 06:34 PM

    March 11, 2014
    An Immersive Virtual Robotics Environment based on ROS

    From Kel Guerin at Johns Hopkins University


    At the Laboratory for Computational Sensing and Robotics at Johns Hopkins University, we have utilized the extensive visualization tools available in ROS to create an immersive virtual reality environment for interacting with robots. The versatile plug-in system for the RVIZ visualization package has allowed us to create virtual user interfaces, information displays, and interactive objects that co-exist with other resources in the RVIZ environment. Additionally, the excellent Oculus Rift RVIZ plugin gave us the perfect starting point for using RVIZ as a VR environment. This provides us an excellent test-bed for virtually teleoperating an teleprogramming our robots. Finally, the flexibility of ROS lets us deploy IVRE on several robots in our lab, including industrial systems and surgical robots. For more information on the tools we used, checkout the oculus rviz plugin and the RVIZ plugin API.

    by Tully Foote on March 11, 2014 06:24 PM

    March 10, 2014
    New Package: Announcing ROS/DDS proxies
    From Ronny Hartanto of DFKI GmbH via ros-user@

    Hi Everyone,

    We are happy to announce the ros_dds_proxies:

    As recently, there was some discussion on using DDS as communication layer in ros. This package contains our implementation on using DDS middleware for a multi-robot systems. We have been successfully using this implementation in our project (IMPERA). In our experiments, all the messages were successfully delivered to all robots, even with communication outage for about 15 minutes. 

    Any comment or improvement are welcome.

    by Tully Foote on March 10, 2014 10:45 PM

    March 08, 2014
    Middlesex University is running an Intro to ROS summer school June 14th to 18th
    From Nick Weldin at Middlesex University

    Middlesex University London is running an Introduction to ROS summer school in Lundon, June 14th-18th. It will be a practical hands on class with 10 turtlebot 2 robots and a Baxter Research Robot. More details are available athttp://www.mdx.ac.uk/courses/short/summer-school/courses/an-introduction-to-robot-operating-system.aspx

    by Tully Foote on March 08, 2014 03:05 AM

    March 06, 2014
    New Package sentis_tof_m100 package for groovy and hydro
    From Angel Merino Sastre & Simon Vogl via ros-users@

    Hi all,

    We are happy to announce the sentis-tof-m100 ros package:


    This package provides support for the Bluetechnix Sentis ToF M100 camera
    based on the software API that is provided with the camera, along with
    a detailed installation how-to and a ready-to-use launch file with a
    visualization example based on rviz.

    Any comment/suggestions are welcome

    by Tully Foote on March 06, 2014 08:04 PM

    Robotics in Concert - 2014
    It has been a little quiet on this front for the past few months but certainly not for a lack of activity so it's probably high time for a 2014 update on what we're up to. Some of the more interesting directions the various groups are working on:

    Piyush at Austin University of Texas has been playing around with a multi-master Gazebo so we can develop in simulation before deploying and has also found a bunch of interesting and zany computer scientists who are getting curious about robots, but don't have the time or resources to commit to actually getting their experiments into real robots in the real world (which is where we come in).

    Meanwhile William at OSRF has been putting together a beta version of capabilities. These have been designed to provide some much needed structure to the internals of a ros robot. William has already expended many words on that wiki page - go there so we don't have to reinvent the word here :) Even though beta, capabilities are already finding their way into robots at Yujin, Unbounded Robotics and Clearpath, which is great - they were designed exactly to scratch an itch for products and what we see is exactly those groups being the early adapters.

    On other fronts, tackling orchestration of services at a higher level stonewalled us for a while - what is the right solution for robotics? The answer to that exact question turned out to be 'the wrong solution' and once we realised this, it allowed us to move forward designing a framework. To be slightly less enigmatic, between ourselves, a group at Soongsil University in Korea and the computer scientists at Austin, there was no consensus on what to do in this space and subsequently whatever we chose, it was very likely going to be that wrong solution. This should have been immediately apparent given that there are so many PhD theses on the topic (even ROS' papa smurf Brian Gerkey has been there, done that!) but hindsight is a wonderful thing. What was obvious though, was that there was a need to provide a platform which would handle all the details and let people experiment with multi-robot services and solutions on top. Enter the OPP, rocon's Orchestration Platform Prototype. Our goal for this is to enable people to write and experiment with their own service workflows (be they ROS, link graph, java agents or business process styles) and handle all of the surrounding details so that people who write services can focus on doing exactly that.

    So what's our first tangible going to look like? As a first version, what we'd like to deliver is a system which can assist you in building maps, administrating, do teleop'ing in emergencies, manage robot and software resources and provide a final slot in which you can write your own custom service. A 'navigation stack' for multi-master if you will.

    Interesting how all those discussions from varying groups resulted in something that rather resembles technology from the 70's - what we have here basically looks like a server computer complete with hardware resources that need to be managed, human interactive devices, software support and services running in parallel on top providing the real world operations...

    We're currently in the middle of this marathon, see the milestones page has more details.

    Jack O'Quinn is working on a flexible and customisable resource requester/scheduler. Austin's computer scientists are plugging away at a new method for orchestrating services and wireless handling. Soongsil is working on a business process style service, Jorge in Spain is about to tackle maps and annotations while we are working on the framework and doing our damndest to fervently adhere and not lose sight of the KISS principle which repeatedly threatens to disappear over the horizon.

    Still much to do, but the road ahead is finally now, a road. If you'd like to discuss, talk to us on the ros multimaster sig. And in the meantime, we hope to spin off a few useful ros single-master tools we've developed sometime in the next couple of weeks, but more about them whey they come out...

    Happy 2014 from the ROCON team.

    by Daniel Stonier (noreply@blogger.com) on March 06, 2014 02:35 PM

    Powered by the awesome: Planet