December 17, 2014
Support OSRF!
Cross posted from osrfoundation.org

When we started the ROS project back in 2007, our goal was to build an open robotics software platform for students, engineers, entrepreneurs, and anyone else to freely use and modify. In 2012, we took the next step by founding OSRF as an independent non-profit organization to pursue that mission, with responsibility for both ROS and Gazebo. Today, we see these tools used worldwide to teach concepts, solve problems, and build products in ways that we couldn't have imagined at the beginning.

We couldn't be happier with the size and breadth of the collaborative community that we've built together, and we're grateful to everyone in the community for the roles that you've played.

You won't be surprised to hear that it costs money to run OSRF. We employ a small team of amazing individuals, we operate an office in the Bay Area, and we run a suite of online services on which the community depends.

Since our founding, OSRF has enjoyed generous financial support from government agencies and private industry, for which we're very grateful. We hope and anticipate that that support will continue in the future. But now, as we approach the end of OSRF's third year, we're trying something new: asking you, our users, for support.

If you rely on ROS and/or Gazebo in your lab, your startup company, your weekend projects, or elsewhere, please consider donating to OSRF. Your donation will support our people and infrastructure so that we can spend (even) more time developing and maintaining the software and services on which you depend.

As one example, if everyone who visits the ROS wiki between now and the end of the year donates just $2, we'll have our costs covered for next year to manage, update, and host all of our online services, including the wiki. Donations in any amount are welcome. Give more, and we can do more.

Donate to OSRF today.

Thank you for your support.

Contributions to the Open Source Robotics Foundation, a 501(c)(3) non-profit organization, will be used at its discretion for its charitable purposes. Such donations are tax-deductible in the U.S. to the extent permitted by law.

by Tully Foote on December 17, 2014 11:58 PM

Springer Book on ROS: Call for Chapters
From Anis Koubaa of Springer Publishing via ros-users@

Hello,

With respect to our last interaction about editing a complete reference book on ROS, I am happy to inform that the official call for chapters for the Springer Book on Robot Operating Systems is now open. 

The book will be published by Springer under the book series "Studies in Systems, Decision and Control". 

We look forward to receiving your contributions to make this book successful and useful for ROS community. The call for chapters website (see above) presents in details the scope of the book, the different categories of chapters, topics of interest, and submission procedure. 
In a nutshell, abstracts must be submitted by January 05, 2015 to register the chapters and to identify in advance any possible similarities of chapter contents. Full chapters submission is due on March 01, 2015.
Submissions and the review process will be handle through EasyChair

Each chapter will be reviewed by at least three expert reviewers, one at least should be a ROS user and/or developer. 
We look for the collaboration of ROS community users to provide reviews and feedback about proposals and chapters to be submitted for the book. If you are interested to participate in the review process, please consider filling in the following reviewer interest form

We look forward to receiving your contribution for a successful ROS reference!

Update:
Just to emphasise that submission of full chapters is on March 01 and that abstract are on January 05
If some authors would like to submit more than one chapter and would like to have an extension, please let me know in advance.

by Tully Foote on December 17, 2014 09:19 PM

Robox and IT+Robotics present a fully ROS-compliant motion controller
Robox and IT+Robotics are excited to announce the first fully ROS-compliant Robox motion controller, the uRmc2, and the cROS library.


The uRmc2 is one of the Robox's push product, and it has been designed to best exploiting the distributed electronics such as drivers and remote I/O. This motion controller can be employed in many motion control application, from the simplest ones (one or two controlled axes), to the most sophisticated ones (dozens of controlled axes) and even in presence of a high number of axes it can ensure a data exchange rate of 500Hz (2ms).


In collaboration with Robox, IT+Robotics realized cROS, a lightweight, single thread, full ANSI C ROS client library, that enables C programmers to quickly interface with ROS Topics, Services, and Parameters.
cROS has been developed with robustness and efficiency in mind, ant it has been designed to address any embedded system that is provided with a basic C compiler and, being single thread, it is well suited also for systems that comes without any operative system. IT+Robotics plans to make available an open source version of the cROS library in Q1 2015.

The Robox controller enhanced with the cROS library system offers the following features:

  • Topic subscriber and publisher capabilities
  • Services provider capabilities
  • Implement the whole Master, Slave and Parameters Server ROS APIs (missing only some minor functions)
  • Management and development of the logging features exposed by the ROS tools
  • Management almost any type of ROS messages, included a complete support to the ROS industrial messages

The last feature means that the system can communicate with any other ROS nodes without the message packages installed.
The messages are generated at runtime: only the *.msg and/or *.srv files are required.

The enhanced Robox controller has been used for the first time in an industrial project where a team of robot manipulators were managing metal parts to be weld.

Here the contacts:

- uRmc2 ROS compliant motion controllers: sales@robox.it
- cROS library: info@it-robotics.it

by Tully Foote on December 17, 2014 07:43 AM

December 16, 2014
2nd Factory-in-a-Day Newsletter!

We are happy to report that the second Factory-in-a-Day newsletter is posted. Topics covered in this newsletter include:

  • Letter from the project Coordinator, Dr. Martijn Wisse, TU Delft
  • Workshop on Philips use case
  • Spotlight on: Philips
  • Do robots kill jobs?

The Factory-in-a-Day project leverages the ROS-I repository, and is supporting improvements and maintenance of the Universal Robots package. As new general factory automation capabilities are created through this project, we anticipate that they will be added to the ROS-I repository.

by Paul Hvass on December 16, 2014 11:39 PM

ARDrone autonomous indoor navigation by 3D slam
From Jim Lin

Hello, everyone

I just want to demonstrate our lab's recent result (also my bachelor thesis). 
It combined several ROS packages, for example, "ardrone_autonomy", "tum_ardrone", "MoveIt" and "tum-vision/lsd_slam" to make ARDrone 2.0 automatically explore unknown area and avoid obstacle by using OctoMap created from lsd_slam.

We used some matrix operation to re-scale the lsd_slam to real world scale. 
The purpose of the project is to emphasize the whole motion can be achieved by low-cost vehicle with monocular camera.

The experiment video is as follows.

by Tully Foote on December 16, 2014 05:40 PM

December 15, 2014
Edmonton Users Group
From Craig Sherstan via ros-users@

I'd like to announce a new ROS Users Group in Edmonton, Alberta, Canada

Our focus is on teaching and exchanging knowledge. We're open to academics, industry and hobbyists. If you're in the area sign up for our mailing list on http://rosedmonton.org

Our first meetup is Jan. 15, 2015 at 6:30 PM at Startup Edmonton (http://startupedmonton.com/)

by Tully Foote on December 15, 2014 07:15 PM

New SIG: Perception
From Vincent Rabaud via ros-users@

Hi all,

as discussed with some of you at ROSCon, there is an interest for a vision SIG.

There are also quite a few packages dealing with audio (speech to text, text to speech, source separation ...) and there is definitely interest for touch, heat, emotion recognition ...

That's why we've renamed the old SIG for perception pipelines (that never got any post at https://groups.google.com/forum/#!forum/ros-sig-perpip) to https://groups.google.com/forum/#!forum/ros-sig-perception.

It is now a SIG where people can discuss about anything a robot can sense and its interpretation. Enjoy !

by Tully Foote on December 15, 2014 05:34 PM

December 11, 2014
New Package: retalis package for sensory data processing and management
From Pouyan Ziafati via ros-users@

Dear All,

I am happy to announce  the release of the retalis package for ROS. The Retalis language  supports a high-level and an efficient implementation of a large variety of robotic sensory data processing and management functionalists. 

Please see the description, tutorial and performance evaluation at  http://wiki.ros.org/retalis

Best regards,

by Ugo Cupcic on December 11, 2014 06:50 PM

December 10, 2014
Multiple developer positions open at Clearpath Robotics!
from Ryan Gariepy via ros-users@

Clearpath Robotics Inc. specializes in the design and manufacture of
unmanned vehicle solutions for everyone from small local businesses to
some of the best known companies on the planet. We began by offering
platforms and services to support robotics R&D, and have now expanded
beyond the world of R&D into commercial and industrial unmanned
solutions.

We employ a multidisciplinary group of highly talented people who live
and breathe robotics.  We believe that work must have a high "cool"
factor and every day should bring new knowledge. We need more people
on our team who are willing and able to bring the state of the art to
practical applications. Clearpath, our partner companies, and our
clients are making tremendous advances in automating the world, and we
want you to be a part of it!

All positions are located in Kitchener-Waterloo, Ontario, Canada. 1-5
years relevant work experience and a related graduate degree are
recommended, but exceptions can be made. Due to the volume of
applicants we receive, providing reference letters and portfolio work
with your application is *highly* recommended.

Individual job details, requirements, and application instructions can
be found at the following links:

Perception/SLAM (http://jobsco.re/12lTlWM)
Controls/Planning (http://jobsco.re/1yb6s53)
Multi-Robot Systems (http://jobsco.re/12lRHV1)

by Ugo Cupcic on December 10, 2014 08:10 PM

Demonstration of the Fraunhofer IPA ROS-I Driver for Yaskawa Motoman Dual Arm Robots

Submitted by: Thiago de Freitas and Ulrich Reiser, Fraunhofer IPA

Under the cooperation between Fraunhofer IPA, Yaskawa Smart Robotics Center in Japan and Yaskawa Motoman Robotics, a ROS-I driver to support multi-groups control for the Motoman Robots was developed.

The first industrial dual-arm manipulator to run the driver was the Motoman SDA10F with an FS100 controller. The driver provided the capability for generating synchronous and asynchronous movements from the ROS side that could be send to the FS100 controller and then executed by the real robot groups (left arm, right arm and torso). Also, support packages were setup including driver configuration files, URDF and MoveIT! configuration files.

The driver was also demonstrated during ROSCON 2014, using a Motoman BMDA3 robot. [Remarkably, the driver worked with the hardware despite the SwRI software team never having access to the hardware prior to the event.] The demo was organized by Yaskawa Motoman and SwRI.

Erik Nieves (Yaskawa Motoman USA) Grooves with the BMDA3 at ROSCon 2014 in Chicago

Erik Nieves (Yaskawa Motoman USA) Grooves with the BMDA3 at ROSCon 2014 in Chicago

Paul Hvass (SWRI/ ROS-Industrial Consortium PM) "running" with the BMDA3

Paul Hvass (SWRI/ ROS-Industrial Consortium PM) "running" with the BMDA3

Some tutorials are recommended for getting a better overview of the driver usage and system configuration:

Some additional information:

null

by Paul Hvass on December 10, 2014 03:54 PM

December 09, 2014
Updated package: razor_imu_9dof
From Kristof Robot via ros-users: 

I am happy to announce Hydro and Indigo versions of razor_imu_9dof, a
package that provides a ROS driver for the Sparkfun Razor IMU 9DOF
(http://wiki.ros.org/razor_imu_9dof).
It allows assembling a low cost Attitude and Heading Reference System
(AHRS) which publishes ROS Imu messages for consumption by packages
like robot_pose_ekf.

Major updates (see Changelog [1] for details):
- catkinized
- upgraded to be fully compatible the ROS navigation stack (and in
particular robot_pose_ekf)
- major upgrade of the wiki documentation (http://wiki.ros.org/razor_imu_9dof)

Video demonstrating the use of razor_imu_9dof with robot_pose_ekf to
improve odometry -  .

For more information, and detailed instructions, see
http://wiki.ros.org/razor_imu_9dof.

I'd like to thank Tang Tiong Yew for the good work on the previous
Fuerte and Groovy versions, and Peter Bartz for the excellent
firmware.
Last but not least, a big thanks to Paul Bouchier, who triggered this
upgrade, and was a major contributor overall.

Enjoy!

Kristof Robot

[1] http://docs.ros.org/indigo/changelogs/razor_imu_9dof/changelog.html

by Tully Foote on December 09, 2014 09:15 AM

December 03, 2014
Scientist position at Honda Research Institute
From Alper Ayvaci via ros-users@

Scientist (Job Number: P13F01) 

HRI in Mountain View, California, has an opening for a Scientist conducting research in the area of computer vision based sensor fusion for mapping, localization, and related problems. Algorithms will be implemented online to process input from cameras and other sensors including IMU's, GPS, and sensor data transmitted over automotive CAN bus. The successful applicant will be a part of a research team developing and implementing a real-time robotic perception platform supporting research on advanced driver assistance systems and autonomous driving. 

His/her responsibilities include: 
* Research on sensor fusion for localization and mapping 
* Develop software implementing proposed algorithms 
* Employ sensor fusion techniques with multi-modal data 
* Setup and run module regression tests using large collections of sensor data 
* Benchmarking results against ground truth data 

Qualifications:
* PhD degree in Computer Science, EE or related field 
* Strong experience in an area such as SLAM, filtering, sensor fusion 
* Broad knowledge of computer vision, robotics and machine learning 
* Experience with online sensor processing of cameras, lidar, GPS, CAN bus 
* Excellent programming skills in C++ and Linux 

Desirable: 
* GPGPU programming, runtime optimization 
* Experience programming in robotics application frameworks such as ROS (Robot Operating System) 
* Working knowledge of popular libraries such as OpenCV, PCL and Eigen 
* Experience working with GIS data and coordinate systems 

To apply, please send a cover letter and your resume to 


with the subject line clearly indicating the job number you are applying for. 
Name your attachments as "FirstName.LastName.OrganizationName.cv.pdf" 

by Tully Foote on December 03, 2014 11:00 PM

December 01, 2014
Demonstrating the Integration of ROS with Siemens’ Process Simulate

Siemens PLM Software is a leading global provider of product lifecycle management (PLM) software. Those PLM solutions, can help make smarter decisions that lead to better products.

As always, we at Siemens PLM Software are looking for new areas that will allow us to understand the future of robotics in the industrial sector. After we came across The Robot Operating System (ROS) and ROS-Industrial, I was sent to take a closer look. In June, I participated in the “ROS Industrial Basic Developers Training Class” held at Southwest Research Institute (SwRI) to understand more about the ROS ecosystem and tools. Since then, I have been experimenting with ROS libraries and tools and thinking about a connection between ROS Industrial and our own Process Simulate software.

From what I learned, ROS Industrial has interesting potential in the area of industrial robotics by providing the following:

  • Standardization for robotic languages.
  • Real time path planning and collision avoidance.
  • A huge library of open source components.

For the above reasons, we believe that ROS-I has the potential to play a significant role in the industry and we should be a part of it.

After a few experiments, I compiled the demo (refer to video above) which shows how Process Simulate can provide a full simulation environment for a ROS controlled robot (R2-D2 believe it or not). In the demo, you’ll see that R2-D2 has three proximity sensors which are mounted on the right, front and left (their signal values can be seen in real time on the top left of the screen). R2-D2’s objective is to leave the maze using the simple algorithm of “always try to turn to the right”. But to make its life a little more interesting, we added red barriers which are removed manually at the start of the simulation to create a more dynamic environment.

As you can see from the rqt_graph above, the sensory information received from Process Simulate is processed on the ROS side and information about the robot’s new location is sent back to Process Simulate (the objects in red are Process Simulate components).

As you can see from the rqt_graph above, the sensory information received from Process Simulate is processed on the ROS side and information about the robot’s new location is sent back to Process Simulate (the objects in red are Process Simulate components).

Along the way, I overcame a couple of interesting challenges such as:

  • Having to write a robotic program using third party tools.
  • Connecting between Windows-based Process Simulate and Linux-based ROS.

After discovering some real added value of linking ROS Industrial with Process Simulate, I’m going to explore further capabilities, like Vision, and working with more complicated environments using advanced Process Simulate and ROS Industrial software packages (PLC and welding in Process Simulate and OpenCV and MoveIt in ROS-I).

If you are looking to share interesting view points, use cases and environment challenges which are related to ROS-I, contact me at: moshe.schwimmer (at sign) siemens.com

Additional references:

http://www.plm.automation.siemens.com/en_us/products/tecnomatix

https://www.facebook.com/Tecnomatix.NXforManufacturing

by Paul Hvass on December 01, 2014 10:21 PM

November 21, 2014
ROS Korea users meetup, seminar, and tutorial
From Jihoon via ros-users@

Hello
      everyone,

There
      will be ROS Korea users seminar & meetup in Seoul, Korea on
      December 21, hosted by OROCA, one of largest korean robotics
      community. 

      The seminar will cover overview for beginners, navigation, moveit,
      UAV, and community briefly by speakers from various groups in
      Korea.

Also
          I hope to hang out with other users after seminar since it is
          first ROS event in Korea. So please drop by in the evening. :)

For more info see(in Korean) : http://cafe.naver.com/openrt/7283
For English info : Email me(jihoonlee.in@gmail.com)

In addition to the seminar there will be a ROS Tutorial. Here are some details about ROS tutorial.


Tentative Date : 2014/12/22
Time : 13:00 ~ 18:00
Location : TBD
Expected # of participants : 30~40
Organiser : Yoonseok Pyo(passionvirus@gmail.com), Jihoon Lee(jihoonlee.in@gmail.com)
Registration : e-mail to passionvirus@gmail.com
Contents:
  * ROS environment setup
  * ROS commands, pub/sub/srv
  * How to integrate sensors
  * SLAM and navigation using kobuki
  * How to write turtlebot rapps

Fully in Korean
Free of charge

by Tully Foote on November 21, 2014 07:16 PM

November 19, 2014
Yaskawa Motoman Offers Robots to Amazon Picking Challenge Teams
dual-arm-SDA10-juice-boxes Motoman-arm-picking-boxes

Yaskawa Motoman Robotics is pleased to announce sponsorship of the Amazon Picking Challenge to be held in conjunction with ICRA 2015. This open competition will further the development of robot skills required for e-commerce and other material handling. Yaskawa Motoman is offering consignment robots to select teams entering the challenge. Selected teams will receive their choice of robot model in January, 2015 and may keep them through June, subject to model availability. To apply for a Motoman consignment robot, please submit the following items via this form by midnight PST December 17, 2014.

  • Video of a simulated robot executing a picking task
  • Link to your team/organization website
  • Completed application describing your Motoman hardware utilization plan

Yaskawa Motoman will provide robots, software (including our MotoROS driver), and onsite technical support both at the team’s location and the event.

To support your development efforts, the ROS-Industrial Consortium will be updating its pick and place tutorial to include the Motoman MH5 II model. The tutorial will also be updated to ROS Indigo. Additional resource links:

by Paul Hvass on November 19, 2014 11:30 PM

ROS Japan Users Group + Kawada host a successful ROS Meetup #4
From Daiki Maekawa via ros-users@

Hi Everyone,

We held ROS JAPAN Users Group meetup as NEXTAGE Hackathon in KAWADA Robotics Inc. on November 8 2014.

We developed the function of obstacle avoidance behavior, stereo camera, and sample code for NEXTAGE.

?Demonstration movie



?Please see the photos of this meetup

See the links below for the presentation slides of this meetup.

by Tully Foote on November 19, 2014 09:58 PM

November 17, 2014
A rosbag implementation in Java
From Aaron Schiffman via @ros-users

Dear ROS-Users,

I've created a rosbag writer implementation for Java, and posted it to a new BitBucket repository. It should be functional for writing a rosbag format 2.0 none compression rosbag in Android or Java ROS implementations (Client or Server). If your interested the source repository is located at:
 
 
 
 
 
 
aaron_sims / jrosbag / source / -- Bitbucket
Source Branch master Check out branch jrosbag /
Preview by Yahoo
 
How to use the Bag class:
  1. Initialize the org.happy.artist.rmdmia.utilities.ros.bag.Bag class.
  2. Call bag.start(OutputStream os, Bag.CHUNK_COMPRESSION_NONE)); // Where os is the OutputStream you intend to write the file. Examples could be a FileOutputStream, or a network output stream that writes the file to Google Drive, or Dropbox (just examples).
  3. Call bag.addConnectionHeader(char[] topic, int conn, char[] connection_header_hex); for each new connection header on connection handshake. int conn is a unique int connection id chosen for the connection (might be a good idea to iterate through topic ids to create an int array, or other mechanism to chose a unique int. connection_header_hex will be the ROS Serialized Message in the connection header.
  4. Call bag.addMessage(long time, int conn, char[] message_data_hex); Pass in the long time, associated connection header int conn id, and the ROS serialized message to add to the rosbag file.
This Java code is poorly documented, however, I wanted to share it with ROS Community for Java/Android ROS clients that want to record rosbag files. Good luck using it. Releasing under Apache 2.0 license.

I wish I had more time to clean up the code better, and if you have questions or want to contribute send me a message.

Thanks,

Aaron 

by Tully Foote on November 17, 2014 07:38 PM

November 14, 2014
New Tutorials: Adding a global path planner as plugin in ROS
From Anis Koubaa via ros-users@

We have updated the tutorial on adding a global path planner as plugin in ROS. We have added testing using RVIZ.
The general tutorial is available on ROS Wiki site in this link

A more specific tutorial that shows how to add a real genetic algorithm planner is presented in this tutorial page

It is possible to work with other path planning algorithms. We implemented the iPath C++ library that provides the implementation of several path planners including A*, GA, local search and some relaxed version of A* and Dijkstra (much faster that A* and Dijkstra).  More will be added soon on Google Code. 

The iPath library is available as open source on Google Code under the GNU GPL v3 license. The library was extensively tested under different maps including those provided in this benchmark and other randomly generated maps
A tutorial on how to use iPath simulator is available on this link

Credits particularly go to Maram Al-Ajlan (Master Student at Al-Imam Mohamed bin Saud University, Saudi Arabia) and Imen Chaari (PhD student at Manouba University Tunisia) for their efforts in implementing the algorithms and integration to ROS. 

If you have any suggestions or questions about tools or tutorials, please contact me. 

Anis

by Tully Foote on November 14, 2014 08:58 PM

November 13, 2014
Software Engineer/PhD opening at Fraunhofer IPA

From Florian Weißhardt via ros-users@

Position: Software Engineer, possibility to obtain PhD degree

Location: Fraunhofer IPA, Stuttgart, Germany

Experience: Strong skills in software design and C/C++ development and rich experience in ROS development

Finding solutions to organizational and technological challenges, particularly within the production environment of industrial enterprises. That, in a nutshell, is the key focus of the research and development work carried out at the Fraunhofer Institute for Manufacturing Engineering and Automation IPA. With 14 individual departments engaged in the fields of Corporate Organization, Automation and Surface Engineering, our R&D projects aim to enhance production processes and make products more cost-effective and environmentally friendly by identifying and exploiting the potential for automation and streamlining at clients' companies.

The Fraunhofer IPA department for robot and assistive systems develops service robots for various application fields (e.g. domestic, inspection, logistics, production assistance, manufacturing, etc.) with the goal of reliable, robust and safe service of these robots in everyday environments. These applications require complex software systems including navigation, planning, perception and manipulation for dynamic and changing environments and intuitive human-robot interaction.

The position covers the development of concepts and tools to reduce integration effort and simplify application development for these complex robotic systems in the frame of public funded national and EU projects. The transfer of the results to industry by organizing workshops, publishing articles relevant magazines and exhibiting demonstrators at fairs and conferences is part of the job description as well.

You have completed your master or diploma degree with excellent results and are interested in interdisciplinary research with high-tech robots like Care-O-bot or KUKA iiwa. You could already gather experience in scientific working and optimally already presented your first results at an international conference. You are confident in software architectures and software engineering and have practical experiences with the development of robot applications in ROS.

We offer you a highly interesting and diverse work environment with both contact to top robotics researchers and industry. In addition to obtaining a phd degree, the position encompasses early transfer of project and staff responsibility. For the implementation and validation of your ideas, we offer exceptionally equipped laboritories and test environments.

Qualifications/Requirements:

  • Rich experience in ROS development
  • A Master/Diploma degree from a top university in computer science, robotics or software engineering
  • Strong skills software design and C/C++ development
  • Proficient oral and written English skills

Advantageous are:

  • Experience with model-driven engineering approaches
  • Oral and written German skills

Please include the following documents in your application:

  • Cover letter that expresses your motivation and (research) goals
  • CV
  • Transcripts of all obtained degrees (including scholar education)
  • References and certificates relevant to the position

Please send your application to martina.goetzner@ipa.fraunhofer.de referring to position IPA-2014-109.

by Paul Hvass on November 13, 2014 12:52 AM

November 12, 2014
Toyota HSR Hackathon assisted by TORK
Toyota Motor Corporation (TMC) hosted a 2-day hackathon on their HSR
(Human Support Robot) on October 23th and 24th in Tokyo. TORK assisted
the event and the software.

HSR has been developed as an elderly care for domestic situations as
well as in medical facilities. This time TMC invited researchers and
students from institutes in Japan. 15 participants got a hands-on
experience, tried out making small tasks to let the mobile-manipulator
robot interact with human, and had fun.

All of the higher-end functionalities such as self-localization,
vision-based collision avoidance, motion planning and so on are
available via a de-facto standard opensource robotics framework ROS
(Robot Operating System) (that is, as previously announced). It also
comes with script language interface so that it doesn't require
developers to be well acquainted with ROS.

With the feedback from the attendees this time TMC expects to boost
the development in the future.

A write up of the event with photos can be found here

Reference: Announcement on TMC Facebook page (in Japanese)

by Tully Foote on November 12, 2014 10:18 PM

November 10, 2014
ROS MAV SIG Call for Participation

ROS Aerial Vehicle Users,


We'd like to invite you to participate in an effort to develop a standard set of messages for communicating within robotics components on Micro Air Vehicles (MAVs). At the IROS workshop on MAV's (proceedings) this fall it was identified that the MAV community has many different implementations of the same capabilities. They are often all closely related and are almost compatible but rarely is it easy to switch between different implementations, or use different implementations together. From that discussion it was proposed to work toward building up a common way to communicate and enable the MAV community to collaborate most effectively.

To make this happen we have setup a mailinglist and wiki pages to be a place to coordinate this effort (MAV SIG, mailing list). If you are interested in this topic we ask that you join, listen and participate so that we can get as broad a spectrum of voices as possible.


We have chosen the ROS SIG format as it has proven effective at developing standard messages which are used by many people every day. ROS SIG's are relatively unstructured and allow adaptation for differences in each community and process.


We plan to use the ROS .msg format as a way to formalize the messages, since it is a relatively compact way to express messages which has representations in many languages. The most important part of the process will not be the actual msg file that comes out, but the datatypes for which people can rely on being isomorphic when transitioning between systems.


Having common datatypes will allow us to have better modularity and interoperability. As an example from the ROS ecosystem, there are 10+ different laser scanner drivers in the ROS ecosystem and 18+ different camera drivers (ROS sensors). Because these drivers all use a standard set of messages a user of those sensors can switch which sensor they are using on their system, or deploy systems with different sensors and the rest system will continue to operate without modifications. There are more complicated examples such as the navigation stack which has a standard set of messages for sending commands and providing feedback. This same interface has been used for differential drive, holonomic, free flying, and even walking robots.


There are already dozens of MAV related ROS packages released and we hope that developing these standard messages can help coordinate the efforts of the many contributors already working on aerial vehicles in ROS.


If you would like to know more please check out the SIG (LINK). if you're at all interested please join the process. We've started a thread at here to kick off the process.  


Tully Foote (OSRF), Lorenz Meier (ETHZ / CVG, PX4), Markus Achtelik (ETHZ / ASL).

by Tully Foote on November 10, 2014 06:13 PM

Software Engineer/PhD opening at Fraunhofer IPA

From Florian Weißhardt via ros-users@

Position: Software Engineer, possibility to obtain PhD degree

Location: Fraunhofer IPA, Stuttgart, Germany

Experience: Strong skills in software design and C/C++ development and rich experience in ROS development

Finding solutions to organizational and technological challenges, particularly within the production environment of industrial enterprises. That, in a nutshell, is the key focus of the research and development work carried out at the Fraunhofer Institute for Manufacturing Engineering and Automation IPA. With 14 individual departments engaged in the fields of Corporate Organization, Automation and Surface Engineering, our R&D projects aim to enhance production processes and make products more cost-effective and environmentally friendly by identifying and exploiting the potential for automation and streamlining at clients' companies.

The Fraunhofer IPA department for robot and assistive systems develops service robots for various application fields (e.g. domestic, inspection, logistics, production assistance, manufacturing, etc.) with the goal of reliable, robust and safe service of these robots in everyday environments. These applications require complex software systems including navigation, planning, perception and manipulation for dynamic and changing environments and intuitive human-robot interaction.

The position covers the development of concepts and tools to reduce integration effort and simplify application development for these complex robotic systems in the frame of public funded national and EU projects. The transfer of the results to industry by organizing workshops, publishing articles relevant magazines and exhibiting demonstrators at fairs and conferences is part of the job description as well.

You have completed your master or diploma degree with excellent results and are interested in interdisciplinary research with high-tech robots like Care-O-bot or KUKA iiwa. You could already gather experience in scientific working and optimally already presented your first results at an international conference. You are confident in software architectures and software engineering and have practical experiences with the development of robot applications in ROS.

We offer you a highly interesting and diverse work environment with both contact to top robotics researchers and industry. In addition to obtaining a phd degree, the position encompasses early transfer of project and staff responsibility. For the implementation and validation of your ideas, we offer exceptionally equipped laboritories and test environments.

Qualifications/Requirements:

  • Rich experience in ROS development
  • A Master/Diploma degree from a top university in computer science, robotics or software engineering
  • Strong skills software design and C/C++ development
  • Proficient oral and written English skills

Advantageous are:

  • Experience with model-driven engineering approaches
  • Oral and written German skills

Please include the following documents in your application:

  • Cover letter that expresses your motivation and (research) goals
  • CV
  • Transcripts of all obtained degrees (including scholar education)
  • References and certificates relevant to the position

Please send your application to martina.goetzner@ipa.fraunhofer.de referring to position IPA-2014-109.

by Tully Foote on November 10, 2014 02:00 AM

November 03, 2014
Delivering ROS in a Box
The Inno Team here at Yujin Robot are very excited to finally be able to talk about our latest project. So without further ado, we'd like to introduce GoCart™!



gopher_design_a0small2.jpg

GoCart is the first iteration of a family of robots we are calling Gopher.  They're designed to be small, agile, team oriented, and able to do anything from specialised gofer'ing services through to lightweight logistics. And while GoCart is a little different than we initially envisaged, it is quickly becoming a natural first step having evolved through business networks and partners in relevant markets.

In many ways this is a break from the past and a shiny new business direction for Yujin Robot, but in many others it is a return to our roots (navigation and manufacturing). The last four years have been an exploratory period in which Yujin has incrementally developed an international outlook. Working with foreign companies to improve manufacturing processes, introducing new work styles, mixed Korean and international teams, adoption of external ideas and of course, ROS. ROS in particular we've used heavily in an experimental way via government projects, TurtleBot, and as a tool for internal development. There actually hasn't been much we haven't tried - right from low level embedded development through to navigation, manipulation and user interfaces via web and Android. Right now though, GoCart is a chance to refocus on our core strengths and transfer everything new that we've learned into a real opportunity.

So what is this opportunity? GoCart is our solution to lower cost and improve the quality of care in health and elderly care facilities. With it, we will automise the time-consuming task of transporting meals within facilities. This will free up time for staff, allowing them to spend more time caring for patients and residents. GoCart will be available as a “robot as a service” (RAAS) at an affordable monthly rate per unit. This will include the robot, on-site set-up, and 24-7 support and maintenance. With GoCart health and elderly care facilities will reduce their overall operational cost.

We're also thrilled to work with Synapticon on this project to build an affordable motor-sensor network for service robotics rather than duplicating everything in-house. GoCart’s low-level control system consists in parts of SOMANET hardware and software modules. It’s job is to handle sensor data processing, power management, and motor control.

GoCart’s list of sensors includes PSDs, ultrasonic range finders, a gyroscope, a 3D sensor, and Yujin Robot’s new d-SLAM™ system using multiple stereo-camera boards. d-SLAM maps and localises and all other sensors specifically focus on obstacle avoidance. d-SLAM also enables us to drop the overpriced laser range finders and provides tremendous advantages in dynamic, cluttered environments.

WIFI is used for the communication between GoCarts and with the admin station to enable teamwork, monitoring, and remote control. For developing and implementing GoCart’s meal-transport service, we will leverage various tools from ROCON. We use capabilities and the rapp platform to manage robot applications (tasks), the concert to orchestrate multiple robots and smart device clients and service tools to facilitate the service development.

GoCart is fully ROS powered. Apart from ROS’ core modules and the ROCON stacks, we also use popular software packages such as the navigation framework, openni_launch and ecto amongst others. We also extend ROS with our own modules where needed - some open, some closed. The single most important priority driving development is to get GoCart out there doing road miles as quickly and as robustly as possible. Consequently, this often fuels our choice to use and contribute to the ROS world when and where it isn’t a differentiator for us.

All in all, we’re excited to be finally putting ROS into what will be a core product for Yujin Robot and at the same time stretching ROS in directions it hadn’t anticipated in being (ab)used.

Right now, GoCart is just emerging - its first iteration is a design concept and technology proof - but we will be very quickly moving on-site for testing. It’s shaping up to be a very busy, but amazing couple of years!

by Daniel Stonier (noreply@blogger.com) on November 03, 2014 04:58 AM

October 29, 2014
New Package: mongodb_store

From Nick Hawes via ros-users@

I would like to announce the release of a new suite of tools to enable the persistent storage, analysis and retrieval of ROS messages in a MongoDB database.

The mongodb_store package:

http://wiki.ros.org/mongodb_store

... provides nodes to store arbitrary ROS messages in a MongoDB database, query the database and retrieve messages, with helper classes in C++ and Python. Nodes are also available to provide rosbag-like functionality using the [same db format] (http://wiki.ros.org/mongodbstore#LoggingofTopics:mongodblog) and also [parameter persistence across system runs] (http://wiki.ros.org/mongodbstore#Parameterpersistence:config_manager.py).

Packages are available on Ubuntu for Indigo and Hydro, e.g.

  • ros-indigo-mongodb-log - The mongodb_log package
  • ros-indigo-mongodb-store - A package to support MongoDB-based storage and analysis for data from a ROS system, eg. saved messages, configurations etc
  • ros-indigo-mongodb-store-msgs - The mongodbstoremsgs package

These tools were developed by the STRANDS project to support the development, debugging and runtime introspection of long-term autonomous mobile robots, but we hope they will be useful to the ROS community more generally.

In the near future we plan to release tools for serving maps from mongodb_store and for logging streams of RGB-D data in a compressed format .

Note that there is an overlap in functionality between these tools and warehouse-ros. We developed our own solution as the existing packages appeared to be unsupported and special-purpose, but as this appears to be changing, we may want to look at combining these two packages.

For feedback, pull requests, feature requests and bug reports please go to: https://github.com/strands-project/mongodb_store/issues

by Tully Foote on October 29, 2014 10:47 PM

October 24, 2014
Voxel Layer Visualisation
If you get into 3D obstacle detection with ROS as we are with our new gocart robot, you are bound to use voxel layers.
 The problem is that they cannot be visualised directly but have to be converted.

Luckily the costmap_2d comes already with a node to do this:

<launch>
  <node name="voxel_grid_2_point_cloud" pkg="costmap_2d" type="costmap_2d_cloud">
    <remap from="voxel_grid" to="/move_base/local_costmap/obstacle_layer/voxel_grid"/>
    <remap from="marked_cloud" to="/move_base/local_costmap/obstacle_layer/marked_cloud"/>
    <remap from="unknown_cloud" to="/move_base/local_costmap/obstacle_layer/unknown_cloud"/>
  </node>
</launch>



This gets us finally the obstacle map with our plushie and turtle bot into rviz:

by Alexander Reimann (noreply@blogger.com) on October 24, 2014 09:16 AM


Powered by the awesome: Planet