paper: Team 900 Presents— Zebravision 5.0: ROS for FRC

Thread created automatically to discuss a document in CD-Media.

Team 900 Presents— Zebravision 5.0: ROS for FRC
by: Adithya Balaji

ROS or Robot Operating System is a paradigm of software architecture for robotics in general and has finally made its entrance to the FIRST scene. This framework allows for fluid integration of co-processors and for the possibility of advanced robot code.

Zebravision 5.0 is a radical departure from previous paradigms of robot software architecture and completes the computer vision team’s take over of team 900 on the whole :slightly_smiling_face: The work centered around the implementation of ROS or Robot Operating System into the team’s overall software framework. The main goal was to improve the facility of Jetson to RoboRio communication but ROS represents much more than that. This leap in software interfacing not only allows the two systems to communicate in a dynamic manner, but also lays the foundation for sophisticated control paradigms built upon the open source ROS framework. This distributed computation model will allow advanced work on robot sensor processing, motion planning, environment perception, localization, and mapping.

Here is the link to the source code: GitHub - FRC900/2017VisionCode: 2017 Competition Code

zebravision5Ros.pdf (2.27 MB)

Zebravision 5.0 is a radical departure from previous paradigms of robot software architecture and completes the computer vision team’s take over of team 900 on the whole :slightly_smiling_face: The work centered around the implementation of ROS or Robot Operating System into the team’s overall software framework. The main goal was to improve the facility of Jetson to RoboRio communication but ROS represents much more than that. This leap in software interfacing not only allows the two systems to communicate in a dynamic manner, but also lays the foundation for sophisticated control paradigms built upon the open source ROS framework. This distributed computation model will allow advanced work on robot sensor processing, motion planning, environment perception, localization, and mapping.

Here is the link to the source code: https://github.com/FRC900/2017VisionCode

We will of course be answering any questions anyone has but I want to say how rediculously proud of these students I am. They’ve done some truly remarkable work with this paper and I can’t praise them enough for it. Also now that this paper is done we can finally kick them off the team’s slack*. :wink:

*They have graduated and need to get back to college. Don’t worry, we plan on keeping up this work in their absence though.

Nice. We’re using ROS as well, and have had some good experiences with it.

Smart idea, interfacing the NavX with the Jetson. That would solve some pain points we’ve been having with our implementation, but we’re wary concerning some major issues we’ve had with its precision and accuracy (yes, it was calibrated correctly :wink: ). Was not aware you could simultaneously interface with it over both SPI and USB though, that’s good to know.

Why the NavX over a sensor with more precision and accuracy, though? The BNO055 and the FXOS8700/FXAS21002 with some home-rolled EKFs are impressively solid, the latter of which I believe Adafruit has a nice breakout for.

I believe I caught a glimpse of a LIDAR on your bot. Did anything come from that? Full field localization is proving a very promising endeavor for us, although I’d advise you that SLAM is proving to be the wrong place to look, at least for us.

Are you planning on moving to a 971-in-2014-style system, with a Jetson doing the entirety of heavy lifting concerning teleop and autonomous logic? It might be worth thinking of, but our experiences have shown that with the RIO’s power, it’s probably more trouble than it’s worth.

Awesome, the more the merrier!

To answer your questions:

Why the NavX over a sensor with more precision and accuracy, though? The BNO055 and the FXOS8700/FXAS21002 with some home-rolled EKFs are impressively solid, the latter of which I believe Adafruit has a nice breakout for.

  1. The Navx has a very accurate fused direction information which was used for our field centric drive. We also began testing other accelerometers but we did not get around to implementing them. Also, USB interfaces are much more practical to work with while using the Jetson compute system. Also, there is no need to home-roll ekfs (extended kalman filters) as one of the major benefits of ROS is its extensive package suite which includes: robot localization and rtabmap which are extensively tested and work great for those purposes. We began experimenting with these tools as well.

I believe I caught a glimpse of a LIDAR on your bot. Did anything come from that? Full field localization is proving a very promising endeavor for us, although I’d advise you that SLAM is proving to be the wrong place to look, at least for us.

  1. Full field localization is also looking very promising for Team 900, I have since graduated from the team but I’m sure they are building upon the impressive preliminary results that we hinted at earlier in the season.

Are you planning on moving to a 971-in-2014-style system, with a Jetson doing the entirety of heavy lifting concerning teleop and autonomous logic? It might be worth thinking of, but our experiences have shown that with the RIO’s power, it’s probably more trouble than it’s worth.

  1. I hope that 900 will be moving to a 900-in-2018 strategy for robot control, though I am not too familiar with 971’s implementation in 2014. :slight_smile:

Lastly, I’m actually very curious about your implementation. Is your code open source anywhere? Upon a cursory search, I could not find it anywhere.

This is me showing that I spend too much time on CD/I’ve spent too much time in FIRST

In 2014, 971 ran BeagleBone Black w/ a custom “cape” which, among other things, had an additional microcontroller. You can read about it here, as my attempts to describe it probably won’t do the system justice.

I’ve heard this complaint from a few people and I’m kind of baffled by it. We’ve been using the NavX since it first came out and haven’t had any real issues with accuracy. Not only that but the Invensense chip it’s based on is used all over the place - I’m always shocked to find it in quadrotors and various IMU+MCU boards that claim “best in class accuracy” and the like. In fact, it’s used in the Pigeon IMU from CTRE as well.

At any rate - we’re always looking at other IMUs and one of the great things about ROS is that we’ll be able to fuse multiple IMUs together with minimal effort. Many of these have ROS drivers of some form and those that don’t can be added and we’ve now got some experience with that.

Synchronizing time across machines, I’m guessing? We’re hoping some of the RTC implementations pan out this year.

I believe I caught a glimpse of a LIDAR on your bot. Did anything come from that? Full field localization is proving a very promising endeavor for us, although I’d advise you that SLAM is proving to be the wrong place to look, at least for us.

We’ve had OK initial luck with slam-gmapping, at least in the lab : https://youtu.be/zn2RehpMaaQ That’s using a LIDAR for mapping and a ZED camera for visual odometer. Should improve with encoder data, I’d assume. No idea how that’ll translate to field-like conditions, though.

Are you planning on moving to a 971-in-2014-style system, with a Jetson doing the entirety of heavy lifting concerning teleop and autonomous logic? It might be worth thinking of, but our experiences have shown that with the RIO’s power, it’s probably more trouble than it’s worth.

We’re not sold on anything in particular now. The hope is that if we break up the code into discrete ROS nodes, moving things between RIO and Jetson (and driver station :wink: ) won’t be that big of a deal aside from changing a launch file or two. Obviously there will be hardware limitations e.g. CAN motor controls are way easier on the RIO than the Jetson, and try not to pass camera data over the network. But if the nodes are modular enough we can easily decide to start them on one device or the other depending on what we find during testing.

As to the RIO’s power, agree it is nice but there’s always room for more (e.g. we have this kinda working on our bot and it is CPU hungry : https://www.youtube.com/watch?v=e1Bw6JOgHME)

How do you plan on using the mapping/localization on the field, as the see through walls won’t be detected very well, and the surrounding areas around the field will change from competition to competition. Or do you plan to remap the field each competition?

Also, was the main purpose of using ROS to implement the transforming frames, so that you can account for the location of the camera in respect to the rest of the robot? The only other use of ROS I can see, is for some sort of localization using a lidar or stereocamera.

Examine the field perimeter closely.

The transformation of coordinate systems is just one of many reasons we took on this transition. There are a lot of uses for ROS - for instance - it reduces the amount of work it takes for us to add additional advanced sensors to the robot. It also opens a lot of sensors that would previously require custom drivers and software.

Mapping results don’t necessarily need to look like human-readable maps to work for localization. Granted there would be problems if the robot tried to drive through invisible-to-it field perimeter, but it isn’t a deal-breaker.
And yeah, because of that we might end up remapping for each new field (or even new match, depending on how things work out). Right now we’re just barely scratching the surface of it.

Also, was the main purpose of using ROS to implement the transforming frames, so that you can account for the location of the camera in respect to the rest of the robot? The only other use of ROS I can see, is for some sort of localization using a lidar or stereocamera.

These are two pretty important reasons, yes. Some of the ROS object detection code also uses it to identify where objects are with respect to the robot. That can be tied into localization.

But the main reason to use it is because pretty much everything in ROS is set up to use it already. No point in reinventing the wheel. We’re going through the pain of working with ROS’s framework to pick up all the cool tools that others have developed for it. Using the tf stuff is yet another way to get closer to that goal.

Thanks, Team 900, for introducing ROS and writing it up so well!

I’m a new mentor for team 4638 and a professional software engineer using ROS at work. So I was excited to read that ROS has been proven out in FRC.

Unfortunately, I haven’t been able to make it work myself, though. I have the RoboRIO trying to connect to a Jetson TX2. The Jetson is the ROS master, running kinetic. And the RoboRIO is using ROS for Labview installed from source from the Tufts github site. When the RIO tries to register a node, it fails to send any traffic to the master. (Earlier in the process, the RIO is able to open and close a connection to the ROS core, confirming that it’s available.) And we end up with an error 403302, as described in a coule of the unresolved issues on the github site.

So I’m trying to figure out how to navigate this minefield. Are the ROS users here running the master on the RIO or an external processor, like the Jetson? What version of the ROS for Labview code are you using? (Source or VIPM installation) Did you have to modify it at all to run on the RIO? Is there some trick to deploying the ROS for LabVIEW library to the RIO? In one of the issues on github was the comment, “there’s something wrong with where ROSDefinitions gets stored… The Global variables aren’t being set correctly.” This seems to be our problem. I’m hoping if you all share some more details of how you’re employing ROS, I can work through this.

Thanks,

Dave Wheeler*