IMUs

Hello! My team is fairly new to sensor-based driving, and we would like to consider using an IMU to help us out with positioning during autonomous. I’d like the robot to traverse a defense and correct its position and angle for a clear shot based on IMU or other sensor data.

At first I had hoped to use the new Analog Devices ADIS16448, but we used all of our FIRST Choice credits. So I’m looking for something else. I’ve found a whole bunch on SparkFun, but I have absolutely no experience with wiring such things, and even after reading through some datasheets I’m not sure if they are compatible with the RoboRio.

Does anyone know if the FRC Gyro & Accelerometer Board that is coming free with every order is capable of delivering what I need?

Here’s the link the SparkFun ones. Any of these compatible and/or recommended?

The navX is one more that I found which seems to be compatible, but it’s $100. Anybody have experience with this model in particular?

Thanks,

Benjamin

The NavX really is your best bet. It was developed specifically for FRC and has great documentation and code examples. Many teams used them successfully last year.

Yeah, it was looking to me like the best option. This post makes me doubt that a little though…it talks about ~1 meter drift per 15 seconds. I just wonder if that’s too much. I’ll still look into it. Thanks for your response. I’d appreciate any input from anyone else with experience!

Don’t worry about the NavX displacement having drift. It produces very good angle values which is what everyone is actually using. Displacement is incredibly difficult to accurately calculate using an accelerometer. First you have to integrate the acceleration to compute velocity then you have to integrate velocity to get position/displacement.

Okay. So I can do what I want without displacement? I can correct the robot’s angle and its position after it crosses to the courtyard?

We used the NavX last year just as a gyro, and it worked fairly well. The firmware has been updated significantly since then, so we’re liking it MUCH more now!

It really depends on what you want your sensor data to be telling you. You say you want to correct for angle and position. From that I take it you mean you want to drive over the defenses and have the robot head to a specific coordinate location on the field, with the right heading (presumably facing the tower). The problem with this plan of attack is that there aren’t many sensors which can provide you an field-centric position (what you were asking for).

Sensors which can give you this information typically accumulate error over time, so you end up losing track of your coordinate position pretty quickly.

If you can remove the need for the sensor to provide you an absolute (field-centric) coordinate position, the door opens wide for sensors that can give you accurate data over the duration of the match.

Sensors that can provide you heading:

  • ADXL345 - this has been in the KOP for a number of years. It’s your standard rate gyro. In my experience you’re looking about a degree of drift every few seconds. The drift makes using this sensor difficult.
  • ADXRS453 Gyro ($76 eval board from digikey). We used this last year with great success. I believe a number of other teams have used this sensor as well. Pretty sure it’s what’s on the Spartan MXP board that came out this year. We have observed about 1-2 degrees of drift over the duration of the match. Our code to interface to it. This is a 1 axis gyro, so it’s just going to be able to provide heading. And like other rate gyros, it requires calibration to take place when the robot is sitting still. We’ve written code to detect when the calibration is innacurate and force a re-cal. If you go this route, pay attention to the method supporting calibration inside of the Robot class.
  • Invensense MPU - and many other IMUs - these usually provide a gyro, accelerometer and magnetometer/compass. You’re going ot be able to find some of these IMU boards for pretty cheap, the problem is is they are giving you raw sensor data. You need the code to be able to turn that data into euler angles. If you haven’t spent time working on these kinds of systems before, I wouldn’t suggest starting in season.
  • NavX - As previously mentioned in this thread, this is a product specifically developed for FIRST and used by many teams with success. You’re likely going to be able to get support if you use this device. They couple one of the Invensense IMUs with a processor. All the code to turn the raw sensor data into useable position data is on the board, and libraries to interface to it are already developed/tested for your use.
  • Bosch BNO055 - A new sensor that recently came out. It’s just like other IMUs in that it has an onboard suite of sensors (accelerometer, gyro, magnetometer), but what makes this one special is it has on-board algorithms to fuse the sensor data and provide clean euler angles or quaternions. Also, it’s really cheap (~$35) - You can get this for free with the digikey PDV. I released code to interface with this sensor before the season started. There’s more documentation and info linked from that repository. So far it’s been looking very promising. I’ve run comparisons against the ADXRS453 and over the duration of a match they are within 1-2 degrees of one another. So looks to provide very accurate heading position, zero drift, fast calibration.

We will likely be using the BNO055 in season this year, if we find a problem with it, we will be falling back to the ADXRS453.

For reference, the way we approach this problem on our team is through a combination of a camera and gyro.

  • The drive team (or auto code) gets the robot to the approximate location we want to shoot from on the field.
  • vision code identifies where we are relative to the goal (distance and angular displacement)
  • a gyro is used to correct for angular displacement by rotating the chassis to center our shot on the goal
  • in the past we have corrected our distance error by adjusting the RPM of our shooter, not by adjusting the position of our drivetrain on the field

In short, we use the visual targets on the goal to localize our robot, instead of requiring a sensor to keep track of our absolute field position over the duration of the match.

Don’t forget you have a bunch of PDVs in your kit of parts that can be used to buy sensors. The one from digikey is good for $35

Not only does the Bosch BNO055 look like a great option, but your methodology for autonomous is very helpful! Thank you so much for the great response!

Concur Bosch BNO055 Absolute Orientation IMU is a good choice for Roll, Pitch, and Yaw (what is called an Attitude and Heading Reference System, or AHRS). It is cheap (well, actually free) and high performing (advanced fusion algorithm is already integrated).

If you use LabVIEW, see this post, the RoboBees released code for this sensor before kickoff. It gives guidance on calibrating it and will store the calibration for you so you can use it in a competition.

As with any advanced sensor, sufficient time and experimentation is required to integrate into both the robot design and control system. You’ll want to ensure that magnetic fields generated by the robot don’t overly degrade the performance of any IMU.

Every team is getting an ADXRS450 as part of the 2nd first choice. See http://firstchoicebyandymark.com/fc16-000. FIRST also added libraries to wpilib.

We’ve also used this process. The year it was most useful was 2012, going for the center bridge first, then shooting. See an example of our autonomous here: Quals 33 - Newton Division 2012 - The Blue Alliance

I’d suggest running the BNO055 in IMU mode (just gyro and accel) to remove any interference from motor EMI. This means the heading will be relative to the orientation the robot is in at the start of the match (instead of to earth mag. field), but I find that more useful personally.

Every team is getting an ADXRS450 as part of the 2nd first choice.

Joe, thanks for pointing that out. I hadn’t looked at the specs on this board yet. Looks like the difference between the 450 and 453 is 25 deg/hr vs 16 deg/hr drift rating respectively.

So if you just need heading (not yaw or tilt), this looks like a great option. Especially because its free! The Java code I linked to above should work with the 450 as well. It was used all last season without error. I obviously haven’t tested it yet with the 450, since nobody has these sensors in hand yet. Also, I wouldn’t be surprised if a future WPIlib update included a class for this sensor, since all teams are receiving them.

Forgot to mention that in my first post. It’s in the library now. ADXRS450_Gyro

So, to poll the opinions of all of you who’ve responded (thanks very much btw!), I propose this question:

Should I use the BNO055 or the ADXRS450?

This is the situation as I see it:

They are both available to me for free (by voucher the BNO055 won’t cost me anything). They both have Labview examples available. They are both probably above my learning curve so I’ll be posting here for help (I’m a second year programmer, and my team has never touched sensors).

I’d like to use one of them to help me build an autonomous that can traverse a defense and score a high goal. I want the sensor to help me get to an approximated shooting location in the courtyard. Vision code will identify our location relative to the goal, and then the sensor will correct the angular displacement and square us to the goal.

So, with the above in mind, and the goal that I have, BNO055 or ADXRS450?

Thank you all again so much for your help!

Hello!

Robotics engineer specializing in inertial nav here. Pretty much all inexpensive chipscale MEMs IMUs are going to perform fairly similarly in terms of noise and stability over time, which is to say, fairly terribly when trying to do positioning. There are many barriers that make this difficult, but the way to think about it is that you’re integrating acceleration twice to find position, and the sensor rarely if ever is calibrated such that motionless reads exactly zero acceleration - something called a sensor bias. In practice, what this means is that if you just sit there integrating, you’ll slowly drift off into outer space, and with these sensors the drift will be meters per minute (and it’s exponential). Some sensors and software packages are capable of detecting zero-motion conditions (literally, when the acceleration looks like it’s so small that the robot assumes there is no motion), and takes that opportunity to get a better estimate of its biases (known in the lingo as a zero-velocity update, or ZUPT), but with how often a FIRST robot is stationary I doubt it’s going to get a solid estimate during the game.

The answer to many of life’s difficulties to finding position with an IMU is to have an external aid of some sort - usually an encoder is preferred - and then fuse the IMU data and encoder data (encoders can be used as velocities or incremental position estimates, we’ve had better luck in incremental position mode, simply because differentiation is usually noisy). That allows you to take different measurements, even ones that seem to disagree slightly, and put them together in a way that trusts particular sensors in different scenarios. The typical way to do that is a kalman filter, but that may be beyond the scope of all but the most dedicated FIRST teams.

Instead, what is usually done when you must have displacement but don’t have the time and/or resources to do a full inertial nav system is to use an IMU in AHRS mode and then use the encoders to get incremental position. Use the AHRS to find the direction of your motion, and the encoders to get the magnitude of the motion, and hey presto, a not-terrible position estimate. Note that it will degrade pretty rapidly, and spinning your wheels without moving (like getting in a shoving match) will destroy the position estimate, but it circumvents a lot of the typical issues of an inertial-only estimate - error is linear with time instead of exponential, and guaranteed to stop drifting when motionless.

Using external aids like the vision system and field markers is the hands-down best way to ensure good field position estimates, but is often difficult and/or computationally expensive. FIRST does a great job of making some of the more advanced sensors easy to use, but it still takes an awful lot of finagling to get everything working right. In short, if you’re going to do something clever like that, leave plenty of time for tweaking.

Sparks

This sensor is actually an accelerometer, not a gyro!

Everyone will be receiving an ADXRS450 in their FIRST Choice shipment this year! This sensor won’t give you position information, but it should be helpful for crossing defenses in autonomous. Code was also added to the official WPI libraries, so getting started should be easy!

The big difference between the 450 and the 453 is an additional calibration step at temperature.

I agree! Consumer-grade sensors don’t usually include calibration to remove misalignments, offset, etc. from the sensor outputs. That’s why I encourage teams to use the ADIS16448! The sensor doesn’t have a built-in AHRS mode, but I’ve put together an AHRS library which calculates Euler angles and allows your robot to use them for navigation in LabVIEW!

So, I went ahead and ordered a BNO055 using our voucher. So we will have both that and an ADXRS450 to play with.

Also, I didn’t know, but it seems our coach purchased a bunch of Talon SRXs. I’m just now starting to read up on them, but don’t they have some sort of encoder built in? I could try to merge that data with the BNO055 or the ADXRS450.

juchong, I’m a little unsure, are you saying the ADXRS450 won’t be suitable for what I’d like to do?

Hi bts! The Talon SRXs have a connector on them to allow you to wire encoders directly. You can then configure the motor controllers to use the encoder information in a PID loop (among other things).

The XRS450 will be suitable for getting your robot across obstacles and maintaining heading, but it will not provide you with a position in 3D space. Since the sensor only measures one axis, there isn’t enough information to calculate position.

Okay! Forgive me, it’s our first year using sensors - do you recommend any encoders? I’ve heard of something called an E4P, and we actually have a few of those.

Also, I’d be fairly happy with just getting across the defense and maintaining heading, but is there something more I could do with the BNO055?

Thanks,

Benjamin

We’ve used the E4P encoders for many years with great success, but they’re tricky to install. It all depends on which wheelbox you’re planning on using.

In theory you could use the BNO055 to calculate position in space, but as others pointed out before, you’ll have to overcome some issues to get that working!

Okay. I’ll get with our drivetrain guys to see if we could fit an E4P in. Otherwise, do you have suggestions for something easier to install and use?

Also, we’re using a 6 CIM drivetrain with the talon SRXs. We’ll probably also need to use encoders on our feeder/shooter motors as well. Do we need an encoder for each of these? 6+ encoders?

Thanks,

Benjamin