Does anyone have a base Java code for this sensor because I can not even find anything on it or the repository for it.
The driver for that sensor is now in WPILib itself. There is an ADIS16470_IMU class.
does this IMU have support for getting pitch? From what I see in the docs, it doesn’t look like there is a method for that.
You can pass a different IMUAxis upon construction to change which axis is used, but you can still only get 1 axis per sensor.
That’s interesting, so you wouldn’t be able to control the yaw for trajectory tracking if you used the imu for pitch control?
Correct. The device only supports integrating a single axis on the device.
Actually, it turns out I was wrong, and you can get those values. They just had an odd name. getXComplementaryAngle and getYComplementaryAngle are the functions to get those values. They’re named a bit weirdly because which one to use changes based on which orientation the gyro is positioned in.
@Thad_House, can you elaborate on the ComplementaryAngle functions? We’ve been looking for documentation on those and haven’t found any.
Looking at the ADIS16470_IMU class, it appears that indeed only one gyro axis is integrated. I’m not clear on where the Complementary angles come from, but it looks like they’re based on accels rather than gyros. In which case, it seems like they won’t be accurate unless the sensor is not under any acceleration other than gravity, right?
Also, can you comment on the orientation of the X and Y complementary angles relative to the sensor and/or the yaw axis?
@juchong can you shed any light on the above?
From my chat with who wrote the code, because you have gravity working for you in those 2 axes, you don’t actually need to do full rate integration, and can just fuse accelerometer and gyro readings with a complementary filter, which is what is done to get those values.
As for the orientation, I think if looking at the sensor from the top mounted to the roborio, and the roborio pointing forward, X is left and right, and Y is forward and back.
Thanks! We’ll give that a shot and see if we can get it to work the way you described. (We’d do that in any case, but it’s good to have some confirmation of our assumptions.)
We were finally able to test the complementary angle methods. Their angles did seem to track reasonably well, although we were seeing quite significant errors in reported angles when the sensor was under acceleration. Is there something I’m missing, or is this expected?
this happens with our robot as well.
The way the ADIS16470 library was written, the class only pulls one gyro axis at a time from the IMU. The ComplementaryAngle functions are used to get the downward direction as read by the accelerometers. They do process based on rotational velocity in X or Y, but if you’re robot is accelerating, this reading will be wrong.
This is not to say that the ADIS16470 is incapable of reading all 3 gyro axis. I am not sure why, when the library was first written, the SPI packet was restricted to one gyro axis. You get to choose the axis. It could be the processing limitations of the RoboRIO, or maybe the buffer overflows if you pull all three gyro axes. I do not see a reason you can’t pull all three gyro axes in the DataSheet. Though I may have missed it.
340 has been rewriting the library, but I feel we are short on testing. I will post our copy later, once it is cleaned up, but this will be in Java. Here is an older version that has the key feature changes.
I am sorry to C++ and LabVIEW teams as I’m not sure we have time to write this.
What needs to happen, if you should need to rewrite the library yourself, is a change in the SPI packet that the ADIS1670 send to the RoboRIO, and parse it differently. These are the steps we took, with links to clarify:
(1) We use a different packet, that requests the X gyro, the Y gyro and the Z gyro in addition to the other data.
(2) Create members to store the accumulated gyro data for all three axes.
(3) Set the autoSPI to use the afore mentioned packet.
(4) Alter the total counts of datapoints per packet. 8 have been added
(5) Change how the packet is read. First the X gyro delta is read, then the Y, then the Z. Note that all ints must increase by one at each use in this section.
(6) if this is the first run, set all accumulated member angles to zero, or if this is not the first run, added the gyro deltas to the accumulated member angles.
(7) Create accessor to pull accumulated member angles for each axis.
The most recent diff with respect to the MutliChannelADIS.java, might also be helpful.
Rob, that’s pretty much what I expected to hear. I agree with everything you said.
Thanks for taking on the driver update, and for listing the details. I was thinking about doing that myself, but especially with the unknowns you listed, I didn’t want to start down that path half way through build season. We’ll give your older version a try. If you want someone to review and test your latest code, we’d be happy to help.
Even if it doesn’t happen during the season, we would be happy to integrate improvements into the WPILib one. We simply imported the last version written by the vendor with minimal changes, just to keep it working for teams.
The latest version of the ADIS16470 class is HERE.
The way it creates and processes the packet is the same as the other one I posted. However he documentation is improved, as well as the methods and constructors. Also the name is clear.
I have looked at the other ADIS IMU in wpilib, and it uses the same number of packets as the modified one I am posting. (27, shown HERE) So this makes me think that my concerns about the buffer size are unfounded, similarly the amount of math being done is only marginally more than with the ADIS16448.
When testing, it should be noted that the mount for the ADIS16470 could use some shims to make sure that the axes are true to the robot. This sensor does still drift a bit, and I would like to test it on more than a test bench RIO. But I am erring on the side of sharing what we have so far.
Thanks, Rob. We tested the previous version (with some modifications to the access methods - but not the acquisition or internal math), and found the X and Y angles to work great. Looking forward to testing your latest version.
One issue we had, as you noted, is the alignment of the sensor to the robot’s axes. Even though the sensor appears to be aligned with the robot, I know it’s not perfect. If my math is right, even a 2° misalignment can cause an error (“drift”) in the reported measurements of around 3%. This is most apparent in the X and Y angles, where we saw about 10° of error per Z rotation of the robot. To be clear, this is not a problem with the driver, but just a system error that we’ll need to account for.
Came to this thread too late, seeing it day before competition, but glad to learn more. Decided to not test a custom driver day of, even though I think it would have resolved our autonomous issues. Hope to test more and then maybe get into WPIlib in future, which could help many team with this gyro. Thanks for putting this info here and I’ll keep up with anything I learn testing it now that we aren’t concerned about how it could impact teleop driving
Another issue my team has found with this gyro was it being unreliable. What happened is before competition in the pits it was working fine. Then when we got to matches it failed reading a crazy tilt angle of 3475.472 and a turn angle of about 70000.00 so we had no auto balance auto or our two piece auto because the gyro was unreliable some matches. So some matches it would work some it would not here is a video of the first match that it stopped working: