|
|
|
![]() |
|
|||||||
|
||||||||
![]() |
| Thread Tools | Rate Thread | Display Modes |
|
#1
|
||||
|
||||
|
Using navX MXP to get robot velocity
From this thread:
Quote:
|
|
#2
|
||||
|
||||
|
Re: Using navX MXP to get robot velocity
Now that you bring it up I am really excited to give it a go before the season starts. Too bad I am a college student and don't have access to the hardware for this endeavor.
The way I see it though the noise of the accelerator will result in an error of many magnitudes greater than the error in robot velocity from drive train encoders. While we are on the subject of sensors to obtain robot velocity, has anyone tried multi point sonar (or other range finder) velocity? Not that it would be in any way practical, but just a thought. I don't mean to hijack this tread in any way, sorry if I do. |
|
#3
|
|||
|
|||
|
Re: Using navX MXP to get robot velocity
Quote:
The navX-MXP can track the rotation and remove the gravity component since it fuses the gyroscope data w/the accelerometer data. The navX-MXP v. 2.0 firmware has been modified to calculate velocity and displacement onboard in real-time. However.... After analyzing the resulting displacement data, my determination is that it's accurate to about 1 meter over a 15 second autonomous period, which is not accurate enough for the types of things we are doing. There are a few reasons for this level of accuracy. The primary reason is that the levels of noise present in the current generation of MEMS accelerometers, such as those in the navX-MXP, is high enough that when it is double-integrated (cubed) there's a large error that's introduced. And this is what leads to the inaccuracy. Lots of work went into filtering the data, and also disabling integration when no motion is detection, but the results still aren't near what we need for use in Autonomous. Based upon this investigation, my current estimation is we need to see noise levels that are about 1/100th (2 orders of magnitude) lower than current technology to get to an error of about a centimeter or so during the autonomous period. When technology reaches that point, we'll build a board that makes this features available for use in FIRST FRC and FTC robots. But until that time, it's a waiting game. STMicro is making good strides towards lower-noise MEMS accelerometers (25 ug/square-root-hertz is their best noise spec); these accelerometers are more expensive, and they're still not to the level we'll need them to be (~2 ug/square-root-hertz). Looking into my crystal ball, I think we're roughly 5 years away from something that becomes viable. If you do want to play around with what the navX-MXP calculates, there's a new "Experimental" button in the navXUI that, along w/the new v. 2.0 firmware, will display integrated velocity and displacement in the x and y axes. It's marked "experimental" for a reason... |
|
#4
|
|||
|
|||
|
Re: Using navX MXP to get robot velocity
Initially, I thought the Bosch integrated IMU was better than the Navmxp for this. However there is a difference between on the kitchen table and on a robot. On a really robot the error is about the same as Silbert noted. We also looked at several gaming mice back in 2013. The problem with mice is the carpet. If we played on a smooth flooring they could be a solution. They loose there accuracy at higher velocities. If some one wanted to put the effort into it using a camera several inches off the carpet and proper illumination coupled Advanced optic flow algorithms may work for FRC bots.
For our team, machine vision solutions have been beyond our reach with the resources we have. We are a swerve team. To navigate 3 tote auto this year we used encoders on all 4 wheels, a navxmxp, and a sonar sensor referencing the distance to the player station wall. Fusing these inputs, we are able to perform the several motions needed to accomplish the 3 tote auto reliably. Maybe some day auto navigation will be plug and play but for now it's hard. I commend the teams that did 3 tote auto this year. |
|
#5
|
|||
|
|||
|
Re: Using navX MXP to get robot velocity
Interesting to note - Kauailabs is working on a prototype of this - PX4Flow optical flow sensor plus navX-based sensor plus LEDs plus proximity sensor - and sensor fusion. Illumination and optical focus are tricky issues, esp at higher speeds (higher frame rates), and an autofocus mechanism may be required to make it easy to configure. As you say, it's not trivial, and our goal is to determine its accuracy, repeatability, and if it can be manufactured as a plug n play product at a reasonable cost. If enough progress is made we hope to present some preliminary results next year. Fusing optical flow and imu with wheel encoders as a fallback if optical flow SNR is too low may be the way to go.
|
|
#6
|
|||||
|
|||||
|
Re: Using navX MXP to get robot velocity
We tested using optical tracking of robot position a decade ago, using an optical mouse with the equivalent of a telephoto lens. When the distance from the sensor to the carpet is perfectly consistent, the results are great. But any variation in height changes the field of view, and that changes the relationship between motion of the image and physical motion of the robot. We couldn't eliminate vibration and bounce from the system, and eventually abandoned the project as unworkable on a real-world FRC robot. (It might have been worth trying for Lunacy, with the super-smooth floor and very low accelerations, but the trailer still caused some variations in tilt that would have to be compensated for somehow.)
If the carpet had a pattern on it that could be detected and measured, keeping the horizontal distances calibrated, I think it would be a good idea to revisit the concept. |
|
#7
|
|||
|
|||
|
Re: Using navX MXP to get robot velocity
Quote:
The vibration part would seem to be managed by a very high framerate in the sensor, our previous experiments worked w/the ADNS3080 optical mouse sensor which had a 6400hz framerate if I recall correctly. What framerate were you using, and what was the general frequency of the vibration? The resolution of the ADNS3080 was very low, which is why our hope is the much higher resolution of the PX4FLOW sensor will allow it to seem more structure in the carpet than the ADNS3080. Getting enough CPU and illumination to use the higher res at a high framerate is the challenging part currently encountered. |
|
#8
|
|||||
|
|||||
|
Re: Using navX MXP to get robot velocity
Quote:
The vibration on our test rig was induced by a 2004-vintage Thomas air compressor (we used a pneumatic linear actuator to move a strip of carpet a known distance), at I'd guess a few dozen Hertz. When we tried to use the sensors on an actual robot, the roughness of the wheels and the compliance of the carpet induced enough bounce and other vertical "noise" to make the readings nearly useless. |
|
#9
|
|||
|
|||
|
Re: Using navX MXP to get robot velocity
Quote:
If we back up to the OP's question about velocity for motion profiling (rather than calculating displacement), do you think your test rig showed promise for velocity vector estimation that would rival or exceed encoders? |
|
#10
|
|||||
|
|||||
|
Re: Using navX MXP to get robot velocity
Quote:
When we gave up on the "telephoto mouse" project, we went to a trio of encoders (something like a US Digital S1, 64 counts per revolution) mounted to custom-fabricated follower wheels that were built something like the old "trick wheel" omnis. On carpet, they worked very well (though we soon discovered a hardware limitation in the IFI controllers' interrupt support, which we overcame with a simple TTL "quadrature hysteresis" circuit between the encoder and the digital input pins). |
![]() |
| Thread Tools | |
| Display Modes | Rate This Thread |
|
|