|
Re: Inertial Measurement vs. Vision Tracking
Several things: First, it sounds like you and the other programmer are competing with each other. That isn't necessarily a bad thing, but if you're working on incompatible systems in different languages, it's going to make it hard to leverage each other's work when you have to produce a single system. Even if you're having a friendly competition to see which system will work better, you'll eventually want to be able to combine insights and code from both systems into your final robot. I don't know your backgrounds or team dynamics, but this might be a chance for you to learn C++ or for the other programmer to learn LabVIEW.
Now, as others have said, this isn't an either/or kind of problem. The IMU provides very good continuous measurements, but it's subject to drift and integration error. A camera provides great ground-truth measurements, but they are lower-rate and are much more prone to error. The right solution is to use both, taking the strengths of each to produce a reliable position estimate.
Gyro drift can be a problem, but it's not going to doom you in 15 seconds. Calibrate it as well as you can, reset the angle when you enter autonomous mode, and you should be fine. We made this stuff work ten years ago, and the IMUs today are significantly better than the ones we had in FRC back then. Accelerometer drift is a bigger problem than gyro drift, but I'm not qualified to say how well it works on a robot.
__________________
Need a physics refresher? Want to know if that motor is big enough for your arm? A FIRST Encounter with Physics
2005-2007: Student | Team #1519, Mechanical Mayhem | Milford, NH
2008-2011: Mentor | Team #2359, RoboLobos | Edmond, OK
2014-??: Mentor | Looking for a team...
|