Quote:
Originally Posted by Ross3098
Ive been thinking about setting up localization/field coordinate system for this game. The only problem I can see is the bump causing some of the wheels to be off the ground and the encoders giving inaccurate measurements.
I'm trying very hard to figure a way around this issue because I have absolutely no idea how to do vision tracking in C++ this year.
|
In the "real world" of mobile robotics, typically you would deal with this problem doing some sort of sensor fusion.
Basically, if you look at each of the sensors available to us, they are all useful for localization but none of them is perfect:
* Gyros/Accelerometers: Very fast response and good accuracy initially, but by integrating accelerations and velocities over time, they drift.
* Encoders: Very precise distance/speed measurements...as long as your wheels don't slip and you know the precise diameter of your wheels.
* Vision system: Seeing a known "landmark" like the goal tells you a great deal about your absolute position on the field, but you can't always see the goal, and you will sometimes get false alarms depending on the environment.
In robotics, we often find robots with this arrangement of sensors. By fusing their outputs together, you can get a system that compensates for the individual failings of each sensor. For example, you might use your gyro for most of your heading measurements, but if you get a good shot of the goal, you "reset" your gyro to reduce/eliminate drift. Common fusion techniques include Extended and Unscented Kalman Filters. Unfortunately, getting these systems working is a Masters/PhD level challenge, and would be difficult to get working well in 6 weeks for anyone (especially since you won't have a testable robot for much of that time).
That said, I am hoping that at least a few ultra high end teams take on this challenge (I'm looking at you, 254).
Jared