My off season focus is computer vision, and one topic I hope to get to is localization, i.e. figuring out where the robot is based on cues in the environment. One tool that could be used as part of the solution is stereo vision. There’s only one problem. (Well, there might be more than one, but one at a time…) Somehow, I have to get two time-synchronized images from two different cameras.
That could be an issue. I think if the robot is moving, the motion will be enough that two images taken “at the same time” won’t really be at the same time, and the difference will be enough to completely throw off the distance measurements.
One option I did see from a thread here was the ZED stereolabs camera, but it’s a bit pricey for FIRST. It barely fits into the newly revised cost limits, and it looks like it needs some pricey support systems to go with it. So, I would rather find some other solution.
And so, before I go too far down a particular road, I’ll start by asking if there are any giants whose shoulders I can stand on. If you have already done it, I would be very happy to try and imitate your solutions.