We want to backtrack where our vision system brought the robot by storing the initial encoder ticks, doing our vision based task (which includes driving) and then driving backwards manually to get the encoders close to that point that they started at.
Is there anything in the WPILIB that supports that sort of functionality? if not we will impliment our own! But didn’t want to recreate the wheel!
We actually have the Trajectory and Odometry stuff implemented and use it to drive paths… the only problem is when we go ‘off script’ with our vision, we can’t really use a preplanned path.
Care to elaborate on how we should be using it in this case?
Grab starting pose from odometry and store it.
Let vision do its thing
Then, you can generate a path using TrajectoryGenerator from your current (post-vision) pose to your original pose.