We want to backtrack where our vision system brought the robot by storing the initial encoder ticks, doing our vision based task (which includes driving) and then driving backwards manually to get the encoders close to that point that they started at.
Is there anything in the WPILIB that supports that sort of functionality? if not we will impliment our own! But didn’t want to recreate the wheel!
This would be a perfect use for the wpilib Trajectory, odometry, and path tracking features, but the initial setup is pretty involved.
You could also just use pid controller on the drive wheels and use that to drive back to the previous encoder point if it’s in a pretty straight line
We actually have the Trajectory and Odometry stuff implemented and use it to drive paths… the only problem is when we go ‘off script’ with our vision, we can’t really use a preplanned path.
Care to elaborate on how we should be using it in this case?
The way I would do it:
Grab starting pose from odometry and store it.
Let vision do its thing
Then, you can generate a path using TrajectoryGenerator from your current (post-vision) pose to your original pose.
oh my gosh yes! Thats so much easier. Thank you!
Am I missing something in the docs? looks like I have to pass in interior waypoints… maybe thats somehow available in the odometry class?
You can pass an empty list and it’ll make a trajectory without interior points (we call those “knot points”, for future reference).
Ah cool! Nice and simple!
This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.