Is it possible to create a trajectory in autonomous from a 3d pose estimation from a camera running solvepnp? If so is it viable, and is there anything we should know before trying this?
It’s possible. We attempted it in 2019.
The basic cycle was:
- Driver gets robot about 15 feet away from target, roughly pointed at it
- Driver hits “align” button
- solvePNP results transferred from camera/coprocessor to RIO
- solvePNP results used to generate a path which drives the robot from its current position to a fixed position/orientation relative to the solvePNP result
- Executes the path after generation.
The biggest thing we had was tuning our drivetrain path follower to work well in all cases. Our robot required +/- 1 inch precision to work well, and we could only get the combination of camera/drivetrain within 2 or 3 inches.
We also didn’t spend a ton of time on it, so I’m fairly certain there’s a bug or two in our coordinate transforms. Using the newly-built-in wpilib coordinate transform functions would be advisable.
Ok thank you. Did you find that the provided solvePNP results were accurate enough for the trajectory, or was it a major contributing factor in the lack of precision?
With high resolution, low framerate, and sub-pixel feature identification, from about 15 feet away, we were seeing ~1 inch accuracy in solvePNP itself. The bigger error we saw came from flex in the camera mount, and pneumatic rubber wheels.
Alright thank you! Do you remember what resolution you were using?
the max we could push the camera to.
In 2019 - I believe 1280 x 960.
2020 we ran a similar pipeline at 1920x1080 - in this case, the high resolution was mostly to get accurate angles from 3/4 of the way down the field.
That being said, it really depends on your hardware & SW assumptions. For these applications, I believe the answer is to use higher resolution - slower processing, but higher spatial accuracy in the results.
This isn’t the only method of course, a faster pipline could be used to drive a more continuous-update algorithm (as supposed to ours, which was very “one-shot” in its nature).