solvePNP 2020

I am implementing solvePNP for our 2020 vision system. I have this:

I am using the provided images from the WPILIb release here.

With BlueGoal-330in-ProtectedZone.jpg, I get the following output:

Rotation Vector:
 [ 0.80314211]
Translation Vector:
 [[ 92.18972654]
 [131.7391002 ]

This just doesn’t seem like it’s saying the target is 330 inches away. Can I please get some guidance?

How do you know that necessary camera matrix? The distances will totally depend on getting that correct.

1 Like

We would most likely need more information on your camera, calculated camera matrix, distortion matrix, etc. There have been other teams that have tried 3D post estimation and I believe limelight did some testing with that. If you want to try it out, by all means do so, but it seems to be complicating a problem that can be quite easily solved by simpler algorithms. In addition there was a lot of problematic results caused by symmetrical targets, not to mention you can achieve similar results with simpler algorithms. Overall it just seems that it causes more trouble than it’s worth, although it’s fun to play around with(my opinion).

To counter that, we have used solvePnP() successfully for the last few years and it does work. My point is that you are using a canned image for which you don’t know the camera, so you can’t really expect the distances to be correct. (Unless I am missing something…)

1 Like

I am basing my code off of this tutorial

I’m now trying to calibrate my camera, but it’s challenging.

You’re right about that. I’ve been in contact with the guy who took the pictures though (hi @AustinShalit)

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.