Quote:
Originally Posted by Doron_Sivan
xmaams - even that you don't use it in the matches - did your formula for finding distance worked? And can you post an explanation / the roborealm code for that (if you want to..)
|
The RR code is in a visual basic script with some trigonometry. After we find the target rectangle, we find the angle that the target takes up on the camera by multiplying the field of view of the camera (radians) by the width of the rectangle (pixels) over the width of the image (pixels). We use half of the result to make the next step simpler.
The distance (inches) is then the width of the target in real life (inches) divided by two (to match the half angle from above) and then divided by the tangent of the above angle.
If you have any questions about how the math works I can draw you a picture with my amazing paint skillz.
The code ends up looking something like this:
halfTargetRad=fovRad*(widthPx/imageWidth)/2
distance=(widthTargetInch/2)*1/(Tan(halfTargetRad))
This method works better if you correct for the distortion in the axis camera and if you use a larger image resolution. For us, this was accurate to within a foot on a 1/5 scale target 12 feet away. (which scales to almost full court for a regular target)
Quote:
Originally Posted by Doron_Sivan
Also, because we used also RR - have you experienced laggs on the robot when running code + RR at the same time?
|
RR runs on our DS laptop, which is not the classmate. We connect to the camera directly, so there is no load on the cRIO. Then we sent the data back to the robot with network tables. (this is about when the vision project got put on hold, but up to then we noticed no lag.) If you are using the classmate you might not have enough power to do a lot of vision and communication. Try to limit your FPS, and remember that grayscale images process more quickly.