First of all, forgive us for our ignorance. But our team has a question regarding aiming with a camera. Since we managed to place a camera to assist the pilot’s vision,
We wonder if it is possible to move the robot to align with a specific image such as a QR code or anything else during a possible autonomous period. We looked for some vision processing examples in the installed labview tutorials,
but we still don’t understand how we can move a motor according to the image.
You could connect the camera to a raspberry pi running Photonvision . This allows you to detect apriltags, and doesn’t require a whole lot of code. Alternatively, if you have a limelight 2 or 3, they have pretty good built in apriltag detection. Forgive me, but I’m not too familiar with labview, are you able to access Networktables? If so, you’ll be able to access the data from the camera and tags through those, and then align to it using a Pid controller.
Here’s some more info on what apriltags are
Let me know if you have any questions,
-Aiden
We are not very familiar with other languages such as Java and C. However, we will investigate, thanks for the tips
The photonvision labview implementation contains a zip file with examples. The examples include controlling a robot to move towards a target.
I think there may be an earlier chief delphi post (maybe a year or two ago that also has some sample code.
Sir, could this be the post you are talking about? we will investigate. We are beginners with this kind of thing, as we only have one programmer on the team who is not very experienced. We will come back to this topic if we have problems
Yes. There is another newer post that has similar a more detailed answer. That is for following paths. If you just want to point or mover towards are target that is a little different. The photo vision Labview had examples for this too.