Distinguish between vision targets

This year our team decided to use computer vision as the primary method of aligning the robot, and so far we have got pretty decent results tracking single objects with a USB cam and openCV. But we haven’t figured out a way to distinguish between multiple sets of targets yet (like on the side of the ship).

One approach I’ve thought of is to measure if a target is facing left or right and use that info to track, but I don’t know how to do that. Is that possible? Are there any other approach to solve this problem?

Thanks in advance.

Well since you’d most likely use vision for lining up theres likely one set of tape that you are most centered on. So id just get the two contours most in the center of the image.

Well there’s the situation where it might register the rightmost target of one set and the leftmost of the next set and end up lining up with the space between two targets.

Of course we could tell the driver to move the bot a bit but that might be time-consuming.


I would recommend doing something similar to what the limelight does, specifically, the Intersection filter, extend lines out at the target’s angle and see if they will ever hit one another, if they hit above the center point of the targets, use that set, otherwise, don’t.

But the targets are intentionally angled (top edge tipped toward the opening) to avoid that. Just compare your shapes to make sure they’re in an A orientation, not a V orientation

2 Likes

That’s what I want to do. Are there any methods in opencv that can do that?

No, opencv doesn’t have something built in to do it, but it should be able to get you to the individual angled bits, then the rest is manual code to look at those pieces and figure out how to group them.

1 Like

If you use contours to detect the vision targets, then you can fit a line to each contour.

1 Like

Edit: Forgot the link.
From the opencv docs:

You can just check the rotation based on calling the minAreaRect() method and looking at the Box2D return.

Went with rotatedRect and got quite great results. Many thanks for you guys’ help!