Does anyone have any ideas as to how I could differentiate between a target being skewed to the left (facing left) and a target skewed to the right (facing right)? This is with the limelight by the way.

Not familiar with limelight, but in general, you will need to access the non-rectangular statistics of the shapes you are looking for. If you are looking for something which is **actually** the same height on the left and on the right, then the side which appears shorter is farther away from the camera, meaning that the camera is on the side which appears taller.

i dont know the exact directions but i do know one side is positive and one side is negative so you should be able to figure it out pretty easily

We had useable results by comparing width and height to find skew. To calculate which direction we were skewed we compared the left and right sides of the target to see what was larger. We had to do a huge rolling average to get these values useable but itâ€™s a decent â€śbudgetâ€ť option if your looking to stay away from solvepnp or more complicated methods.

How did you guys figure out which side was shorter? I know you can see it from the web dashboard lol, but as far as I could tell from the documentation tshort and tlong just return the length of the shortest side/longest side, not which side is longest/shortest.

You can subtract the corners to find the length of each side. I think we had to do some weird sorting to find which corners are which.

we also use a similar sort of method for calculating distance so that it doesnâ€™t change with skew. We find the middle height by averaging the top and bottom corners and subtract them. A lookup table can be used to compare this length to distance.

Iâ€™m not certain how to do this on Limelight. I would like to know if there is a way.

I did manage to get this to work properly with a JeVois. I used the Open CV command **cv2.minAreaRect()**. It returns an angle and on our code it always is a negative angle, but in one direction the angle is near -15 and the other it is near -75 so we just look to see if it is larger or smaller than -45 and sort accordingly.

I just wrote this for our limelight. I started by looking at the skew (rotation) of the image. Since our targets this year are stationary, we can assume any skew is due to perspective (assuming your robot isnâ€™t rotated either! )

The skew can tell you if youâ€™re right or left of the target. Viewing from above the target, if the target is rotated clockwise, I found that the skew starts at -90 and goes down in degrees to about 60-65 before it gets too noisy. For counter-clockwise rotation, the skew starts at -0.01 and goes to about 30-35 before it gets noisy. So I set those values as min/max values to work with and I wrote the functions below:

```
public boolean isHeadOnTarget() {
return getTS() == 0.0;
}
public boolean isRightOfTarget() {
double ts = getTS();
return ts <= RobotMap.LIMELIGHT_SKEW_CLOCKWISE_MAX &&
ts >= RobotMap.LIMELIGHT_SKEW_CLOCKWISE_MIN;
}
public boolean isLeftOfTarget() {
double ts = getTS();
return ts >= RobotMap.LIMELIGHT_SKEW_COUNTERCLOCKWISE_MAX &&
ts <= RobotMap.LIMELIGHT_SKEW_COUNTERCLOCKWISE_MIN;
}
```

Kind of a goofy way to do it, but we actually mounted our camera at a 14.5 degree angle and offset it to one side to be in line with the right target. The right target appears as a vertical target and the left one is much wider, so we can take the ratio of the height and width to make sure weâ€™re locking on to the right target.

@coderkevin why does the isLeftOfTarget function use >= max and <= min when the isRightOfTarget function uses the opposite? Iâ€™m assuming you used the values you listed (-90, -60, -35, -0.01) for clockwise min & max and counter-clockwise min & max respectively, is that correct? Do you never see values between -60 and -35? Iâ€™m trying to solve the left-vs-right issue, myself, and this would be a huge help as I hadnâ€™t understood how skew could be used. Thanks!

You can also calculate this from the camera transform data from the limelight if you are running the 3d transform.

In the code above in just checking if the skew is in one of the acceptable ranges. Skew of middle values can occur but I only observed them when at a very obscure angle. (Assuming your vision target doesnâ€™t rotate, which if your field elements or robot are rotating, youâ€™ve got bigger problems! )

How I tested this: we made a vision target on a small wooden board and hung it on a bracket on a plastic sawhorse to be the right height for a low hatch goal/loading station. I then moved the sawhorse around in front of the practice robot with our limelight in place.

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.