Is image processing done on the driverstation allowed if relayed to the robot over normal means

We’re trying out some very advanced software solutions and we’ve got a system in place that would allow a stereoscopic camera or even a lidar to look out from the driver station and pinpoint robots and game pieces the robot might not be able to see, we would not communicate through this media, we would most likely use networktables to stay FIRST friendly but is this kind of sensor usage allowed by first? Ive never heard of teams with sensors in the drivestation.

I believe 254 did it in 2018, they used it to sense the height of the scale so the robot would lift the powercubes higher if they weren’t the first to place one

1 Like

Ah excellent! I think that answers my question :slight_smile:.

Rules change each year. Section 10.10 from the 2019 rules is where to look. Your camera will be looking through the polycarbonate window, which sometimes has marks on it.

In 2019, you could not look over the driver station with a camera. In some years that’s been allowed.

Yes, for a while, this was the best way to do it since there wasn’t really anything that could be easily put on a robot and be powerful enough for vision

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.