Vision Tracking using the Axis camera?

I know that this has probably been talked about before. but has anyone done it with the Axis camera and have examples or code of it working?

this is my teams first year using vision tracking

We used the axis camera previously for vision tracking in 2016. I’d suggest against it, there is a considerable amount of latency (up to a couple of seconds). I’d suggest looking into the Limelight or, if the limelight is out of your budget, check out the recent FRCVision thread on using a USB webcam with a raspberry pi.

1 Like

@Nevada_Reno you can also check the included tutorials in LabVIEW:

  1. Setting up the AXIS Camera: Walks you through using the AXIS camera configuration utility to detect and setup the camera.
  2. Integrating Vision into Robot Code: Points you towards an example build for LabVIEW FRC that uses either a USB or IP (AXIS) camera to detect targets.

Finally, I can also recommend the following tutorial online: Image Processing in LabVIEW for FRC, as well as the FUNdamentals of LabVIEW for FRC: Episode 2 - Vision and Control.

If you are using LabVIEW, you can look at the frc examples in Labview. They have already written working vision code for you.

The axis camera has no worse lag than any other camera. Certainly not a couple of seconds.

Of course a dedicated off board processor like the lime light, a raspberry pie, or other system will be faster than processing on the Rio. But you can get very acceptable results with the Rio and an axis camera. We’ve used them for a long time.

@wsh32 The latency would be highly affected on your setup. For example the CPU/Graphics processing power of the computer doing the image processing. And if you send the image back to the driver station computer you would have network latency.