Edit: Issue seems to be related to camera motion and auto-exposure slowing down the camera. You can watch frame rates change just by moving the camera. Even turning off autoexposure didn’t fix it, just made it better… I think it’s just a cheap webcam processor dropping frames when behind.
Original Post:
Hello all, I didn’t know if I should make this a github issue or a topic here, so I started here.
As with many others I have been playing with AprilTags, and I think I came to a roadblock that should be looked at. I think the CameraServer.CsSink.grabFrame() function has an issue, and I think it’s affecting anything using CameraServer (both WpiLib and PhotonVision).
I was suspicious because while we are limited to ~7FPS with AprilTags reading, we were also only using ~40% CPU and why we can’t make data processing more parallel.
So, I made a VERY simple vision loop (posted below) in Python; just grabbed a frame and did nothing. Then I took the time between loops and logged it. This is a plot of the FPS, notice it groups really bad at 30, 15, and 7.5 FPS.
# some WpiLib vision initialization before
csInst = CameraServer.getInstance()
vs = NetworkTables.getTable('Vision') # Get the Vision NetworkTable
# start cameras
for config in cameraConfigs:
cam = startCamera(config)
cvSink = csInst.getVideo(camera=cam)
#Begin Image processing loop
frame = np.zeros(shape=(640, 480, 3), dtype=np.uint8)
while(True):
t1 = time.perf_counter()#Used for testing performance
FPS = (1/(t1-t0))#Used for testing performance
vs.putNumber("FPS", FPS)
ret, frame = cvSink.grabFrame(frame)
t0 = t1
I’m definitely not CPU bound, running the WpiLibPi v2022.1.1 image on a Pi 4b, 2GB of RAM, LifeCam HD-3000 camera, configured in WpiLibPi as a 640x480, 30 FPS cam. No other streams were running on the Pi (verified by the Network traffic from Pi). Data collected on my laptop using the Robot Simulation (only to run a NetworkTables server and using WpiLib logging, used AdvantageScope to plot the data)