Camera Lag

Hey,

I am practicing up with the camera and using last year’s game as practice. I finished programming a camera tracking system that finds the colors on the target, but there is a big lag when i get the values? Is there any way i can reduce this lag?

thanks,

Most likely. What is big, and how are you measuring it?

Last year, I was measuring the lag for various elements of the vision system and with very simple processing measured it to a little under 60ms. I measured for bigger images and it goes up. Also, the delta for more processing is easy to measure.

Greg McKaskle

when i say big lag i mean about 1.5 seconds. In my program, i filter out the background and eveything becomes red, except for the green half of the target which becomes black. I am using the IMAQ histogram to get values from the new image and see how much is black. One of my test programs has our robot go forward until it gets about 2 feet from the target. However, it only works if the robot is moving slow. Otherwise, i have 120lb of robot flying at me.

Thanks for the change idea, i’ll use that and see how it goes.

That is way more than it should be. With a bit of work, you should be able to get it to 100ms or so.

Please give more info on the image size, type of processing, etc. Also, what language are you using?

Right away, one of the things I’ll mention is that if you use the LV libraries and do not setup your camera to have the FTC account on it, you pay a time penalty for each image because LV has to go down an account list to find what works. It affects LV more than C++ because LV redoes the connection for each image.

Greg McKaskle

Also if you are using LabVIEW and have concurrent loops, make sure you add a small delay to each loop so that the processor can be shared effectively.

it probably is about 100 ms, maybe a little more. I was just guestimating. When i have he chance, then i will try to get a exact measurment. I am using labview FRC and i am processing what i believe is a U8 image. All i am doing is putting it through an imaq histrogram. The image is 319 x 240.

Let me go over a few techniques for measuring stuff, because the probes and panels in RT are great for debugging, but not for conveying lag. In reality, they are recompressing and transmitting to PC, and then doing the display stuff over again.

To time a portion of a diagram, you note the time, do the diagram, take the time, subtract and display the time. Probably the easiest and most foolproof is to drop a sequence structure around the code you want to measure, add a frame to the beginning, drop in a millisecond time function. Add a frame after you code, drop in a get millisecond time again. Outside the sequence drop a subtract, and wire it up. Note that if you subtract backwards, you’ll get values of ~4billion. You want new minus old. Anyway, right click and create an indicator. You may sometimes want to make a chart instead of an indicator so that you can see how consistent the measurement is, see what data or parameter causes it to rise and fall.

To time the camera lag, I needed an external trigger, so I used the LED on the cRIO. You can use one on another piece of HW if it is easier. I aimed that camera at the cRIO LED – up real close. The LED is off, and in my code I take the time, turn the LED on, and loop getting and processing images until the camera reports that the LED is on. Note the time, then turn things off for awhile and repeat to get a statistical measure of the lag.

To determine if the LED is on in the image I was simply sampling the Green portion of a single pixel. About as simple as I could get. I also assumed that the lag for RT to turn the LED on was minimal. That meant most of what I was measuring was base camera related lag – exposing the sensor, compressing the image, transmitting the image, cRIO reading the TCP, cRIO decoding the JPEG into an IMAQ image, and finally cRIO processing. I don’t have all of the numbers in front of me, but the small and medium images were both about 60ms. Not bad considering that at 30fps, the camera only sends an image every 33. So depending on when the LED goes on relative to the exposure, it could easily be 33 ms on some samples becuase the LED was just barely missed by this frame, and another isn’t due for 33ms. In others, you get lucky and the LED goes on just before a frame exposure begins.

Anyway, the other stuff that is pretty easy to measure within the lag I’m discussing is the cRIO stuff. The TCP reads are small, and the cRIO decode is around 7 or 8 ms for small size and around 22ms for medium size. It is 100ms for large.

Feel free to come up with other ways of measuring the lag of other parts of the process and please share. Maybe we can do something about some of them.

Greg McKaskle