![]() |
2010 FRC camera vision tracking delay, image processing lag
We are running modified code based on the 2010 FRC Vision Sample Code (released 2/3/2010). Our live video feed is almost real-time as it is displayed on the classmate. To achieve this, we are using the latest driver station console but have removed the linear sliders located above the gyroscope gauge. This solved initial lag problems associated with the camera stream.
Now in our code, we notice the targeting function takes 167 ms to execute, where approximately 50 ms is used converting an HSL image to grayscale and the remaining 117 ms is used for “target image processing” where colored arcs are drawn around potential targets. Have other teams noticed this “target image processing” lag? When called, the targeting function blocks execution, the processor is busy comparing pixels to find the target and code in the main OperatorControl loop is halted. Since code in the main loop executes every .167ms (5-6 times per second), the delayed reading of joystick values causes a jerky motion and the robot feels unresponsive. This is not acceptable since we would like to use the targeting function to process data continuously during the match. Our solution: We want the targeting function to run outside the OperatorControl loop. We are considering the creation of a new “task” for the vision code to run independently in its own “thread” similar to the compressor class. However, since multi-threaded applications are complicated and hard to debug, we wanted to post this on the forum first. Could Brad Miller comment on a good solution to the problem? (Is there something obvious we are overlooking) Have any other teams implemented a tracking "thread" for independent vision tracking? Will you post sample code on this forum? |
Re: 2010 FRC camera vision tracking delay, image processing lag
Our experience with the camera code is similar to yours. After playing around with the default tracking code, I realized that it wasn't cut out for driving with simultaneous tracking because of the lag you mentioned. Our solution was as you suggested, to separate the vision processing into its own task. This new task gets a new image from the camera, runs the (slightly modified) ellipse detection code, and writes the results to a global structure. The main robot task reads the results from the global structure as part of its normal loop. It's not really that difficult as long as you remember to protect accesses to the shared memory with a semaphore.
|
Re: 2010 FRC camera vision tracking delay, image processing lag
Mike, Can you post the important pieces of code that relate to threading and semaphores?
|
Re: 2010 FRC camera vision tracking delay, image processing lag
i have done it, it worked out ok only problem is the other code ran a little to fast for it too keep up(was using the syncronized object and it took over the var's and never released them), you also have to be careful about data access still have a couple of the really ugly "Fatal Task Level Exceptions" i have yet to track down and kill with fire. i will see if i can get code up later on. please note that the documentation isn't exactly clear on this, the function that you want to make into a task must NOT be part of a class, or atleast in my experience it has gotten rather mad if it is. and also when you give it the function it is withOUT ()'s
here is code for spawning a new task: Task <name>("taskname",(FUNCPTR)(functionname)); don't do this: Task <name>("taskname",(FUNCPTR)(functionname())); |
Re: 2010 FRC camera vision tracking delay, image processing lag
as for semaphore etc use a synchronized object, it makes thing rather easy to protect critical areas.
you do still have to declare a semaphore object though Synchronized sync(<semaphore name>); |
Re: 2010 FRC camera vision tracking delay, image processing lag
I wasn't sure about C/C++ image processing. The LV code runs in an independent loop, its version of the task, and that seems to be the better way to go. No matter what you do, you will not get the image processing to run at 50Hz.
As for the performance in general, a white paper was posted on ni.com that discusses some tuning and performance issues. By the way, I suspect the arcs are drawn around the targets on the dashboard, not on the cRIO. At least that is the way it is done for LV. If you have questions the white paper doesn't answer, fire away. Greg McKaskle |
Re: 2010 FRC camera vision tracking delay, image processing lag
that is a correct assumption, the dashboard draws the arcs.
I have a random question, is there a maximum number of tasks I can start on the cRIO before it freaks out? |
Re: 2010 FRC camera vision tracking delay, image processing lag
I've read that the maximum number of tasks is limited only by the amount of memory installed on the cRio. Since the cRio is running an embedded operating system, my guess is that very bad things happen once the memory is full.
|
Re: 2010 FRC camera vision tracking delay, image processing lag
That would depend on the task being created. Each task allocates a stack and some other memory. My guess is that a hundred is fine and should be enough.
I'm pretty sure that LV allocates dozens. If you have other needs, please post. Greg McKaskle |
Re: 2010 FRC camera vision tracking delay, image processing lag
On the topic of multithreading, I've done lots of work with multiple threaded programs (both on Windows and VxWorks. Our robot last year ran on 5 different threads). It's not that hard once you get a good grip on how to use semaphores. Send me a PM and I'd be more than happy to help you.
On another note, Quote:
|
Re: 2010 FRC camera vision tracking delay, image processing lag
Quote:
|
Re: 2010 FRC camera vision tracking delay, image processing lag
use the sendVisionData function in DashboardDataSender.h file that comes with the 2010 vision demo
void sendVisionData(double joyStickX, double gyroAngle, double gyroRate, double targetX, vector<Target> targets); |
Re: 2010 FRC camera vision tracking delay, image processing lag
Quote:
|
Re: 2010 FRC camera vision tracking delay, image processing lag
I was referring to the labview dashboard. The robot code needs to populate the packets to match what the dashboard side is expecting, otherwise you get the dreaded data type mismatch error when running the dashboard.
If you open up the default robot-side dashboard demo, you'll see where the vision data gets packed up (we use C++, but the Labview side does the same thing). If you compare that to the high priority packet format on the dashboard side, you'll see how it matches. You can then trace the wires up and to the right to get to the vision block to see how the overlay is done. |
Re: 2010 FRC camera vision tracking delay, image processing lag
You may already have what you need, but I was pretty sure the sample code for C++ and Java sent the data to the dashboard.
Greg McKaskle |
| All times are GMT -5. The time now is 14:45. |
Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi