Chief Delphi

Chief Delphi (http://www.chiefdelphi.com/forums/index.php)
-   General Forum (http://www.chiefdelphi.com/forums/forumdisplay.php?f=16)
-   -   2012 Beta Testers (http://www.chiefdelphi.com/forums/showthread.php?t=107825)

JamesTerm 06-09-2012 11:17

Re: 2012 Beta Testers
 
Quote:

Originally Posted by Greg McKaskle (Post 1184141)
If you want to do this with the smart dashboard, it would involve writing some windows specific code using gcc or Visual Studio that would read the UDP packet from the DS, parse the info and display it.

Thanks for sharing this work-flow we may try to pursue that... one thing I'd like to know is if the smart dashboard can be developed on off-line. I know I've seen a testing harness available in the past, but that one is out dated. It would be great to be able to interface with the smart dashboard without needing a cRIO present.

Also while I'm here... I'm looking forward to seeing if the Joystick protocol is going to change to use the full HID info. I was thinking of creating an article/feature request about this in terms of being able to use the POV controls on a logitech game pad. I wanted to use that control for this year's game but could not.

Tom Line 06-09-2012 14:15

Re: 2012 Beta Testers
 
Quote:

Originally Posted by Greg McKaskle (Post 1184141)
Reading between the lines, I suspect that 118 is doing an HTTP GET to init the mjpeg stream. This is inherently done over TCP. I suspect they sent data back to the cRIO using UDP. Of course the computer doing this work was mounted on their robot, not on the DS.

If you want to do this with the smart dashboard, it would involve writing some windows specific code using gcc or Visual Studio that would read the UDP packet from the DS, parse the info and display it. It would do the TCP GET to get the mjpeg stream opened, and some code to decode or display it. It could also recompile or implement the network table protocol for C++ and use that to do read/writes to the cRIO for sharing data. No need to interoperate with JAVA unless you decide to.

Similarly, the LV dashboard does the UDP and TCP work. No network table implementation was made publicly available last year.

Greg McKaskle

Greg, when I thought about this I assumed that the Labview dashboard live-stream is already an mpeg stream and that Labview users can pull frames from that and use the Labview example vision code to process it. Is that the case? Is there a reason not to do it that way?

Greg McKaskle 06-09-2012 20:46

Re: 2012 Beta Testers
 
Quote:

Greg, when I thought about this I assumed that the Labview dashboard live-stream is already an mpeg stream and that Labview users can pull frames from that and use the Labview example vision code to process it. Is that the case? Is there a reason not to do it that way?
Last year, the default dashboard changed from reading individual JPEGs to doing an mjpeg stream. As you mention, it has always been possible to branch the image wire and connect it to the vision VIs for processing. Getting the processing back to the robot would involved UDP or TCP done on the open ports, or possibly using the beta-quality Network-Tables.

The example code ofr vision and tutorial that went with it supported both laptop and cRIO. It didn't integrate it into the dashboard, but you pretty much just needed to copy and paste the loop and connect it to the mjpeg wire.

So yeah, no reason not to. If the processing is done when needed or on low resolution images, the cRIO should have plenty juice to process the images. But the added power of the laptop makes it far easier to get a working solution with less optimization. For reference, the cRIO is about 800 MIPs. Image processing is almost entirely integer, so that is a pretty good metric to use. The Atom in the classmates is around 3000 MIPs.

Greg McKaskle

JamesTerm 07-09-2012 08:48

Re: 2012 Beta Testers
 
Quote:

Originally Posted by Greg McKaskle (Post 1184471)
So yeah, no reason not to. If the processing is done when needed or on low resolution images, the cRIO should have plenty juice to process the images. But the added power of the laptop makes it far easier to get a working solution with less optimization. For reference, the cRIO is about 800 MIPs. Image processing is almost entirely integer, so that is a pretty good metric to use. The Atom in the classmates is around 3000 MIPs.
Greg McKaskle

Thanks for the benchmarks... I presume the 800 MIPs is referring to the cRIO II right? I tested the example rebound rumble vision processing code on the regular cRIO and found it took 80-110ms per frame to process. I did not test this on the cRIO II though. If I understand correctly image processing *on the cRIO* is almost entirely integer, while processing using openCV on a classmate would not need this kind of optimization.

Joe Ross 09-09-2012 22:07

Re: 2012 Beta Testers
 
Quote:

Originally Posted by Tom Line (Post 1184430)
Greg, when I thought about this I assumed that the Labview dashboard live-stream is already an mpeg stream and that Labview users can pull frames from that and use the Labview example vision code to process it. Is that the case? Is there a reason not to do it that way?

That's basically what we did this year. I posted a simplified example here: http://forums.usfirst.org/showthread...9750#post59750

Tom Line 10-09-2012 01:51

Re: 2012 Beta Testers
 
Quote:

Originally Posted by Joe Ross (Post 1184943)
That's basically what we did this year. I posted a simplified example here: http://forums.usfirst.org/showthread...9750#post59750

Thanks for that link Joe. Figuring out UDP communication was on my to-do list before next season for exactly this reason. It will be a big help in understanding how it works.

Greg McKaskle 10-09-2012 07:08

Re: 2012 Beta Testers
 
Quote:

I presume the 800 MIPs ...
The processors are very similar. The data sheet for the MPC5125, the chip in the cRIO-II, stated the 800 MIPS as being accomplished for less than one watt. The cRIO uses the MPC5100 and lists the MIPS as 760.

I quoted integer benchmarks because many image operations are integer based. The sequence of operations for FRC in 2012 was to decode the JPG, color threshold, convex hull, make particle measurements, and compare the particle scores to target scores. I believe all of those operations are integer based, many of them being applied to every pixel, so lots of integer ops. I'm sure there are some floats used too, but way more integers.

A few years ago, the target was the ellipse/circle, and a Hough transform use used for the shape matching. At least the current geometric shape library is entirely Hough-based, I assume it was then. This will have a bigger mix of float operations. Since coordinates in the image are integers and there are so many of them, there will at least be lots of int loads and stores. And in general, image processing libs tend to be optimized. Since ints are still somewhat faster than floats, even with SSE, they will use the fastest approach that gets the right answer.

Greg McKaskle


All times are GMT -5. The time now is 17:30.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi