Chief Delphi

Chief Delphi (http://www.chiefdelphi.com/forums/index.php)
-   Programming (http://www.chiefdelphi.com/forums/forumdisplay.php?f=51)
-   -   2013 Lessons Learned for a Vision Co-Processor (http://www.chiefdelphi.com/forums/showthread.php?t=123917)

virtuald 16-01-2014 11:32

Re: 2013 Lessons Learned for a Vision Co-Processor
 
Quote:

Originally Posted by Jerry Ballard (Post 1320041)
Thanks Joe for the formatting tip.

9) Diagnostic photos are good: During the matches, we captured and saved frames from the camera once a second (sometimes less) to help us determine how the targeting system was doing.

We did this as well. It was great for tuning purposes. :)

sparkytwd 16-01-2014 13:13

Re: 2013 Lessons Learned for a Vision Co-Processor
 
Quote:

Originally Posted by Dr.Bot (Post 1327950)
Is it legal to use a Beaglebone as a co-processor? I have had success running ROS and a Kinect off both a BBB and a Pi, but the Pi is underpowered for onboard vision processing. I don't have any Panda board experience.

Chapter and Verse:

Quote:

R54
ROBOTS must be controlled via one (1) programmable National Instruments cRIO (P/N: cRIO-FRC or cRIO-FRCII), with image version FRC_2014_v52.


There are no rules that prohibit co-processors, provided commands originate from the cRIO to configure, enable, and specify all operating points for all power regulating devices. This includes Jaguar motor controllers legally wired to the CAN-bus.
You could even go so far as to have a shell command program on the cRIO that took commands from the co-processor.

As for the ROS, I haven't played with it too much, but our current vision processor is the ODROID-U2 does have an ROS software version: http://forum.odroid.com/viewtopic.php?f=8&t=2096

faust1706 16-01-2014 15:33

Re: 2013 Lessons Learned for a Vision Co-Processor
 
Quote:

Originally Posted by virtuald (Post 1328034)
We did this as well. It was great for tuning purposes. :)

Also, save off all your variable that you need to a logfile. I've done this the past 2 years. You can write an octave script that will make graphs out of the solutions. Also, you can REPRINT your solutions onto the corresponding image. This was really cool. We went out with a cart with a computer and camera during calibration and saved images 10 times a second, then created a "video" that we could show in the pits to curious students and mentors and show the judges it working on the field.

Also, if you can, go to another regional you are not competing at and take pictures so you can get an idea of what to expect at your regional. We have also done this for the past 2 years.

Greg McKaskle 16-01-2014 16:04

Re: 2013 Lessons Learned for a Vision Co-Processor
 
Since we are talking about logging data for review, has anyone played with the dashboard template that includes the ability to record and playback a match? It logs an avi and another file that includes the contents of the network table, joystick data, I/O values, etc. It would also work to test out changes to dashboard-based vision processing. Your code under development would be processing the AVI instead of the live version.

Greg McKaskle

virtuald 17-01-2014 19:49

Re: 2013 Lessons Learned for a Vision Co-Processor
 
Quote:

Originally Posted by Greg McKaskle (Post 1328173)
Since we are talking about logging data for review, has anyone played with the dashboard template that includes the ability to record and playback a match? It logs an avi and another file that includes the contents of the network table, joystick data, I/O values, etc. It would also work to test out changes to dashboard-based vision processing. Your code under development would be processing the AVI instead of the live version.

Greg McKaskle

That sounds like a nice capability. However, not useful for anyone not using the LabVIEW dashboard.

I have been thinking about logging all of the data and doing something with it for awhile now, but I've never actually gotten around to doing it.

Greg McKaskle 18-01-2014 06:39

Re: 2013 Lessons Learned for a Vision Co-Processor
 
Quote:

However, not useful for anyone not using the LabVIEW dashboard.
My understanding is that the 2.0 version of SD also has this feature.

So the teams who use LV dashboard have this capability, the teams using SD 2.0 have it, and teams to do something entirely different can look at those implementations if they would like to have it.

My point is -- we got around to it. I hope teams know about the feature and use it to improve their robot, their gameplay, and help with debugging and diagnosis.

Greg McKaskle


All times are GMT -5. The time now is 22:56.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi