Go to Post But FIRST isn't FRC anymore. We need to make way for FTC and FLL. - Tetraman [more]
Home
Go Back   Chief Delphi > Technical > Programming
CD-Media   CD-Spy  
portal register members calendar search Today's Posts Mark Forums Read FAQ rules

 
Closed Thread
 
Thread Tools Rate Thread Display Modes
  #1   Spotlight this post!  
Unread 03-01-2014, 09:38
Jerry Ballard's Avatar
Jerry Ballard Jerry Ballard is offline
Registered User
AKA: Jerry Ballard
FRC #0456 (Siege Robotics)
Team Role: Mentor
 
Join Date: Dec 2012
Rookie Year: 2011
Location: Vicksburg, MS, USA
Posts: 13
Jerry Ballard is a jewel in the roughJerry Ballard is a jewel in the roughJerry Ballard is a jewel in the rough
2013 Lessons Learned for a Vision Co-Processor

On the eve of Kick-Off, we'd thought it would be a good time to post our lessons learned from last year's competition.

Team 456 (Siege Robotics)

Hardware: Pandaboard with Linux Ubuntu, Logitech C110 Webcam
Software: OpenCV, yavta, iniparser, mongoose
Code: C

1) Control your exposure: For consistent results, avoid over saturation of bright targets, set and control the exposure (integration time) setting on the webcam. We found that a 8ms integration time worked best in most settings. Camera exposure settings were set using yavta: (https://github.com/fastr/yavta). Remember, many webcams have auto-exposure set as a default. The auto-exposure setting won't allow you to consistently identify targets in changing lighting conditions (see #2).

2) Do calibrate at competition: Each competition arena will have different lighting. Take advantage of the time provided (before competition begins) and calibrate camera to those settings. We setup a simple executable script that recorded movies at different camera exposures that we then ran against our target tracking systems off the field. I've attached frame shots of the same field at different camera exposures. Corresponding movies are available, let me know if you would like a copy of them.

3) Use a configuration file:
We found that using a configuration file allowed us to modify target tracking settings during competition without having to recompile code. Recompiling code during competition is rarely a good thing.

4) OpenCV isn't always optimized for FRC Competition: In the pursuit of faster frame processing, we found that going to the traditional OpenCV RGB->HSV->Threshold(Value)&Threshold(Hue) pipeline was a major time-eater. We found a significant gain in performance from 15 fps to 22 fps by doing the thresholding on Value and Hue while converting from RGB. This allowed a reduction in computation load by eliminating the Saturation calculation and skipping obvious non-target pixels. If requested, we can post the algorithm here or it is also available on our github site. (GitHub is awesome!)

5) Avoid double precision:
If our input source is a 640x480 image, most if not all calculations can be done with integer and single precision arithmetic. Using doubles adds more processing time = reduced fps processing.

6) Data transfer via HTTP was questionable: We used the Mongoose webserver code to transfer target angular coordinates to the control system (https://github.com/cesanta/mongoose). Coding-wise, the Mongoose code was easy to compile, simple to integrate, and multi-threaded (that is another discussion). During competition, sometimes it appeared that we would not get target lock as quick as was tested and we suspect that it was related to the HTTP communication between the PandaBoard and the CRIO. We are still trying to decide the best way to communicate between the vision co-processor and the CRIO (looking at UDP next).
Attached Thumbnails
Click image for larger version

Name:	framegrab_2ms.jpg
Views:	110
Size:	43.9 KB
ID:	15708  Click image for larger version

Name:	framegrab_4ms.jpg
Views:	77
Size:	57.2 KB
ID:	15709  Click image for larger version

Name:	framegrab_6ms.jpg
Views:	65
Size:	77.4 KB
ID:	15710  Click image for larger version

Name:	framegrab_8ms.jpg
Views:	64
Size:	78.5 KB
ID:	15711  Click image for larger version

Name:	framegrab_16ms.jpg
Views:	87
Size:	97.8 KB
ID:	15712  


Last edited by Jerry Ballard : 03-01-2014 at 12:29. Reason: Corrected minor typo in #3
  #2   Spotlight this post!  
Unread 03-01-2014, 10:17
billylo's Avatar
billylo billylo is offline
Registered User
FRC #0610 (Coyotes)
Team Role: Mentor
 
Join Date: Mar 2012
Rookie Year: 2012
Location: Toronto
Posts: 161
billylo has a brilliant futurebillylo has a brilliant futurebillylo has a brilliant futurebillylo has a brilliant futurebillylo has a brilliant futurebillylo has a brilliant futurebillylo has a brilliant futurebillylo has a brilliant futurebillylo has a brilliant futurebillylo has a brilliant futurebillylo has a brilliant future
Re: 2013 Lessons Learned for a Vision Co-Processor

thanks for sharing... #priceless
  #3   Spotlight this post!  
Unread 03-01-2014, 12:09
mechanical_robot's Avatar
mechanical_robot mechanical_robot is offline
Registered User
no team
Team Role: Driver
 
Join Date: Jul 2013
Rookie Year: 2013
Location: United States
Posts: 92
mechanical_robot will become famous soon enough
Re: 2013 Lessons Learned for a Vision Co-Processor

Thankyou for sharing this. Very interesting as I am considering doing a OpenCV project with my new Raspberry Pi
  #4   Spotlight this post!  
Unread 03-01-2014, 15:33
yash101 yash101 is offline
Curiosity | I have too much of it!
AKA: null
no team
 
Join Date: Oct 2012
Rookie Year: 2012
Location: devnull
Posts: 1,191
yash101 is an unknown quantity at this point
Re: 2013 Lessons Learned for a Vision Co-Processor

Thanks. Can you post a link to the algorithm you used? Did you use the web server to communicate with the cRIO or just the DS?
  #5   Spotlight this post!  
Unread 03-01-2014, 18:29
JesseK's Avatar
JesseK JesseK is offline
Expert Flybot Crasher
FRC #1885 (ILITE)
Team Role: Mentor
 
Join Date: Mar 2007
Rookie Year: 2005
Location: Reston, VA
Posts: 3,661
JesseK has a reputation beyond reputeJesseK has a reputation beyond reputeJesseK has a reputation beyond reputeJesseK has a reputation beyond reputeJesseK has a reputation beyond reputeJesseK has a reputation beyond reputeJesseK has a reputation beyond reputeJesseK has a reputation beyond reputeJesseK has a reputation beyond reputeJesseK has a reputation beyond reputeJesseK has a reputation beyond repute
Re: 2013 Lessons Learned for a Vision Co-Processor

UDP would work fine. We use raw network programming like that for a complex data structure between the robot and driver's station. We could use something fancy like Protobufs, CORBA, or other WSDL-type middleware - but hardcoding the encode/decode order of 4-byte values also works and is what we do.

As for vision itself, I have no comments
__________________

Drive Coach, 1885 (2007-present)
CAD Library Updated 5/1/16 - 2016 Curie/Carver Industrial Design Winner
GitHub
  #6   Spotlight this post!  
Unread 03-01-2014, 19:24
Jerry Ballard's Avatar
Jerry Ballard Jerry Ballard is offline
Registered User
AKA: Jerry Ballard
FRC #0456 (Siege Robotics)
Team Role: Mentor
 
Join Date: Dec 2012
Rookie Year: 2011
Location: Vicksburg, MS, USA
Posts: 13
Jerry Ballard is a jewel in the roughJerry Ballard is a jewel in the roughJerry Ballard is a jewel in the rough
Re: 2013 Lessons Learned for a Vision Co-Processor

Quote:
Originally Posted by yash101 View Post
Thanks. Can you post a link to the algorithm you used? Did you use the web server to communicate with the cRIO or just the DS?
Communication was just with the cRIO LabView controller code.

Here's the algorithm we used for the color conversion:

Code:
/*
**  Local MACRO defines
*/
#define MIN3(x,y,z)  ((y) <= (z) ? \
                         ((x) <= (y) ? (x) : (y)) \
                     : \
                         ((x) <= (z) ? (x) : (z)))

#define MAX3(x,y,z)  ((y) >= (z) ? \
                         ((x) >= (y) ? (x) : (y)) \
                     : \
                         ((x) >= (z) ? (x) : (z)))

/* 
** prototypes of functions
*/

void T456_change_RGB_to_binary( IplImage *, CvMat *, int, int, int);
void T456_filter_image( unsigned char , unsigned char , unsigned char , 
                 unsigned char *, int, int, int);

/*
**  Filter RGB image, HSV convert, threshold, etc...
*/

void T456_change_RGB_to_binary( IplImage *rgb, CvMat *binary,
                                int val_thresh, int hue_mid_thresh, 
                                int hue_mid_span )
{
  register int y;
  register unsigned char r,g,b;
  register char *data;
  register uchar *bin_data;

  register int total_vals;

  /* 
  **  Point the data pointer into the beginning of the image data
  */
  data = (char*)rgb->imageData;

  /*
  **  Point the output binary image pointer to the beginning of the image
  */
  bin_data = (uchar*)binary->data.ptr;
  
  total_vals = rgb->height * rgb->width;

  for ( y = 0; y < total_vals; y++ )  /* rows */
  {
     /* grab the bgr values */
     b = data[0];
     g = data[1];
     r = data[2];
     data += 3;

     T456_filter_image( r,g,b, bin_data, val_thresh, 
                        hue_mid_thresh, hue_mid_span );

     /* increment output pointer */
     bin_data++;
  }

}


void T456_filter_image( unsigned char r, unsigned char g, unsigned char b, 
                 unsigned char *binary ,
                 int val_thresh, int hue_mid_thresh, int hue_mid_span)
{
   unsigned char rgb_min, rgb_max, rgb_diff;
   unsigned char hue = 0; 
   unsigned char val = 0; 

   /*
   **  set the default return value to zero 
   */
   *binary = 0;

   /*
   **  get the min and max values of the RGB
   **   pixel
   */
   rgb_min = MIN3( r, g, b );
   rgb_max = MAX3( r, g, b );

   rgb_diff = rgb_max - rgb_min;

   val = rgb_max;

   /* 
   **  This is the trivial case:
   **    zero pixels or value is less than VAL_THRESH
   */
   if ( (val == 0) || (val < val_thresh) ) {
      return;   /* binary = 0 */
   }

   /*
   **  Zero out white pixels 
   **   WARNING (use only if camera is not oversaturated)
   */
   if ( (val >= val_thresh) && (rgb_diff == 0 ) ) 
   {
      *binary = 0;
      return;
   }

   /* 
   ** Compute hue 
   */
   if (rgb_max == r) {
       hue = 0 + 43 * (g - b)/(rgb_diff);
   } else if (rgb_max == g) {
       hue = 85 + 43*(b - r)/(rgb_diff);
   } else /* rgb_max == b */ {
       hue = 171 + 43*(r - g)/(rgb_diff);
   }

   /* 
   **  to get to this point, val > val_thresh
   */
   if (    (hue >= ( hue_mid_thresh - hue_mid_span)) 
       && ( hue <= ( hue_mid_thresh + hue_mid_span) ) )
   {
       *binary = 255;
   }
 
   return;
}

Last edited by Jerry Ballard : 03-01-2014 at 22:46. Reason: reformatted code section
  #7   Spotlight this post!  
Unread 03-01-2014, 20:41
Joe Ross's Avatar Unsung FIRST Hero
Joe Ross Joe Ross is offline
Registered User
FRC #0330 (Beachbots)
Team Role: Engineer
 
Join Date: Jun 2001
Rookie Year: 1997
Location: Los Angeles, CA
Posts: 8,567
Joe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond repute
Re: 2013 Lessons Learned for a Vision Co-Processor

Your tips are useful for vision processing in general. Do you have any tips specific to using a Co-Processor, ie how to power it, how to shut it down cleanly, how to get a development environment up and running quickly, etc?

You can enclose your code in [ code] tags to keep it formatted.
  #8   Spotlight this post!  
Unread 03-01-2014, 23:41
Jerry Ballard's Avatar
Jerry Ballard Jerry Ballard is offline
Registered User
AKA: Jerry Ballard
FRC #0456 (Siege Robotics)
Team Role: Mentor
 
Join Date: Dec 2012
Rookie Year: 2011
Location: Vicksburg, MS, USA
Posts: 13
Jerry Ballard is a jewel in the roughJerry Ballard is a jewel in the roughJerry Ballard is a jewel in the rough
Re: 2013 Lessons Learned for a Vision Co-Processor

Thanks Joe for the formatting tip.

Here are a few more lessons learned related to your questions:

7) Stable and reliable power is required: During competition, the system voltage can vary significantly (below 10V) due to the varying demands of the other powered components. If input voltage dropped that much the PandaBoard would drop power and reboot (not good). So we used the MiniBox DC-DC converter (http://www.mini-box.com/DCDC-USB?sc=8&category=981) to solve this problem. The only issue (we believe) we had in competition with power was probably due to a loose power connection into the PandaBoard.

8) Hard Stop and Rebuild during competition: During the build season, we made the decision not to find a graceful method to shutdown the system during competition. Instead we let the system hard stop when the power was shutdown and then did a reboot and fsck'ed (file system checked) between matches. Also, we maintained multiple copies of the complete OS on memory cards and often did card swaps between short match times. Swaping OS memory cards allowed for some simple diagnostics of target tracking and provided redundancy in case of severe system crash.

9) Diagnostic photos are good: During the matches, we captured and saved frames from the camera once a second (sometimes less) to help us determine how the targeting system was doing. It turned out that these images were very useful to the pilot/copilot of the robot to quickly replay the previous match. Visual cueing from these images helped the students recall better what happened during the match (from the robot's point of view). I've attached an example image below. The blue circle is the aim point of the shooter, cross-hairs identify targets in range, the red dot is the predicted frisbee hit, and the green circle is the target selected.

10) Raspberry Pi didn't work for us:
We spent a lot of time trying to get the RPi to run the targeting code. But as written, our code wasn't able to achieve the fps speed needed. We had set a goal of 15 fps as a minimum. The most we could squeeze out of the RPi was 10 fps. We all love the RPi but in this case we weren't able to get the required speed.
Attached Thumbnails
Click image for larger version

Name:	frame_00913_003825.jpg
Views:	74
Size:	52.4 KB
ID:	15720  
  #9   Spotlight this post!  
Unread 04-01-2014, 03:38
SoftwareBug2.0's Avatar
SoftwareBug2.0 SoftwareBug2.0 is offline
Registered User
AKA: Eric
FRC #1425 (Error Code Xero)
Team Role: Mentor
 
Join Date: Aug 2004
Rookie Year: 2004
Location: Tigard, Oregon
Posts: 486
SoftwareBug2.0 has a brilliant futureSoftwareBug2.0 has a brilliant futureSoftwareBug2.0 has a brilliant futureSoftwareBug2.0 has a brilliant futureSoftwareBug2.0 has a brilliant futureSoftwareBug2.0 has a brilliant futureSoftwareBug2.0 has a brilliant futureSoftwareBug2.0 has a brilliant futureSoftwareBug2.0 has a brilliant futureSoftwareBug2.0 has a brilliant futureSoftwareBug2.0 has a brilliant future
Re: 2013 Lessons Learned for a Vision Co-Processor

Quote:
Originally Posted by Jerry Ballard View Post

10) Raspberry Pi didn't work for us:
We spent a lot of time trying to get the RPi to run the targeting code. But as written, our code wasn't able to achieve the fps speed needed. We had set a goal of 15 fps as a minimum. The most we could squeeze out of the RPi was 10 fps. We all love the RPi but in this case we weren't able to get the required speed.
How did you determine what speed you needed? Was 10 Hz too slow just because it wasn't meeting your goal of 15 or did you try it and not like the results?
  #10   Spotlight this post!  
Unread 04-01-2014, 07:58
MikeE's Avatar
MikeE MikeE is offline
Wrecking nice beaches since 1990
no team (Volunteer)
Team Role: Engineer
 
Join Date: Nov 2008
Rookie Year: 2008
Location: New England -> Alaska
Posts: 381
MikeE has a reputation beyond reputeMikeE has a reputation beyond reputeMikeE has a reputation beyond reputeMikeE has a reputation beyond reputeMikeE has a reputation beyond reputeMikeE has a reputation beyond reputeMikeE has a reputation beyond reputeMikeE has a reputation beyond reputeMikeE has a reputation beyond reputeMikeE has a reputation beyond reputeMikeE has a reputation beyond repute
Re: 2013 Lessons Learned for a Vision Co-Processor

Quote:
Originally Posted by Jerry Ballard View Post
8) Hard Stop and Rebuild during competition: During the build season, we made the decision not to find a graceful method to shutdown the system during competition. Instead we let the system hard stop when the power was shutdown and then did a reboot and fsck'ed (file system checked) between matches. Also, we maintained multiple copies of the complete OS on memory cards and often did card swaps between short match times. Swaping OS memory cards allowed for some simple diagnostics of target tracking and provided redundancy in case of severe system crash.

9) Diagnostic photos are good: During the matches, we captured and saved frames from the camera once a second (sometimes less) to help us determine how the targeting system was doing. It turned out that these images were very useful to the pilot/copilot of the robot to quickly replay the previous match. Visual cueing from these images helped the students recall better what happened during the match (from the robot's point of view). I've attached an example image below. The blue circle is the aim point of the shooter, cross-hairs identify targets in range, the red dot is the predicted frisbee hit, and the green circle is the target selected.
Thanks for sharing your experiences.
Your success with hard stop is particularly helpful as clean shutdown was a significant concern. I like the pragmatic approach of having spare memory cards to swap between matches.

Great tip about the diagnostic images.
  #11   Spotlight this post!  
Unread 16-01-2014, 11:32
virtuald's Avatar
virtuald virtuald is offline
RobotPy Guy
AKA: Dustin Spicuzza
FRC #1418 (), FRC #1973, FRC #4796, FRC #6367 ()
Team Role: Mentor
 
Join Date: Dec 2008
Rookie Year: 2003
Location: Boston, MA
Posts: 1,058
virtuald has a brilliant futurevirtuald has a brilliant futurevirtuald has a brilliant futurevirtuald has a brilliant futurevirtuald has a brilliant futurevirtuald has a brilliant futurevirtuald has a brilliant futurevirtuald has a brilliant futurevirtuald has a brilliant futurevirtuald has a brilliant futurevirtuald has a brilliant future
Re: 2013 Lessons Learned for a Vision Co-Processor

Quote:
Originally Posted by Jerry Ballard View Post
Thanks Joe for the formatting tip.

9) Diagnostic photos are good: During the matches, we captured and saved frames from the camera once a second (sometimes less) to help us determine how the targeting system was doing.
We did this as well. It was great for tuning purposes.
__________________
Maintainer of RobotPy - Python for FRC
Creator of pyfrc (Robot Simulator + utilities for Python) and pynetworktables/pynetworktables2js (NetworkTables for Python & Javascript)

2017 Season: Teams #1973, #4796, #6369
Team #1418 (remote mentor): Newton Quarterfinalists, 2016 Chesapeake District Champion, 2x Innovation in Control award, 2x district event winner
Team #1418: 2015 DC Regional Innovation In Control Award, #2 seed; 2014 VA Industrial Design Award; 2014 Finalists in DC & VA
Team #2423: 2012 & 2013 Boston Regional Innovation in Control Award


Resources: FIRSTWiki (relaunched!) | My Software Stuff
  #12   Spotlight this post!  
Unread 16-01-2014, 15:33
faust1706's Avatar
faust1706 faust1706 is offline
Registered User
FRC #1706 (Ratchet Rockers)
Team Role: College Student
 
Join Date: Apr 2012
Rookie Year: 2011
Location: St Louis
Posts: 498
faust1706 is infamous around these partsfaust1706 is infamous around these parts
Re: 2013 Lessons Learned for a Vision Co-Processor

Quote:
Originally Posted by virtuald View Post
We did this as well. It was great for tuning purposes.
Also, save off all your variable that you need to a logfile. I've done this the past 2 years. You can write an octave script that will make graphs out of the solutions. Also, you can REPRINT your solutions onto the corresponding image. This was really cool. We went out with a cart with a computer and camera during calibration and saved images 10 times a second, then created a "video" that we could show in the pits to curious students and mentors and show the judges it working on the field.

Also, if you can, go to another regional you are not competing at and take pictures so you can get an idea of what to expect at your regional. We have also done this for the past 2 years.
__________________
"You're a gentleman," they used to say to him. "You shouldn't have gone murdering people with a hatchet; that's no occupation for a gentleman."

Last edited by faust1706 : 16-01-2014 at 15:49.
  #13   Spotlight this post!  
Unread 16-01-2014, 16:04
Greg McKaskle Greg McKaskle is offline
Registered User
FRC #2468 (Team NI & Appreciate)
 
Join Date: Apr 2008
Rookie Year: 2008
Location: Austin, TX
Posts: 4,751
Greg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond repute
Re: 2013 Lessons Learned for a Vision Co-Processor

Since we are talking about logging data for review, has anyone played with the dashboard template that includes the ability to record and playback a match? It logs an avi and another file that includes the contents of the network table, joystick data, I/O values, etc. It would also work to test out changes to dashboard-based vision processing. Your code under development would be processing the AVI instead of the live version.

Greg McKaskle
  #14   Spotlight this post!  
Unread 17-01-2014, 19:49
virtuald's Avatar
virtuald virtuald is offline
RobotPy Guy
AKA: Dustin Spicuzza
FRC #1418 (), FRC #1973, FRC #4796, FRC #6367 ()
Team Role: Mentor
 
Join Date: Dec 2008
Rookie Year: 2003
Location: Boston, MA
Posts: 1,058
virtuald has a brilliant futurevirtuald has a brilliant futurevirtuald has a brilliant futurevirtuald has a brilliant futurevirtuald has a brilliant futurevirtuald has a brilliant futurevirtuald has a brilliant futurevirtuald has a brilliant futurevirtuald has a brilliant futurevirtuald has a brilliant futurevirtuald has a brilliant future
Re: 2013 Lessons Learned for a Vision Co-Processor

Quote:
Originally Posted by Greg McKaskle View Post
Since we are talking about logging data for review, has anyone played with the dashboard template that includes the ability to record and playback a match? It logs an avi and another file that includes the contents of the network table, joystick data, I/O values, etc. It would also work to test out changes to dashboard-based vision processing. Your code under development would be processing the AVI instead of the live version.

Greg McKaskle
That sounds like a nice capability. However, not useful for anyone not using the LabVIEW dashboard.

I have been thinking about logging all of the data and doing something with it for awhile now, but I've never actually gotten around to doing it.
__________________
Maintainer of RobotPy - Python for FRC
Creator of pyfrc (Robot Simulator + utilities for Python) and pynetworktables/pynetworktables2js (NetworkTables for Python & Javascript)

2017 Season: Teams #1973, #4796, #6369
Team #1418 (remote mentor): Newton Quarterfinalists, 2016 Chesapeake District Champion, 2x Innovation in Control award, 2x district event winner
Team #1418: 2015 DC Regional Innovation In Control Award, #2 seed; 2014 VA Industrial Design Award; 2014 Finalists in DC & VA
Team #2423: 2012 & 2013 Boston Regional Innovation in Control Award


Resources: FIRSTWiki (relaunched!) | My Software Stuff
  #15   Spotlight this post!  
Unread 04-01-2014, 08:55
billbo911's Avatar
billbo911 billbo911 is offline
I prefer you give a perfect effort.
AKA: That's "Mr. Bill"
FRC #2073 (EagleForce)
Team Role: Mentor
 
Join Date: Mar 2005
Rookie Year: 2005
Location: Elk Grove, Ca.
Posts: 2,356
billbo911 has a reputation beyond reputebillbo911 has a reputation beyond reputebillbo911 has a reputation beyond reputebillbo911 has a reputation beyond reputebillbo911 has a reputation beyond reputebillbo911 has a reputation beyond reputebillbo911 has a reputation beyond reputebillbo911 has a reputation beyond reputebillbo911 has a reputation beyond reputebillbo911 has a reputation beyond reputebillbo911 has a reputation beyond repute
Re: 2013 Lessons Learned for a Vision Co-Processor

Quote:
Originally Posted by Joe Ross View Post
Your tips are useful for vision processing in general. Do you have any tips specific to using a Co-Processor, ie how to power it, how to shut it down cleanly....
Our approach to powering the co-processor was to use the same +5 vdc source that is provided on the PDB for powering the Axis camera. We used a USB webcam, so the output was free.
To power down gracefully, we created a vi that only ran during "Disabled mode" and was activated by pressing a designated button on one of the joysticks. The vi initiated a socket connection to our co-processor. On the co-processor, we had a routine that ran and constantly listened for a socket request. When it received a request, it ran a "shutdown" command. This was done at the end of every match.
__________________
CalGames 2009 Autonomous Champion Award winner
Sacramento 2010 Creativity in Design winner, Sacramento 2010 Quarter finalist
2011 Sacramento Finalist, 2011 Madtown Engineering Inspiration Award.
2012 Sacramento Semi-Finals, 2012 Sacramento Innovation in Control Award, 2012 SVR Judges Award.
2012 CalGames Autonomous Challenge Award winner ($$$).
2014 2X Rockwell Automation: Innovation in Control Award (CVR and SAC). Curie Division Gracious Professionalism Award.
2014 Capital City Classic Winner AND Runner Up. Madtown Throwdown: Runner up.
2015 Innovation in Control Award, Sacramento.
2016 Chezy Champs Finalist, 2016 MTTD Finalist
Closed Thread


Thread Tools
Display Modes Rate This Thread
Rate This Thread:

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT -5. The time now is 03:11.

The Chief Delphi Forums are sponsored by Innovation First International, Inc.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi