Go to Post PS: Please use the default font for postings, it is so much easier to read. - DonRotolo [more]
Home
Go Back   Chief Delphi > Technical > Programming
CD-Media   CD-Spy  
portal register members calendar search Today's Posts Mark Forums Read FAQ rules

 
View Poll Results: What did you use for vision tracking?
Grip on RoboRio - IP Camera 3 2.07%
Grip on RoboRio - USB Camera 9 6.21%
Grip on Laptop- IP Camera 19 13.10%
Grip on Laptop- USB Camera 6 4.14%
Grip on Raspberry Pi- IP Camera 5 3.45%
Grip on Raspberry Pi- USB Camera 13 8.97%
RoboRealm IP Camera 6 4.14%
RoboRealm USB Camera 7 4.83%
Other - Please Elaborate with a Response 77 53.10%
Voters: 145. You may not vote on this poll

 
 
Thread Tools Rate Thread Display Modes
Prev Previous Post   Next Post Next
  #13   Spotlight this post!  
Unread 02-05-2016, 20:00
nighterfighter nighterfighter is online now
1771 Alum, 1771 Mentor
AKA: Matt B
FRC #1771 (1771)
Team Role: Mentor
 
Join Date: Sep 2009
Rookie Year: 2007
Location: Suwanee/Kennesaw, GA
Posts: 835
nighterfighter has a brilliant futurenighterfighter has a brilliant futurenighterfighter has a brilliant futurenighterfighter has a brilliant futurenighterfighter has a brilliant futurenighterfighter has a brilliant futurenighterfighter has a brilliant futurenighterfighter has a brilliant futurenighterfighter has a brilliant futurenighterfighter has a brilliant futurenighterfighter has a brilliant future
Re: What Did you use for Vision Tracking?

Quote:
Originally Posted by MamaSpoldi View Post
Team 230 also used a PixyCam... with awesome results and we integrated it in one day. The PixyCam does the image processing on-board so there is no need to transfer images. You can quickly train it to search for a specific color that you train it for and report when it sees it. We selected the simplest interface option provided by the Pixy which involves a single digital (indicating "I see a target") and a single analog (which provides feedback for where within the frame the target is located). This allowed us to provide a driver interface (and also program the code in autonomous) to use the digital to tell us when the target is in view and then allow the analog value to drive the robot rotation to center the goal.

We already had our tracking code written for the Axis camera, so our P loop only had to have a tiny adjustment (instead of a range of 320 pixels, it was 5 volts), so the PixyCam swap was almost zero code change.

We got our PixyCam hooked up and running in a few hours. We only used the analog output, didn't have time to get the digital output on it working. So if it never saw a target, (output value of around .43 volts I believe) the robot would "track" to the right constantly. But that is easy enough to fix in code...(if the "center" position doesn't update, you aren't actually tracking).

If we had more time we probably would have used I2C or SPI to interface with the camera, in order to get more data.



I know of at least 2 other teams from Georgia who used the PixyCam as well, being added in after/during the DCMP.
__________________
1771- Programmer, Captain, Drive Team (2009-2012)
4509- Mentor (2013-2015)
1771- Mentor (2015)
Reply With Quote
 


Thread Tools
Display Modes Rate This Thread
Rate This Thread:

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT -5. The time now is 10:38.

The Chief Delphi Forums are sponsored by Innovation First International, Inc.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi