Chief Delphi

Chief Delphi (http://www.chiefdelphi.com/forums/index.php)
-   Programming (http://www.chiefdelphi.com/forums/forumdisplay.php?f=51)
-   -   Team 254 Presents: CheesyVision (http://www.chiefdelphi.com/forums/showthread.php?t=128639)

Jared Russell 04-08-2014 11:45 PM

Team 254 Presents: CheesyVision
 
Like many teams this season, Team 254 was surprised when we got to our first competition and found out that the Hot Goal vision targets were not triggering right at the start of autonomous mode. There seems to have been some improvements through the weeks, but there is still anywhere from 0.5 to 1.5 seconds delay.

We originally had planned on using a sensor on board the robot - an infrared photosensor from Banner - but our problem was that (a) you can't move the robot until the hot goal triggers or you'll miss the target and (b) it meant our drive team spent a lot of time lining up the sensors to be juuuust right (as Karthik and Paul often pointed out at Waterloo). Onboard cameras may be more tolerant of movement, but introduce new hardware and wiring onto the robot.

We were intrigued by the Kinect, but thought: Why use the Kinect when our Driver Station already has a built-in webcam?

Introducing CheesyVision, our new laptop-based webcam system for simple gesture control of our robot. 254 ran this software at SVR and drove to the correct goal every single time. In eliminations, we installed it on 971 and it worked perfectly, as well. We wanted to share it with all of FRC prior to the Championship, because we think that just because the field timing issue will probably never be perfect this season, nobody should have to suffer.

CheesyVision is a Python program that runs on your Driver Station and uses OpenCV to process a video stream from your webcam.

There are three boxes on top of the webcam image:

-A calibration box (top center)
-Two boxes for your hands (left and right)

Basically, if the left and right boxes are similar in color to the calibration box, we assume your hand is not there. Before each match, our operator puts his hands in the left and right boxes, and then drops the one that corresponds with the goal that turns hot. The result is sent over a TCP socket to the cRIO - example Java code for a TCP server and robot that uses this data in autonomous mode is provided, and doing the same thing in C++ or LabView should be easy (If you implement one of these, please share it with the community!).

There are tuning instructions in the source code, but we have found the default settings work pretty reliably under most lighting conditions as long as your shirt and the color of your skin are different enough (because the algorithm is self-calibrating). Of course, you could use virtually anything else besides skin and clothing if the colors are different.

Here are some screenshots:





To download and install the software, visit:
https://github.com/Team254/CheesyVision

Good luck!

DampRobot 04-08-2014 11:49 PM

Re: Team 254 Presents: CheesyVision
 
I saw this in person at SVR, and it is very cool. Great job 254, and thanks for sharing!

Now if only someone would use this same technology to block their 3 ball auto...

Yipyapper 04-08-2014 11:51 PM

Re: Team 254 Presents: CheesyVision
 
This is absolutely phenomenal, and since 781 had to remove their camera for weight, I needed a new method for hot goal detection. I have not been this happy about programming for a while--whether this works for us or not, I am incredibly grateful.

Too bad I can't give rep more than once.

Thad House 04-08-2014 11:53 PM

Re: Team 254 Presents: CheesyVision
 
This really is cool. I like the method. The only problem is that Wildstang couldn't use it :D

In all seriousness, I think this is an excellent way of detecting hot goals. Very simple, and most laptops have a camera on them nowadays. Ill keep it in mind for championships this weekend.

instantcake 04-08-2014 11:54 PM

Re: Team 254 Presents: CheesyVision
 
Thank you so much, we were just looking at how to implement our hot goal detection for champs, and this is an amazing solution. We also plan on extending it to tell the robot where to go while blocking during autonomous. Thank you so much for sharing this with the FIRST community!

JohnFogarty 04-08-2014 11:54 PM

Re: Team 254 Presents: CheesyVision
 
We currently use the Kinect method, but i might be inclined to implement this instead. I didn't develop something like this because the kinect Java classes already exsisted and were fairly easy to use. I do like how this required some work though.

Nice work.

PayneTrain 04-08-2014 11:58 PM

Re: Team 254 Presents: CheesyVision
 
I can't wait to tell the beleaguered crew working on Kinect programming there may be another way!

It is a real shame 254 isn't using the Kinect after its rousing success with it in 2012.

akoscielski3 04-08-2014 11:59 PM

Re: Team 254 Presents: CheesyVision
 
This weekend at the Windsor-Essex Great Lakes Regional I heard of 1559 using a very similar program for their Hot Goal detection. Instead they used cards that had symbols on them, and I believe they had this all season long though I can not confirm. Because of this they won the Innovation and Control Award.

It's pretty cool seeing that another team came up with a very similar way to detect the Hot Goal.

Good luck at Champs Poofs!

alex.lew 04-09-2014 12:05 AM

Re: Team 254 Presents: CheesyVision
 
2468 team appreciate used a system like this at Bayou last week. This never occurred to us - it's so simple and elegant. This will be pretty cool to show kids at demos.

RyanCahoon 04-09-2014 12:34 AM

Re: Team 254 Presents: CheesyVision
 
1 Attachment(s)
Quote:

Originally Posted by akoscielski3 (Post 1371512)
This weekend at the Windsor-Essex Great Lakes Regional I heard of 1559 using a very similar program for their Hot Goal detection. Instead they used cards that had symbols on them, and I believe they had this all season long though I can not confirm. Because of this they won the Innovation and Control Award.

We (1708) used a similar method at both NYTV (we got it working about halfway through the competition) and Pittsburgh (where we won Innovation in Control as well). We used the Fiducial module built into RoboRealm.

I've attached our RoboRealm script file for anyone who's curious. To use, first double click on the Fiducial line in the script, then click the Train button, then click Start. You may need to change the path to the directory that the fiducials are stored in if you're not on 64-bit Windows or you installed in a non-default directory. You'll also have to modify the Network Tables configuration to match your team number.

If we can get a more comprehensive paper written on it, I'll post it on CD.

Nice work, Poofs and Devil-Tech (and others). Cool to see other teams using this method as well.

billbo911 04-09-2014 01:01 AM

Re: Team 254 Presents: CheesyVision
 
1 Attachment(s)
I LOVE IT!!

This year 2073 used a USB webcam on our bot to track the balls. It was implemented to assist the driver with alignment to balls when they were obstructed from his view or just too far away to easily line up.
We won the Inspiration in Control Award at both Regionals we attended because of it. If 254 can share their code, we can share the Labview receiver we used to help any team that can take advantage of it.
Set the IP of the receiver to that of your DS, the Port number to that set on line 72 of the code 254 provided, and set number of bytes to read to whatever you are sending. In the case of 254's code, that should be 1.

A quick look at the block diagram will make it obvious what to do.

Please ask any questions here so I can publicly answer them.

Thad House 04-09-2014 01:17 AM

Re: Team 254 Presents: CheesyVision
 
So something that might be helpful to add would be to make it SmartDashboard compatible. That might make it alot more accessible to teams because it can easily be added as just a variable on the dashboard. You can get python binding for SmartDashboard here
http://firstforge.wpi.edu/sf/frs/do/...ktables.2014_4

I don't have a CRIO on me, but I attached a version that uses the exact same method of communicating as we were doing earlier in the season, so it should work. It just has two bool variables (right_in and left_in) and should just use the standard smartdashboard VI's or functions and be compatible with all versions.

EDIT: Attaching the file wouldnt work for some reason. So here is a skydrive link
https://onedrive.live.com/redir?resi...nt=file%2c.zip

seg9585 04-09-2014 01:22 AM

Re: Team 254 Presents: CheesyVision
 
Wouldn't it be easier just to hold up a light or large board of a specific color to discern between the two?
Basically we are just concerned about answering a boolean question here. As in: "Is the left goal hot?" If no, don't hold up the board/light and assume the right goal is hot. If yes, hold up your indicator.

seg9585 04-09-2014 01:26 AM

Re: Team 254 Presents: CheesyVision
 
Quote:

Originally Posted by billbo911 (Post 1371529)
I LOVE IT!!

This year 2073 used a USB webcam on our bot to track the balls. It was implemented to assist the driver with alignment to balls when they were obstructed from his view or just too far away to easily line up.
We won the Inspiration in Control Award at both Regionals we attended because of it.

Cool -- we actually did the same thing, and fed back to the driver station which balls were detected in the field of view, and also their distance + offset angle from our collector. Seemed to work pretty well but when we were out on the field we didn't use the autonomous control for it just because of the nature of defensive and high-speed gameplay. Unfortunately we didn't win a Controls award at either of our regionals. Would love to compare code though!

MrTechCenter 04-09-2014 01:33 AM

Re: Team 254 Presents: CheesyVision
 
Quote:

Originally Posted by seg9585 (Post 1371535)
Wouldn't it be easier just to hold up a light or large board of a specific color to discern between the two?
Basically we are just concerned about answering a boolean question here. As in: "Is the left goal hot?" If no, don't hold up the board/light and assume the right goal is hot. If yes, hold up your indicator.

This is kind of like what we did for our hot goal tracking. We lined up our robot up with the middle of the high goal so that it could only see the targets for the side that we were on. The camera looked at the targets for the first second and a half of autonomous. If that side was hot first, the robot quickly drove forward and shot. If the other side was hot first, the robot very slowly moved forward so that by the time it was in position to shoot, the goals flipped.


All times are GMT -5. The time now is 04:52 AM.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi