Vision tracking: Did they get it right?

During the kickoff this morning when Dave Lavery spoke about the vision tracking, he seemed optimistic. He said that he thinks “they finally got it right this time.” Its obviously early to tell, but what do you think? Will the vision tracking aspect of Breakaway be a disappointment (like past games) or did they get it right this time?

No way to tell until we get our hands on the system. I assure you I will be begging to try it out on Monday :stuck_out_tongue:

There has been at least two other times when someone has sworn this is the year for vision tracking…

Honestly, I don’t see much of an advantage of it for auton. Unless you can get a ball, line up the target, and shoot it 100% of the time.

It’s going to be essential for teleop though. There is no way a driver can manually make a shot without a camera feed on their netbook/drivers station. That’s where vision will be awesome!

While there is a use in autonomous, it seems as if you can place the balls where you’d want, thus allowing the use of encoders and some dead reckoning to allow a shot on target (You know your position, you know the ball’s position, and you know the goal’s position. It’s all a matter of math and distances/timing from there). With regards to teleop…yes having a live video feed is useful, but that’s not necessarily vision-tracking, rather just a camera feed. And as your goal is on your side of the field, homing in on the target is not that big of a deal in my opinion. It might be useful if you’re looking to score in your opponents goal rather…

The reason auton scoring is beneficial, in my opinion, is that the goals will be completely undefended. You might not even need to camera track to score in them.

Can’t say if they got it right, but can say we want to use it this year, IF we can get it to work.

Is there going to be some kind of prepared program we can download?
If so can someone please post a link.

Not necessarily. A defensive bot’s autonomous mode may be to block a goal.

when they have proven that they can lock on the target and are giving us some of the program i would say yes they have beyond all reason got it a lot better than before

It’s probably inside the windriver update (for the c people). Excuse me while I go check for zip files :stuck_out_tongue:

<G28> says otherwise:

Can’t cross the centerline, so can’t block opponent shots

Highly unlikely, per <G28>:

[quote=Game Manual Section 7, <G28>]AUTONOMOUS PERIOD ROBOT Movement - During the AUTONOMOUS PERIOD, a
ROBOT cannot completely cross the CENTER LINE. Violation: Two PENALTIES; plus two
PENALTIES and a YELLOW CARD if a BALL or ROBOT is contacted after completely
crossing the CENTER LINE, and two additional PENALTIES for each additional BALL or
ROBOT contacted.
[/quote]

Um, there’s been some kind of vision demo nearly every year, and they’ve given us enough code to get started each time. I particularly remember impressive-looking demos in both 2005 and 2006. 2007 featured examples of tracking two lights, (2008 dumped the camera in favor of the “Robocoach”), and if I remember correctly, 2009 had a demo of the new Axis camera tracking trailers.

I may be wrong, but I think 9 out of 10 goals will be shot from within 10 feet of the goal. Given that range and the fact that the goal is right next to the drivers, camera-based tracking isn’t going to help much.

What I meant by that was it is pointless to use the camera to score in auton, not that it’s pointless altogether.

I totally agree, you could use encoders to score really effectively.

Found the code for targeting. According to the comments lighting should have no effect on it. For you tech savvy people (isn’t that everyone here?), download the windriver updater, rename it to a .zip, and unzip it. In the folder go to vxworks-6.3 arget\src\demo\2010ImageDemo. Target.cpp is the source code for the targeting system.

Looks like they just went with converting the color image into monochromatic and looking for the changes there. they even made detecting the circle an API call.

oh and indubitably you can find the LabVIEW update here.

thank you

Anyway I can find the Java camera code? I saw it once somewhere, but can’t remember. Thanks in advance!

http://first.wpi.edu/FRC/frcjava.html

Nevermind.

Nope. Penalty for crossing the white line and penalty for each game piece the robot touched.

Actually, it’s a double penalty for crossing, double penalty + yellow for the first ball/robot hit after crossing, and double penalty per contact after that. That was a nice understatement, Chris.

But there is a goal on your side of the line that you could block.:cool:

Is there any starter camera tracking code for LabVIEW yet? I looked through the examples, but those are only for color recognition- not for shapes.