Chief Delphi

Chief Delphi (http://www.chiefdelphi.com/forums/index.php)
-   General Forum (http://www.chiefdelphi.com/forums/forumdisplay.php?f=16)
-   -   Update #19 (http://www.chiefdelphi.com/forums/showthread.php?t=56031)

Billfred 23-03-2007 10:34

Re: Update #19
 
Quote:

Originally Posted by FreedomForce (Post 603270)
if u want us to work on our autonomous, why not just let us stack. (reference back to Update 18)

Perhaps I'm missing something here--what does stacking have to do with autonomous? Unless, of course, you're thinking about having stacked robots function as one unit in autonomous...but I would classify such a strategy as highly unlikely.

As I see it, it's an hour. Furthermore, it's an hour that you can't use for working on the robot in other areas--like everything else, it's a tradeoff. It won't in and of itself get any teams scoring ringers, unless they've got some magic fingers on the keyboard.

Taylor 23-03-2007 11:02

Re: Update #19
 
For those making the argument (mostly satirically so) that week 4,5,6 regionals have an advantage over 1,2,3, remember this. All robots in 1,2,3 had the same (dis)advantage of not having that hour for calibration. All robots in 4,5,6 have the same (dis)advantage of having that hour. Each regional was fair in and of itself; the winning alliances are true. No robot in a regional had an advantage over another robot in that same regional due to the hour of calibrations.

Likewise, no 4,5,6 team will have an advantage over a 1,2,3 team because the Atlanta field lighting will be different from all regionals. Teams will still have to calibrate to Atlanta conditions, no matter what or how many regionals they attended.
Similarly, I don't think rules imposed by head ref(s) at any particular regional had an adverse effect on the outcome of the event. All teams in a regional had to play by the same rules (misguided as though they may be). Only when one team experiences intentional favortism over another due to intentional inconsistency of judging calls does one team profit while another suffers.

[for what it's worth] I think FIRST has done a great job of showing they acknowledge they can make mistakes, and they are willing to analyze and possibly fix those mistakes if possible. Let's just hope we don't get Update #26 during the championships.

marccenter 23-03-2007 12:39

Re: Update #19
 
I just wish this rule was in effect at the Detroit Regional. I specifically went up to the head judge and asked if I could get on the practice field after Thursday 5 pm matches were completed to calibrate the camera and he said no, because of FIRST rules. I am glad that FIRST has allowed this change for all of us folks working hard on trying to make the camera work. Unfortunately, that was my last tournament to work on the camera. Maybe, next Year!:D

JohnBoucher 28-03-2007 14:22

Re: Update #19
 
Did this testing period help anyone?

Who went onto the field and tested values and how different were they from what you had?

Any idea how this will play out in Atlanta?

Qbranch 28-03-2007 15:20

Re: Update #19
 
Yeah cameras sure but what about other sensor calibration?

I'm using ultrasonic radar and depends on the distance between the upright poles on the rack which varies a little rack to rack.... usually have to guess but being able to actually measure would be very helpful.

If FIRST won't let me calibrate my sensors.... then i guess they only allow one way of doing things and aren't open to creative guidance systems....

aaah i'm sure they'll let me, theyre all for new stuff anyhow.

-q

Alekat 28-03-2007 17:12

Re: Update #19
 
This seems like a good effort to help figure out the mysterious soul that has been the camera.

It's too bad this calibration can't take place before the practice matches, that would make more sense, but I'm sure that's simply due to where the time constraints are.

heydowns 29-03-2007 10:09

Re: Update #19
 
Quote:

Originally Posted by JohnBoucher (Post 606676)
Did this testing period help anyone?

Who went onto the field and tested values and how different were they from what you had?

Any idea how this will play out in Atlanta?

We had no luck locking onto the vision target on Thursday of Boston regional. We were using the default values from Kevin Watson's camera code, which had worked fine for us last year and during build season this year.

Within a few minutes of being on the field, we rapidly realized the source of the problem.... The venue for Boston regional is an ice hockey arena, with electronic banners around the arena. The banners are at a height such that when you project the vision of the camera beyond the green vision target, the banners are in direct view.

They were running the banners with FIRST-related messages, most of which had a bright white background. This background had the same green component as the vision target (much like you get with florescent lights commonly found in, say, high school hallways :))

Most of the time this resulted in really big bounding areas for the "tracked" object, but low pixel counts, resulting in very low confidence values. This causes the camera servo handling code to continue to look around the venue for a better match, which it never finds since it is bombarded with large-area matches.

One match, though, we actually locked completely on the banner and the robot drove down along the side of the rack, trying to get "close enough" to score :)

We corrected the problem by clamping down the tolerance of our YCbCr values (R, G, B min/max) for the green light to +/- 10 I believe. This resulted in less pixel matches over the actual vision target, but we had far less false positives than before. Once this was in place, the servo tracking code had good enough confidence to follow the actual green vision target.

This was possible only because we had the on-field time.

After getting it working, we helped out 2 other teams having similar problems. AFAIK, lowering the tolerance worked for them, too. We saw at least 4 other teams also taking advantage of this on-field opportunity.

For those who have trouble using the LabView app for camera calibration, or simply don't have LabView available, we use the CMUCam2GUI application from the creators of the CMUCam (google it) with good results.

Mike Betts 29-03-2007 22:35

Re: Update #19
 
Quote:

Originally Posted by heydowns (Post 607172)
...This was possible only because we had the on-field time...

Please be sure that you feed this info back to FIRST via your Regional Committee Chair. FIRST needs to know that it was useful...

Regards,

Mike

Adam Y. 30-03-2007 07:19

Re: Update #19
 
Quote:

Originally Posted by Qbranch (Post 606709)
Yeah cameras sure but what about other sensor calibration?

I'm using ultrasonic radar and depends on the distance between the upright poles on the rack which varies a little rack to rack.... usually have to guess but being able to actually measure would be very helpful.

If FIRST won't let me calibrate my sensors.... then i guess they only allow one way of doing things and aren't open to creative guidance systems....

aaah i'm sure they'll let me, theyre all for new stuff anyhow.

-q

You can't successfully calibrate ultrasonic sensors. In fact they tend to exhibit failure modes that can't be fixed.

MikeDubreuil 30-03-2007 09:15

Re: Update #19
 
Quote:

Originally Posted by heydowns (Post 607172)
This was possible only because we had the on-field time.

I'm on the Boston Committee and this information is very helpful, thank you. We never thought the sponsor banners would interfere with the vision system. I will make a note of your experience. If the cameras are used next year I will see what I can do about getting the teams on the field earlier to calibrate their cameras.


All times are GMT -5. The time now is 00:47.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi