|
|
|
![]() |
|
|||||||
|
||||||||
![]() |
| Thread Tools |
Rating:
|
Display Modes |
|
#31
|
||||
|
||||
|
Re: Team 254 Presents: CheesyVision
Quote:
![]() |
|
#32
|
|||||
|
|||||
|
Re: Team 254 Presents: CheesyVision
I absolutely love the simplicity and out-of-the-box thinking in this hot goal tracking system. I was thinking about it, though, and wondered to myself how it's legal. I checked the rules, and according to G16, it's legal:
Quote:
Quote:
Again, good job, 254 does it again. |
|
#33
|
|||
|
|||
|
Re: Team 254 Presents: CheesyVision
I think the rules themselves do not disallow it, but Q431 and Q446 further clarify the use of non-contact communication in the driver station during autonomous mode.
|
|
#34
|
||||
|
||||
|
Re: Team 254 Presents: CheesyVision
Completely missed out on the chance to call it "Hot or Not". Just saying
![]() |
|
#35
|
||||
|
||||
|
Re: Team 254 Presents: CheesyVision
Quote:
I was surprised to find how much bandwidth can acrue using UDP and the DO_NOT_WAIT option with a similar test of sending two doubles every 33ms. In short I took out the DO_NOT_WAIT and the bandwidth went down significantly. |
|
#36
|
|||||
|
|||||
|
Re: Team 254 Presents: CheesyVision
Quote:
Neither goal hot Left goal hot Right goal hot The "neither" state is useful because you can watch for the transition from neither to one of the other states to indicate that the goal has flipped. This requires 2 bits of information to discern, hence separate left and right boxes. Other use-cases may not need the third state and could only use one detection area. |
|
#37
|
|||||
|
|||||
|
Re: Team 254 Presents: CheesyVision
Quote:
There are also a few Q&A making it legal. |
|
#38
|
||||
|
||||
|
Re: Team 254 Presents: CheesyVision
Quote:
Quote:
|
|
#39
|
||||
|
||||
|
Re: Team 254 Presents: CheesyVision
Quote:
And the final wow goes to all the rules breakdown of what we *can* do... just think of the possibilities... heak why not voice commands (tell alliances mates to be quite hehe). ![]() |
|
#40
|
||||
|
||||
|
Re: Team 254 Presents: CheesyVision
Quote:
Yes and no. We never feed video back to the driver. We just used the value of "x center" of the ball to steer the robot whenever the driver needed assistance. One button on the steering wheel overrode the wheel position and replaced it with the "((image x center - ball x center value) * k)". "k" was a gain value used to bring the error value to a useful level to steer the robot. All image acquisition and processing were done on a PCDuino on-board the robot. None of the network traffic for this crossed the WiFi network, it all stayed local to the robot. |
|
#41
|
||||
|
||||
|
Re: Team 254 Presents: CheesyVision
Team 329 used a barcode scanner to decode a barcode which populated a field on the Smart Dashboard which indicated that we would shoot immediately (the goal you are looking at was now hot) or delayed for 5 seconds if no barcode was scanned.
No additional bandwidth, no camera, no additional processing, simple and effective. |
|
#42
|
||||
|
||||
|
Re: Team 254 Presents: CheesyVision
I think this would be a great thread. It would be cool to know if any other teams tried it and are willing to share how they did it.
|
|
#43
|
||||
|
||||
|
Re: Team 254 Presents: CheesyVision
#pewpew
|
|
#44
|
|||
|
|||
|
Re: Team 254 Presents: CheesyVision
Thanks for sharing!
#veryvision #muchGP #wow |
|
#45
|
|||
|
|||
|
Re: Team 254 Presents: CheesyVision
Our version was developed by our students with contributions from Greg McKaskle. Written in LabVIEW using the vision libraries. Instead of recognizing hand position, it uses a sign our drivers carry with them. The sign is initially held at an angle (neutral position) and turned to a horizontal position to shoot.
|
![]() |
| Thread Tools | |
| Display Modes | Rate This Thread |
|
|