|
|
|
![]() |
|
|||||||
|
||||||||
![]() |
| Thread Tools |
Rating:
|
Display Modes |
|
#46
|
|||
|
|||
|
Re: Team 254 Presents: CheesyVision
In elims at SVR, Brian from 971 and I coded up their robot to use this app (we just let them use our backup driver station laptop to make the install process easier).
They only needed 1 bit of data: whether or not the goal directly in front of them was hot. They used this bit of data to determine whether to wait 3 or 7 seconds to shoot from when auton started. We just used the driver's right hand to signal this bit. The left side was a no care. |
|
#47
|
||||
|
||||
|
Re: Team 254 Presents: CheesyVision
Quote:
|
|
#48
|
|||
|
|||
|
Re: Team 254 Presents: CheesyVision
Yes. In QF1-1 you can clearly see both our robots double clutch in auto.
|
|
#49
|
||||
|
||||
|
Re: Team 254 Presents: CheesyVision
This is really cool. We were planning on using the kinect, but we haven't had spectacular results in testing when we try it with people walking around in the background.
After playing around with it, I found it really useful to be able to lock the calibration color value so that I could hold a green index card in front of the calibration square, save that calibration value, then use both hands to hold up two cards in the boxes so that I can drop one hand out of the way to signal. To add the lock- above the while loop Code:
locked = 0 Code:
if locked == 1:
cal = lastCal
lastCal = cal
Code:
elif key == ord('l'):
locked = 1
elif key == ord('u'):
locked = 0
|
|
#50
|
|||||
|
|||||
|
Re: Team 254 Presents: CheesyVision
Quote:
|
|
#51
|
||||
|
||||
|
Re: Team 254 Presents: CheesyVision
I imagine changes in lightning would create a problem as well. The LED strip right in front of the DS is green prior to a match, then off during Autonomous. That alone would change the calibration. So, a realtime cal is very important.
|
|
#52
|
||||
|
||||
|
Re: Team 254 Presents: CheesyVision
We found the same thing with our test tonight, but we saw a lot of improvement with a really bright green index card. We're now just comparing the amount of green in the two squares and ignoring the calibration one. The side with the most green in it becomes the hot side, and the other is the cold one. This seems to work the best for us.
|
|
#53
|
|||
|
|||
|
Re: Team 254 Presents: CheesyVision
I haven't seen any talk of a C++ port, so I started a thread in the C++ sub forum here to avoid derailing this thread:
http://www.chiefdelphi.com/forums/sh...26#post1372026 My prototype code is linked in the thread, it is completely untested, but any contributions are welcome. Thanks Poofs, very awesome implementation; looking forward to trying this out. |
|
#54
|
||||
|
||||
|
Re: Team 254 Presents: CheesyVision
It wouldnt let me edit my original post, but I did some testing today and got a version of this that uses NetworkTables to work. It worked on my simulator setup and should work the exact same on a real robot. It just uses 2 bool's, one for each hand. I attached the file to this, and my post on page 1 has the link to the pynetworktables for windows.
I plan bringing this to the regional championship in case anybody needs help with the hot goal. I really like this way, and if we hadn't already coded the kinect we would most likely use it. Last edited by Thad House : 10-04-2014 at 01:41. |
|
#55
|
|||
|
|||
|
Re: Team 254 Presents: CheesyVision
I plan bringing this to the regional championship in case anybody needs help with the hot goal. I really like this way, and if we hadn't already coded the kinect we would most likely use it.[/quote]
In case anybody needs a helping hand with pynetworktables, I believe this would be the dependency you need: https://github.com/robotpy/pynetworktables Is this correct Thad? |
|
#56
|
||||
|
||||
|
Re: Team 254 Presents: CheesyVision
Quote:
http://firstforge.wpi.edu/sf/frs/do/...ktables.2014_4 |
|
#57
|
|||
|
|||
|
Re: Team 254 Presents: CheesyVision
I just wanted to post back and say I got the C++ port up and running (with minimal changes).
https://github.com/FirstTeamExcel/Ro...sionServer.cpp Feel free to shamelessly steal the code, but I'd love to hear if it helps anyone out. |
|
#58
|
|||
|
|||
|
Re: Team 254 Presents: CheesyVision
Wanted to say thank you. I helped our team get it working with labview yesterday at the Michigan Championship. I made a couple minor changes to the python script: classmate laptop already flipped the image, so I removed the flip logic and fixed left/right, switched to using UDP, and slowed down the send frequency. UDP made the reconnect stuff unnecessary and simplified the labview interface as well.
While there I also helped 107 with a copy of the code and while I did not touch base to see if they got everything working, I know in testing they also had it working in auton (controlling wheels for easy testing). The whole team got a real kick out of playing with the code. Thanks again for an elegant and cheesy solution. |
|
#59
|
|||||
|
|||||
|
Re: Team 254 Presents: CheesyVision
Yeah, UDP is definitely a more straightforward way to do this, but we had already implemented a TCP server on our robot and decided to repurpose it.
|
|
#60
|
|||
|
|||
|
Re: Team 254 Presents: CheesyVision
Quote:
![]() |
![]() |
| Thread Tools | |
| Display Modes | Rate This Thread |
|
|