Chief Delphi

Chief Delphi (http://www.chiefdelphi.com/forums/index.php)
-   Programming (http://www.chiefdelphi.com/forums/forumdisplay.php?f=51)
-   -   Team 254 Presents: CheesyVision (http://www.chiefdelphi.com/forums/showthread.php?t=128639)

Tom Bottiglieri 09-04-2014 15:38

Re: Team 254 Presents: CheesyVision
 
In elims at SVR, Brian from 971 and I coded up their robot to use this app (we just let them use our backup driver station laptop to make the install process easier).

They only needed 1 bit of data: whether or not the goal directly in front of them was hot. They used this bit of data to determine whether to wait 3 or 7 seconds to shoot from when auton started. We just used the driver's right hand to signal this bit. The left side was a no care.

PayneTrain 09-04-2014 15:51

Re: Team 254 Presents: CheesyVision
 
Quote:

Originally Posted by Tom Bottiglieri (Post 1371742)
In elims at SVR, Brian from 971 and I coded up their robot to use this app (we just let them use our backup driver station laptop to make the install process easier).

They only needed 1 bit of data: whether or not the goal directly in front of them was hot. They used this bit of data to determine whether to wait 3 or 7 seconds to shoot from when auton started. We just used the driver's right hand to signal this bit. The left side was a no care.

If I recall correctly, it only ever missed once, and that was due to the spectacular new and exciting failure of FMS switching between tleeop and auto at random, correct? This is super neat stuff.

Tom Bottiglieri 09-04-2014 15:57

Re: Team 254 Presents: CheesyVision
 
Quote:

Originally Posted by PayneTrain (Post 1371747)
If I recall correctly, it only ever missed once, and that was due to the spectacular new and exciting failure of FMS switching between tleeop and auto at random, correct? This is super neat stuff.

Yes. In QF1-1 you can clearly see both our robots double clutch in auto.

Jared 09-04-2014 17:22

Re: Team 254 Presents: CheesyVision
 
This is really cool. We were planning on using the kinect, but we haven't had spectacular results in testing when we try it with people walking around in the background.

After playing around with it, I found it really useful to be able to lock the calibration color value so that I could hold a green index card in front of the calibration square, save that calibration value, then use both hands to hold up two cards in the boxes so that I can drop one hand out of the way to signal.

To add the lock-
above the while loop
Code:

locked = 0
after the statement in the while loop beginning with cal, left, right
Code:

if locked == 1:
            cal = lastCal
        lastCal = cal

at the bottom where the keys are checked

Code:

elif key == ord('l'):
            locked = 1
        elif key == ord('u'):
            locked = 0

Pressing l locks the current calibration value, and u resets it to normal.

Jared Russell 09-04-2014 19:03

Re: Team 254 Presents: CheesyVision
 
Quote:

Originally Posted by Jared (Post 1371808)
This is really cool. We were planning on using the kinect, but we haven't had spectacular results in testing when we try it with people walking around in the background.

After playing around with it, I found it really useful to be able to lock the calibration color value so that I could hold a green index card in front of the calibration square, save that calibration value, then use both hands to hold up two cards in the boxes so that I can drop one hand out of the way to signal.

To add the lock-
above the while loop
Code:

locked = 0
after the statement in the while loop beginning with cal, left, right
Code:

if locked == 1:
            cal = lastCal
        lastCal = cal

at the bottom where the keys are checked

Code:

elif key == ord('l'):
            locked = 1
        elif key == ord('u'):
            locked = 0

Pressing l locks the current calibration value, and u resets it to normal.

This is in fact how our original prototype worked, but we switched to continuous on-line calibration because we found that bumping the laptop could throw things off.

billbo911 09-04-2014 19:30

Re: Team 254 Presents: CheesyVision
 
Quote:

Originally Posted by Jared Russell (Post 1371841)
This is in fact how our original prototype worked, but we switched to continuous on-line calibration because we found that bumping the laptop could throw things off.

I imagine changes in lightning would create a problem as well. The LED strip right in front of the DS is green prior to a match, then off during Autonomous. That alone would change the calibration. So, a realtime cal is very important.

Jared 09-04-2014 21:46

Re: Team 254 Presents: CheesyVision
 
Quote:

Originally Posted by Jared Russell (Post 1371841)
This is in fact how our original prototype worked, but we switched to continuous on-line calibration because we found that bumping the laptop could throw things off.

We found the same thing with our test tonight, but we saw a lot of improvement with a really bright green index card. We're now just comparing the amount of green in the two squares and ignoring the calibration one. The side with the most green in it becomes the hot side, and the other is the cold one. This seems to work the best for us.

DjScribbles 10-04-2014 01:15

Re: Team 254 Presents: CheesyVision
 
I haven't seen any talk of a C++ port, so I started a thread in the C++ sub forum here to avoid derailing this thread:

http://www.chiefdelphi.com/forums/sh...26#post1372026

My prototype code is linked in the thread, it is completely untested, but any contributions are welcome.

Thanks Poofs, very awesome implementation; looking forward to trying this out.

Thad House 10-04-2014 01:19

Re: Team 254 Presents: CheesyVision
 
1 Attachment(s)
It wouldnt let me edit my original post, but I did some testing today and got a version of this that uses NetworkTables to work. It worked on my simulator setup and should work the exact same on a real robot. It just uses 2 bool's, one for each hand. I attached the file to this, and my post on page 1 has the link to the pynetworktables for windows.

I plan bringing this to the regional championship in case anybody needs help with the hot goal. I really like this way, and if we hadn't already coded the kinect we would most likely use it.

DjScribbles 10-04-2014 07:09

Re: Team 254 Presents: CheesyVision
 
I plan bringing this to the regional championship in case anybody needs help with the hot goal. I really like this way, and if we hadn't already coded the kinect we would most likely use it.[/quote]

In case anybody needs a helping hand with pynetworktables, I believe this would be the dependency you need: https://github.com/robotpy/pynetworktables

Is this correct Thad?

Thad House 10-04-2014 09:30

Re: Team 254 Presents: CheesyVision
 
Quote:

Originally Posted by DjScribbles (Post 1372056)
In case anybody needs a helping hand with pynetworktables, I believe this would be the dependency you need: https://github.com/robotpy/pynetworktables

Is this correct Thad?

Thats if you want to build it from source. If you use this link it gives you a windows installer so you do not need to install any of the build stuff.
http://firstforge.wpi.edu/sf/frs/do/...ktables.2014_4

DjScribbles 10-04-2014 19:57

Re: Team 254 Presents: CheesyVision
 
I just wanted to post back and say I got the C++ port up and running (with minimal changes).

https://github.com/FirstTeamExcel/Ro...sionServer.cpp

Feel free to shamelessly steal the code, but I'd love to hear if it helps anyone out.

phurley67 11-04-2014 09:23

Re: Team 254 Presents: CheesyVision
 
Wanted to say thank you. I helped our team get it working with labview yesterday at the Michigan Championship. I made a couple minor changes to the python script: classmate laptop already flipped the image, so I removed the flip logic and fixed left/right, switched to using UDP, and slowed down the send frequency. UDP made the reconnect stuff unnecessary and simplified the labview interface as well.

While there I also helped 107 with a copy of the code and while I did not touch base to see if they got everything working, I know in testing they also had it working in auton (controlling wheels for easy testing).

The whole team got a real kick out of playing with the code. Thanks again for an elegant and cheesy solution.

Jared Russell 11-04-2014 10:23

Re: Team 254 Presents: CheesyVision
 
Quote:

Originally Posted by phurley67 (Post 1372507)
UDP made the reconnect stuff unnecessary and simplified the labview interface as well.

Yeah, UDP is definitely a more straightforward way to do this, but we had already implemented a TCP server on our robot and decided to repurpose it.

Tom Bottiglieri 11-04-2014 11:27

Re: Team 254 Presents: CheesyVision
 
Quote:

Originally Posted by Jared Russell (Post 1372530)
Yeah, UDP is definitely a more straightforward way to do this, but we had already implemented a TCP server on our robot and decided to repurpose it.

Also the squawk JVM that runs on the cRIO doesn't support UDP listen sockets.


:confused:


All times are GMT -5. The time now is 05:08.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi