Chief Delphi

Chief Delphi (http://www.chiefdelphi.com/forums/index.php)
-   Programming (http://www.chiefdelphi.com/forums/forumdisplay.php?f=51)
-   -   Team 254 Presents: CheesyVision (http://www.chiefdelphi.com/forums/showthread.php?t=128639)

alexander.h 21-04-2014 19:47

Re: Team 254 Presents: CheesyVision
 
Quote:

Originally Posted by billbo911 (Post 1377477)
LabView is not involved except for driving the robot.
We used a USB webcam attached to a PCDuino. It wold track the balls based on color and shape. We also had a switch on the DS that allowed us to select to track Blue or Red.
We only used the "x axis" center of the ball to assist the driver with aligning to the ball. We never used the distance to the ball.

We feed the "x" value to LabView to be used to help the driver align.

OK ... so for Cheesy Vision, there is almost nothing going in Labview. As for the ball tracker, couldn't I just get the alliance colour and send that as the colour to recognize instead of using a switch on the Driver Station?

JamesTerm 22-04-2014 12:51

Re: Team 254 Presents: CheesyVision
 
Quote:

Originally Posted by alexander.h (Post 1377485)
As for the ball tracker, couldn't I just get the alliance colour and send that as the colour to recognize instead of using a switch on the Driver Station?

There has been some talk about that... on this thread.


BTW... I like the .h in your name... I thought you were a c++ programmer. :)

alexander.h 22-04-2014 18:13

Re: Team 254 Presents: CheesyVision
 
Quote:

Originally Posted by JamesTerm (Post 1377758)
There has been some talk about that... on this thread.

Thanks for the link!

Quote:

Originally Posted by JamesTerm (Post 1377758)
BTW... I like the .h in your name... I thought you were a c++ programmer. :)

Ha ha ha ... No, it just symbolize's the initial of my last name : Hassler.

Skragnoth 23-04-2014 21:34

Re: Team 254 Presents: CheesyVision
 
Has anyone had any luck using cheesy vision on the playing fields at champs? It works perfectly for us in the pit, but it is not able to connect to the robot on the Newton playing field or the Newton practice field. We submitted a question to the FTA regarding whether they have port 1180 blocked but haven't gotten back to us yet.

Gregor 23-04-2014 23:33

Re: Team 254 Presents: CheesyVision
 
Quote:

Originally Posted by Skragnoth (Post 1378130)
Has anyone had any luck using cheesy vision on the playing fields at champs? It works perfectly for us in the pit, but it is not able to connect to the robot on the Newton playing field or the Newton practice field. We submitted a question to the FTA regarding whether they have port 1180 blocked but haven't gotten back to us yet.

We ran it fine on our practice match on Galileo field.

DjScribbles 24-04-2014 12:56

Re: Team 254 Presents: CheesyVision
 
Quote:

Originally Posted by Skragnoth (Post 1378130)
Has anyone had any luck using cheesy vision on the playing fields at champs? It works perfectly for us in the pit, but it is not able to connect to the robot on the Newton playing field or the Newton practice field. We submitted a question to the FTA regarding whether they have port 1180 blocked but haven't gotten back to us yet.

If you (or anyone else) are using my original C++ port, there were a few issues in the original code that could be causing your problems. See this post for the details: http://www.chiefdelphi.com/forums/sh...71&postcount=5

Skragnoth 26-04-2014 06:14

Re: Team 254 Presents: CheesyVision
 
Quote:

Originally Posted by DjScribbles (Post 1378260)
If you (or anyone else) are using my original C++ port, there were a few issues in the original code that could be causing your problems. See this post for the details: http://www.chiefdelphi.com/forums/sh...71&postcount=5

It turns out that port 1180 was being blocked on the Newton field. They unblocked it and cheesy vision worked all day Thursday and the first half of Friday. Then it was no longer able to connect after lunch on Friday for 2 of our matches. We asked the FTA about it and they wouldn't admit it was their fault but it magically worked our next match. :)

JamesTerm 26-04-2014 21:15

Re: Team 254 Presents: CheesyVision
 
Quote:

Originally Posted by JamesTerm (Post 1371645)
kudos to you guys, and to quote one of our engineers... "That is a great idea and they are real champs for sharing".

Haha Robert... they are real Champs. :)

billbo911 19-08-2014 15:18

Re: Team 254 Presents: CheesyVision
 
Quote:

Originally Posted by billbo911 (Post 1376918)
OK, Here is a LabView TCP Receiver.

I can't believe how easy it was!

All my struggles were because I had a minor misunderstanding of how my IDE (Notepad++) was interacting with the CheesyVision code and also the security settings in Win 8 were preventing me from testing this receiver. The CheesyVision code is solid. Now this vi works just as reliably with it.

After several days of testing, we found that even this code would not work reliably. What we found was that the cRio would max out the CPU Utilization and crash immediately after it started receiving data from CheesyVision. No matter how we modified the vi, the cRio just wouldn't handle the stream of data being sent to it.
Previously, all our testing and development had been done between two quad core PCs. Horsepower was not an issue. Once we deployed it to the cRio, we found that it simply could not handle the amount of data being streamed to it.

So, back to the drawing board.

The approach we took then was to rewrite CheesyVision so that it would continue to continuously sample the image data, but not send the current status to the cRio until the cRio requested it. BTW, this is exactly the same approach we used with 3X award winning DoubleVision.
This has been tested (THANK YOU 3250!!) and proved to be stable when deployed to the cRio when running under LabView.

https://github.com/EagleForce/Cheesy-Eagle-Vision

Please follow the instructions and link on this page to install CheesyVision. Then once you are able to run CheesyVision on your DS, download the modified version of CheesyEagleVision from our GitHub to your DS. Simply run it instead of CheesyVision. No modifications to CheesyEagleVision are needed.
Also download the CheesyEagle Receiver.vi. Place the vi into your code wherever you would like it to execute from. The only modifications needed to the vi are to set the IP of your DS in it and determine how you would like to use the data output from it.

This has not been tested on the new RoboRio yet. So, if one of the Beta teams can do that, it would be greatly appreciated!

thinker&planner 19-08-2014 21:16

Re: Team 254 Presents: CheesyVision
 
Don't get me wrong, I think this is an amazing "thinking out of the robot" solution, but there is a part of me that just doesn't think this is really in the spirit of the whole "Auto" period. You don't even need vision tracking, one of the things that can distinguish an amazing robot from a better-than-average robot. (Yes, vision tracking usually makes robots amazing in teleop too, but it (was) almost the only way to distinguish "hot" targets in auto)
As far as next year, I would imagine that the rules will either have a "hybrid" period or not allow this type of thing.
Once again, great job! I can't say enough how I admire solutions (and loopholes) like this.

MrTechCenter 19-08-2014 22:37

Re: Team 254 Presents: CheesyVision
 
Quote:

Originally Posted by thinker&planner (Post 1397038)
Don't get me wrong, I think this is an amazing "thinking out of the robot" solution, but there is a part of me that just doesn't think this is really in the spirit of the whole "Auto" period. You don't even need vision tracking, one of the things that can distinguish an amazing robot from a better-than-average robot. (Yes, vision tracking usually makes robots amazing in teleop too, but it (was) almost the only way to distinguish "hot" targets in auto)
As far as next year, I would imagine that the rules will either have a "hybrid" period or not allow this type of thing.
Once again, great job! I can't say enough how I admire solutions (and loopholes) like this.

They did have hybrid mode back in 2012 and hardly anybody used it. Also, it's worth noting that one of the main reasons CheesyVision was created was because the hot goal lights and retroreflective targets were not properly synched with the field timer, causing those using sensors or cameras to detect the hot goal to not work (This was the case for both us and 254 at CVR). FIRST said that a software fix would be implemented after week two to correct this issue, although I don't believe it was ever truly fixed.

AdamHeard 20-08-2014 01:27

Re: Team 254 Presents: CheesyVision
 
It's not a loophole.... FIRST clarified such things were legal in QnA. They were legal in 2013 as well.


Quote:

Originally Posted by thinker&planner (Post 1397038)
Don't get me wrong, I think this is an amazing "thinking out of the robot" solution, but there is a part of me that just doesn't think this is really in the spirit of the whole "Auto" period. You don't even need vision tracking, one of the things that can distinguish an amazing robot from a better-than-average robot. (Yes, vision tracking usually makes robots amazing in teleop too, but it (was) almost the only way to distinguish "hot" targets in auto)
As far as next year, I would imagine that the rules will either have a "hybrid" period or not allow this type of thing.
Once again, great job! I can't say enough how I admire solutions (and loopholes) like this.


JamesTerm 20-08-2014 09:48

Re: Team 254 Presents: CheesyVision
 
Quote:

Originally Posted by thinker&planner (Post 1397038)
Don't get me wrong, I think this is an amazing "thinking out of the robot" solution, but there is a part of me that just doesn't think this is really in the spirit of the whole "Auto" period. You don't even need vision tracking, one of the things that can distinguish an amazing robot from a better-than-average robot. (Yes, vision tracking usually makes robots amazing in teleop too, but it (was) almost the only way to distinguish "hot" targets in auto)
As far as next year, I would imagine that the rules will either have a "hybrid" period or not allow this type of thing.
Once again, great job! I can't say enough how I admire solutions (and loopholes) like this.


I have to agree with thinker and planner as I felt the spirit of the game was to encourage and promote vision processing (the way it was intended with the reflectors) to teach students this awesome technology. I should clarify "spirit of game" should not be associated with terms like loophole or whether or not it is legal. As for the issue with the delay... I believe there are at least a few teams that would account for this as it is trivial in code to solve... even if it meant less time to finish. After all... the art of writing a control system is error management and if you write one... you know what I'm talking about.

I really hope future games will indeed target the need for vision processing the way it has been laid out (i.e. have a good ROI in points to use it)... Vision processing is an arduous task and I'd love to see more teams master it!

billbo911 20-08-2014 10:31

Re: Team 254 Presents: CheesyVision
 
Quote:

Originally Posted by thinker&planner (Post 1397038)
Don't get me wrong, I think this is an amazing "thinking out of the robot" solution, but there is a part of me that just doesn't think this is really in the spirit of the whole "Auto" period. You don't even need vision tracking, one of the things that can distinguish an amazing robot from a better-than-average robot. (Yes, vision tracking usually makes robots amazing in teleop too, but it (was) almost the only way to distinguish "hot" targets in auto)
As far as next year, I would imagine that the rules will either have a "hybrid" period or not allow this type of thing.
Once again, great job! I can't say enough how I admire solutions (and loopholes) like this.

Quote:

Originally Posted by JamesTerm (Post 1397099)
I have to agree with thinker and planner as I felt the spirit of the game was to encourage and promote vision processing (the way it was intended with the reflectors) to teach students this awesome technology. I should clarify "spirit of game" should not be associated with terms like loophole or whether or not it is legal. As for the issue with the delay... I believe there are at least a few teams that would account for this as it is trivial in code to solve... even if it meant less time to finish. After all... the art of writing a control system is error management and if you write one... you know what I'm talking about.

Thank you both for sharing your opinions on this. While I may not agree 100%, I truly see your point and respect your position.
All I did with this modification was to close the loop on it's use under LabView.

Quote:

Originally Posted by JamesTerm (Post 1397099)
I really hope future games will indeed target the need for vision processing the way it has been laid out (i.e. have a good ROI in points to use it)... Vision processing is an arduous task and I'd love to see more teams master it!

James,
THIS!!
I could not have said it any better. If games could be made that have a large ROI in points for use of Vision, I would be thrilled!

Cory 20-08-2014 11:10

Re: Team 254 Presents: CheesyVision
 
Quote:

Originally Posted by JamesTerm (Post 1397099)
As for the issue with the delay... I believe there are at least a few teams that would account for this as it is trivial in code to solve... even if it meant less time to finish. After all... the art of writing a control system is error management and if you write one... you know what I'm talking about.

We did exactly what you said. The delay was unacceptable and was something FIRST was either unwilling or unable to fix. It was impossible for a 3 ball auto to work consistently under those conditions. That is the only reason we made Cheesy Vision.

JamesTerm 20-08-2014 13:05

Re: Team 254 Presents: CheesyVision
 
Quote:

Originally Posted by Cory (Post 1397118)
We did exactly what you said. The delay was unacceptable and was something FIRST was either unwilling or unable to fix. It was impossible for a 3 ball auto to work consistently under those conditions. That is the only reason we made Cheesy Vision.


Ah, that is good to know... I'm glad you guys went through this path first! We had 3 ball auton in code, and could not fit it within 10 seconds... so kudos for being able to do that. (Our winch took too long to load).

FWIW: They could have easily appended more time to account for the delay as that would have been a software solution... but yeah... probably easier said than done... probably some consequences if they would have went down that path. I do wonder though how they came up with 10 seconds in the first place.

JamesTerm 21-08-2014 00:31

Re: Team 254 Presents: CheesyVision
 
Quote:

Originally Posted by Cory (Post 1397118)
We did exactly what you said. The delay was unacceptable and was something FIRST was either unwilling or unable to fix. It was impossible for a 3 ball auto to work consistently under those conditions. That is the only reason we made Cheesy Vision.

Ok I got to thinking about this and something is not adding up to me perhaps you can explain. How can Cheesy Vision make up time for the delay? Grant Imahara once said that it takes an average person 200ms to react to a sudden change. So to me anything a human could to *for this context* processing vision could do as well, but even better since it has a faster reaction time.

Kevin Sheridan 21-08-2014 01:18

Re: Team 254 Presents: CheesyVision
 
Quote:

Originally Posted by JamesTerm (Post 1397309)
Ok I got to thinking about this and something is not adding up to me perhaps you can explain. How can Cheesy Vision make up time for the delay? Grant Imahara once said that it takes an average person 200ms to react to a sudden change. So to me anything a human could to *for this context* processing vision could do as well, but even better since it has a faster reaction time.

So for our three ball auto we determine which goal is hot and drive to the opposite goal and wait for the goals to switch to start shooting. With our old detection system we put a 1 second delay to see which goal was hot before we started driving to the opposite goal. Since the goals were extremely inconsistent on switching (sometimes the lights/targets were delayed by up to 1.5 seconds) we implemented CheesyVision. Using CheesyVision we can determine the hot goal as we drive forward because we are no longer reliant on sensors looking at the retro reflective targets. I believe we determined which goal was hot around the 2.5-3 second mark giving our operator a large window to tell the robot which goal is hot.

JamesTerm 21-08-2014 08:01

Re: Team 254 Presents: CheesyVision
 
Quote:

Originally Posted by Kevin Sheridan (Post 1397317)
Using CheesyVision we can determine the hot goal as we drive forward because we are no longer reliant on sensors looking at the retro reflective targets. I believe we determined which goal was hot around the 2.5-3 second mark giving our operator a large window to tell the robot which goal is hot.


Ah I get it... as you drive forward your FOV is too narrow to view both reflectors, and yeah I wouldn't want to use fish eye lens to mess with the geometry of recognizing rectangles either. Thanks for explanation. :)

I do want to throw out for the good of the group the idea of streaming 2 video feeds at around 3mbps using h264... of course the 3mbps is only necessary if one is doing vision processing on the PC driverstation... I'm looking forward to seeing what kind of processing power the robo-rio will offer. We were planning on streaming 2 video feeds, but found the human ability to pick up balls did not need a rear view camera.

Abhishek R 11-02-2015 11:08

Re: Team 254 Presents: CheesyVision
 
Quote:

Originally Posted by JesseK (Post 1441762)
Here I thought this thread was revived to discuss possible merits of Cheesy Vision for this year's game. There seem to be a couple...

I thought there was a rule specifically outlawing the use of webcams and Kinects to communicate with the robot during autonomous?

Andrew Schreiber 11-02-2015 11:15

Re: Team 254 Presents: CheesyVision
 
Quote:

Originally Posted by Abhishek R (Post 1441768)
I thought there was a rule specifically outlawing the use of webcams and Kinects to communicate with the robot during autonomous?

There is.

Caleb Sykes 11-02-2015 11:27

Re: Team 254 Presents: CheesyVision
 
Quote:

Originally Posted by JesseK (Post 1441762)
Here I thought this thread was revived to discuss possible merits of Cheesy Vision for this year's game. There seem to be a couple...

It might be feasible to put a laptop with webcam right on the robot, and then have the HP use cheesy vision to line up the robot with the chutes.

JesseK 11-02-2015 11:54

Re: Team 254 Presents: CheesyVision
 
Quote:

Originally Posted by Abhishek R (Post 1441768)
I thought there was a rule specifically outlawing the use of webcams and Kinects to communicate with the robot during autonomous?

It's 100% illegal, but the thought exercise opens up some creativity in robot control / silly antics.

Karthik 11-02-2015 14:01

Re: Team 254 Presents: CheesyVision
 
I've moved all the "mentor vs student" posts that weren't discussing CheesyVision into their own thread. Please continue the discussion over there:

http://www.chiefdelphi.com/forums/sh...d.php?t=134357


All times are GMT -5. The time now is 06:17.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi