![]() |
Team 254 Presents: CheesyVision
Like many teams this season, Team 254 was surprised when we got to our first competition and found out that the Hot Goal vision targets were not triggering right at the start of autonomous mode. There seems to have been some improvements through the weeks, but there is still anywhere from 0.5 to 1.5 seconds delay.
We originally had planned on using a sensor on board the robot - an infrared photosensor from Banner - but our problem was that (a) you can't move the robot until the hot goal triggers or you'll miss the target and (b) it meant our drive team spent a lot of time lining up the sensors to be juuuust right (as Karthik and Paul often pointed out at Waterloo). Onboard cameras may be more tolerant of movement, but introduce new hardware and wiring onto the robot. We were intrigued by the Kinect, but thought: Why use the Kinect when our Driver Station already has a built-in webcam? Introducing CheesyVision, our new laptop-based webcam system for simple gesture control of our robot. 254 ran this software at SVR and drove to the correct goal every single time. In eliminations, we installed it on 971 and it worked perfectly, as well. We wanted to share it with all of FRC prior to the Championship, because we think that just because the field timing issue will probably never be perfect this season, nobody should have to suffer. CheesyVision is a Python program that runs on your Driver Station and uses OpenCV to process a video stream from your webcam. There are three boxes on top of the webcam image: -A calibration box (top center) -Two boxes for your hands (left and right) Basically, if the left and right boxes are similar in color to the calibration box, we assume your hand is not there. Before each match, our operator puts his hands in the left and right boxes, and then drops the one that corresponds with the goal that turns hot. The result is sent over a TCP socket to the cRIO - example Java code for a TCP server and robot that uses this data in autonomous mode is provided, and doing the same thing in C++ or LabView should be easy (If you implement one of these, please share it with the community!). There are tuning instructions in the source code, but we have found the default settings work pretty reliably under most lighting conditions as long as your shirt and the color of your skin are different enough (because the algorithm is self-calibrating). Of course, you could use virtually anything else besides skin and clothing if the colors are different. Here are some screenshots: ![]() ![]() ![]() ![]() To download and install the software, visit: https://github.com/Team254/CheesyVision Good luck! |
Re: Team 254 Presents: CheesyVision
I saw this in person at SVR, and it is very cool. Great job 254, and thanks for sharing!
Now if only someone would use this same technology to block their 3 ball auto... |
Re: Team 254 Presents: CheesyVision
This is absolutely phenomenal, and since 781 had to remove their camera for weight, I needed a new method for hot goal detection. I have not been this happy about programming for a while--whether this works for us or not, I am incredibly grateful.
Too bad I can't give rep more than once. |
Re: Team 254 Presents: CheesyVision
This really is cool. I like the method. The only problem is that Wildstang couldn't use it :D
In all seriousness, I think this is an excellent way of detecting hot goals. Very simple, and most laptops have a camera on them nowadays. Ill keep it in mind for championships this weekend. |
Re: Team 254 Presents: CheesyVision
Thank you so much, we were just looking at how to implement our hot goal detection for champs, and this is an amazing solution. We also plan on extending it to tell the robot where to go while blocking during autonomous. Thank you so much for sharing this with the FIRST community!
|
Re: Team 254 Presents: CheesyVision
We currently use the Kinect method, but i might be inclined to implement this instead. I didn't develop something like this because the kinect Java classes already exsisted and were fairly easy to use. I do like how this required some work though.
Nice work. |
Re: Team 254 Presents: CheesyVision
I can't wait to tell the beleaguered crew working on Kinect programming there may be another way!
It is a real shame 254 isn't using the Kinect after its rousing success with it in 2012. |
Re: Team 254 Presents: CheesyVision
This weekend at the Windsor-Essex Great Lakes Regional I heard of 1559 using a very similar program for their Hot Goal detection. Instead they used cards that had symbols on them, and I believe they had this all season long though I can not confirm. Because of this they won the Innovation and Control Award.
It's pretty cool seeing that another team came up with a very similar way to detect the Hot Goal. Good luck at Champs Poofs! |
Re: Team 254 Presents: CheesyVision
2468 team appreciate used a system like this at Bayou last week. This never occurred to us - it's so simple and elegant. This will be pretty cool to show kids at demos.
|
Re: Team 254 Presents: CheesyVision
1 Attachment(s)
Quote:
I've attached our RoboRealm script file for anyone who's curious. To use, first double click on the Fiducial line in the script, then click the Train button, then click Start. You may need to change the path to the directory that the fiducials are stored in if you're not on 64-bit Windows or you installed in a non-default directory. You'll also have to modify the Network Tables configuration to match your team number. If we can get a more comprehensive paper written on it, I'll post it on CD. Nice work, Poofs and Devil-Tech (and others). Cool to see other teams using this method as well. |
Re: Team 254 Presents: CheesyVision
1 Attachment(s)
I LOVE IT!!
This year 2073 used a USB webcam on our bot to track the balls. It was implemented to assist the driver with alignment to balls when they were obstructed from his view or just too far away to easily line up. We won the Inspiration in Control Award at both Regionals we attended because of it. If 254 can share their code, we can share the Labview receiver we used to help any team that can take advantage of it. Set the IP of the receiver to that of your DS, the Port number to that set on line 72 of the code 254 provided, and set number of bytes to read to whatever you are sending. In the case of 254's code, that should be 1. A quick look at the block diagram will make it obvious what to do. Please ask any questions here so I can publicly answer them. |
Re: Team 254 Presents: CheesyVision
So something that might be helpful to add would be to make it SmartDashboard compatible. That might make it alot more accessible to teams because it can easily be added as just a variable on the dashboard. You can get python binding for SmartDashboard here
http://firstforge.wpi.edu/sf/frs/do/...ktables.2014_4 I don't have a CRIO on me, but I attached a version that uses the exact same method of communicating as we were doing earlier in the season, so it should work. It just has two bool variables (right_in and left_in) and should just use the standard smartdashboard VI's or functions and be compatible with all versions. EDIT: Attaching the file wouldnt work for some reason. So here is a skydrive link https://onedrive.live.com/redir?resi...nt=file%2c.zip |
Re: Team 254 Presents: CheesyVision
Wouldn't it be easier just to hold up a light or large board of a specific color to discern between the two?
Basically we are just concerned about answering a boolean question here. As in: "Is the left goal hot?" If no, don't hold up the board/light and assume the right goal is hot. If yes, hold up your indicator. |
Re: Team 254 Presents: CheesyVision
Quote:
|
Re: Team 254 Presents: CheesyVision
Quote:
|
Re: Team 254 Presents: CheesyVision
Quote:
The way 254 has done it, if both hands are in their respective boxes, the robot knows that neither hot goal has lit up, and therefore won't start it's autonomous routine until it receives data from the laptop saying that there is a hot goal to shoot balls into. |
Re: Team 254 Presents: CheesyVision
Quote:
|
Re: Team 254 Presents: CheesyVision
This is easily the simplest most innovative control method this season.
Kudos to whoever came up with the idea and I can see this becoming the standard in subsequent seasons. |
Re: Team 254 Presents: CheesyVision
Team 3211 The Y Team from Israel did the same thing in the Israeli regional, worked 100% of our matches.
We however used facial recognition libraries, so when the camera recognizes a face, it knows there's a hot goal in front of it. We later tried printing pictures of Dean and Woodie to use, but they turned out not 3-D enough for the face recognition... We were'nt sure if its 'legal', so we asked the head ref, who approved the use of that. The only relevant Q&A states that a kinect might be used, we didnt know if a webcam is ok too... Ill talk to our programs and try having the code here later on, we used Labview on the robot and python with openCV for that image recognition. Also, we were told by the FTA that he noticed us sending a lot of info through the fields bandwitdh, and that it might cause problems. We decided to have the drivers shut down the image recognition running at the begining of teleop, to avoid any possible problems or delays (which we didnt have, but just to be sure). Thanks Poofs! Its an honor seeing that our idea is used by you guys too =] |
Re: Team 254 Presents: CheesyVision
Quote:
|
Re: Team 254 Presents: CheesyVision
This is awesome, we currently have the Kinect set up but this wouldn't require all of the extra equipment.
|
Re: Team 254 Presents: CheesyVision
We actually just put on a banner sensor at the last competition, but made it slightly rotatable so we just moved the sensor. Our drive team was able to consistently find the target in about 10 seconds. Maybe it was your placement/mount of the sensor that made you guys take a while?
|
Quote:
No idea. Im more of a mechanics guy, but the FTA came to us saying that he noticed it, and said that if it disrupts the field somehow he will shut us down. Never happened. |
Re: Team 254 Presents: CheesyVision
How many times can the Poofs blow our minds in one season?
Thank you for sharing this with teams--I bet it's going to see a lot of play at Championship. Maybe even district championships too, for teams on the stick. |
Re: Team 254 Presents: CheesyVision
Quote:
|
Re: Team 254 Presents: CheesyVision
Big thanks to 254. You've given us Cheesy Drive code (which we're running an implementation of right now) and now this little gem.
Thanks for everything you guys do to build teams up. |
Re: Team 254 Presents: CheesyVision
Quote:
We even turned the RPI off, so there would be no problems there. I too, dont see how it should pass that much packages through the field, but do remember the FTA coming to talk to us about it. There were no problems, but i just wanted to give teams a "heads up", that if not implemented correctly this code might be problematic. That said, it's not that hard to implement. On another note, maybe the teams who have tried this method can do some kind of an 'help all teams' stand at the championship, where we could set together premade codes for C, Java and Labview, and just come to teams and help them get those 5 more points. Sounds easy enough, nost laptops already have webcams - so why not. It's kinda like waht 254 did to 971 at SVR, isnt it? Ideas? |
Re: Team 254 Presents: CheesyVision
Wow! Thank you so much for sharing. This is a wonderful means for hot goal detection that would be absolutely wonderful to use. Impressive as usual Cheesy Poofs!
|
Re: Team 254 Presents: CheesyVision
Quote:
It only sends 1 byte every 25ms. The flow if FROM the DS to the cRio. If the processing was done on the cRio, then the FTA would have a point, but it is not. All processing is done on the DS and only one byte is sent. How the cRio uses that byte is up to the team. |
Re: Team 254 Presents: CheesyVision
Wow, awesome work here Jared and 254! Thank you for sharing the work and looking to improve the FRC community.
|
Re: Team 254 Presents: CheesyVision
Quote:
|
Re: Team 254 Presents: CheesyVision
I absolutely love the simplicity and out-of-the-box thinking in this hot goal tracking system. I was thinking about it, though, and wondered to myself how it's legal. I checked the rules, and according to G16, it's legal:
Quote:
Quote:
Again, good job, 254 does it again. |
Re: Team 254 Presents: CheesyVision
|
Re: Team 254 Presents: CheesyVision
Completely missed out on the chance to call it "Hot or Not". Just saying :D
|
Re: Team 254 Presents: CheesyVision
Quote:
I was surprised to find how much bandwidth can acrue using UDP and the DO_NOT_WAIT option with a similar test of sending two doubles every 33ms. In short I took out the DO_NOT_WAIT and the bandwidth went down significantly. |
Re: Team 254 Presents: CheesyVision
Quote:
Neither goal hot Left goal hot Right goal hot The "neither" state is useful because you can watch for the transition from neither to one of the other states to indicate that the goal has flipped. This requires 2 bits of information to discern, hence separate left and right boxes. Other use-cases may not need the third state and could only use one detection area. |
Re: Team 254 Presents: CheesyVision
Quote:
There are also a few Q&A making it legal. |
Re: Team 254 Presents: CheesyVision
Quote:
Quote:
|
Re: Team 254 Presents: CheesyVision
Quote:
And the final wow goes to all the rules breakdown of what we *can* do... just think of the possibilities... heak why not voice commands (tell alliances mates to be quite hehe). :) |
Re: Team 254 Presents: CheesyVision
Quote:
Yes and no. We never feed video back to the driver. We just used the value of "x center" of the ball to steer the robot whenever the driver needed assistance. One button on the steering wheel overrode the wheel position and replaced it with the "((image x center - ball x center value) * k)". "k" was a gain value used to bring the error value to a useful level to steer the robot. All image acquisition and processing were done on a PCDuino on-board the robot. None of the network traffic for this crossed the WiFi network, it all stayed local to the robot. |
Re: Team 254 Presents: CheesyVision
Team 329 used a barcode scanner to decode a barcode which populated a field on the Smart Dashboard which indicated that we would shoot immediately (the goal you are looking at was now hot) or delayed for 5 seconds if no barcode was scanned.
No additional bandwidth, no camera, no additional processing, simple and effective. |
Re: Team 254 Presents: CheesyVision
Quote:
|
Re: Team 254 Presents: CheesyVision
#pewpew
|
Re: Team 254 Presents: CheesyVision
Thanks for sharing!
#veryvision #muchGP #wow |
Re: Team 254 Presents: CheesyVision
Quote:
|
Re: Team 254 Presents: CheesyVision
In elims at SVR, Brian from 971 and I coded up their robot to use this app (we just let them use our backup driver station laptop to make the install process easier).
They only needed 1 bit of data: whether or not the goal directly in front of them was hot. They used this bit of data to determine whether to wait 3 or 7 seconds to shoot from when auton started. We just used the driver's right hand to signal this bit. The left side was a no care. |
Re: Team 254 Presents: CheesyVision
Quote:
|
Re: Team 254 Presents: CheesyVision
Quote:
|
Re: Team 254 Presents: CheesyVision
This is really cool. We were planning on using the kinect, but we haven't had spectacular results in testing when we try it with people walking around in the background.
After playing around with it, I found it really useful to be able to lock the calibration color value so that I could hold a green index card in front of the calibration square, save that calibration value, then use both hands to hold up two cards in the boxes so that I can drop one hand out of the way to signal. To add the lock- above the while loop Code:
locked = 0Code:
if locked == 1:Code:
elif key == ord('l'): |
Re: Team 254 Presents: CheesyVision
Quote:
|
Re: Team 254 Presents: CheesyVision
Quote:
|
Re: Team 254 Presents: CheesyVision
Quote:
|
Re: Team 254 Presents: CheesyVision
I haven't seen any talk of a C++ port, so I started a thread in the C++ sub forum here to avoid derailing this thread:
http://www.chiefdelphi.com/forums/sh...26#post1372026 My prototype code is linked in the thread, it is completely untested, but any contributions are welcome. Thanks Poofs, very awesome implementation; looking forward to trying this out. |
Re: Team 254 Presents: CheesyVision
1 Attachment(s)
It wouldnt let me edit my original post, but I did some testing today and got a version of this that uses NetworkTables to work. It worked on my simulator setup and should work the exact same on a real robot. It just uses 2 bool's, one for each hand. I attached the file to this, and my post on page 1 has the link to the pynetworktables for windows.
I plan bringing this to the regional championship in case anybody needs help with the hot goal. I really like this way, and if we hadn't already coded the kinect we would most likely use it. |
Re: Team 254 Presents: CheesyVision
I plan bringing this to the regional championship in case anybody needs help with the hot goal. I really like this way, and if we hadn't already coded the kinect we would most likely use it.[/quote]
In case anybody needs a helping hand with pynetworktables, I believe this would be the dependency you need: https://github.com/robotpy/pynetworktables Is this correct Thad? |
Re: Team 254 Presents: CheesyVision
Quote:
http://firstforge.wpi.edu/sf/frs/do/...ktables.2014_4 |
Re: Team 254 Presents: CheesyVision
I just wanted to post back and say I got the C++ port up and running (with minimal changes).
https://github.com/FirstTeamExcel/Ro...sionServer.cpp Feel free to shamelessly steal the code, but I'd love to hear if it helps anyone out. |
Re: Team 254 Presents: CheesyVision
Wanted to say thank you. I helped our team get it working with labview yesterday at the Michigan Championship. I made a couple minor changes to the python script: classmate laptop already flipped the image, so I removed the flip logic and fixed left/right, switched to using UDP, and slowed down the send frequency. UDP made the reconnect stuff unnecessary and simplified the labview interface as well.
While there I also helped 107 with a copy of the code and while I did not touch base to see if they got everything working, I know in testing they also had it working in auton (controlling wheels for easy testing). The whole team got a real kick out of playing with the code. Thanks again for an elegant and cheesy solution. |
Re: Team 254 Presents: CheesyVision
Quote:
|
Re: Team 254 Presents: CheesyVision
Quote:
:confused: |
Re: Team 254 Presents: CheesyVision
Quote:
Also it doesn't look like you have a runner thread within the object. Are you running it externally? If so could you post that code as well? |
Re: Team 254 Presents: CheesyVision
Thank you very much for posting this. Within an hour of showing this to our programmer, we had it fully operational with our 1 ball.
|
Re: Team 254 Presents: CheesyVision
Quote:
The object implements a thread for reading from the IO stream by inheriting from jankyTask, the Run method is the wrapped threading function. |
Re: Team 254 Presents: CheesyVision
I think it is very kind of your team to post this publicly. The use of a built-in camera on the driver station is a very good choice for getting a human in the loop and you've sparked some thinking for Chief Delphi that will last for seasons to come.
After looking through the posted code repository, I have to ask: what is Team 254's philosophy on student involvement? The two contributors on github appear to be your mentors and the level of programming skill is also not commonly found in high school students. Have I missed the student involvement in this? I'm not making any sort of accusation that Team 254 has done something wrong or is not following rules. I am just surprised that for a high school competition the high-visibility work from your team seems to be mentor-only. I believe the announcer at Silicon Valley said that Team 254 won the regional for 15 of the last 16 years. This is impressive and clearly your team is doing something that ensures a solid victory record. |
Re: Team 254 Presents: CheesyVision
Quote:
|
Re: Team 254 Presents: CheesyVision
Quote:
|
Re: Team 254 Presents: CheesyVision
This was forked from our teams' FRC 2014 repository just for public release. It is different from what we competed with last weekend (it removed some team-specific features, streamlined some quick hacks, and added a ton of comments). Mentors went over the code with a fine-toothed comb before making it public. This was deliberate.
While our students are intricately involved in our teams' software (more on this below), we are talking about releasing code to the entire FIRST community DURING the competition season. A fairly high bar is required for teams to be able to understand, use, and trust the code in time for their next competition - we certainly don't want to be breaking other teams' robots. I personally made (and stand behind) the decision to go mentor heavy on this particular project for this reason. (To be clear, I fully believe that our students could have made just as polished a product, but I thought that an expedient release would be ultimately more important.) It might be software, but this is just another COTS module that you can choose to use (or ignore). Like an AM Shifter or a VEXpro VersaPlanetary, I believe that putting a high quality component in the hands of a student is a vehicle for inspiration. Quote:
First, never judge a book by its cover. Every year I am amazed at what students are capable of. This year, there are some very gifted programmers on 254. They wrote a RESTful webserver on our cRIO (that ultimately provided the TCP server part of CheesyVision). One of them - and this still absolutely blows my mind to think about - designed and implemented a quintic spline trajectory planner for our autonomous driving routine. I explained the basic concept, then sat back as he did the math, derived the differential equations, and gave me working code. Just awesome. Second reason: An anecdote. One of my earliest posts on Chief Delphi was in this thread. It was 2003, and WildStang had just posted about StangPS, a really sophisticated navigation system that I was sure had to be engineer-built (just look at my posts!). I was a senior in high school at the time. I thought my gyro-based autonomous mode was pretty nifty, but was blown away by StangPS. I watched their video dozens of times, enthusiastically emailed it to my programming mentor at the time, and was just totally fascinated with it. I ended up reading about odometry and dead reckoning, using interrupts to read optical encoders, Kalman filters, and all sorts of other concepts that I didn't fully understand as a high schooler, but found really, really cool. While at the time I was a little peeved that here I was, a high school student writing all of 341's code while these other teams had teams of engineers, in hindsight I cannot thank 111 enough for raising the bar and for sharing what they did. I was inspired and in some permanent and positive way, my life was shaped by it. While a little Python script for processing a webcam image is by no means as impressive as complete robot navigation system, my hope is that at least a few students will give it a look and see something they think is cool and want to learn more about later. |
Re: Team 254 Presents: CheesyVision
Quote:
I had expected that your project was forked and that was why I asked for clarification as to what the students did. Instead, your answer wasn't completely clear to me as to exactly what the students did for CheesyVision. I do understand that it was "mentor heavy." Though you couldn't tell the differences between student and mentor effort when you were in high school, I trust my judgment because I have done programming for 18 years and know the subtle differences in programming skills at all levels. I do think very highly of the work you released to all teams. I am sure students also do. Quote:
I am eminently fortunate to always have mentored teams that were student run and each team has students just as impressive as the ones you described. From what I have learned today, I think the difference between your team and my teams is that other mentors keep it students vs students. I do not intend for any of my posts to put you on the defensive nor to diminish your students' work hard. I am trained to speak my mind and your reply has been informative. Thank you for answering. |
Re: Team 254 Presents: CheesyVision
Quote:
|
Re: Team 254 Presents: CheesyVision
Quote:
If yes, mission accomplished. It doesn't matter who builds the robot. What matters is what the students get out of it. You don't have to turn a wrench or write software to be inspired to do so. -Nick |
Re: Team 254 Presents: CheesyVision
Let's get this thread back on track everyone...
Cheesyvision is really innovative! Way to think outside the box! |
Re: Team 254 Presents: CheesyVision
Team 254, thank you!
We ran CheesyVision any time we were doing a 1 ball auto at MSC and it worked perfectly. You guys are awesome. |
Re: Team 254 Presents: CheesyVision
Thanks to 254 for helping to patch the (still broken) field/fms. We're still running a 1 second delay at the start of auton to avoid the timing issues, which was still not enough in at least one of our qualification matches at MAR champs.
|
Re: Team 254 Presents: CheesyVision
Thanks to 254 for giving us more awesome stuff to look through and use.
Our competition season was over before this release but I think we will be trying to implement this for any offseasons we go to. |
Re: Team 254 Presents: CheesyVision
Huge thanks to 254 for releasing this. We showed it to our programmers on Wednesday and had a working hot goal auton before lunch Thursday. You guys saved us a huge amount of time and finally let us get rid of the annoying green led ring our robot. Now if only our shirts weren't tie-dye...
|
Re: Team 254 Presents: CheesyVision
Quote:
Quote:
Thanks! |
Re: Team 254 Presents: CheesyVision
We added it this weekend to our Prac bot. Looks like we will have two ball hot goal detect at Champs.
Thanks so much guys. |
Re: Team 254 Presents: CheesyVision
I heard a lot of teams at MSC were using CheesyVision this weekend with great success. Kudos to releasing such a nice product in season.
Class Act. CheesyVision is much better than Teh CheeZViSHUN (which apparently just tinted all camera inputs blue). |
Re: Team 254 Presents: CheesyVision
As a heads up, I found that our netbook's webcam would cause an exception when the cheesy vision script started, to resolve this, I added a delay between initiating the camera connection, and grabbing the first image.
I don't have the script handy to share, but as someone who's never used python before, I'm confident that someone else could improve on the implementation anyway. I also saw some weird connection issues between the Client and Server on Saturday on the field (all other systems were normal, just lack of cheesy-vision). I left more details on the issue here. |
Re: Team 254 Presents: CheesyVision
I learned to write software for FRC bots by reading 254's 2010 code. It's great that releases are still coming out each year.
|
Re: Team 254 Presents: CheesyVision
Quote:
You were around in the days back before the off the shelf shifters. You should recall that before AM started selling their shifters there were few teams that could reliably shift. This is EXACTLY the scenario. Only instead of only rich teams having access to it (a set of shifters will run you what, $700?) anyone with an internet can use this. Comments like this make me question whether I should open source anything lest I be accused of 'cheating'. |
Re: Team 254 Presents: CheesyVision
Quote:
If we give in to comments like this... and good code stops being shared everyone will lose... Keep the good code flowing! Exposure to continuously well-written code for students to view can inspire them. Now that I think about it, well-written code still inspires me! ;) |
Re: Team 254 Presents: CheesyVision
Quote:
|
Re: Team 254 Presents: CheesyVision
Quote:
Quote:
Quote:
Quote:
Also, Thanks for Cheesy Vision! 955 used it with 1 and 2 ball hot at PNWCMP last weekend :) Ryan |
Re: Team 254 Presents: CheesyVision
2363 worked on integrating this into our system last night in preparation for the Championship. We were able to get it working with our one ball and will be working on integrating it into our 2 ball tonight.
Thank you for this innovative out of the box system. It is amazing the simple things teams come up with each and every year and how much you can learn by just looking at what other teams have done. |
Re: Team 254 Presents: CheesyVision
Quote:
Kylar worked with Greg McKaskle from NI on this implementation. Kylar will be at Championships if you have any questions for him regarding this programming technique. |
Re: Team 254 Presents: CheesyVision
Quote:
We copied your loop into periodic tasks and changed the IP and port. We installed the three routines and changed the IP in CheesyVision. We can't get a connection between the ports. CheesyVision says no connection LabView says error 63 or 65 - connection refused. Any ideas as to what we are doing wrong? Thanks |
Re: Team 254 Presents: CheesyVision
A big MAHALO (Hawaiian thank you)!!!
Our team does not have any programming mentors and is entirely student coded. They have been working with the camera tracking the hot goal with some success. Our students do not use python but were very intrigued by this and had it implemented in a very short time into their C++ code. In fact our programming cadre has now become INSPIRED to learn more of the language. THIS IS AWESOME. Without your gracious sharing of your code I doubt the students would of looked at another programming language, especially this late in the season. Without a doubt this is what I really love about FIRST. Causing inspiration across the world by sharing.. Keep it up Good luck to all teams attending the championship! We will be rocking CheesyVision at St Louis.. See many of you there. Aloha! |
Re: Team 254 Presents: CheesyVision
Hi, I have been trying to use the Cheesy vision with LabView, but i can't find the correct functions to use it. Is it possible? if so how can i do it?
Thank you! |
Re: Team 254 Presents: CheesyVision
Quote:
The solution I posted here will not work! What I posted is a Socket Requester. What is needed is a Socket Receiver. I have tried dozens of variations based on tutorials, and on-line NI help, but so far have not been able to find anything that will work with CheesyVision and LabView. I know it "should" be simple, but so far I have not found anything that will work. That said, it could quite easily be my setup. I do not have access to a cRio, so I am using one laptop to run CheesyVision and another running the "receiver" vi in a standalone configuration. If anyone has any insight in how to resolve this, PLEASE SPEAK UP!! |
Re: Team 254 Presents: CheesyVision
Quote:
http://www.chiefdelphi.com/forums/sh...9&postcount=54 Thats the link to my post with the download, and the pynetworktables can be found here http://firstforge.wpi.edu/sf/frs/do/...ktables.2014_4 The variables can just be read using the SmartDashboard ReadBoolean Vi's. |
Re: Team 254 Presents: CheesyVision
We got it to work in LV by switching to a UDP socket instead of a TCP socket (on port 1130).
CheesyVision side, we removed the retry and connection code and use sendto instead of send to send a UDP socket. A quick google search (on my phone while at MSC) helped with this. On the LV side, we used a UDP Open and UDP Listen with a timeout of 0 in a While loop. When UDP Listen returns an error (timed out), we have some logic to use the last good byte recieved as the Cheesyvision byte, timestamp it, then calculate age (dt of timestamp), and report the byte and age to our code. I don't have the exact code, I'll see if I can get it. Total coding time was under 10mins in the pits. This was after an hour or so of fooling around with TCP. |
Re: Team 254 Presents: CheesyVision
Quote:
Excellent! I will try to replicate this approach this morning. Please post your code when you can! It will help tremendously if we can't get it dialed in. |
Re: Team 254 Presents: CheesyVision
1 Attachment(s)
OK, Here is a LabView TCP Receiver.
I can't believe how easy it was! All my struggles were because I had a minor misunderstanding of how my IDE (Notepad++) was interacting with the CheesyVision code and also the security settings in Win 8 were preventing me from testing this receiver. The CheesyVision code is solid. Now this vi works just as reliably with it. |
Re: Team 254 Presents: CheesyVision
Quote:
|
Re: Team 254 Presents: CheesyVision
Quote:
|
Re: Team 254 Presents: CheesyVision
Quote:
|
Re: Team 254 Presents: CheesyVision
Quote:
|
Re: Team 254 Presents: CheesyVision
Quote:
|
Re: Team 254 Presents: CheesyVision
Quote:
We used a USB webcam attached to a PCDuino. It wold track the balls based on color and shape. We also had a switch on the DS that allowed us to select to track Blue or Red. We only used the "x axis" center of the ball to assist the driver with aligning to the ball. We never used the distance to the ball. We feed the "x" value to LabView to be used to help the driver align. |
| All times are GMT -5. The time now is 05:08. |
Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi