Quote:
Originally Posted by Caboose
I would think a better plan would be to have the vision processing calculated on the robot, use of cheaper and better USB cameras with low image capture lag, with a light and powerful laptop and send relevant data to the Driver Station/Robot. But alas there is a dreaded $400 limit to ALL parts, if only laptops that would go on the robot could be a little more expensive the FMS would not have a need to worry about large images clogging the field network... 
|
Team 11 found and used an AMD dual core netbook (it had a bigger screen than what most might consider a netbook) on our robot for Rebound Rumble. It came with a SSD in it. The screen was removed from it. The original battery was used in it (we had also considered ITX boards, PC104 boards, BeagleBones...didn't want to fight with the power supply issues). It passed inspection at the 3 competitions it was used in. Later in the season it was removed (it worked fine it was removed to adjust for driving styles). It was *just* under the $400 limit.
They had 2 USB cameras connected to it. One high resolution (1080p), low speed (measured 5+ frames a second) and one high speed (measured 30+ frames a second...this was fun to watch and could swamp a single core), low resolution (640x480). It was running Linux and using custom Java software written by the students to process video and send control signals to the cRIO over the robot Ethernet. We tried quite a few USB cameras (I've got a 1 cubic foot box full of them now). Some had terrible white balance. Some didn't work well in Video4Linux but were a little better in Windows (well it was a Microsoft camera LOL). Some had terrible frame rates or highly variable frame rates unexpectedly. We found oddly that several of the very cheap webcams on Amazon worked great ($5 webcam versus $125 webcam and the $5 webcam works better for this...go figure). (I didn't mention exactly which cameras because I don't want to take all the challenge out of this.)
One of the original concerns that prompted this design which has now spanned 2 years of competition (we actually thought about this the year before and didn't have any weight to spare for it, though our soon to be programming captain made some very impressive tests) was the bandwidth sending video to the driver's station. We had a great deal of problems locating working clear samples of Java code for the cRIO that could process video so this seemed like an idea worth testing (mind you I know the cRIO can do this we just couldn't get the samples to work or to function in a way we preferred).
Though we didn't use it, OpenCV is an extremely functional and professional vision library you can call in many languages. Our students actually communicated with Video4Linux (V4L) which OpenCV actually uses as well (though it can use other solutions to get the video sources).
Our team uses a lot of Linux. The programmers who worked on this part were quite comfortable with it and to my knowledge no mentor provided technical support because they didn't need to. The netbook had Windows 7 on it and we removed it. I'm quite sure from my own work professionally that you could use Windows, Linux, BSD or Mac OSX and get workable results even with a single core Atom CPU (we originally tested with a Dell Mini 9 which is precisely that at the time it was running Ubuntu 9). My advice (take it or leave it) is try not to think you need to process every frame and every pixel of every frame.
Though we used Java (most precisely OpenJDK) I personally tested PyGames and it worked just fine stand alone.
If someone else is interested in trying it this shows you most everything you need to know:
http://www.pygame.org/docs/tut/camera/CameraIntro.html
I had that interfaced with a NXT controller for an experiment and that was also controlled with Python code.
Quote:
Originally Posted by RyanCahoon
If we ignore all the above and suppose they did allow teams to use their own laptops, there's also the point of maintainability at the competitions. The ability of the FTAs and CSAs to help troubleshoot problems becomes greatly reduced when you open up such a critical part of the control system. FIRST is having a difficult enough time keep the current system running, as evidenced by the communication problems, etc, even when there aren't malicious parties involved. I'm not trying to insult FIRST at all - just saying the job is a difficult one already. It would become increasingly unclear if the problem was in the field system or if the team had messed something up themselves.
* Perhaps the argument here comes down to the fact that the cRIOs are bulkier than they need to be. And I would agree with you. I doubt FIRST needs controllers that are certified for 50G shock loads, etc. See above points on logistics, though. It might have been interesting (political issues notwithstanding) if we had kept the old IFI controllers but made it easier to interface them with a laptop.
Despite my arguments to the contrary, I think it would be a great opportunity if FIRST did move to a laptop-based system. I guess the last point is that I am encouraged by FIRST opening up the driver station in the last couple of years. Perhaps this is a sign of things to come (I hope).
|
I'm confused by this (not to appear too argumentative).
A few people warned us this year about the netbook we used and with proper mounting there are plenty of examples of our robot smashing over the bump in the center of the field at full throttle. We did that in practice on our own field and on the real field literally well over 150 times. No issues. Course we did have an SSD in it.
Also doesn't FIRST allow you to use other laptops for the driver's station and doesn't that create to some extent the same support issue? I grant you the DS is basically Windows software so that did sort of reduce the variability. However, there's nothing at all stopping FIRST from producing a Linux distro all their very own. This would give them control over the boot times, the drivers, the interfaces and the protocol stacks. It's really much the same problem FIRST faces if they put DD-WRT or OpenWRT on the robot APs. I assure everyone that a laptop for processing video on the robot and literally entirely in lieu of the cRIO (with a replacement for the digital side car) can be done and I have no problems proving it.