|
|
|
![]() |
|
|||||||
|
||||||||
| View Poll Results: What did you use for Vision Processing | |||
| Driver Station |
|
28 | 60.87% |
| Laptop/Netbook on Robot |
|
3 | 6.52% |
| Other |
|
2 | 4.35% |
| Did not use/cRIO |
|
13 | 28.26% |
| Voters: 46. You may not vote on this poll | |||
![]() |
| Thread Tools | Rate Thread | Display Modes |
|
#1
|
||||
|
||||
|
I am just wondering who was using their Driver Station or an non cRIO computer for Vision Processing in Rebound Rumble, and what are you going to do next year with the new field data limits.
On a side note, because of the $400 limit of parts on the robot and the new allowance of laptops allowed on robots, would people like a price raise limited for laptops/alt. computing devices that could go on the robot? Last edited by Caboose : 23-08-2012 at 18:07. Reason: 42 |
|
#2
|
||||
|
||||
|
Re: Who used Driver Station for Vision?
Where did you hear about new field data limits?
|
|
#3
|
||||
|
||||
|
Page 23 of the Einstein Investigation Report: <http://www3.usfirst.org/node/2426>
Quote:
Last edited by Caboose : 23-08-2012 at 11:41. Reason: 42 |
|
#4
|
||||
|
||||
|
Re: Who used Driver Station for Vision?
Quote:
To transmit a single frame of video with 320x240 resolution and 24bit color is 320*240*24 = 1.8 Mb. Note that the Axis cameras use MJPEG compression as well, so this is a gross overestimate. For targeting purposes, given the network lag that's going to be inherit in the system, you shouldn't need more than 10 fps. Maybe 15 if you want smoother-looking video for display to your drivers. That's still only 30 Mbps (even with uncompressed video). |
|
#5
|
||||
|
||||
|
Re: Who used Driver Station for Vision?
Quote:
![]() |
|
#6
|
||||||
|
||||||
|
Re: Who used Driver Station for Vision?
Here's is what the Q/A said for last year:
Quote:
|
|
#7
|
||||
|
||||
|
Re: Who used Driver Station for Vision?
Quote:
Did you do any testing to compare your accuracy against a lower resolution? |
|
#8
|
|||||
|
|||||
|
Re: Who used Driver Station for Vision?
Quote:
They were kind enough to put their code up on Delphi, so take a look if your interested. -RC |
|
#9
|
||||
|
||||
|
Re: Who used Driver Station for Vision?
Quote:
*Edited top Question* Last edited by Caboose : 23-08-2012 at 18:07. Reason: 42 |
|
#10
|
||||
|
||||
|
Re: Who used Driver Station for Vision?
That's what I had assumed, but I'm curious if you noted how much larger the errors were. I'd also be interested in the errors of the stereo distance measurement vs the perspective distance measurement as a function of distance, if you have them.
|
|
#11
|
||||
|
||||
|
Re: Who used Driver Station for Vision?
Due to the events on Einstein some of the rules regarding networks have changed. One of those is that there's going to be a bandwidth limit for each robot and QoS to ensure the driver commands can make it through with as little latency as possible.
|
|
#12
|
|||||
|
|||||
|
Re: Who used Driver Station for Vision?
We ran our vision on Dashboard.
I set the fps limit to 20fps. Due to the design of the control algorithm we used, more frames is better, so we try for 20. We get images at 320x240 (we determined this was the smallest size that achieved adequate resolution with the fish-eye lens we used). If we encounter any bandwidth limit in the future, we'll either run smaller or slower. Running at 320x240 at 10fps is still far better than what the cRio on its own is capable of (I don't think we could get more than 6fps at any resolution, given the amount of processor utilized by non-vision code), and any solution that is weight neutral on the robot the solution we will almost always choose. |
|
#13
|
||||
|
||||
|
Quote:
![]() |
|
#14
|
||||
|
||||
|
Re: Who used Driver Station for Vision?
If you're worried about the price limit and weight, you shouldn't even be looking at laptops. Build your own Mini-ITX board. You can buy/bill the components separately and to your needed specifications, and that way you don't have to carry around the screen. There are power supplies on that site designed for car computers that are tolerant to voltage drop-out as well, so you can even power the computer off of the robot's battery, saving you more weight.
|
|
#15
|
||||
|
||||
|
Re: Who used Driver Station for Vision?
For me the bigger question is, why doesn't FIRST just ditch the cRio and go with laptop controlled robots?
All it would take is a USB motor controller + I/O board (and since we already have FIRST-specific electronics that ship with the cRio, I don't see this as being a huge issue) and almost any laptop and you have a control system that can be easily upgraded/updated, has an independent backup battery (no worries about power loss on your controller), can be programmed in virtually any language, can run vision processing without eating network bandwidth, and, depending on what you get, can be cheaper and have more features than the cRio. |
![]() |
| Thread Tools | |
| Display Modes | Rate This Thread |
|
|