|
|
|
![]() |
|
|||||||
|
||||||||
| View Poll Results: What did you use for vision tracking? | |||
| Grip on RoboRio - IP Camera |
|
3 | 2.07% |
| Grip on RoboRio - USB Camera |
|
9 | 6.21% |
| Grip on Laptop- IP Camera |
|
19 | 13.10% |
| Grip on Laptop- USB Camera |
|
6 | 4.14% |
| Grip on Raspberry Pi- IP Camera |
|
5 | 3.45% |
| Grip on Raspberry Pi- USB Camera |
|
13 | 8.97% |
| RoboRealm IP Camera |
|
6 | 4.14% |
| RoboRealm USB Camera |
|
7 | 4.83% |
| Other - Please Elaborate with a Response |
|
77 | 53.10% |
| Voters: 145. You may not vote on this poll | |||
![]() |
| Thread Tools | Rate Thread | Display Modes |
|
#46
|
|||
|
|||
|
Re: What Did you use for Vision Tracking?
Team 107 was using the NI Vision software and IP or USB camera. The vision software was located on the driver station. This has limits for bandwidth from the field but running smaller pictures and some compression we were able to push these limits.
They also had a on board Kangaroo computer running NI Vision and using the network tables to communicates. Doing this will allow for faster video processing in the future if we write our own camera drivers or see what is out there. It also allows for bigger pictures so we get more resolution and accuracy. Doing this would also allow for a second camera to be used going through the driver station so the drivers could see where they were going without taking up targeting bandwidth. Both solutions were capable of running 30fps so for weight reason they went with the desktop version of the software. It is fun to play with the kangaroo and a target. |
|
#47
|
||||||
|
||||||
|
Re: What Did you use for Vision Tracking?
We used the LabVIEW vision example, integrated into our dashboard. I took a quick look at the teams calibrating their vision on the Einstein Mass field, and over half of them used LabVIEW or the NI Vision assistant.
|
|
#48
|
||||
|
||||
|
Re: What Did you use for Vision Tracking?
Did using the Labview Vision example running on the roboRIO result in a significant performance decrease?
Did robot controls become more sluggish or have more lag because of the extra CPU usage of the Labview Vision Processing? |
|
#49
|
|||
|
|||
|
Re: What Did you use for Vision Tracking?
We began the season with GRIP running on the roboRIO for an IP camera. For some reason, it would never work on the official field FMSes (it worked fine in our shop and on the practice fields).
We decided to throw all of that out the window and we eventually rolled a Python script utilizing OpenCV on a Raspberry Pi 3 with an accompanying camera module. The Pi communicated with the roboRIO through a USB to Ethernet cable (Ethernet went into the Pi, USB went into the roboRIO's USB Host port), thus creating a separate network between the RIO and the Pi. Finally, we used NetworkTables to actually transport data on the connection. We hoped to avoid any sort of meddling that the FMS might cause by creating our own direct connection between the RIO and Pi (bypassing the radio altogether), and this seemed to do the trick for us. |
|
#50
|
|||
|
|||
|
Re: What Did you use for Vision Tracking?
Quote:
|
|
#51
|
||||
|
||||
|
Re: What Did you use for Vision Tracking?
You reference "The latency correction discussed in the presentation at worlds" - is that presentation available?
Thanks, |
|
#52
|
|||
|
|||
|
Re: What Did you use for Vision Tracking?
|
|
#53
|
||||
|
||||
|
Re: What Did you use for Vision Tracking?
Here's an interesting question... did any teams run live tracking like 254? In other words, were any other teams able to get their turrets to track the goal as they drove towards it, dynamically following it?
|
|
#54
|
|||
|
|||
|
Re: What Did you use for Vision Tracking?
We ended up using openCV on a RasPi 2 then used Network Tables to transmit the data to the RoboRio.
|
![]() |
| Thread Tools | |
| Display Modes | Rate This Thread |
|
|