|
|
|
![]() |
|
|||||||
|
||||||||
|
|
Thread Tools |
Rating:
|
Display Modes |
|
#1
|
||||
|
||||
|
Appreciate's bird's eye (Human vision assist in autonomous)
Howdy,
We are Team 2468 Appreciate in Austin, Texas. As other teams have been doing, we have also utilized vision triggering in our autonomous program in order to successfully shoot into the hot goal. Our approach is different from CheesyVision because our robot always immediately drives forward, but it shoots only when we tell it. We use LabView along with our laptop's webcam(bird's eye) to achieve this. Our LabView program uses NI Vision Acquisition Software to get images from our driver station's webcam to process in the code. Using Vision Assistant, we generated LabView geometric pattern matching code to recognize our sign. Now, we track the recognized image in our webcam and surround it with a red box and fire when the sign is held less than ten degrees to the horizontal, as indicated by a shift in the color of the box to green. The VI outputs true or false; true if the box is green, false if the box is red. This allows us to communicate during autonomous to tell the robot when to fire. Tracking: ![]() Fire!: ![]() Here is how to implement it~ Download the NI Vision Acquisition Software. It can be found here: http://www.ni.com/download/ni-vision...13.09/4409/en/ Download our software, visit: https://github.com/Casolt/2468-Sign-Autonomous A sign will need to be created prior to using this code. The sign will need to have high contrast between the symbol and background and be matt to avoid reflection. We use a black poster board with white tape on top to write the word "Fire". We have included our image that we use to detect the sign. This image should be a clear, non-distorted picture of the sign which you are using. The picture will be referenced by a file path given to the Wait Until Sees Fire.vi. You will need to use the IMAQdx Open Camera and IMAQdx Configure Grab VI to begin grabbing images from the webcam. Place the Wait Until Sees Fire VI inside of the Default Case in the Dashboard loop. Connect the Image out of the Configure Grab to the Image in of the Wait Until Sees Fire VI. Use smart dashboard to write a Boolean to the robot and in the robot's Autonomous Independent, wait until that Boolean is true before launching your ball. How the VI is implemented in our Dashboard VI: ![]() How the robot responds to the smart dashboard: ![]() We appreciate help from Greg McKaskle during the development of this system. He dedicated his own time to teach several students about NI vision tools. |
| Thread Tools | |
| Display Modes | Rate This Thread |
|
|