2014 Vision Processing Example

My team and I used the tutorial provided to implement the vision targeting example code after testing the example code worked with the camera. When using the example code, we got the black and red image showing the targets. When I followed the steps provided to implement the vision targeting, I get a regular image and no distance information.

I attached snapshots of the labview code that I added to the basic dashboard program.

1.PNG




1.PNG

A couple things.

The Score and Rank VI has an input for annotating. If not wired, it defaults to False. If you wish to see the rectangles outlining the targets and the scores, wire a True to this input.

Your new code also looks like it default to a teal colored target. I’m not sure if that is calibrated the same as the example. Since you copied in the calibration code, you should be able to draw a line across your lit target to update the search colors. You may also want to widen the Value range.

If you run the dashboard directly, and don’t build a dashboard executable, you will be able to debug and determine if you want to change constants, etc. Remember that if you probe a binary image, you will want to right-click and set the Palette to Binary. Otherwise, the 0 and 1 will both look black. Binary has unique colors (black and red) for 0 and 1 pixel values.

Greg McKaskle

Here are two images of code that will need to be added. The first image will write data back to the Autonomous.vi and the second is wiring the True to the Score and Rank.vi that Greg pointed out.

Dashboard1.PNG

Dashboard2.PNG.PNG

Dashboard1.PNG
Dashboard2.PNG.PNG


Dashboard1.PNG
Dashboard2.PNG.PNG