All the hurdles to implementing machine vision kept us out of machine vision. We are like a lot of small teams in that we do not have time to chase machine vision when the drive motors are not running. All the chatter on Chief Delphi about using Net Tables, ancillary processors or loading opencv libraries reinforced our reluctance. In addition, 2 years of issues the hd3000 usb camera did not help.
This was particularly painful for me, since a big part of my engineering and programming career was machine vision.
This year we implemented a quick n dirty tower tracker for our regional event. It worked amazingly well, but a little too late to get us to St Louis.
I will post some screen shots and some code when we finish getting unpacked. Here are the highlights
- C++ code
runs in roboRio using minimum set of nivision.h functions.
runs at frame rate (appeared to be about 15 fps)
annotated image shows detection and tracking.
Cpu load went up less than 10% when tracking.
Logitech 9xxx? USB camera
Summary. We were already using IMAQxxxx functions (nivision.h) to select between a front viewing and rear viewing camera. When tracking, we copied each scan frame into a 2 dimensional array (something I am comfortable with) using imagImageToArray. Then used some fairly simple techniques to detect bright vertical and horizontal lines. Finally a little magic to home in on the bottom horizontal reflective tape. Then we copied our annotated image data back to the normal scan frame using imaqArrayToImage.
Once we could track the tower, we struggled with trying to make minimal angle correction with the a skid-steering robot. Finally ran out of time.
We did manage one 20 point autonomous, so we think we are cool.