Quote:
Originally Posted by Greg McKaskle
I'm not sure if more of the IMAQ functions were wrapped this year. Being open source, if one team does it and submits it, all teams could use it, so I'd think it would have happened by now. Anyway, for the vision I did this year, I only used one function not in the list. I used Color Threshold as well.
Greg McKaskle
|
I think you're making some incorrect presumptions here.
For the first week of build season, I spent a lot of time searching the WPILib docs looking for image processing functions. All I found were some thresholding functions, particle analysis reports, and some detect ellipses off to the side. I was looking for a real image processing library, like OpenCV. We wanted to explore techniques like flickering lights on and off to identify the targets. We were using Java. So, I spent literally days trying to figure out how to achieve what I wanted to. Finally, after going through the source code of WPILib in C++, I found functions that started with "imaq" and then found nivision.h. I learned by reading through the source code that the undocumented Image pointers returned by WPILib functions went with these functions. Even then, I wasn't sure they would work. They could be completely unrelated. This was completely by chance.
Teams should not be expected to have to go through this, or be able to. This is not the standard that teams who are just learning how to program should be able to reach. If we want teams to actually do some real programming, instead of having fun calling a few cute functions to see what happens, then those who actually have control of how this library is made need to see this as simply inadequate.