|
|
|
![]() |
|
|||||||
|
||||||||
![]() |
|
|
Thread Tools | Rate Thread | Display Modes |
|
|
|
#1
|
||||
|
||||
|
Re: 30fps Vision Tracking on the RoboRIO without Coprocessor
Given that HSV requires a bunch of conditional code it's going to be tough to vectorize. You could give our approach from last year a try :
Code:
vector<Mat> splitImage;
Mat bluePlusRed;
split(imageIn, splitImage);
addWeighted(splitImage[0], _blue_scale / 100.0,
splitImage[2], _red_scale / 100.0, 0.0,
bluePlusRed);
subtract(splitImage[1], bluePlusRed, imageOut);
After that we did a threshold on the newly created single-channel image. We used Otsu thresholding to handle different lighting conditions but you might get away with a fixed threshold as in your previous code. To make this fast you'd probably want to invert the red_scale and blue_scale multipliers so you could do an integer divide rather than convert to float and back - but you'd have to see which is quicker. Should be able to vdup them into all the uint8 lanes in a q register at the start of the loop and just reuse them. And be sure to do this in saturating math because overflow/underflow would ruin the result. Oh, and I had some luck getting the compiler to vectorize your C code if it was rewritten to match the ASM code. That is, set a mask to either 0 or 0xff then and the mask with the source. Be sure to mark the function args as __restrict__ to get this to work. The code was using d and q regs but seemed a bit sketchy otherwise, but it might be fast enough where you could avoid coding in ASM. Last edited by KJaget : 16-11-2016 at 08:27. |
|
#2
|
|||
|
|||
|
Re: 30fps Vision Tracking on the RoboRIO without Coprocessor
No fair heading down so close to bare metal
![]() |
|
#3
|
|||
|
|||
|
Re: 30fps Vision Tracking on the RoboRIO without Coprocessor
No one knows what vision processing will be needed in the future. For this year we found that feeding the results of processing into a control loop did not work well. We take a picture calculate the degrees of offset from the target. Then use this offset and the IMU to rotate the robot. Take another frame and check that we are on target. If not rotate and check. If on target shoot. We did not need a high frame rate and it worked very well. I'll note that our biggest problem was not the vision but, the control loop to rotate the bot. There was a thread on this earlier. We hosted MAR Vision day this past weekend. It has become very apparent that most teams are struggling with vision. While it's nice to see work like this, I would like to see more of an effort to bring vision to the masses. GRIP helped allot this year.
|
|
#4
|
||||
|
||||
|
Re: 30fps Vision Tracking on the RoboRIO without Coprocessor
It's only unfair if I say I've done it and then leave the whole thing closed source
![]() |
|
#5
|
||||
|
||||
|
Re: 30fps Vision Tracking on the RoboRIO without Coprocessor
Quote:
![]() |
|
#6
|
|||
|
|||
|
Re: 30fps Vision Tracking on the RoboRIO without Coprocessor
Quote:
WPILib did a poor job of wrapping NIVision (work NOT done by NI, by the way). The history is that a few folks tried to make a dumbed-down version for the first year, and it was a dud. Then some students hacked at a small class library of wrappers. But the hack showed through. That doesn't mean NIVision, the real product, is undocumented or trying to be sneaky. NI publishes three language wrappers for NIVision (.NET, C, and LV). The documentation for NIVision is located here -- C:\Program Files (x86)\National Instruments\Vision\Documentation. And one level up is lots of samples, help files, utilities, etc. If the same people did the wrappers on top of OpenCV, it would have been just as smelly. Luckily, good people are involved in doing this newer version of vision for WPILib. But I see no reason to make NIVision into the bad guy here. If you choose to ignore the WPILib vision stuff and code straight to NIVision libraries from NI, I think you'll find that it is a much better experience. That is what LV teams do, by the way. LV-WPILib has wrappers for the camera, but none for image processing. They just use NIVision directly. If my time machine batteries were charged up, I guess it would be worth trying to fix the time-line. But the I'm still worried about the kids, Marty. Greg McKaskle |
![]() |
| Thread Tools | |
| Display Modes | Rate This Thread |
|
|