View Single Post
  #11   Spotlight this post!  
Unread 11-14-2016, 12:14 PM
Jaci's Avatar
Jaci Jaci is offline
Registered User
AKA: Jaci R Brunning
FRC #5333 (Can't C# | OpenRIO)
Team Role: Mentor
 
Join Date: Jan 2015
Rookie Year: 2015
Location: Perth, Western Australia
Posts: 251
Jaci has a reputation beyond reputeJaci has a reputation beyond reputeJaci has a reputation beyond reputeJaci has a reputation beyond reputeJaci has a reputation beyond reputeJaci has a reputation beyond reputeJaci has a reputation beyond reputeJaci has a reputation beyond reputeJaci has a reputation beyond reputeJaci has a reputation beyond reputeJaci has a reputation beyond repute
Re: 30fps Vision Tracking on the RoboRIO without Coprocessor

Quote:
Originally Posted by Jared Russell View Post
This is very cool, though I'm not (yet) convinced that you can get 30fps @ 640x480 with an RGB USB camera using a "conventional" FRC vision algorithm. But now you have me thinking...

Why I think we're still a ways off from RGB webcam-based 30fps @ 640x480: Your Kinect is doing several of the most expensive image processing steps for you in hardware.

With a USB webcam, you need to:

1. Possibly decode the image into a pixel array (many webcams encode their images in formats that aren't friendly to processing).
This is certainly true. As I mentioned, I don't really have a benchmark to gather data from a conventional USB webcam, so I can't really provide input to this part.

Quote:
Originally Posted by Jared Russell View Post
2. Convert the pixel array into a color space that is favorable for background-lighting-agnostic thresholding (HSV/HSL). This is done once per pixel per channel (3*640*480), and each op is the evaluation of a floating point (or fixed point) linear function, and usually also involves evaluating a decision tree for numerical reasons.

3. Do inRange thresholding on each channel separately (3x as many operations as in your example) and then AND together the outputs into a binary image.
These can actually both be turned into 1 set of instructions if your use case is target-finding.
Most robots use some sort of light source to find the retro-reflective target. Most typically this is the green ring (for our Kinect, it's the IR projector). If your image is already in the RGB form, you can actually just isolate the Green channel (which you can do with SIMD extremely simply, vld3.8) and proceed onward. Storing the R and B channels out to a D register but not writing it to RAM will save a lot of time here, and then your thresholding function will only take one set of data.

Something similar can be done with HSV/HSL, however this will require a bit more math on the assembly side of things to isolate the Lightness for a specific hue or saturation. Nonetheless, it's still faster than calculating for all 3 channels.

Quote:
Originally Posted by Jared Russell View Post
However...

Great idea hacking the ASM to use SIMD for inRange. I wonder if you could also write an ASM function to do color space conversion, thresholding, and ANDing in a single function that only touches registers (may require fixed point arithmetic; I'm not sure what the RoboRIO register set looks like). This would add several more ops to your program, and have 3x as many memory reads, but would have the same number of memory writes.
I believe it would be possible to do HS{L,V}/RGB color space correction with SIMD if you're willing to take on the challenge. I may give this a try when I have some time to burn.
Putting them all into one set of instructions dealing only with the NEON registers is entirely possible, in fact the thresholding and ANDing are already grouped together, operating on the Q registers. I can confirm that the ARM NEON instruction set does include fixed-point arithmetic, although it requires the vcvt instruction to convert them to floating-point first, which is also done by the NEON system.
__________________
Jacinta R

Curtin FRC (5333+5663) : Mentor
5333 : Former [Captain | Programmer | Driver], Now Mentor
OpenRIO : Owner

Website | Twitter | Github
jaci.brunning@gmail.com
Reply With Quote