View Single Post
  #9   Spotlight this post!  
Unread 09-11-2016, 11:59
Greg McKaskle Greg McKaskle is offline
Registered User
FRC #2468 (Team NI & Appreciate)
 
Join Date: Apr 2008
Rookie Year: 2008
Location: Austin, TX
Posts: 4,748
Greg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond repute
Re: Vision Tracking?

There are many ways to make an omelette, so I'll give my version of this.

1) Acquire Image
2) Process Image to emphasize what you care about over what you don't
3) Make measurements to grade the potential targets
4) Pick a target that you are going to track
5) Make 2D measurements that will help you determine 3D location
6) Adjust your robot

1. Acquire Image
This is actually where lots of teams have trouble. The images can be too bright, too dark, blurry, blocked by external or internal mechanisms, etc. It is always good to log some images to test your code against and to use the images provided at kickoff. Being able to view and calibrate your acquisition to adjust to field conditions is very important. The white paper about FRC Vision contains a number of helpful techniques regarding image acquisition.

2. Process Image
This is commonly a threshold filter, but can be an edge detector or any processing that declutters the image and makes it more efficient to process and easier to test. HSV, HSI, or HSL are pretty accurate ways to do this, but it can be done in RGB or even just using intensity on a black and white image. You can also use strobes, IR lighting, polarized lighting, and other tricks to do some of the processing in the analog world instead of digital.

3. Make Measurements
For NIVision, this is generally a Particle Report. It can give size, location, perimeter, roundness, angles, aspect ratios, etc for each particle. Pick the measures that can help to qualify or disqualify things in the image. For example, that particle is too small, that one is too large, that one is just right. But rather than a Boolean output, I find it useful to give a score (0 to 100%) for example, and then later combine those strategically in the next step. This is where the folder of sample images pays off. You get to tweak and converge so that given the expected images, it has a reasonably predictable success rate.

4. Pick a Target
Rank and select the element that your code considers the best candidate.

5. Determine 3D Location
The location of an edge, or the center of the particle can sometimes be enough to correlate to distance and location. Area is another decent approximation of distance. And of course if you want to, you can identify corners and use the distortion of the known shape to solve for location.

6. Adjust your Robot
Use the 3D info to adjust your robot's orientation, location, flywheel, or whatever makes sense to act on a target at that location in 3D space relative to your robot. Often this simplifies to -- turn so the target is in the center of the camera image, drive forward to a known distance, or adjust the shooter to the estimated distance.

From my experience, #1 is hard due to changing lighting and environment, inefficient or unsuccessful calibration procedures, and lack of data to adjust the camera well. #2 through 5 have lots of example code and tools to help process reasonably good images. #6 can be hard as well and really depends on the robot construction and sensors. Closing the loop with only a camera is tricky because cameras are slow and often noise due to mounting and measurement conditions.

So yes. Vision is not an easy problem, but if you control a few key factors, it can be solved pretty well by the typical FRC team, and there are many workable solutions. This makes it pretty good challenge for FRC, IMO.

Greg McKaskle
Reply With Quote