Chief Delphi

Chief Delphi (http://www.chiefdelphi.com/forums/index.php)
-   General Forum (http://www.chiefdelphi.com/forums/forumdisplay.php?f=16)
-   -   Optical Flow Tracking, Help! (http://www.chiefdelphi.com/forums/showthread.php?t=118510)

techhelpbb 17-08-2013 21:09

Re: Optical Flow Tracking, Help!
 
Quote:

Originally Posted by falconmaster (Post 1287760)
Thanks, I removed them. I think people don't prank call as much now that cell phones have numbers show up, harder to hide.....anyway I took them off thanks!

So now we finally have the FIRST water game we've all joked about for so long.

:yikes:

I see from the link you are using Guppy firewire cameras.

falconmaster 17-08-2013 21:46

Re: Optical Flow Tracking, Help!
 
Quote:

Originally Posted by techhelpbb (Post 1287762)
So now we finally have the FIRST water game we've all joked about for so long.

:yikes:

I see from the link you are using Guppy firewire cameras.

Hey, we are ready! Bring on the water!

falconmaster 17-08-2013 21:51

Re: Optical Flow Tracking, Help!
 
Yup Guppy Fire wire cameras, we have been using HSL values to do object tracking, also use flood fill for forward cameras when background is uniform. Works great, auto calibrates with lighting conditions. If we can somehow do the same thing for the bottom then the light from sun or cloudy day will not matter. Right now you calibrate for one lighting condition and then a clod comes and you are blind! Auto fill for bottom needs to first get rid of all white patches on floor of pool by taking whit pixels and making them dark like the dark pixels on the floor then do auto fill. We are working on that too.....

Greg McKaskle 17-08-2013 22:26

Re: Optical Flow Tracking, Help!
 
If you have a variety of images, cloudy and clear, it would be easy to use an adaptive threshold and see if that is as effective, or I suspect more effective, than the flood.

Greg McKaskle

falconmaster 17-08-2013 22:31

Re: Optical Flow Tracking, Help!
 
You posted

"This should help you quite a bit and quite directly:
http://robots.stanford.edu/cs223b05/...tical_flow.pdf"

good stuff in here, will show my programmers! Thanks!

Gdeaver 20-08-2013 09:14

Re: Optical Flow Tracking, Help!
 
I've looked at the team website and the competition web site. What exactly are you having a problem with? From the call for mouse odometry, I would infer that you are having problems navigating to the various stations. Is this it? Do you have a well stabilized platform? Can you navigate a straight line using the PNI compass and gyro board? What sensor fussion algorithm are you using. Can you maintain constant depth with the pressure sensor? You have 10 DOF with the sensors you should have a good AHRS solution. So you are looking to supplement this with a X and Y distance measurement. Be aware that with only 1 mouse odometry input there are multiple solutions. 1 with no body rotation (a line) and one with body rotation. In 3 D of course. Thus the need to fuse with the AHRS. The curved bottom of the pool and variations in depth and platform orientation from level will mess with accuracy. Looking at the diagram of the pool 1st gen cruise missile comes to mind. Dead recon to a way point, detect way point object and orientation with vision. Rotate and dead recon heading to next way point etc. So what have you achieved and what needs improvement?

Jared Russell 20-08-2013 12:23

Re: Optical Flow Tracking, Help!
 
I have had to solve analogous problems many times in my academic and professional lives. The first problem is getting reliable and robust features to track between images (for a sparse optical flow solution). I do not have much experience using RGB cameras under water, but I have looked at the bottom of a pool plenty of times and would expect the dancing lights and shadows to make this a real challenge. This makes getting robust sparse features difficult, and using dense optical flow (where every pixel is a feature) basically impossible.

Do you have representative imagery of what the cameras would be capturing? If so I might be able to help identify feature extraction algorithms that would work. Terms to Google would be "SIFT", "SURF", and "Harris corners". OpenCV-based implementations of all of these are easy to find. Optical mice "cheat" and use a light source at an oblique angle to cause minor imperfections in a surface to turn into reliable features. You most likely will not have that option unless you are hugging the pool bottom.

Once you obtain features that you can track between multiple images, estimating the robot's motion is fairly straightforward. There are a few different techniques you can use, but in general the more assumptions you can make (e.g. I am at constant depth with a zero pitch and roll), the fewer degrees of freedom you need to worry about, and therefore the fewer feature correspondences you need to accurately measure your position. But step one is finding the right features.

As you've already identified, a DVL is by far the best way to do this underwater (currently). Failing that, if there is no water current (or a known water current), you could buy a water speed sensor and combine that with compass/gyros to do dead reckoning.


All times are GMT -5. The time now is 17:42.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi