|
|
|
![]() |
|
|||||||
|
||||||||
![]() |
| Thread Tools | Rate Thread | Display Modes |
|
#16
|
||||
|
||||
|
Re: Optical Flow Tracking, Help!
Quote:
![]() I see from the link you are using Guppy firewire cameras. Last edited by techhelpbb : 17-08-2013 at 21:25. |
|
#17
|
|||||
|
|||||
|
Re: Optical Flow Tracking, Help!
Hey, we are ready! Bring on the water!
|
|
#18
|
|||||
|
|||||
|
Re: Optical Flow Tracking, Help!
Yup Guppy Fire wire cameras, we have been using HSL values to do object tracking, also use flood fill for forward cameras when background is uniform. Works great, auto calibrates with lighting conditions. If we can somehow do the same thing for the bottom then the light from sun or cloudy day will not matter. Right now you calibrate for one lighting condition and then a clod comes and you are blind! Auto fill for bottom needs to first get rid of all white patches on floor of pool by taking whit pixels and making them dark like the dark pixels on the floor then do auto fill. We are working on that too.....
|
|
#19
|
|||
|
|||
|
Re: Optical Flow Tracking, Help!
If you have a variety of images, cloudy and clear, it would be easy to use an adaptive threshold and see if that is as effective, or I suspect more effective, than the flood.
Greg McKaskle Last edited by Greg McKaskle : 17-08-2013 at 22:58. Reason: one more word would be better |
|
#20
|
|||||
|
|||||
|
Re: Optical Flow Tracking, Help!
You posted
"This should help you quite a bit and quite directly: http://robots.stanford.edu/cs223b05/...tical_flow.pdf" good stuff in here, will show my programmers! Thanks! |
|
#21
|
|||
|
|||
|
Re: Optical Flow Tracking, Help!
I've looked at the team website and the competition web site. What exactly are you having a problem with? From the call for mouse odometry, I would infer that you are having problems navigating to the various stations. Is this it? Do you have a well stabilized platform? Can you navigate a straight line using the PNI compass and gyro board? What sensor fussion algorithm are you using. Can you maintain constant depth with the pressure sensor? You have 10 DOF with the sensors you should have a good AHRS solution. So you are looking to supplement this with a X and Y distance measurement. Be aware that with only 1 mouse odometry input there are multiple solutions. 1 with no body rotation (a line) and one with body rotation. In 3 D of course. Thus the need to fuse with the AHRS. The curved bottom of the pool and variations in depth and platform orientation from level will mess with accuracy. Looking at the diagram of the pool 1st gen cruise missile comes to mind. Dead recon to a way point, detect way point object and orientation with vision. Rotate and dead recon heading to next way point etc. So what have you achieved and what needs improvement?
|
|
#22
|
|||||
|
|||||
|
Re: Optical Flow Tracking, Help!
I have had to solve analogous problems many times in my academic and professional lives. The first problem is getting reliable and robust features to track between images (for a sparse optical flow solution). I do not have much experience using RGB cameras under water, but I have looked at the bottom of a pool plenty of times and would expect the dancing lights and shadows to make this a real challenge. This makes getting robust sparse features difficult, and using dense optical flow (where every pixel is a feature) basically impossible.
Do you have representative imagery of what the cameras would be capturing? If so I might be able to help identify feature extraction algorithms that would work. Terms to Google would be "SIFT", "SURF", and "Harris corners". OpenCV-based implementations of all of these are easy to find. Optical mice "cheat" and use a light source at an oblique angle to cause minor imperfections in a surface to turn into reliable features. You most likely will not have that option unless you are hugging the pool bottom. Once you obtain features that you can track between multiple images, estimating the robot's motion is fairly straightforward. There are a few different techniques you can use, but in general the more assumptions you can make (e.g. I am at constant depth with a zero pitch and roll), the fewer degrees of freedom you need to worry about, and therefore the fewer feature correspondences you need to accurately measure your position. But step one is finding the right features. As you've already identified, a DVL is by far the best way to do this underwater (currently). Failing that, if there is no water current (or a known water current), you could buy a water speed sensor and combine that with compass/gyros to do dead reckoning. |
![]() |
| Thread Tools | |
| Display Modes | Rate This Thread |
|
|