View Single Post
  #26   Spotlight this post!  
Unread 06-01-2012, 08:48
Joe Johnson's Avatar Unsung FIRST Hero
Joe Johnson Joe Johnson is offline
Engineer at Medrobotics
AKA: Dr. Joe
FRC #0088 (TJ2)
Team Role: Engineer
 
Join Date: May 2001
Rookie Year: 1996
Location: Raynham, MA
Posts: 2,648
Joe Johnson has a reputation beyond reputeJoe Johnson has a reputation beyond reputeJoe Johnson has a reputation beyond reputeJoe Johnson has a reputation beyond reputeJoe Johnson has a reputation beyond reputeJoe Johnson has a reputation beyond reputeJoe Johnson has a reputation beyond reputeJoe Johnson has a reputation beyond reputeJoe Johnson has a reputation beyond reputeJoe Johnson has a reputation beyond reputeJoe Johnson has a reputation beyond repute
Re: Running the Kinect on the Robot.

Quote:
Originally Posted by Jared341 View Post
<snip>

As long as Y = "a distinct color not found/illegal on robots", you could probably do this pretty well without even using the Kinect's depth image. (OpenCV has built in hough circle routines, for example: http://www.youtube.com/watch?v=IeLeMBU4yJk).

For added robustness, you could use the Kinect depth image simply to help select the range of radii to look for. I think you'd get equivalent performance - and much more efficient computation - using this method than with 3D point cloud fitting.
First regarding "As long as Y = "a distinct color not found/illegal on robots"" This is a pretty significant as long as.

Second, regarding using standard image processing, my experience with machine vision is that with controlled lighting, life is good, without it, life can be pretty crumby.

An FRC Robotics field is a pretty lousy lighting environment -- may be bright, may be dim, may be spots, may be colored lighting, ...

There were teams in the GA dome whose image processing algorithm ran fine during the day, but had fits after dark (and vice versa). Are you willing to live with the possibility that your algorithm runs fine on your division field but goes whacky on Einstein? Maybe but maybe not...

So... ...I think that the 3D points from the PrimeSense distance data are going to be more robust to ambient lighting conditions.

Joe J.
__________________
Joseph M. Johnson, Ph.D., P.E.
Mentor
Team #88, TJ2