Go to Post A lot of students decribe FIRST as the best thing that happened to their life. Once a student gets hooked, they don't want to leave FIRST for any cause. I am one of them. - Arefin Bari [more]
Home
Go Back   Chief Delphi > Competition > Rules/Strategy
CD-Media   CD-Spy  
portal register members calendar search Today's Posts Mark Forums Read FAQ rules

 
Closed Thread
Thread Tools Rate Thread Display Modes
  #1   Spotlight this post!  
Unread 10-01-2012, 15:50
ianonavy ianonavy is offline
Programming Mentor/Alumnus
AKA: Ian Adam Naval
FRC #3120 (RoboKnights)
Team Role: Mentor
 
Join Date: Dec 2010
Rookie Year: 2008
Location: Sherman Oaks
Posts: 32
ianonavy is an unknown quantity at this point
Depth Perception

We have decided on a robot design that requires sensing how far the basketball hoops are from our robot, but we are not sure how to go about doing that. We were thinking about mounting the Kinect on our robot, but that proved to be a lot more complicated than we expected. I've read some posts here about using lasers to detect how far the hoops are. How would our team go about doing that? What is the simplest/easiest way of going about doing this?

tl;dr; What's the easiest way to sense how far away the basketball hoops are from the robot?

Thanks for your input, good luck on your build season!
  #2   Spotlight this post!  
Unread 10-01-2012, 15:57
andreboos andreboos is offline
Registered User
FRC #3021 (The Agency)
Team Role: Programmer
 
Join Date: Dec 2009
Rookie Year: 2010
Location: San Diego
Posts: 132
andreboos is a jewel in the roughandreboos is a jewel in the roughandreboos is a jewel in the roughandreboos is a jewel in the rough
Re: Depth Perception

You could use the ultrasonic range sensor in the Kit of Parts.
  #3   Spotlight this post!  
Unread 10-01-2012, 16:00
DoctorWhom93's Avatar
DoctorWhom93 DoctorWhom93 is offline
I prefer "Sparky"
AKA: Tristan
FRC #1625 (Winnovation)
Team Role: Programmer
 
Join Date: Jan 2012
Rookie Year: 2009
Location: Winnebago
Posts: 2
DoctorWhom93 is an unknown quantity at this point
Re: Depth Perception

If you open up the Video Processing LabVIEW example it has a function defined in there called "Distance". You can open that up and get what FIRST put in their for processing distance using the camera
  #4   Spotlight this post!  
Unread 10-01-2012, 16:04
DuaneB's Avatar
DuaneB DuaneB is offline
Registered User
FRC #1731
Team Role: Mentor
 
Join Date: Jan 2010
Rookie Year: 2010
Location: Virginia
Posts: 20
DuaneB is an unknown quantity at this point
Re: Depth Perception

Note that maximum range of the Maxbotix LV-MaxSonar®-EZ1 sonar range finder is 254 inches (21 ft / 6.5m).
  #5   Spotlight this post!  
Unread 10-01-2012, 16:04
plnyyanks's Avatar
plnyyanks plnyyanks is offline
Data wins arguments.
AKA: Phil Lopreiato
FRC #1124 (The ÜberBots), FRC #2900 (The Mighty Penguins)
Team Role: College Student
 
Join Date: Apr 2010
Rookie Year: 2010
Location: NYC/Washington, DC
Posts: 1,114
plnyyanks has a reputation beyond reputeplnyyanks has a reputation beyond reputeplnyyanks has a reputation beyond reputeplnyyanks has a reputation beyond reputeplnyyanks has a reputation beyond reputeplnyyanks has a reputation beyond reputeplnyyanks has a reputation beyond reputeplnyyanks has a reputation beyond reputeplnyyanks has a reputation beyond reputeplnyyanks has a reputation beyond reputeplnyyanks has a reputation beyond repute
Re: Depth Perception

You could use the axis camera instead of the kinect and process the image to determine the distance. If you have LabVIEW, one of the examples shows how to do this. Otherwise, these threads might be useful, along with this whitepaper.
__________________
Phil Lopreiato - "It's a hardware problem"
Team 1124 (2010 - 2013), Team 1418 (2014), Team 2900 (2016)
FRC Notebook The Blue Alliance for Android
  #6   Spotlight this post!  
Unread 10-01-2012, 16:40
ianonavy ianonavy is offline
Programming Mentor/Alumnus
AKA: Ian Adam Naval
FRC #3120 (RoboKnights)
Team Role: Mentor
 
Join Date: Dec 2010
Rookie Year: 2008
Location: Sherman Oaks
Posts: 32
ianonavy is an unknown quantity at this point
Re: Depth Perception

Well, we're going to be programming in either C++ or Java, so LabVIEW is out of the question. I don't understand how you could process an image to determine depth with a single image. Wouldn't it be more accurate to have two cameras to have stereo vision?
  #7   Spotlight this post!  
Unread 10-01-2012, 17:12
plnyyanks's Avatar
plnyyanks plnyyanks is offline
Data wins arguments.
AKA: Phil Lopreiato
FRC #1124 (The ÜberBots), FRC #2900 (The Mighty Penguins)
Team Role: College Student
 
Join Date: Apr 2010
Rookie Year: 2010
Location: NYC/Washington, DC
Posts: 1,114
plnyyanks has a reputation beyond reputeplnyyanks has a reputation beyond reputeplnyyanks has a reputation beyond reputeplnyyanks has a reputation beyond reputeplnyyanks has a reputation beyond reputeplnyyanks has a reputation beyond reputeplnyyanks has a reputation beyond reputeplnyyanks has a reputation beyond reputeplnyyanks has a reputation beyond reputeplnyyanks has a reputation beyond reputeplnyyanks has a reputation beyond repute
Re: Depth Perception

Attached is a screenshot of the LV example VI. If you don't use LV, then just pay attention to the triangles drawn out on the bottom.

We can calculate the distance from the target using a couple known values and some trigonometry. We know the camera's resolution, its field of view (the angle at which it can view, or 2Θ), and the width of the target in real life, and the target's position in the camera image. Here's a comment from the VI that goes over the math:
Quote:
Since we know that the target width is 2', we can use its pixel width to determine the width
of the camera field of view in ft at that working distance from the camera. W is half of that.
Divide by the tangent of theta (half the view angle), to determine d.
So we take the width of the target box in pixals, and determine the width of the whole image (2/width*xresolution/2). Then, we can divide that by the tangent of .5Θ (where Θ = the view angle, as found on the Axis camera datasheet [about 47˚ for the M1011, and 54˚ for the 206]) to get the distance in feet.

Click image for larger version

Name:	target_distance.png
Views:	413
Size:	47.9 KB
ID:	11311

Also, all this is explained in NI's Whitepaper on the subject
__________________
Phil Lopreiato - "It's a hardware problem"
Team 1124 (2010 - 2013), Team 1418 (2014), Team 2900 (2016)
FRC Notebook The Blue Alliance for Android

Last edited by plnyyanks : 10-01-2012 at 17:15. Reason: added link to whitepaper
  #8   Spotlight this post!  
Unread 13-01-2012, 00:48
rich2202 rich2202 is offline
Registered User
FRC #2202 (BEAST Robotics)
Team Role: Mentor
 
Join Date: Jan 2012
Rookie Year: 2012
Location: Wisconsin
Posts: 1,230
rich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond repute
Re: Depth Perception

It gets a little more complicated when you are looking at the basket from an angle. You have to examine the shape of the square to calculate the viewing angle. Then use the viewing angle to adjust the size of the box before calculating the distance.
  #9   Spotlight this post!  
Unread 14-01-2012, 22:58
windtakers windtakers is offline
Registered User
AKA: Blake Dansfield
FRC #3620 (Average Joes)
Team Role: Programmer
 
Join Date: Jan 2011
Rookie Year: 2011
Location: michigan
Posts: 33
windtakers is an unknown quantity at this point
Re: Depth Perception

Quote:
Originally Posted by plnyyanks View Post
Attached is a screenshot of the LV example VI.
where is this example at in labview
  #10   Spotlight this post!  
Unread 04-02-2012, 11:02
landybr landybr is offline
Registered User
FRC #1676
 
Join Date: Jan 2012
Location: montvale
Posts: 2
landybr is an unknown quantity at this point
Re: Depth Perception


Last edited by landybr : 04-02-2012 at 11:09.
  #11   Spotlight this post!  
Unread 04-02-2012, 11:23
Tylernol's Avatar
Tylernol Tylernol is offline
Registered User
FRC #2993 (MegaBots)
Team Role: Marketing
 
Join Date: Jan 2012
Rookie Year: 2009
Location: Logan UT
Posts: 13
Tylernol is an unknown quantity at this point
Re: Depth Perception

We're currently using the axis camera to find the rectangle of the reflective tape. We'll analyze the aspect ratio of the square, then count the pixels to find our distance.

...or at least that's what our programmers told me. I'm not one myself and most of it is very confusing to me.

:edit Oh, and we're using Java. I could probably ask our programmers for the code.
  #12   Spotlight this post!  
Unread 04-02-2012, 12:15
nssheepster's Avatar
nssheepster nssheepster is offline
Da' Rule Man
AKA: Nik Shepherd
FRC #0174 (Arctic Warriors)
 
Join Date: Nov 2011
Rookie Year: 2008
Location: Liverpool, NY
Posts: 107
nssheepster is an unknown quantity at this point
Re: Depth Perception

Our team wants to do that, but upon reflection and research, we feel that the camera or "laser" sensing method is not consistently reliable enough. We thought of a filtered gyroscope - accelerometer system, but we don't have the time and manpower to finish it before our first regional. So, we are doing without. But reliability is going to be a problem, no matter what almost - incomprehensible algorithms you use.
__________________
In theory, this should work.
In practice, not so much.
F.I.R.S.T. = For Inspiration and Recognition of Science and Technology
So really, it's F.I.A.R.O.S.A.T.?
Nah, that doesn't sound as good.
  #13   Spotlight this post!  
Unread 04-02-2012, 12:36
Dale's Avatar
Dale Dale is offline
Head Coach & Mentor
AKA: Dale Yocum
FRC #1540 (Flaming Chickens)
Team Role: Coach
 
Join Date: Feb 2005
Rookie Year: 2005
Location: Portland, OR
Posts: 504
Dale has much to be proud ofDale has much to be proud ofDale has much to be proud ofDale has much to be proud ofDale has much to be proud ofDale has much to be proud ofDale has much to be proud ofDale has much to be proud ofDale has much to be proud of
Re: Depth Perception

Be sure to test how well the ultrasonic sensors work in a noisy environment. What happens when a robot with a chain driven wheel turning at 5000 rpm is next to your robot or when your own shooter is running. It may work fine, it may not! We're going to test that today.
__________________
2016 PNW Championship Chairman's; 2016 Winner Oregon City District, 2015 PNW Championship Chairman's; 2015 PNW District Engineering Inspiration; 2015 PNW District Finalist; 2014 PNW Championship Chairman's; 2014 Championship Innovation in Controls; 2013 Chairman's (Oregon); 2013 Finalist (OKC); 2012 Winner (OKC); 2012 Chairman's (OKC); 2012 Woody Flowers (Oregon); 2011 Volunteer of the Year (Oregon); 2011 Finalist & Captain (San Diego); 2011 Innovation in Control (San Diego); 2010 & 2007 Chairman's (Oregon); 2010 Regional Champions (Colorado); 2010 Innovation in Control (Colorado); 2009 & 2008 Engineering Inspiration (Oregon); 2008 Regional Champions (Oregon); 2007 Regional Finalist (Oregon); 2005 Rookie Inspiration (PNW)
  #14   Spotlight this post!  
Unread 04-02-2012, 19:04
DonRotolo's Avatar
DonRotolo DonRotolo is offline
Back to humble
FRC #0832
Team Role: Mentor
 
Join Date: Jan 2005
Rookie Year: 2005
Location: Atlanta GA
Posts: 7,011
DonRotolo has a reputation beyond reputeDonRotolo has a reputation beyond reputeDonRotolo has a reputation beyond reputeDonRotolo has a reputation beyond reputeDonRotolo has a reputation beyond reputeDonRotolo has a reputation beyond reputeDonRotolo has a reputation beyond reputeDonRotolo has a reputation beyond reputeDonRotolo has a reputation beyond reputeDonRotolo has a reputation beyond reputeDonRotolo has a reputation beyond repute
Re: Depth Perception

Quote:
Originally Posted by Dale View Post
Be sure to test how well the ultrasonic sensors work in a noisy environment.
Excellent idea. Other robots may have similar sensors pinging as well.

In any case, it is possible to reduce much of the potential/actual interference using standard audio techniques.
__________________

I am N2IRZ - What's your callsign?
  #15   Spotlight this post!  
Unread 09-02-2012, 19:49
Wolfgang Wolfgang is offline
Registered User
AKA: Mehmed
FRC #1245 (Shazbots)
Team Role: Programmer
 
Join Date: Oct 2009
Rookie Year: 2010
Location: Denver
Posts: 47
Wolfgang is an unknown quantity at this point
Re: Depth Perception

The ultrasonic sensor included in the kit of parts has proven to be harder than expected to use effectively, at least ours was. It appears to get noisy and distorted data after about 8 ft of distance. This happens even if it is quite around the sensor, so we believe it would be tough to use unless you were shooting very close range.
Closed Thread


Thread Tools
Display Modes Rate This Thread
Rate This Thread:

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT -5. The time now is 03:21.

The Chief Delphi Forums are sponsored by Innovation First International, Inc.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi