Go to Post The GDC needs to write the rules once and let the game be what the game is going to be. There's little that's fun about being told exactly how to play with your toys. - Madison [more]
Home
Go Back   Chief Delphi > Technical > Programming > NI LabVIEW
CD-Media   CD-Spy  
portal register members calendar search Today's Posts Mark Forums Read FAQ rules

 
Reply
 
Thread Tools Rate Thread Display Modes
  #1   Spotlight this post!  
Unread 01-02-2016, 20:38
bacarpenter bacarpenter is offline
Lead programmer (2 years)
AKA: Baylee Carpenter
FRC #2197 (Las Pumas)
Team Role: Programmer
 
Join Date: Mar 2015
Rookie Year: 2014
Location: Indiana
Posts: 19
bacarpenter is an unknown quantity at this point
Tracking low goal in autonomous

I'm the head programmer for team 2197 and my team has given me the task of getting over a defense then move the robot in front of the low goal and put the ball in the goal in auto. I was thinking about using a camera but I'm not sure how I'd implement that or if there's a better way. I also have access to a gyro and an ultrasonic sensor so I can use those in my code too. If anybody has any suggestions on how I should make this code it'd help a lot
Reply With Quote
  #2   Spotlight this post!  
Unread 01-02-2016, 20:47
rich2202 rich2202 is offline
Registered User
FRC #2202 (BEAST Robotics)
Team Role: Mentor
 
Join Date: Jan 2012
Rookie Year: 2012
Location: Wisconsin
Posts: 1,137
rich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond repute
Re: Tracking low goal in autonomous

Use the gyro to keep the robot pointed in the right direction while moving.
Use encoders on the wheel motors to estimate distance traveled.
Use gyro to point to castle wall.
Use sonar to get to the wall.
Use gyro to turn to the low goal
Use sonar to get to the low goal

At various points, you can use the camera to fine tune your position and direction.
Reply With Quote
  #3   Spotlight this post!  
Unread 01-02-2016, 21:02
bacarpenter bacarpenter is offline
Lead programmer (2 years)
AKA: Baylee Carpenter
FRC #2197 (Las Pumas)
Team Role: Programmer
 
Join Date: Mar 2015
Rookie Year: 2014
Location: Indiana
Posts: 19
bacarpenter is an unknown quantity at this point
Re: Tracking low goal in autonomous

Quote:
Originally Posted by rich2202 View Post
At various points, you can use the camera to fine tune your position and direction.
How exactly would I use the camera to fine tune my position I've never used vision processing before so I have no idea where to start
Reply With Quote
  #4   Spotlight this post!  
Unread 01-02-2016, 21:46
rich2202 rich2202 is offline
Registered User
FRC #2202 (BEAST Robotics)
Team Role: Mentor
 
Join Date: Jan 2012
Rookie Year: 2012
Location: Wisconsin
Posts: 1,137
rich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond repute
Re: Tracking low goal in autonomous

There is a tutorial on using vision to identify the target. Sorry I can't look it up for you,I am on a limited tablet. Try searching on frc vision.

Once you have identified the target, you can use its relative size to estimate distance, and the position of the target in the frame to estimate how much you have to turn to line up withe the target.
Reply With Quote
  #5   Spotlight this post!  
Unread 02-02-2016, 06:52
rich2202 rich2202 is offline
Registered User
FRC #2202 (BEAST Robotics)
Team Role: Mentor
 
Join Date: Jan 2012
Rookie Year: 2012
Location: Wisconsin
Posts: 1,137
rich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond repute
Re: Tracking low goal in autonomous

Tutorial on vision processing
https://wpilib.screenstepslive.com/s/4485/m/24194
Reply With Quote
  #6   Spotlight this post!  
Unread 02-02-2016, 07:00
Greg McKaskle Greg McKaskle is online now
Registered User
FRC #2468 (Team NI & Appreciate)
 
Join Date: Apr 2008
Rookie Year: 2008
Location: Austin, TX
Posts: 4,748
Greg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond repute
Re: Tracking low goal in autonomous

There is a vision example that implements the content in the white paper. It has code that will perform the vision processing on the DS laptop which can be integrated into your dashboard, and it has code that will perform the processing on the robot. I think it is tutorial 8 steps you through it. Tutorials are found on the Getting Started window.

Greg McKaskle
Reply With Quote
  #7   Spotlight this post!  
Unread 02-02-2016, 19:40
bacarpenter bacarpenter is offline
Lead programmer (2 years)
AKA: Baylee Carpenter
FRC #2197 (Las Pumas)
Team Role: Programmer
 
Join Date: Mar 2015
Rookie Year: 2014
Location: Indiana
Posts: 19
bacarpenter is an unknown quantity at this point
Re: Tracking low goal in autonomous

I've looked through those tutorials and I understand for the most part how to process the image and make it detect the objects/colors I want but the problem I'm having is using that information to move the robot how would I do that?
Reply With Quote
  #8   Spotlight this post!  
Unread 02-02-2016, 21:31
Greg McKaskle Greg McKaskle is online now
Registered User
FRC #2468 (Team NI & Appreciate)
 
Join Date: Apr 2008
Rookie Year: 2008
Location: Austin, TX
Posts: 4,748
Greg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond repute
Re: Tracking low goal in autonomous

One of the last steps is to put the target into a coordinate system that goes from -1 to 1, much the way that the joystick does. So the math needed to turn the target results into an input to RobotDrive is relatively simple. It may need to be reversed or scaled, but that was why it is in that mapping.

Greg McKaskle
Reply With Quote
  #9   Spotlight this post!  
Unread 02-02-2016, 21:38
Ben Wolsieffer Ben Wolsieffer is offline
Dartmouth 2020
AKA: lopsided98
FRC #2084 (Robots by the C)
Team Role: Alumni
 
Join Date: Jan 2011
Rookie Year: 2011
Location: Manchester, MA (Hanover, NH)
Posts: 520
Ben Wolsieffer has much to be proud ofBen Wolsieffer has much to be proud ofBen Wolsieffer has much to be proud ofBen Wolsieffer has much to be proud ofBen Wolsieffer has much to be proud ofBen Wolsieffer has much to be proud ofBen Wolsieffer has much to be proud ofBen Wolsieffer has much to be proud of
Re: Tracking low goal in autonomous

Quote:
Originally Posted by Greg McKaskle View Post
One of the last steps is to put the target into a coordinate system that goes from -1 to 1, much the way that the joystick does. So the math needed to turn the target results into an input to RobotDrive is relatively simple. It may need to be reversed or scaled, but that was why it is in that mapping.

Greg McKaskle
I thought this was not an effective method to do tracking, because the framerate of the vision algorithm is usually too slow to be a direct input to a control loop.
__________________



2016 North Shore District - Semifinalists and Excellence in Engineering Award
2015 Northeastern University District - Semifinalists and Creativity Award
2014 Granite State District - Semifinalists and Innovation in Control Award
2012 Boston Regional - Finalists
Reply With Quote
  #10   Spotlight this post!  
Unread 03-02-2016, 00:56
GeeTwo's Avatar
GeeTwo GeeTwo is online now
Technical Director
AKA: Gus Michel II
FRC #3946 (Tiger Robotics)
Team Role: Mentor
 
Join Date: Jan 2014
Rookie Year: 2013
Location: Slidell, LA
Posts: 3,558
GeeTwo has a reputation beyond reputeGeeTwo has a reputation beyond reputeGeeTwo has a reputation beyond reputeGeeTwo has a reputation beyond reputeGeeTwo has a reputation beyond reputeGeeTwo has a reputation beyond reputeGeeTwo has a reputation beyond reputeGeeTwo has a reputation beyond reputeGeeTwo has a reputation beyond reputeGeeTwo has a reputation beyond reputeGeeTwo has a reputation beyond repute
Re: Tracking low goal in autonomous

One of the big problems with visually tracking the low goals is that there does not seem to be a clear description of the details of what is behind the low goal. While it may sound rather roundabout at first, you may want to consider having a camera near the back of your robot pointed nearly vertically but forward with an LED ring to pick up on the high goal reflective tape.

Another tactic to lining up on the low goal may be to put an angled plate outside the frame perimeter down low in such a location that the castle wall and the partition help drive you in to the goal. You'd need one for each side as far as I can see at a casual look.

Caveat: my team is not planning to work the low goal, so neither of these ideas are tested at all.
__________________

If you can't find time to do it right, how are you going to find time to do it over?
If you don't pass it on, it never happened.
Robots are great, but inspiration is the reason we're here.
Friends don't let friends use master links.
Reply With Quote
  #11   Spotlight this post!  
Unread 03-02-2016, 04:43
rich2202 rich2202 is offline
Registered User
FRC #2202 (BEAST Robotics)
Team Role: Mentor
 
Join Date: Jan 2012
Rookie Year: 2012
Location: Wisconsin
Posts: 1,137
rich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond reputerich2202 has a reputation beyond repute
Re: Tracking low goal in autonomous

Quote:
Originally Posted by lopsided98 View Post
I thought this was not an effective method to do tracking, because the framerate of the vision algorithm is usually too slow to be a direct input to a control loop.
You can use the picture to calculate angle and distance. You then need to tell the drive system how to accomplish that. Once you have done that, you take another pic to confirm. Get new angle/distance. Rinse and repeat.
Reply With Quote
  #12   Spotlight this post!  
Unread 03-02-2016, 07:11
Greg McKaskle Greg McKaskle is online now
Registered User
FRC #2468 (Team NI & Appreciate)
 
Join Date: Apr 2008
Rookie Year: 2008
Location: Austin, TX
Posts: 4,748
Greg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond repute
Re: Tracking low goal in autonomous

Using the camera to close the loop is not as predictable as calculating an amount to turn and using a gyro to close the loop. But if your framerate is reasonable and your robot is moving or otherwise can turn pretty well, this also works. In 2008, we programmed and demo'd the Toro robot over and over at champs. It stayed a fixed distance from and followed a colored piece of paper or a T-shirt. And yes, it was running on the 8-slot cRIO using an Axis camera. No coprocessor required.

Greg McKaskle
Reply With Quote
Reply


Thread Tools
Display Modes Rate This Thread
Rate This Thread:

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT -5. The time now is 08:07.

The Chief Delphi Forums are sponsored by Innovation First International, Inc.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi