Go to Post If you need persuasion woven tight with logic, leave it to a software architect. - Richard Wallace [more]
Home
Go Back   Chief Delphi > Technical > Programming
CD-Media   CD-Spy  
portal register members calendar search Today's Posts Mark Forums Read FAQ rules

 
Closed Thread
 
Thread Tools Rate Thread Display Modes
  #1   Spotlight this post!  
Unread 08-04-2010, 23:07
JBotAlan's Avatar
JBotAlan JBotAlan is offline
Forever chasing the 'bot around
AKA: Jacob Rau
FRC #5263
Team Role: Mentor
 
Join Date: Sep 2004
Rookie Year: 2004
Location: Riverview, MI
Posts: 723
JBotAlan has a reputation beyond reputeJBotAlan has a reputation beyond reputeJBotAlan has a reputation beyond reputeJBotAlan has a reputation beyond reputeJBotAlan has a reputation beyond reputeJBotAlan has a reputation beyond reputeJBotAlan has a reputation beyond reputeJBotAlan has a reputation beyond reputeJBotAlan has a reputation beyond reputeJBotAlan has a reputation beyond reputeJBotAlan has a reputation beyond repute
Send a message via AIM to JBotAlan Send a message via Yahoo to JBotAlan
Demystifying autonomous...

I just had a bit of a brain blast. It probably won't be so great grammatically or organizationally because I'm not feeling so hot right now, but wanted to write this down.

It is so difficult to explain to those uninitiated with programming why I can't "just make it go straight for 3 feet, then turn left"...what they don't seem to get is how the robot actually sees the world.

So...

How about having a session before the build season in which we explore what life is like for a robot? Take each of the kids, put a blindfold on, and have them navigate through an obstacle course. Ask them what they did to keep on course, and then draw the parallel to different sensors on the robot.

Then you could proceed to write up steps of how to get through said course, after outlining which sensors would be necessary.

Maybe even give a good heads-up display of sensor values (SANS CAMERA) from the robot sitting around the corner. Give the controls over to a student and watch them figure out the shape of the course based on a few microswitches, an ultrasonic, and a gyro.

It would require a bit of elbow grease on the programmer's part, but I think the effort would pay off.

Input? Anyone done something like this before?
__________________
Aren't signatures a bit outdated?

Last edited by JBotAlan : 08-04-2010 at 23:10. Reason: fixed typo
  #2   Spotlight this post!  
Unread 08-04-2010, 23:20
Andrew Schreiber Andrew Schreiber is offline
Data Nerd
FRC #0079
 
Join Date: Jan 2005
Rookie Year: 2000
Location: Misplaced Michigander
Posts: 4,057
Andrew Schreiber has a reputation beyond reputeAndrew Schreiber has a reputation beyond reputeAndrew Schreiber has a reputation beyond reputeAndrew Schreiber has a reputation beyond reputeAndrew Schreiber has a reputation beyond reputeAndrew Schreiber has a reputation beyond reputeAndrew Schreiber has a reputation beyond reputeAndrew Schreiber has a reputation beyond reputeAndrew Schreiber has a reputation beyond reputeAndrew Schreiber has a reputation beyond reputeAndrew Schreiber has a reputation beyond repute
Re: Demystifying autonomous...

Actually, sorta. Was working at an FLL camp, we were teaching concepts of programming. We blindfolded one student and had another give them instructions. It really made them realize how blind their robots are.
__________________




.
  #3   Spotlight this post!  
Unread 08-04-2010, 23:20
Mark McLeod's Avatar
Mark McLeod Mark McLeod is offline
Just Itinerant
AKA: Hey dad...Father...MARK
FRC #0358 (Robotic Eagles)
Team Role: Engineer
 
Join Date: Mar 2003
Rookie Year: 2002
Location: Hauppauge, Long Island, NY
Posts: 8,752
Mark McLeod has a reputation beyond reputeMark McLeod has a reputation beyond reputeMark McLeod has a reputation beyond reputeMark McLeod has a reputation beyond reputeMark McLeod has a reputation beyond reputeMark McLeod has a reputation beyond reputeMark McLeod has a reputation beyond reputeMark McLeod has a reputation beyond reputeMark McLeod has a reputation beyond reputeMark McLeod has a reputation beyond reputeMark McLeod has a reputation beyond repute
Re: Demystifying autonomous...

I use pin-the-tail-on-the-donkey as a practical example.
Blindfolded, spun around, don't touch anything - now find the target in one shot.
__________________
"Rationality is our distinguishing characteristic - it's what sets us apart from the beasts." - Aristotle
  #4   Spotlight this post!  
Unread 09-04-2010, 09:55
gvarndell's Avatar
gvarndell gvarndell is offline
Software Engineer
AKA: Addi's and Georgie's Dad
FRC #1629 (GaCo)
Team Role: Parent
 
Join Date: Jan 2009
Rookie Year: 2008
Location: Grantsville, Maryland
Posts: 350
gvarndell has a reputation beyond reputegvarndell has a reputation beyond reputegvarndell has a reputation beyond reputegvarndell has a reputation beyond reputegvarndell has a reputation beyond reputegvarndell has a reputation beyond reputegvarndell has a reputation beyond reputegvarndell has a reputation beyond reputegvarndell has a reputation beyond reputegvarndell has a reputation beyond reputegvarndell has a reputation beyond repute
Re: Demystifying autonomous...

Quote:
Originally Posted by JBotAlan View Post
How about having a session before the build season in which we explore what life is like for a robot? Take each of the kids, put a blindfold on, and have them navigate through an obstacle course. Ask them what they did to keep on course, and then draw the parallel to different sensors on the robot.
It's analytical, I love it.
__________________
Robots never, ever, ever, ever break -- The Robot Repairman (Backyardigans)
  #5   Spotlight this post!  
Unread 09-04-2010, 10:29
Dkt01's Avatar
Dkt01 Dkt01 is offline
Programming Mentor
AKA: David
FRC #1756 (Argos)
Team Role: Mentor
 
Join Date: Jan 2010
Rookie Year: 2009
Location: Peoria, Il
Posts: 145
Dkt01 will become famous soon enough
Re: Demystifying autonomous...

This would be hilarious to watch. But it would also be a pretty good way to show what autonomous really is. The closest thing we did was at the beginning, we set up a field and walked around as robots. To simulate autonomous, we were blindfolded. Needless to say, everyone struggled with the "autonomous period", but your idea sounds like one of the best ways to simulate what the robot is thinking.
  #6   Spotlight this post!  
Unread 09-04-2010, 12:55
kamocat's Avatar
kamocat kamocat is offline
Test Engineer
AKA: Marshal Horn
FRC #3213 (Thunder Tech)
Team Role: Mentor
 
Join Date: May 2008
Rookie Year: 2008
Location: Tacoma
Posts: 894
kamocat is just really nicekamocat is just really nicekamocat is just really nicekamocat is just really nicekamocat is just really nice
Send a message via AIM to kamocat Send a message via MSN to kamocat
Re: Demystifying autonomous...

What I did to demonstrate the (un)usefulnes camera is to close one eye, and put your hands in a tube around the other eye, to give yourself that 30 degree view angle. (+-15 degrees)
What I didn't have them do is chop it to 10 frames per second. (blinking continuously might work)

Anyways, it was effective at eliminating the wish to get it sent back to the dashboard, where it would only be updated once a second.
__________________
-- Marshal Horn
  #7   Spotlight this post!  
Unread 09-04-2010, 13:15
Ether's Avatar
Ether Ether is offline
systems engineer (retired)
no team
 
Join Date: Nov 2009
Rookie Year: 1969
Location: US
Posts: 8,042
Ether has a reputation beyond reputeEther has a reputation beyond reputeEther has a reputation beyond reputeEther has a reputation beyond reputeEther has a reputation beyond reputeEther has a reputation beyond reputeEther has a reputation beyond reputeEther has a reputation beyond reputeEther has a reputation beyond reputeEther has a reputation beyond reputeEther has a reputation beyond repute
Re: Demystifying autonomous...

Quote:
Originally Posted by kamocat View Post
Anyways, it was effective at eliminating the wish to get [the camera video] sent back to the dashboard, where it would only be updated once a second.
I know that the other dashboard data gets updated only once per second, but is this true for the camera image also? I thought the camera image was not part of that 50-element array.


~
  #8   Spotlight this post!  
Unread 09-04-2010, 13:21
mcb's Avatar
mcb mcb is offline
Registered User
AKA: Katie
FRC #1732 (Hilltoppers)
Team Role: College Student
 
Join Date: Jan 2009
Rookie Year: 2009
Location: Milwaukee, WI
Posts: 38
mcb is an unknown quantity at this point
Re: Demystifying autonomous...

My team did this activity in the fall with the FLL team that we mentored. Not only did it help to explain programming but it also served as a team-building exercise. One kid, the "robot," put the blindfold on and another would direct them where they needed to go, acting as the code. It worked really well!
  #9   Spotlight this post!  
Unread 09-04-2010, 13:51
kamocat's Avatar
kamocat kamocat is offline
Test Engineer
AKA: Marshal Horn
FRC #3213 (Thunder Tech)
Team Role: Mentor
 
Join Date: May 2008
Rookie Year: 2008
Location: Tacoma
Posts: 894
kamocat is just really nicekamocat is just really nicekamocat is just really nicekamocat is just really nicekamocat is just really nice
Send a message via AIM to kamocat Send a message via MSN to kamocat
Re: Demystifying autonomous...

Quote:
Originally Posted by Ether View Post
I know that the other dashboard data gets updated only once per second, but is this true for the camera image also? I thought the camera image was not part of that 50-element array.


~
The dashboard uses "Get camera image on PC".
I can't find anywhere in the code that it *says* it takes 1000ms to happen, but I tested it, and I think that's how long it turned out to be. This may be on purpose, seeing as they were worried last year about the bandwidth it would incur.
__________________
-- Marshal Horn
  #10   Spotlight this post!  
Unread 09-04-2010, 14:03
AdamHeard's Avatar
AdamHeard AdamHeard is offline
Lead Mentor
FRC #0973 (Greybots)
Team Role: Mentor
 
Join Date: Oct 2004
Rookie Year: 2004
Location: Atascadero
Posts: 5,498
AdamHeard has a reputation beyond reputeAdamHeard has a reputation beyond reputeAdamHeard has a reputation beyond reputeAdamHeard has a reputation beyond reputeAdamHeard has a reputation beyond reputeAdamHeard has a reputation beyond reputeAdamHeard has a reputation beyond reputeAdamHeard has a reputation beyond reputeAdamHeard has a reputation beyond reputeAdamHeard has a reputation beyond reputeAdamHeard has a reputation beyond repute
Send a message via AIM to AdamHeard
Re: Demystifying autonomous...

Quote:
Originally Posted by kamocat View Post
The dashboard uses "Get camera image on PC".
I can't find anywhere in the code that it *says* it takes 1000ms to happen, but I tested it, and I think that's how long it turned out to be. This may be on purpose, seeing as they were worried last year about the bandwidth it would incur.
I know teams were able to get the camera image to be nearly real time.
  #11   Spotlight this post!  
Unread 09-04-2010, 14:17
Ether's Avatar
Ether Ether is offline
systems engineer (retired)
no team
 
Join Date: Nov 2009
Rookie Year: 1969
Location: US
Posts: 8,042
Ether has a reputation beyond reputeEther has a reputation beyond reputeEther has a reputation beyond reputeEther has a reputation beyond reputeEther has a reputation beyond reputeEther has a reputation beyond reputeEther has a reputation beyond reputeEther has a reputation beyond reputeEther has a reputation beyond reputeEther has a reputation beyond reputeEther has a reputation beyond repute
Re: Demystifying autonomous...

Quote:
Originally Posted by kamocat View Post
The dashboard uses "Get camera image on PC".
I can't find anywhere in the code that it *says* it takes 1000ms to happen, but I tested it, and I think that's how long it turned out to be. This may be on purpose, seeing as they were worried last year about the bandwidth it would incur.
I don't know for sure, but a few weeks back when a LabVIEW programmer was showing me the cRIO code that packages the Dashboard data it looked to me like the camera image was not being packed into the 50-element array but rather was being updated separately at a higher rate. Second hand info: I asked a team member who had seen the dashboard camera video and he said it looked to be updating a lot faster than once per second. This was with unmodified (default) 2010 FRC LabVIEW framework code.

~
  #12   Spotlight this post!  
Unread 09-04-2010, 18:30
Joe Ross's Avatar Unsung FIRST Hero
Joe Ross Joe Ross is offline
Registered User
FRC #0330 (Beachbots)
Team Role: Engineer
 
Join Date: Jun 2001
Rookie Year: 1997
Location: Los Angeles, CA
Posts: 8,561
Joe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond repute
Re: Demystifying autonomous...

Quote:
Originally Posted by Ether View Post
I don't know for sure, but a few weeks back when a LabVIEW programmer was showing me the cRIO code that packages the Dashboard data it looked to me like the camera image was not being packed into the 50-element array but rather was being updated separately at a higher rate. Second hand info: I asked a team member who had seen the dashboard camera video and he said it looked to be updating a lot faster than once per second. This was with unmodified (default) 2010 FRC LabVIEW framework code.

~
You are correct that the camera images is handled completely separately from the dashboard data.

The default LabVIEW framework code updates the high priority dashboard data in Robot Main (50hz). The Low Priority dashboard data is updated at 2hz (IIRC). For an example of the high priority data updating fast, look at the camera tracking information in the lower right corner of the dashboard.

Last edited by Joe Ross : 09-04-2010 at 19:05.
  #13   Spotlight this post!  
Unread 09-04-2010, 18:40
ideasrule's Avatar
ideasrule ideasrule is offline
Registered User
FRC #0610 (Coyotes)
Team Role: Programmer
 
Join Date: Jan 2010
Rookie Year: 2009
Location: Toronto
Posts: 108
ideasrule is a jewel in the roughideasrule is a jewel in the roughideasrule is a jewel in the roughideasrule is a jewel in the rough
Re: Demystifying autonomous...

Team 610 is another team that, after a lot of effort, got a real-time (as far as our eyes could tell) feed at 320x240 resolution. The drivers depended on it heavily at the beginning, not so much after they got more experience in aligning the robot with the goal.
  #14   Spotlight this post!  
Unread 09-04-2010, 19:01
apalrd's Avatar
apalrd apalrd is offline
More Torque!
AKA: Andrew Palardy (Most people call me Palardy)
VRC #3333
Team Role: College Student
 
Join Date: Mar 2009
Rookie Year: 2009
Location: Auburn Hills, MI
Posts: 1,347
apalrd has a reputation beyond reputeapalrd has a reputation beyond reputeapalrd has a reputation beyond reputeapalrd has a reputation beyond reputeapalrd has a reputation beyond reputeapalrd has a reputation beyond reputeapalrd has a reputation beyond reputeapalrd has a reputation beyond reputeapalrd has a reputation beyond reputeapalrd has a reputation beyond reputeapalrd has a reputation beyond repute
Re: Demystifying autonomous...

We did some camera-driving on our practice bot. We found that, while the image gets a good framerate, it does is not close to realtime. It appeared to be around 10hz video, however it was around 1s behind reality. This fooled our driver into thinking it was actually updating at the speed it appeared, which it did not (btw, I noticed this same thing on the Dashboard I wrote - the data not the camera - no graphs). After the driver (and I, since I was the only other one there and was having fun), got used to the delay, we were able to drive the practice bot through a door (without bumpers), even with the camera mounted far to the side of the bot and partially obstructed by the claw. We also noticed some (lots of) lag with the controls when using vision processing (find ellipse), but with just the camera to the dashboard it was fine. We were able to keep the robot aligned with a line on the floor (edge of the road inside CTC) at full speed in high gear (around 12fps) using only the camera, then shift to low and navigate a narrower hallway into a room, from a different room. It works quite well.

As to the original intent of this thread, I once taught a programming class at an FLL camp, and we played this game where we had two students, sitting back-to-back with identical bags of legos, and we had one student build something and describe vocally to the other how to build it. This taught them how important good instruction is for good execution.
__________________
Kettering University - Computer Engineering
Kettering Motorsports
Williams International - Commercial Engines - Controls and Accessories
FRC 33 - The Killer Bees - 2009-2012 Student, 2013-2014 Advisor
VEX IQ 3333 - The Bumble Bees - 2014+ Mentor

"Sometimes, the elegant implementation is a function. Not a method. Not a class. Not a framework. Just a function." ~ John Carmack
  #15   Spotlight this post!  
Unread 09-04-2010, 18:50
Radical Pi Radical Pi is offline
Putting the Jumper in the Bumper
AKA: Ian Thompson
FRC #0639 (Code Red Robotics)
Team Role: Programmer
 
Join Date: Jan 2010
Rookie Year: 2010
Location: New York
Posts: 655
Radical Pi has a spectacular aura aboutRadical Pi has a spectacular aura aboutRadical Pi has a spectacular aura about
Re: Demystifying autonomous...

Quote:
Originally Posted by Joe Ross View Post
You are correct that the camera is handled completely separately from the dashboard data.

The default LabVIEW framework code updates the high priority dashboard data in Robot Main (50hz). The Low Priority dashboard data is updated at 2hz (IIRC). For an example of the high priority data updating fast, look at the camera tracking information in the lower right corner of the dashboard.
Actually, that is only the tracking data for the camera. The actual images are sent independently in a (UDP I believe) stream that is started when the camera is first called in code. The dashboard packers have nothing to do with this stream
__________________

"To have no errors would be life without meaning. No strugle, no joy"
"A network is only as strong as it's weakest linksys"
Closed Thread


Thread Tools
Display Modes Rate This Thread
Rate This Thread:

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Autonomous Rafi Ahmed Rules/Strategy 13 08-01-2007 01:34
I <3 Autonomous Mike General Forum 3 26-04-2005 22:21
Autonomous danielkitchener Rumor Mill 3 03-01-2004 01:08
autonomous..... Arefin Bari Rumor Mill 30 19-12-2003 10:53


All times are GMT -5. The time now is 01:53.

The Chief Delphi Forums are sponsored by Innovation First International, Inc.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi