Go to Post We were the "oooh, neat, fire!" robot that year. - pfreivald [more]
Home
Go Back   Chief Delphi > Technical > Programming
CD-Media   CD-Spy  
portal register members calendar search Today's Posts Mark Forums Read FAQ rules

 
Closed Thread
Thread Tools Rate Thread Display Modes
  #16   Spotlight this post!  
Unread 09-04-2010, 18:30
Joe Ross's Avatar Unsung FIRST Hero
Joe Ross Joe Ross is offline
Registered User
FRC #0330 (Beachbots)
Team Role: Engineer
 
Join Date: Jun 2001
Rookie Year: 1997
Location: Los Angeles, CA
Posts: 8,572
Joe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond repute
Re: Demystifying autonomous...

Quote:
Originally Posted by Ether View Post
I don't know for sure, but a few weeks back when a LabVIEW programmer was showing me the cRIO code that packages the Dashboard data it looked to me like the camera image was not being packed into the 50-element array but rather was being updated separately at a higher rate. Second hand info: I asked a team member who had seen the dashboard camera video and he said it looked to be updating a lot faster than once per second. This was with unmodified (default) 2010 FRC LabVIEW framework code.

~
You are correct that the camera images is handled completely separately from the dashboard data.

The default LabVIEW framework code updates the high priority dashboard data in Robot Main (50hz). The Low Priority dashboard data is updated at 2hz (IIRC). For an example of the high priority data updating fast, look at the camera tracking information in the lower right corner of the dashboard.

Last edited by Joe Ross : 09-04-2010 at 19:05.
  #17   Spotlight this post!  
Unread 09-04-2010, 18:40
ideasrule's Avatar
ideasrule ideasrule is offline
Registered User
FRC #0610 (Coyotes)
Team Role: Programmer
 
Join Date: Jan 2010
Rookie Year: 2009
Location: Toronto
Posts: 108
ideasrule is a jewel in the roughideasrule is a jewel in the roughideasrule is a jewel in the roughideasrule is a jewel in the rough
Re: Demystifying autonomous...

Team 610 is another team that, after a lot of effort, got a real-time (as far as our eyes could tell) feed at 320x240 resolution. The drivers depended on it heavily at the beginning, not so much after they got more experience in aligning the robot with the goal.
  #18   Spotlight this post!  
Unread 09-04-2010, 18:50
Radical Pi Radical Pi is offline
Putting the Jumper in the Bumper
AKA: Ian Thompson
FRC #0639 (Code Red Robotics)
Team Role: Programmer
 
Join Date: Jan 2010
Rookie Year: 2010
Location: New York
Posts: 655
Radical Pi has a spectacular aura aboutRadical Pi has a spectacular aura aboutRadical Pi has a spectacular aura about
Re: Demystifying autonomous...

Quote:
Originally Posted by Joe Ross View Post
You are correct that the camera is handled completely separately from the dashboard data.

The default LabVIEW framework code updates the high priority dashboard data in Robot Main (50hz). The Low Priority dashboard data is updated at 2hz (IIRC). For an example of the high priority data updating fast, look at the camera tracking information in the lower right corner of the dashboard.
Actually, that is only the tracking data for the camera. The actual images are sent independently in a (UDP I believe) stream that is started when the camera is first called in code. The dashboard packers have nothing to do with this stream
__________________

"To have no errors would be life without meaning. No strugle, no joy"
"A network is only as strong as it's weakest linksys"
  #19   Spotlight this post!  
Unread 09-04-2010, 19:01
apalrd's Avatar
apalrd apalrd is offline
More Torque!
AKA: Andrew Palardy (Most people call me Palardy)
VRC #3333
Team Role: College Student
 
Join Date: Mar 2009
Rookie Year: 2009
Location: Auburn Hills, MI
Posts: 1,347
apalrd has a reputation beyond reputeapalrd has a reputation beyond reputeapalrd has a reputation beyond reputeapalrd has a reputation beyond reputeapalrd has a reputation beyond reputeapalrd has a reputation beyond reputeapalrd has a reputation beyond reputeapalrd has a reputation beyond reputeapalrd has a reputation beyond reputeapalrd has a reputation beyond reputeapalrd has a reputation beyond repute
Re: Demystifying autonomous...

We did some camera-driving on our practice bot. We found that, while the image gets a good framerate, it does is not close to realtime. It appeared to be around 10hz video, however it was around 1s behind reality. This fooled our driver into thinking it was actually updating at the speed it appeared, which it did not (btw, I noticed this same thing on the Dashboard I wrote - the data not the camera - no graphs). After the driver (and I, since I was the only other one there and was having fun), got used to the delay, we were able to drive the practice bot through a door (without bumpers), even with the camera mounted far to the side of the bot and partially obstructed by the claw. We also noticed some (lots of) lag with the controls when using vision processing (find ellipse), but with just the camera to the dashboard it was fine. We were able to keep the robot aligned with a line on the floor (edge of the road inside CTC) at full speed in high gear (around 12fps) using only the camera, then shift to low and navigate a narrower hallway into a room, from a different room. It works quite well.

As to the original intent of this thread, I once taught a programming class at an FLL camp, and we played this game where we had two students, sitting back-to-back with identical bags of legos, and we had one student build something and describe vocally to the other how to build it. This taught them how important good instruction is for good execution.
__________________
Kettering University - Computer Engineering
Kettering Motorsports
Williams International - Commercial Engines - Controls and Accessories
FRC 33 - The Killer Bees - 2009-2012 Student, 2013-2014 Advisor
VEX IQ 3333 - The Bumble Bees - 2014+ Mentor

"Sometimes, the elegant implementation is a function. Not a method. Not a class. Not a framework. Just a function." ~ John Carmack
  #20   Spotlight this post!  
Unread 09-04-2010, 19:05
Unsung FIRST Hero Woodie Flowers Award
Chris Fultz Chris Fultz is offline
My Other Car is a 500 HP Turbine
FRC #0234 (Cyber Blue)
Team Role: Engineer
 
Join Date: Jan 2002
Rookie Year: 1942
Location: Indianapolis, IN
Posts: 2,837
Chris Fultz has a reputation beyond reputeChris Fultz has a reputation beyond reputeChris Fultz has a reputation beyond reputeChris Fultz has a reputation beyond reputeChris Fultz has a reputation beyond reputeChris Fultz has a reputation beyond reputeChris Fultz has a reputation beyond reputeChris Fultz has a reputation beyond reputeChris Fultz has a reputation beyond reputeChris Fultz has a reputation beyond reputeChris Fultz has a reputation beyond repute
Re: Demystifying autonomous...

Getting back to the OP, that is a great idea.

We did something close and played a human bots game of breakaway, where each student was a robot with certain skills. it made everyone realize the value of different trades and also how small the field became with 2 or 3 robots in one zone.

I like the idea of blindfolds and then someone giving instructions to move the student around the field.

This is actually how i learned FORTRAN in one of my first programming classes. The professor decided to make a peanut butter cracker and eat it. We had to give him verbal instructions on what to do, and he did exactly what we said. Not what we meant, but what we actually said. I still remember the class. It made a good impression!
__________________
Chris Fultz
Cyber Blue - Team 234
2016 IRI Planning Committee
2016 IndyRAGE Planning Committee
2010 - Woodie Flowers Award - Championship
  #21   Spotlight this post!  
Unread 10-04-2010, 15:05
Greg McKaskle Greg McKaskle is offline
Registered User
FRC #2468 (Team NI & Appreciate)
 
Join Date: Apr 2008
Rookie Year: 2008
Location: Austin, TX
Posts: 4,752
Greg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond repute
Re: Demystifying autonomous...

I'm a firm believer that programmers need to learn how to anthropomorphize with the computer/robot. Don't go through life that way, but think of it like a pair of glasses or a hat you can put on when you want to see and think, knowing only what the function or algorithm knows, or being able to identify exactly what they will need in order to function.

As for the camera and dashboard discussion. The upper rate on the dashboard data is 50Hz and about 1KB per packet. The default framework doesn't read sensors and transmit them at that rate because it would load up the CPU reading a bunch of unallocated channels. I suspect that the framework will start to take advantage of the Open list of I/O to do a better job of this in the future.

Video back to the PC involves lots of elements, meaning that each must perform well, or the frame rate will drop and/or lag will be introduced. Thinking it through, the camera doesn't introduce much lag, but be sure that it is told to acquire and send at a fast rate. The images are delivered on port two over TCP, then sent out over port one over TCP with a small header added for versioning. The issue I've seen with the cRIO is with the memory manager. Big image buffers can be pretty slow to allocate. Keeping the image buffer below 16KB gets rid of this bottleneck. Next in the chain is the bridge, then the router. I haven't seen issues with these elements as they are special purpose and that is all they do. Next is the dashboard computer. If the CPU gets loaded, the images will sit in the IP stack and be lagged by up to five seconds. The default dashboard unfortunately had two elements which are both invalidating the screen and causing drawing cost. The easiest fix is to hide the image info. I believe I've also seen lag introduced when lots of errors are being sent to the DS. With a faster computer, this wouldn't matter as much either.

As I mentioned, an issue at any link in the chain, and the fps can drop and lag can be introduced. If each of these are handled well, I believe you can get less lag down to about 100ms, and frame rate above 25.

Greg McKaskle
  #22   Spotlight this post!  
Unread 11-04-2010, 13:20
gvarndell's Avatar
gvarndell gvarndell is offline
Software Engineer
AKA: Addi's and Georgie's Dad
FRC #1629 (GaCo)
Team Role: Parent
 
Join Date: Jan 2009
Rookie Year: 2008
Location: Grantsville, Maryland
Posts: 350
gvarndell has a reputation beyond reputegvarndell has a reputation beyond reputegvarndell has a reputation beyond reputegvarndell has a reputation beyond reputegvarndell has a reputation beyond reputegvarndell has a reputation beyond reputegvarndell has a reputation beyond reputegvarndell has a reputation beyond reputegvarndell has a reputation beyond reputegvarndell has a reputation beyond reputegvarndell has a reputation beyond repute
Re: Demystifying autonomous...

Quote:
Originally Posted by kamocat View Post
What I did to demonstrate the (un)usefulnes camera is to close one eye, and put your hands in a tube around the other eye, to give yourself that 30 degree view angle. (+-15 degrees)
What I didn't have them do is chop it to 10 frames per second. (blinking continuously might work)
Certainly not if autistic children are around, but how about adding a strobe light in a dark room?
Wear your Grandmother's glasses, patch over one eye, restrict FOV on other eye, and have a strobe going.
I would hazard a guess that most teens could fairly consistently catch a randomly tossed (soccer) ball under such conditions -- even if the strobe was off more than on.
I guess I would even predict that 6 kids could split into 2 alliances and play some decent soccer under these conditions.
They probably ought to wear helmets though

Pondering how humans can perform such a feat might foster some appreciation for the fact that robot autonomy cannot be dependent upon _knowing_ everything all the time.
A robot that could, even very poorly, approximate our powers of prediction and our ability to fill in the blanks wrt our sensory inputs would be truly amazing.
__________________
Robots never, ever, ever, ever break -- The Robot Repairman (Backyardigans)
  #23   Spotlight this post!  
Unread 27-04-2010, 16:49
JBotAlan's Avatar
JBotAlan JBotAlan is offline
Forever chasing the 'bot around
AKA: Jacob Rau
FRC #5263
Team Role: Mentor
 
Join Date: Sep 2004
Rookie Year: 2004
Location: Riverview, MI
Posts: 723
JBotAlan has a reputation beyond reputeJBotAlan has a reputation beyond reputeJBotAlan has a reputation beyond reputeJBotAlan has a reputation beyond reputeJBotAlan has a reputation beyond reputeJBotAlan has a reputation beyond reputeJBotAlan has a reputation beyond reputeJBotAlan has a reputation beyond reputeJBotAlan has a reputation beyond reputeJBotAlan has a reputation beyond reputeJBotAlan has a reputation beyond repute
Send a message via AIM to JBotAlan Send a message via Yahoo to JBotAlan
Re: Demystifying autonomous...

Quote:
Originally Posted by gvarndell View Post
A robot that could, even very poorly, approximate our powers of prediction and our ability to fill in the blanks wrt our sensory inputs would be truly amazing.
This is exactly the kind of thing that is so hard to explain to non-believers. What they don't realize is that the robot is more literal than their 7-year-old boy cousin, and completely unable to do anything beyond some quick math.

I'm glad to see the response to this thread. If I put anything together, I'll pass it along on CD.

I am taking at least a year off of FIRST, though, so it may not be for the next little while.
__________________
Aren't signatures a bit outdated?
  #24   Spotlight this post!  
Unread 28-04-2010, 10:17
slavik262's Avatar
slavik262 slavik262 is offline
We do what we must because we can.
AKA: Matt Kline
FRC #0537 (Charger Robotics)
Team Role: Alumni
 
Join Date: Jan 2007
Rookie Year: 2007
Location: Sussex, WI
Posts: 310
slavik262 is a splendid one to beholdslavik262 is a splendid one to beholdslavik262 is a splendid one to beholdslavik262 is a splendid one to beholdslavik262 is a splendid one to beholdslavik262 is a splendid one to beholdslavik262 is a splendid one to behold
Send a message via AIM to slavik262
Re: Demystifying autonomous...

Quote:
Originally Posted by Greg McKaskle View Post
If the CPU gets loaded, the images will sit in the IP stack and be lagged by up to five seconds. The default dashboard unfortunately had two elements which are both invalidating the screen and causing drawing cost. The easiest fix is to hide the image info. I believe I've also seen lag introduced when lots of errors are being sent to the DS. With a faster computer, this wouldn't matter as much either.
It was great talking to you in Atlanta about this. Does National Instruments have any thoughts on possibly using DirectX or OpenGL to render the video? Using the video dashboard I wrote (which copies the incoming frames directly onto a DirectX texture instead of using GDI to render and uses a separate thread for receiving images via Winsock), we were consistently getting 25+ frames per second on the field in Atlanta. I also distributed it to a few other teams, including Team 175 who were finalists in the Curie division and used it in all of their matches. Granted, I wasn't rendering anything else but video on the dashboard, but with the combination of hardware accelerated rendering and blocking networking I/O, I got CPU usage down to about 15-20% (as opposed to the default dashboard pegging the CPU at 100%).
__________________

Last edited by slavik262 : 28-04-2010 at 10:21.
  #25   Spotlight this post!  
Unread 29-04-2010, 20:34
Greg McKaskle Greg McKaskle is offline
Registered User
FRC #2468 (Team NI & Appreciate)
 
Join Date: Apr 2008
Rookie Year: 2008
Location: Austin, TX
Posts: 4,752
Greg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond repute
Re: Demystifying autonomous...

It was good seeing what you've developed as well. I can't guarantee what is happening in the IMAQ group, but NI has had a series of vision displays since 1994. Periodically, they look at the options for doing the display of the different the image types. The display shows 8 bit mono with color table, 16 bit and floating point monochrome, and true color, perhaps others I can't remember.

With most of the time being spent on the DS, I didn't pay enough attention to the default DB, and because of two indicators that were invalidating on opposite sides of the screen, most of the screen was being redrawn for each new image and chart update. I didn't have a classmate at the time, but fixing either the chart overlap or hiding the Image Information display definitely dropped the CPU load. I can't tell you what rates the IMAQ display is capable of on the classmate, but my assumption is that it is similar in speed or faster than DirectX or that is what they would already be using. If you are able to make a direct comparison and publish the data, I'll report the results to the developer in IMAQ.

Meanwhile, I'm glad you were able to make so much progress on your DB. It was impressive and I hope your team can take it even farther.

Greg McKaskle
  #26   Spotlight this post!  
Unread 29-04-2010, 20:49
byteit101's Avatar
byteit101 byteit101 is offline
WPILib maintainer (WPI)
AKA: Patrick Plenefisch
no team (The Cat Attack (Formerly))
Team Role: Programmer
 
Join Date: Jan 2009
Rookie Year: 2009
Location: Worcester
Posts: 699
byteit101 is a glorious beacon of lightbyteit101 is a glorious beacon of lightbyteit101 is a glorious beacon of lightbyteit101 is a glorious beacon of lightbyteit101 is a glorious beacon of lightbyteit101 is a glorious beacon of light
Re: Demystifying autonomous...

I know my dashboard (ZomB) is similar in speed to what you were getting. Although I did not have a CPU or FPS indicator, I had about 5 other controls on the dashboard, and at one point I looked down at the image, realized that our camera was pointed at us, and waved, and watched my hand in real time. (We got tipped on our side, video here: http://thecatattack.org/media/view/2596 (I wave at 1:25 and at the end) )
I had actually been noticing an interesting delay that eventually built up between reboots, that caused the UI of the DS and DB to lag by about 3-4 seconds to respond to mouse events after 6 hours of restarting the DS and DB (clearing FMS Locked), and was surprised that the video was still not laggy.
I would think the difference between DirectX, IMAQ, GDI/GDI+, and WPF is negligible unless some other process is hogging CPU (like many charts)
__________________
Bubble Wrap: programmers rewards
Watchdog.Kill();
printf("Watchdog is Dead, Celebrate!");
How to make a self aware robot: while (∞) cout<<(sqrt(-∞)/-0);
Previously FRC 451 (The Cat Attack)
Now part of the class of 2016 at WPI & helping on WPILib
Closed Thread


Thread Tools
Display Modes Rate This Thread
Rate This Thread:

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Autonomous Rafi Ahmed Rules/Strategy 13 08-01-2007 01:34
I <3 Autonomous Mike General Forum 3 26-04-2005 22:21
Autonomous danielkitchener Rumor Mill 3 03-01-2004 01:08
autonomous..... Arefin Bari Rumor Mill 30 19-12-2003 10:53


All times are GMT -5. The time now is 02:28.

The Chief Delphi Forums are sponsored by Innovation First International, Inc.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi