Go to Post To put it simply: An adult coach impacts a team for two minutes. An adult mentor impacts a team for a lifetime. - dlavery [more]
Home
Go Back   Chief Delphi > Technical > Programming
CD-Media   CD-Spy  
portal register members calendar search Today's Posts Mark Forums Read FAQ rules

 
Closed Thread
Thread Tools Rate Thread Display Modes
  #1   Spotlight this post!  
Unread 02-06-2013, 20:40
bijan311 bijan311 is offline
Registered User
FRC #1714
Team Role: Alumni
 
Join Date: May 2013
Rookie Year: 2013
Location: Wisconsin
Posts: 11
bijan311 is an unknown quantity at this point
Using a Raspberry Pi for camera tracking

I was thinking of using a raspberry pi as an on-board computer for camera tracking because it is cheap and lightweight. I was wondering if anyone has tried doing that or if it's a good idea or not.
  #2   Spotlight this post!  
Unread 02-06-2013, 21:20
gluxon's Avatar
gluxon gluxon is offline
\n
AKA: Brandon Cheng
FRC #0178 (The 2nd Law Enforcers)
Team Role: Leadership
 
Join Date: Apr 2012
Rookie Year: 2011
Location: Connecticut
Posts: 65
gluxon has a spectacular aura aboutgluxon has a spectacular aura aboutgluxon has a spectacular aura about
Re: Using a Raspberry Pi for camera tracking

It's a good idea and has been a success.

https://github.com/team178/oculus.js
  #3   Spotlight this post!  
Unread 02-06-2013, 21:22
Mike Bortfeldt Mike Bortfeldt is offline
Registered User
FRC #1126 (& 1511)
Team Role: Mentor
 
Join Date: Oct 2004
Rookie Year: 2004
Location: Rochester, NY
Posts: 119
Mike Bortfeldt has much to be proud ofMike Bortfeldt has much to be proud ofMike Bortfeldt has much to be proud ofMike Bortfeldt has much to be proud ofMike Bortfeldt has much to be proud ofMike Bortfeldt has much to be proud ofMike Bortfeldt has much to be proud ofMike Bortfeldt has much to be proud of
Re: Using a Raspberry Pi for camera tracking

I believe the folks on team 340 (GRR) did camera vision tracking on the Raspberry PI this year. If you look on the "Summer of FIRST Project" thread, they mention the project.

Mike
  #4   Spotlight this post!  
Unread 03-06-2013, 10:01
Aaron.Graeve Aaron.Graeve is offline
Registered User
FRC #1477 (Texas Torque)
Team Role: Alumni
 
Join Date: Jan 2012
Rookie Year: 2012
Location: College Station, Texas
Posts: 103
Aaron.Graeve is a name known to allAaron.Graeve is a name known to allAaron.Graeve is a name known to allAaron.Graeve is a name known to allAaron.Graeve is a name known to allAaron.Graeve is a name known to all
Re: Using a Raspberry Pi for camera tracking

I know it is possible and a good idea. Don't quote me on it but I believe 118 used the Raspberry Pi's enhanced I/O equivalent, a BeagleBone, for the vision tracking on their 2012 robot.
__________________

2016:
Alamo, Bayou, and Lone Star Regional FTAA
2015:
Dallas, Alamo, Bayou, and Lone Star Regional FTAA
2014:
Alamo, Dallas, and Lone Star Regional FTAA
Alamo Regional Robot Inspector
2013:
Einstein Champion and 2013 World Champion (Thanks 1241 & 610), Galileo Division Champion, Razorback Regional Winner, Alamo Regional Semifinalist, Bayou Regional Semifinalist, Lone Star Regional Quarterfinialist
2012:
Curie Division Semifinalist, Lone Star Regional Finalist, Bayou Regional Winner, Alamo Regional Winner
  #5   Spotlight this post!  
Unread 03-06-2013, 10:08
cmrnpizzo14's Avatar
cmrnpizzo14 cmrnpizzo14 is offline
Registered User
AKA: Cam Pizzo
FRC #3173 (IgKNIGHTers)
Team Role: Mentor
 
Join Date: Jan 2011
Rookie Year: 2006
Location: Boston
Posts: 522
cmrnpizzo14 has a reputation beyond reputecmrnpizzo14 has a reputation beyond reputecmrnpizzo14 has a reputation beyond reputecmrnpizzo14 has a reputation beyond reputecmrnpizzo14 has a reputation beyond reputecmrnpizzo14 has a reputation beyond reputecmrnpizzo14 has a reputation beyond reputecmrnpizzo14 has a reputation beyond reputecmrnpizzo14 has a reputation beyond reputecmrnpizzo14 has a reputation beyond reputecmrnpizzo14 has a reputation beyond repute
Re: Using a Raspberry Pi for camera tracking

3173 used the pi and had vision tracking software available but due to lack of test time it never made our competition bot. I believe that in the time since the season has ended our programmers have made it work quite successfully. I will try to get one of our programmers to post something about this.

The most impressive vision tracking I have seen in person was Aperture's (3142 I believe). I know they have a white paper up on it, I will post a link when I find it. They used the Kinect with a Pi I believe.

EDIT: Found the paper http://www.chiefdelphi.com/media/papers/2692
__________________
FIRST Team 3173 The IgKNIGHTers

"Where should we put the battery?"

Last edited by cmrnpizzo14 : 03-06-2013 at 10:11. Reason: Found it!
  #6   Spotlight this post!  
Unread 03-06-2013, 19:06
bijan311 bijan311 is offline
Registered User
FRC #1714
Team Role: Alumni
 
Join Date: May 2013
Rookie Year: 2013
Location: Wisconsin
Posts: 11
bijan311 is an unknown quantity at this point
Re: Using a Raspberry Pi for camera tracking

Thank you all, this was very helpful.
  #7   Spotlight this post!  
Unread 04-06-2013, 18:19
JamesTerm's Avatar
JamesTerm JamesTerm is offline
Terminator
AKA: James Killian
FRC #3481 (Bronc Botz)
Team Role: Engineer
 
Join Date: May 2011
Rookie Year: 2010
Location: San Antonio, Texas
Posts: 298
JamesTerm is a splendid one to beholdJamesTerm is a splendid one to beholdJamesTerm is a splendid one to beholdJamesTerm is a splendid one to beholdJamesTerm is a splendid one to beholdJamesTerm is a splendid one to beholdJamesTerm is a splendid one to behold
Re: Using a Raspberry Pi for camera tracking

Quote:
Originally Posted by Aaron.Graeve View Post
I know it is possible and a good idea. Don't quote me on it but I believe 118 used the Raspberry Pi's enhanced I/O equivalent, a BeagleBone, for the vision tracking on their 2012 robot.
This is what I know... as I spoke with their developer on this:

They used a beagleboard (http://beagleboard.org/) running embedded Linux with Ethernet and USB interfaces

They used HTTP GET calls via libcurl library... processed using OpenCV and finally sent UDP packets across to the cRIO as input for the control loops.

One thing we didn't discuss... which is something I may want to talk about at some point... is the danger of sending UDP packets if the robot is not listening to them. This can flood the buffers and corrupt tcp/ip causing the driver station to lose connection. The solution we tried to overcome this issue is to open the listener immediately on its own thread (task) that starts on power-up. This should work as the time it takes for the camera to power on (about 30 seconds)... is much later than what it takes for the cRIO to power up and start listening.

Oh yes and we both use WindRiver c++
  #8   Spotlight this post!  
Unread 04-06-2013, 18:43
Hjelstrom's Avatar
Hjelstrom Hjelstrom is offline
Mentor
FRC #0987 (High Rollers)
Team Role: Mentor
 
Join Date: Mar 2008
Rookie Year: 2005
Location: Las Vegas
Posts: 148
Hjelstrom has a reputation beyond reputeHjelstrom has a reputation beyond reputeHjelstrom has a reputation beyond reputeHjelstrom has a reputation beyond reputeHjelstrom has a reputation beyond reputeHjelstrom has a reputation beyond reputeHjelstrom has a reputation beyond reputeHjelstrom has a reputation beyond reputeHjelstrom has a reputation beyond reputeHjelstrom has a reputation beyond reputeHjelstrom has a reputation beyond repute
Re: Using a Raspberry Pi for camera tracking

Quote:
Originally Posted by cmrnpizzo14 View Post
3173 used the pi and had vision tracking software available but due to lack of test time it never made our competition bot. I believe that in the time since the season has ended our programmers have made it work quite successfully. I will try to get one of our programmers to post something about this.

The most impressive vision tracking I have seen in person was Aperture's (3142 I believe). I know they have a white paper up on it, I will post a link when I find it. They used the Kinect with a Pi I believe.

EDIT: Found the paper http://www.chiefdelphi.com/media/papers/2692
You can also check out this paper: http://www.chiefdelphi.com/media/papers/2698?
I think if you're going to use the Kinect, you should use its depth sensor rather than just using it as an IR camera. The depth sensing it does is incredibly powerful (though it has some quirks too).
  #9   Spotlight this post!  
Unread 04-06-2013, 18:47
ohrly?'s Avatar
ohrly? ohrly? is offline
Griffin Alum
AKA: Colin Poler
FRC #1884 (The Griffins)
Team Role: Alumni
 
Join Date: Jan 2013
Rookie Year: 2011
Location: London
Posts: 58
ohrly? is an unknown quantity at this point
Re: Using a Raspberry Pi for camera tracking

We were going to use a raspberry pi for vision tracking this year, but our robot couldn't really aim, so it was pointless for us. We did still get it working before ship, though.

On a hardware level, we powered it by splicing the usb cable it comes with to a spare 12v-5v power converter we had. We installed the Arch linux distribution for ARM, and sent the data back over FRC's own NetworkTables.

Basically, we used the python bindings for OpenCV, and we just loaded the MJPEG file with opencv.VideoCapture("http://10.18.84.11/mjpg/video.mjpg"). Then we did some math (we actually found our position by using the angles of elevation to all the goals we could see to find distances, and then "triangulating" ourselves)

Lessons learned:
Keep a spare SD card, with everything you need installed. Our raspberry pi inexplicably stopped working at some point, and needed a reinstall, which we wouldn't have been able to do at a regional.
Do everything in one language, don't mix and match python and java.
Figure out if we actually need vision tracking before we build it. (hopefully not a problem for you)

All in all, we won't be using it next year. I think we'll put the classmate on the robot, so it has an external battery.
  #10   Spotlight this post!  
Unread 05-06-2013, 12:14
JamesTerm's Avatar
JamesTerm JamesTerm is offline
Terminator
AKA: James Killian
FRC #3481 (Bronc Botz)
Team Role: Engineer
 
Join Date: May 2011
Rookie Year: 2010
Location: San Antonio, Texas
Posts: 298
JamesTerm is a splendid one to beholdJamesTerm is a splendid one to beholdJamesTerm is a splendid one to beholdJamesTerm is a splendid one to beholdJamesTerm is a splendid one to beholdJamesTerm is a splendid one to beholdJamesTerm is a splendid one to behold
Re: Using a Raspberry Pi for camera tracking

Quote:
Originally Posted by Hjelstrom View Post
You can also check out this paper: http://www.chiefdelphi.com/media/papers/2698?
I think if you're going to use the Kinect, you should use its depth sensor rather than just using it as an IR camera. The depth sensing it does is incredibly powerful (though it has some quirks too).
This is a great link! and I'd like to highlight something you said from it here:

"
Your team's vision system really inspired us to take another look at vision too though. Using the dashboard to do the processing helps in so many ways. The biggest I think is that you can "see" what the algorithm is doing at all times. When we wanted to see what our Kinect code is doing, we had to drag a monitor, keyboard, mouse, power inverter all onto the field. It was kind of a nightmare.
"

From our experience seeing the algorithm is so important... like when tuning the thresholds dynamically. We also wanted to capture some raw video and do offline testing and tweaking of the footage to fix bugs in the algorithm code (and to improve it, by eliminating more false positives).

I think the ability to see the algorithm is one valid argument to the question "I was wondering if anyone has tried doing that or if it's a good idea or not."

The only drawback with dashboard processing is bandwidth using mjpeg. If you want 640x480 resolution using default settings it costs about 11-13mbps. For this season we are capped at 7mbps and anything above 5 starts to introduce lag (as written in the fms white paper). We are looking into using h264 solution that gives 1.2 for good lighting - 5 for poor lighting using full 640x480 quality. This will roughly yield a 5ms latency, which should be plenty fast for closed loop processing. If more teams start to use vision for next season, we should all really want to encourage all teams to use lower bandwidth so that controls will continue to be responsive (i.e. everybody wins).

Last edited by JamesTerm : 06-06-2013 at 10:09.
  #11   Spotlight this post!  
Unread 05-06-2013, 13:33
Joe Ross's Avatar Unsung FIRST Hero
Joe Ross Joe Ross is offline
Registered User
FRC #0330 (Beachbots)
Team Role: Engineer
 
Join Date: Jun 2001
Rookie Year: 1997
Location: Los Angeles, CA
Posts: 8,600
Joe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond reputeJoe Ross has a reputation beyond repute
Re: Using a Raspberry Pi for camera tracking

Quote:
Originally Posted by JamesTerm View Post
The only drawback with dashboard processing is bandwidth using mjpeg. If you want 640x480 resolution it costs about 11-13mbps.
That's not necessarily true. Take a look at the images in 341's whitepaper. It was quite an ah-ha moment for us when we realized that their 640x480 images were the same size as our 320x240 images from that year.

I think you alluded to it when you said "poor lighting", but that's not how I'd look at it. It's lighting optimized for the task required. There's been a lot of discussion about using the hold exposure setting to get the correct conditions.

Last edited by Joe Ross : 05-06-2013 at 13:42.
  #12   Spotlight this post!  
Unread 05-06-2013, 15:05
billbo911's Avatar
billbo911 billbo911 is offline
I prefer you give a perfect effort.
AKA: That's "Mr. Bill"
FRC #2073 (EagleForce)
Team Role: Mentor
 
Join Date: Mar 2005
Rookie Year: 2005
Location: Elk Grove, Ca.
Posts: 2,384
billbo911 has a reputation beyond reputebillbo911 has a reputation beyond reputebillbo911 has a reputation beyond reputebillbo911 has a reputation beyond reputebillbo911 has a reputation beyond reputebillbo911 has a reputation beyond reputebillbo911 has a reputation beyond reputebillbo911 has a reputation beyond reputebillbo911 has a reputation beyond reputebillbo911 has a reputation beyond reputebillbo911 has a reputation beyond repute
Re: Using a Raspberry Pi for camera tracking

Quote:
Originally Posted by Joe Ross View Post
....
I think you alluded to it when you said "poor lighting", but that's not how I'd look at it. It's lighting optimized for the task required. There's been a lot of discussion about using the hold exposure setting to get the correct conditions.
One technique that can be used to "optimise the lighting conditions" and to help minimise bandwidth is as follows:

Throw as much light at the retroreflective tape as is reasonably possible. This could be done by using multiple concentric LED rings. (We are using 3.).
This will result in the reflected light being substantially brighter than surrounding area. In fact, and hopefully so, it will saturate the camera's detector in the reflected light region.
Now, reduce the exposure (time), and lock it, to the minimum amount that still generates a useful, but not quite saturated, image of the target. Doing this will also reduce the amount of signal coming from anywhere else that is not the target, and practically eliminate those parts of the image. What remains in the image is not much more than the target it's self.
When the camera compresses that image, the amount of data sent is minimal, and thus reduces the bandwidth required to send the images across the network.
__________________
CalGames 2009 Autonomous Champion Award winner
Sacramento 2010 Creativity in Design winner, Sacramento 2010 Quarter finalist
2011 Sacramento Finalist, 2011 Madtown Engineering Inspiration Award.
2012 Sacramento Semi-Finals, 2012 Sacramento Innovation in Control Award, 2012 SVR Judges Award.
2012 CalGames Autonomous Challenge Award winner ($$$).
2014 2X Rockwell Automation: Innovation in Control Award (CVR and SAC). Curie Division Gracious Professionalism Award.
2014 Capital City Classic Winner AND Runner Up. Madtown Throwdown: Runner up.
2015 Innovation in Control Award, Sacramento.
2016 Chezy Champs Finalist, 2016 MTTD Finalist
  #13   Spotlight this post!  
Unread 05-06-2013, 15:08
JamesTerm's Avatar
JamesTerm JamesTerm is offline
Terminator
AKA: James Killian
FRC #3481 (Bronc Botz)
Team Role: Engineer
 
Join Date: May 2011
Rookie Year: 2010
Location: San Antonio, Texas
Posts: 298
JamesTerm is a splendid one to beholdJamesTerm is a splendid one to beholdJamesTerm is a splendid one to beholdJamesTerm is a splendid one to beholdJamesTerm is a splendid one to beholdJamesTerm is a splendid one to beholdJamesTerm is a splendid one to behold
Re: Using a Raspberry Pi for camera tracking

Quote:
Originally Posted by Joe Ross View Post
That's not necessarily true. Take a look at the images in 341's whitepaper. It was quite an ah-ha moment for us when we realized that their 640x480 images were the same size as our 320x240 images from that year.
Ah-ha... I think I get what you are saying... that is to use a fast exposure to get better compression for darker images that are optimal for target processing. That makes sense, but would limit the use of the camera to just targeting. The context I posted was for a dual use-case of being able to view images with video levels balanced and obtaining targeting information from that.

Thanks for this link... I was actually looking for it again... there is something in there I'd like to quote again here because it is really great advise.

From Jared341:

Changing the default camera settings is the most important thing you can do in order to obtain reliable tracking and stay underneath the bandwidth cap.

In particular, there are six settings to pay attention to:

1) Resolution. The smaller you go, the less bandwidth you use but the fewer pixels you will have on the target. If you make all of the other changes here, you should be able to stay at 640x480.

2) Frames per second. "Unlimited" results in a 25 to 30 fps rate under ideal circumstances. Depending on how you use the camera in a control loop, this may be overkill. Experiment with different caps.

3) White balance. You do NOT want automatic white balance enabled! Failing to do so makes your code more susceptible to being thrown off by background lighting in the arena. All of our Axis cameras have a white balance "hold" setting - use it.

4) Exposure time/priority. You want a very dark image, except for the illuminated regions of the reflective tape. Set the exposure time to something very short. Put the camera in a bright scene (e.g. hold up a white frisbee a foot or two in front of the lens) and then do a "hold" on exposure priority. Experiment with different settings. You want virtually all black except for a very bright reflection off of the tape. This is for two purposes: 1) it makes vision processing much easier (fewer false detections), 2) it conserves bandwidth, since dark areas of the image are very compact after JPEG compression. The camera doesn't know what you are looking for, so it will try to send you the entire scene as well as it can. But if it can't see the "background" very well, you are "tricking" the camera into only giving you the part you need!

5) Compression. As the WPI whitepaper says, this makes a huge difference in bandwidth. Use a minimum of 30, but you may be able to get away with more (we are using 50 this year). Experiment with it.

6) Brightness. You can do a lot of fine tuning of the darkness of the image with the brightness slider.
  #14   Spotlight this post!  
Unread 06-06-2013, 01:19
faust1706's Avatar
faust1706 faust1706 is offline
Registered User
FRC #1706 (Ratchet Rockers)
Team Role: College Student
 
Join Date: Apr 2012
Rookie Year: 2011
Location: St Louis
Posts: 498
faust1706 is infamous around these partsfaust1706 is infamous around these parts
Re: Using a Raspberry Pi for camera tracking

I would say use an O-Droid device. such as the X2 or U2. 1706 used it and was running the vision code at >25 fps. It is much more powerful than the raspberry pi and the same size. We powered it through the 5V port in the power distrubution board. It needed a heat sync, so we 3d printed a case for it, which allowed an installment of a fan, and wired that into the power board too.
__________________
"You're a gentleman," they used to say to him. "You shouldn't have gone murdering people with a hatchet; that's no occupation for a gentleman."
  #15   Spotlight this post!  
Unread 29-06-2013, 00:14
gixxy's Avatar
gixxy gixxy is offline
Programming and Arduino Mentor
AKA: Gustave Michel III
FRC #3946 (Tiger Robotics)
Team Role: Mentor
 
Join Date: Nov 2011
Rookie Year: 2012
Location: Ruston, LA
Posts: 207
gixxy is on a distinguished road
Re: Using a Raspberry Pi for camera tracking

Team 3946 successfully used a Raspberry Pi to get distance and angle from center data from the camera over a TCP Socket Connection in its own thread for use in an (almost successful) auto-aim system.

Our code is available on github:

Robot Side (Java): This class establishes a TCP Socket Client in a new thread and attempts to connect to the Raspberry Pi, from there you can design and subsystem and/or command to do the calls for new data: https://github.com/frc3946/UltimateA...hreadedPi.java
Raspberry Pi Side (Python): https://github.com/frc3946/PyGoalFin...leProcessor.py
__________________
Programmer - A creature known for converting Caffeine into Code.
Studying Computer Science @ Louisiana Tech University
Associate Consultant @ Fenway Group

2012-13: 3946 - Head of Programming, Electrical and Web
2014 - 3468 - Programming Mentor
2015 - Present - Lurker
Closed Thread


Thread Tools
Display Modes Rate This Thread
Rate This Thread:

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT -5. The time now is 03:20.

The Chief Delphi Forums are sponsored by Innovation First International, Inc.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi