View Full Version : What information can we access from the camera?
Total Meltdown
07-01-2006, 11:32
What information do we have access to regarding the camera? I heard about Height of the target, angle from the robot to said target, etc. What other information is returned?
What information do we have access to regarding the camera? I heard about Height of the target, angle from the robot to said target, etc. What other information is returned?
You can get all that along with motion tracking using the IFI CMUcam not the Google digital camera. The Google camera is ment only to take pictures of your robot and upload them as a "photo essay".
Elgin Clock
07-01-2006, 11:50
I think you are a bit confused.
Google was announced as a sponsor of FIRST last year, and so the camera in the kit is a digital camera for taking pictures donated by Google for all teams to document their build/team progress.
The camera which is going to be on robots, is the return of 2005's CMU Cam system.
This is used for defining the green color, and going to be used for the purposes you are thinking of.
You can get all that along with motion tracking using the IFI CMUcam not the Google digital camera.
The CMU Cam was made by Carnegie-Mellon University (CMU) and is not an IFI part. I think your confusion is because IFI handles the ordering of the CMU cameras. (If I'm not mistaken)
Ken Leung
07-01-2006, 11:52
I just edited the thread title, correcting the confusion between the Google ditigal camera and the CMU camera. Hopefully that will clear things up and provide the right answer for the person who asked the question.
Elgin Clock
07-01-2006, 11:55
I just edited the thread title, correcting the confusion between the Google ditigal camera and the CMU camera. Hopefully that will clear things up and provide the right answer for the person who asked the question.
I was wondering who did that. Thanks Ken.
The CMU Cam was made by Carnegie-Mellon University (CMU) and is not an IFI part. I think your confusion is because IFI handles the ordering of the CMU cameras. (If I'm not mistaken)
The camera we get in the kit is an IFI manufactured part. They licensed it from Carnegie-Mellon.
Total Meltdown
07-01-2006, 12:18
Yes, sorry, I meant the IFI camera, not the Google one.
We're developing a system for auto-tracking the top goal autonomously, so that our "gun" will always be pointing at the correct angle to fire the ball into the goal, using some basic laws of projectile motion. I'll make a thread about it if the team decides they like it and it goes anywhere.
Kevin Watson
07-01-2006, 13:00
We're developing a system for auto-tracking the top goal autonomously, so that our "gun" will always be pointing at the correct angle to fire the ball into the goal, using some basic laws of projectile motion.This is precisely what I (and others) hope teams will do this year. I'm writing a one page white paper on computing range, which I'll post soon. Basically, range = (green light height - camera height)/tan(tilt angle), where green light height equals 10' 10", camera height is the distance the camera is mounted above the floor and tilt angle is the calculated tilt angle derived from the tilt PWM value when the tracking software has a good solution. I've posted the camera software Dave Lavery mentioned here: http://kevin.org/frc
-Kevin
Total Meltdown
07-01-2006, 13:19
Yeah, I could compute the distance if I need to, but I thought I heard them say that the distance was already returned right in the camera code itself.
...and would I be safe in assuming that the robot controller can do the basic Trig functions? Sin Cos Tan and the like?
Kevin Watson
07-01-2006, 13:26
Yeah, I could compute the distance if I need to, but I thought I heard them say that the distance was already returned right in the camera code itself.
...and would I be safe in assuming that the robot controller can do the basic Trig functions? Sin Cos Tan and the like?The current revision of the code does not give you range (I'll add this on the next rev.), but if you look at the code in terminal.c, you'll see how to calculate the tilt angle, which gets plugged into the equation mentioned above. Yes, the compiler has math.h support.
-Kevin
Total Meltdown
07-01-2006, 13:54
Balls are scooped in between the front wheels, being shuffled toward
the "Wheel" which works like an automated Baseball pitcher.
Front Back
\
\
\ \ <-- Turret swivels 360
\ \ Degrees, and tips from
\ \ Straight up to about 30 degrees
|-------------------------------------|
| Wheel --> () /<-- Ball |
| (not to scale) / scooper brings|
| shoots balls up / balls to turret|
| the barrell / base |
|-------------------------------------|
...yes, I just made an ASCII diagram of my robot design.
What I was thinking, is that the program constantly calculates the launch vector based on the speed at which the mechanism launches balls, and rotates and tilts the turret. I'm taking Physics now and we've already gone over Projectile motion, so it shouldn't be too hard to figure out the Angle at which the turret has to be positioned based on how far away you are from the goal.
Note that this design requries that the ball be fired at the same speed every time, or have some way to determine how fast the ball will be fired.
devicenull
07-01-2006, 18:16
So far, I intend to mount the camera next to the launcher. This way, as long as I mirror the up/down motion of the camera, the laucnher should be always aiming correctly.
Does anyone know if we got the mount for the camera they used at the kick off, or if the plans are available for it.. it's a lot nicer then the one we used last year.
I find it nice we have a physics teacher as one of our mentors :)
seanwitte
08-01-2006, 12:08
Attached is a Windows app for version 2.0 of the .NET framework that I used when I reviewed the camera code last month. The app will display what the robot thinks its seeing and the values for the pan and tilt servos. Directions for sending the data are included in the zip file. Extract the exe and the dll to the same folder and run it.
Justin Stiltner
08-01-2006, 13:50
devicenull,
Yes we recived the pan/tilt unit that was shown at kickoff, it should be in your electronics box in a bag marked frc-pantilt-01, I have not yet found directions on how to assemble the camera, but will be making some and posting them if noone beats me to it.
googlecamera
11-01-2006, 17:23
Hi,
download pdf file.
http://kevin.org/frc/CMUcam2_mount_assembly.pdf
Google camera
seanwitte
18-01-2006, 09:26
Attached is a Windows app for version 2.0 of the .NET framework that I used when I reviewed the camera code last month. The app will display what the robot thinks its seeing and the values for the pan and tilt servos. Directions for sending the data are included in the zip file. Extract the exe and the dll to the same folder and run it.
It looks like the x and y bounds in the application are wrong and should be reversed. It will still track and display fine, but the x axis is too long and the y axis is too short. I will make the correction and re-post if anyone is actually using it.
Kevin Watson
18-01-2006, 11:51
It looks like the x and y bounds in the application are wrong and should be reversed. It will still track and display fine, but the x axis is too long and the y axis is too short. I will make the correction and re-post if anyone is actually using it.You might consider adding a colored rectangle around the center pixel that represents the maximum allowable error as defined in tracking.h. The tracking software won't move the servos if the centroid pixel is within this rectangle.
During development of the camera software I used one of these standalone LCD graphics displays (http://www.ezlcd.com/) to display in real-time what the software is doing. It's very similar to your software and kinda fun to watch.
If there is demand for it, I'll release a revised version of terminal.c that will send output to: 1) IFI terminal, 2) your software, or 3) to the LCD that I mentioned above.
-Kevin
You might consider adding a colored rectangle around the center pixel that represents the maximum allowable error as defined in tracking.h. The tracking software won't move the servos if the centroid pixel is within this rectangle.
During development of the camera software I used one of these standalone LCD graphics displays (http://www.ezlcd.com/) to display in real-time what the software is doing. It's very similar to your software and kinda fun to watch.
If there is demand for it, I'll release a revised version of terminal.c that will send output to: 1) IFI terminal, 2) your software, or 3) to the LCD that I mentioned above.
-Kevin
Kevin, I got our camera working about an hour ago and I would be interested in this code. Also, I looked at the different displays on the linke dpage, but which one have you used to display? When I was watching the code on the terminal it seems to go by quickly, does your LCD display also do that?
-Mike
Team 1654
Uberbots
18-01-2006, 23:27
on IFI's documentation of the camera, there is a section where it describes the possible input and output values for the camera.
what to look for? there is a page or two full of tables with the specific fcodes needed for the CMUcam to recognize and access its values. these values go anywhere from inputtint eh color values, to (i dont really know where it ends, to tell you the truth im still reqading it myself :D )
--ss32
Kevin Watson
19-01-2006, 00:13
Kevin, I got our camera working about an hour ago and I would be interested in this code. Also, I looked at the different displays on the linke dpage, but which one have you used to display? When I was watching the code on the terminal it seems to go by quickly, does your LCD display also do that?
-Mike
Team 1654
Here's an image (http://kevin.org/frc/lcd.gif) of the display, and a .avi movie (http://kevin.org/frc/lcd.zip), which shows the display in action. The movie is around 22MB.
Edit: The red box in the middle represents the target area defined by the two ALLOWABLE_ERROR #defines in tracking.h. The green box that's moving around is the green target and the single green pixel in the middle of the green blob is the centroid of the blob. The tracking software is designed to keep the centroid pixel within the red box. If the centroid pixel moves outside of the box, the software will move the servos in an attempt to get the centroid pixel back into the box. It's kind of fun to watch.
-Kevin
Here's an image (http://kevin.org/frc/lcd.gif) of the display, and a .avi movie (http://kevin.org/frc/lcd.zip), which shows the display in action. The movie is around 22MB.
Edit: The red box in the middle represents the target area defined by the two ALLOWABLE_ERROR #defines in tracking.h. The green box that's moving around is the green target and the single green pixel in the middle of the green blob is the centroid of the blob. The tracking software is designed to keep the centroid pixel within the red box. If the centroid pixel moves outside of the box, the software will move the servos in an attempt to get the centroid pixel back into the box. It's kind of fun to watch.
-Kevin
Cool, this is quite neat. Did you hook up on the RC or OI somehow? Which model # was that LCD screen?
-Mike
Kevin Watson
19-01-2006, 01:28
Cool, this is quite neat. Did you hook up on the RC or OI somehow? Which model # was that LCD screen?
-MikeIt's the ezlcd-001 and it's wired to serial port one/programming port.
-Kevin
Joe Hershberger
20-01-2006, 03:22
It's the ezlcd-001 and it's wired to serial port one/programming port.
-Kevin
Kevin,
That display looks very nice. A bit expensive for just playing around, but very cool. Is the interface to it pretty straightforward? Does it have primitive graphics function included such as draw circle, draw rect, draw text? What kind of update rate can you get on it over the serial port?
Very impressive.
-Joe
varcsscotty
20-01-2006, 03:29
is there a way to use this in the dashboard? is it allowed?
Kevin Watson
20-01-2006, 11:46
Does it have primitive graphics function included such as draw circle, draw rect, draw text?Yes it does. You can download the documentation from the website.
What kind of update rate can you get on it over the serial port?I don't know what the upper bound is. I have the display update with each new t-packet, which come in at around twelve Hertz.
-Kevin
viewtyjoe
21-01-2006, 10:35
When using the code provided by Kevin Watson, I've gotten some weird calculations for the distance to the target (we're talking 300 inches in error). I believe this is because the camera is calculating the wrong tilt angle, and hopefully, I'll have the values calculated before today's over. If anyone has a tilt angle calculation that works, post a link or post it here, please.
Joe Hershberger
21-01-2006, 12:03
When using the code provided by Kevin Watson, I've gotten some weird calculations for the distance to the target (we're talking 300 inches in error). I believe this is because the camera is calculating the wrong tilt angle, and hopefully, I'll have the values calculated before today's over. If anyone has a tilt angle calculation that works, post a link or post it here, please.
What code of Kevin's is supposed to give you the distance? I didn't even realize that it computed the angle at this point.
-Joe
Kevin Watson
21-01-2006, 13:53
What code of Kevin's is supposed to give you the distance? I didn't even realize that it computed the angle at this point.
-Joerange = (green light height - camera height)/tan(tilt angle)
-Kevin
maniac_2040
22-01-2006, 01:02
Originally Posted by viewtyjoe
When using the code provided by Kevin Watson, I've gotten some weird calculations for the distance to the target (we're talking 300 inches in error). I believe this is because the camera is calculating the wrong tilt angle, and hopefully, I'll have the values calculated before today's over. If anyone has a tilt angle calculation that works, post a link or post it here, please.
Our team got the distance calculation to come within 50 millimeters (~2 inches). This was around an angle of 30 degrees and using a height much lower than the actual height of the vision target. Using the actual height of the target, and at higher distances, our calculations came within 500 millimeters(~20 inches). To calculate tilt angle, our team placed the vision target at a known height, then using special right triangles, placed our camera at a distance that should give us a known angle. Doing this, and recording the servo position at the time, we discovered that an angle of 30 degrees(above the horizontal) is a servo position of approximately 186 and an angle of 45 degrees is a servo position around 214. You can use these values to convert between servo position and angle..
One reason there is error in the angle/distance calculation is that there's no telling which part of the vision target the camera will start tracking. It depends on how the search algorithm catches it. It could start tracking the bottom, top, side..anywhere. It won't automatically aim for the center. Since the target is 200 mm tall this can result in significant error. For us, with the camera at the same position sometimes the angle varied by a degree or two, which believe me, can equal a significant amount in distance..there are many ways to approximate and get more accurate results, but it'll never be dead on...anyways, has anyone else gotten better results?
Joe Hershberger
23-01-2006, 02:47
Our team got the distance calculation to come within 50 millimeters (~2 inches). This was around an angle of 30 degrees and using a height much lower than the actual height of the vision target. Using the actual height of the target, and at higher distances, our calculations came within 500 millimeters(~20 inches). To calculate tilt angle, our team placed the vision target at a known height, then using special right triangles, placed our camera at a distance that should give us a known angle. Doing this, and recording the servo position at the time, we discovered that an angle of 30 degrees(above the horizontal) is a servo position of approximately 186 and an angle of 45 degrees is a servo position around 214. You can use these values to convert between servo position and angle..
One reason there is error in the angle/distance calculation is that there's no telling which part of the vision target the camera will start tracking. It depends on how the search algorithm catches it. It could start tracking the bottom, top, side..anywhere. It won't automatically aim for the center. Since the target is 200 mm tall this can result in significant error. For us, with the camera at the same position sometimes the angle varied by a degree or two, which believe me, can equal a significant amount in distance..there are many ways to approximate and get more accurate results, but it'll never be dead on...anyways, has anyone else gotten better results?
Have you tried using the other tracking data that you get to refine your angles? Just look at the tracking centroid that's returned to adjust.. it will tell you where in the image the center of the target is.
Cheers!
-Joe
viewtyjoe
24-01-2006, 15:41
We've calibrated our camera tilt angle based on where it is parallel to the frame, and we've been seeing fairly accurate results (less than 5 inches error) with the target at heights a little less than 20 inches short of where it will be. The problem was that the servo's neutral position was slightly below the horizontal.
So just make sure that the values you use in calculating tilt angle are valid.
Graham Donaldson
31-01-2006, 20:14
Does anyone know the degree of error the camera's program/servo will accept? We were testing pan today, and we determined that (from 14 feet) it had a margin of error of approximately 2.5 degrees. Can anyone confirm this or contradict it, and, do you know of the degree of error is the same for tilt? Thanks.
Kevin Watson
31-01-2006, 22:58
Does anyone know the degree of error the camera's program/servo will accept? We were testing pan today, and we determined that (from 14 feet) it had a margin of error of approximately 2.5 degrees. Can anyone confirm this or contradict it, and, do you know of the degree of error is the same for tilt? Thanks.Is this a fixed error, meaning it is always off by 2.5 degrees across the pan range? Or do you mean you took a bunch of samples and found the error to be +/- 2.5 degrees? In general, your maximum error should be around one half of a PWM step, and given that one PWM step swings the camera ~1/2 of a degree, I'd expect that you'd be able to do better than one degree. This was also discussed in this thread (http://http://www.chiefdelphi.com/forums/showthread.php?t=41928).
-Kevin
Steve Orr
31-01-2006, 23:06
Is this a fixed error, meaning it is always off by 2.5 degrees across the pan range? Or do you mean you took a bunch of samples and found the error to be +/- 2.5 degrees? In general, your maximum error should be around one half of a PWM step, and given that one PWM step swings the camera ~1/2 of a degree, I'd expect that you'd be able to do better than one degree. This was also discussed in this thread (http://http://www.chiefdelphi.com/forums/showthread.php?t=41928).
-Kevin
If you are using the common six wheel drive then the robot will tilt differently depending whether it is on the back four wheels or the front four wheels (given that the center wheels are slightly lowered). Could this be providing extra error on the tilt?
something to think about, but it shouldn't effect the pan angle...
Alan Anderson
01-02-2006, 07:27
Does anyone know the degree of error the camera's program/servo will accept? We were testing pan today, and we determined that (from 14 feet) it had a margin of error of approximately 2.5 degrees. Can anyone confirm this or contradict it, and, do you know of the degree of error is the same for tilt? Thanks.
Once the camera has centered itself on the target, it won't move until the target gets sufficiently far from the center of its view. I think the default number of pixels in the "deadband" ends up equivalent to a little more than one degree either side of center. I don't have a copy of the camera code handy, so I can't quote the actual #define for you.
You might try reducing the size of the "deadband", or you could read the mx and my values from the camera to tell you how far off-center the target actually is.
Once the camera has centered itself on the target, it won't move until the target gets sufficiently far from the center of its view. I think the default number of pixels in the "deadband" ends up equivalent to a little more than one degree either side of center. I don't have a copy of the camera code handy, so I can't quote the actual #define for you.
You might try reducing the size of the "deadband", or you could read the mx and my values from the camera to tell you how far off-center the target actually is.
Does anyone know how many pixels the camera code uses when calculating the pixel error? I believe the image is down sampled - but I can't find how much.
Thanks.
Kevin Watson
01-02-2006, 10:46
Once the camera has centered itself on the target, it won't move until the target gets sufficiently far from the center of its view. I think the default number of pixels in the "deadband" ends up equivalent to a little more than one degree either side of center. I don't have a copy of the camera code handy, so I can't quote the actual #define for you.Here's the relevant code from tracking.h:
// These values define how much error, in pixels, is
// allowable when trying to keep the center of the tracked
// object on the center pixel of the camera's imager. Too
// high a value and your pointing accuracy will suffer, too
// low and your camera may oscillate because the servos
// don't have enough pointing resolution to get the center
// of the tracked object into the square/rectangle defined
// by these values
#define PAN_ALLOWABLE_ERROR_DEFAULT 6
#define TILT_ALLOWABLE_ERROR_DEFAULT 6
If you're using the bells and whistles version, the values can also be changed via the tracking menu.
You might try reducing the size of the "deadband", or you could read the mx and my values from the camera to tell you how far off-center the target actually is.There is example code in terminal.c that shows how to calculate the pointing error. If you're using the bells and whistles code, use this instead:
int Pan_Error;
int Tilt_Error;
Pan_Error=(int)T_Packet_Data.mx-(int)Tracking_Config_Data.Pan_Target_Pixel);
Tilt_Error=(int)T_Packet_Data.my-(int)Tracking_Config_Data.Tilt_Target_Pixel);
-Kevin
Graham Donaldson
01-02-2006, 11:24
Once the camera has centered itself on the target, it won't move until the target gets sufficiently far from the center of its view. I think the default number of pixels in the "deadband" ends up equivalent to a little more than one degree either side of center. I don't have a copy of the camera code handy, so I can't quote the actual #define for you.
You might try reducing the size of the "deadband", or you could read the mx and my values from the camera to tell you how far off-center the target actually is.
What we were doing is we had the camera set up on a protractor. Whenever we turned it, the camera would adjust, but you had to adjust it more than 2.5 degrees. We found that to be the minimum it could turn w/o adjusting. I think what you said, Alan, is true. I'm not on programming (I&T), but I'll talk to our programmers to see if they can reduce the size of the "deadband".
Also, if you (theoretically, as I'm not on programming and not sure how it works) make the size of the space that the program looks to fill bigger, then would it be more accurate? So it has to fill more and therefore moves more often to keep it full? I don't know if this will work; again, I have no idea how the program operates. That's why I'm on IT: I tell them what to do, and they do it. WAY too confusing for me.
Anyways,thanks for your help with this.
vBulletin® v3.6.4, Copyright ©2000-2017, Jelsoft Enterprises Ltd.