Log in

View Full Version : Two Color Tracking Demo


Bharat Nain
13-01-2009, 22:05
Here is a version that tracks the gimbal type setup provided by FIRST. It works in autonomous mode. WPI has finally released it.

http://first.wpi.edu/FRC/frcupdates.html

Booksy
13-01-2009, 22:10
Anyone find the labview version yet?

EDIT: usfirst has(will have) a link to it on their 2009 control system page. I guess I'll just wait for that to be up.

R.C.
13-01-2009, 23:50
Thanks for the post and letting us know.

Greg McKaskle
14-01-2009, 09:18
The LV version is at http://joule.ni.com/nidu/cds/view/p/lang/en/id/1215

Greg McKaskle

Mike Mahar
14-01-2009, 09:45
I can't compile this code. The module Target.cpp calls a function "InArea" that is defined in the WPILib module TrackAPI.cpp. TrackAPI.h, however, does not declare this function. I could easily add this declaration to Target.cpp but I wonder if this was written for a WPILib that hasn't been released yet. Some other posts have indicated that there are bugs in the vision system and this example may not address them. Has there been any WPILib update since 1/1/09 that I don't have?

foemill
14-01-2009, 10:01
the function "InArea" can be found in the source code for build 1562 found here (http://first.wpi.edu/Images/CMS/First/WorkbenchUpdateV1.2.1562Source.zip). The source has changed since the kickoff because the code I have from just before the kickoff doesn't contain the InArea function either.

Try re-installing the Workbench update (http://first.wpi.edu/Images/CMS/First/WorkbenchUpdateV1.2.1562.exe). It looks like they may have added it to this install. Otherwise you'll have to re-build the source and link in the WPIlib.a file generated in the make process. The default location for the WPILib source tree ended up in C:\WinDriver\vxworks-6.3\WPILib.


I can't compile this code. The module Target.cpp calls a function "InArea" that is defined in the WPILib module TrackAPI.cpp. TrackAPI.h, however, does not declare this function. I could easily add this declaration to Target.cpp but I wonder if this was written for a WPILib that hasn't been released yet. Some other posts have indicated that there are bugs in the vision system and this example may not address them. Has there been
any WPILib update since 1/1/09 that I don't have?

Mike Mahar
14-01-2009, 11:40
I just downloaed the build 1562 from the WPI site and it is identical to the one that I downloaded on jan 2. My issue isn't that the function InArea is missing from the library but that the declaration of it is missing from the header TrackAPI.h. The latest version does not fix this. If I put the declaration for InArea in target.cpp than the demo compiles and links.

I suspect that they have added the declaration of InArea to their version of the library but have not yet released it. The demo, however, seems to have been built with a newer library than the one we have.


the function "InArea" can be found in the source code for build 1562 found here (http://first.wpi.edu/Images/CMS/First/WorkbenchUpdateV1.2.1562Source.zip). The source has changed since the kickoff because the code I have from just before the kickoff doesn't contain the InArea function either.

Try re-installing the Workbench update (http://first.wpi.edu/Images/CMS/First/WorkbenchUpdateV1.2.1562.exe). It looks like they may have added it to this install. Otherwise you'll have to re-build the source and link in the WPIlib.a file generated in the make process. The default location for the WPILib source tree ended up in C:\WinDriver\vxworks-6.3\WPILib.

fabalafae
14-01-2009, 12:17
Thanks for the LV link.

PhilBot
14-01-2009, 15:27
FYI.

It's possible to build the Camera Gimbal slightly back the front.

Looking at the gimbal from the front of the robot, the servo needs to be on the left hand side. It will go together just fine with the servo on the right, but the net effect is that up and down will be reversed on the Y servo.

It's easy to spot this (as in my case) because when the camera sees the target, it sweeps up or down away from it.

I tried fixing it in software but I couldn't quite get it working, so I opted for fixing the camera.... It just meant detaching the servo horn from the tilt servo, and flipping the camera plate by 180 degrees. (Of course you have to invert the camera and rotate the base by 180 on the robot as well).

It's a great example VI (for LabVIEW), able to be almost dropped right into either of the robot templates.

RyanW
14-01-2009, 17:22
Is there anything special that needs to be done to use the LabVIEW version of this code? We tried running the code on the cRIO using the run button, and it refused to communicate. It was communicating just fine with other code we'd been working with (using the Basic Robot Main VI), so I'm inclined to think that it's some sort of missing software update, but I'm not sure what that could be. Our team has update 2.0 of the FIRST robotics update (found using the about menu item).

We have gotten the camera to work before, connected to the cRIO and directly to our programming laptop.

Please help!

PhilBot
14-01-2009, 19:39
Hi

Yes..... Most "sample" programs come configured with a generic tean number (read IP address) or none at all.

Once you open the project, you will see the IP address as part of the realtime program node name (cRIO Controller 10.0.0.2).

Right Click on the Realtime node and select Properties.

From there you will see a place to change the program's name and IP address.

Set it based on your team # 10.xx.yy.2

Now you sould be able to connect using the run button.
Make sure you RUN the main vi (eg: Two Color Servo Camera Example.vi)

Have Fun.

RyanW
16-01-2009, 14:06
Yes, that was the problem. Thank you, PhilBot.

dani190
17-01-2009, 12:51
oh someone stated that the code must be run in autonomous mode...

is this true? that might be why iv had such a hard time all along.

Dr Nick
17-01-2009, 13:28
@dani - The tracking is only active in autonomous mode, the tele-op is just the standard SimpleRobot tele-op.

Does anyone know if there are step by step instructions for asembling the camera gimbal assembly? I found the drawing on the FIRST site but the only part I can't figure out is how to attach the "Axis Case" part to the camera? Other than that I could probably figure the rest out by looking at the drawings.

dani190
17-01-2009, 13:30
sorry which part... it is relatively easy just clarify which part and i can help you out

RKElectricalman
17-01-2009, 13:56
Can anyone explain to me how to open up the project in Wind River?

Also, when I was working with Labview, I couldn't get it to track. I figured out how to power the servos with the jumpers included, but I couldn't get the camera to actually autonomously move and track the colors, even though it did see it in the VI. Has any one had any issues similar to this?

pollyproof12
17-01-2009, 15:59
anyone have pictures of how they set up thier camera?

nickmagus
17-01-2009, 20:14
nvm

laultima
17-01-2009, 20:26
The color values from this work beautifully. Weve got our camera tracking, both pan and tilt, from code we wrote using these colors in WindRiver. Ill be happy to post the code here if anyone wants to look it over.

btgdaniel
18-01-2009, 05:00
Yes, please post your code -- it would be nice to see another example.

Greg McKaskle
18-01-2009, 09:48
A few clarifications. The example for two color tracking does need to have the DS connected and enabled, but it doesn't care if it is autonomous or tele. You can incorporate it into either or both later.

If you are having problems with the LV example, I'd be glad to help. The first thing I'd advise is setting the radio button near the image display to turn off the gimbal, so the camera will hold still and you will see what the camera sees. Make sure it is focused, make sure your target is in the scene. And hopefully the mask display below will show you the results of the red/green thresholds.

Lighting has a huge impact on vision, and even though this is app was written to be really tolerant of lighting, it can still mess with you. So I usually tilt the target forward and back a bit to see how that will affect things. Move it around to see how overhead lights affect things. Splotchy lighting is some of the worst.

If you are having difficulties at this point, it will be necessary to adjust the camera settings, no use trying to track.

Greg McKaskle

Dr Nick
18-01-2009, 19:44
I've got the camera mounted and scanning for the target but it won't lock on to it (using C++). I got an image from the camera and set the threshold values for my lighting conditions so i have no idea why its not working. To make matters worse, there is a problem with DPRINTF (I think). When it got to the DPRINTF command, the console said there was an error with it, saying a malformed calling sequence. Is anyone else having this problem?

Mikesername
19-01-2009, 13:44
So how do you go about hooking it up so the motors attatched to the wheels move to follow the target?

Ted Weisse
19-01-2009, 14:16
If you are having problems with the LV example, I'd be glad to help. The first thing I'd advise is setting the radio button near the image display to turn off the gimbal, so the camera will hold still and you will see what the camera sees. Make sure it is focused, make sure your target is in the scene. And hopefully the mask display below will show you the results of the red/green thresholds.

Greg McKaskle[/quote]


We can track a single color and the example when in camera mode sees both red and green thresholds. When we enable the gimbal and the DS the pan/tilt searches. On the screen we see the green lagging the red target as it goes across the screen.

What are we doing wrong?

Thanks....

laultima
19-01-2009, 17:47
Heres the code we're using for target tracking, using the color values from the TwoColorTrackDemo. All variables are doubles declared before the While loop, and pan and tilt are servos.

if (FindColor(IMAQ_HSL, &redData.hue, &redData.saturation,
&redData.luminance, &aRed) && FindColor(IMAQ_HSL,
&greenData.hue, &greenData.saturation,
&greenData.luminance, &aGreen)) {
redX = aRed.center_mass_x_normalized;
redY = aRed.center_mass_y_normalized;
greenX = aGreen.center_mass_x_normalized;
greenY = aGreen.center_mass_y_normalized;
centerX = (redX + greenX)/2;
centerY = (redY + greenY)/2;
panMove = (tan((PI/4)*centerX))/8;
tiltMove= (tan((PI/4)*centerY))/8;
pan->Set(pan->Get()+panMove);
tilt->Set(tilt->Get()-tiltMove);
}

Greg McKaskle
19-01-2009, 21:38
On the screen we see the green lagging the red target as it goes across the screen.

What are we doing wrong?


Are you holding the target so that it looks like a foe? The panel lets you describe which alliance you are on, and by default it looks for and tracks the biggest foe.

Greg McKaskle

Ted Weisse
19-01-2009, 21:57
Yes as foe. The servos just hunt never locking on to the target.

Greg McKaskle
19-01-2009, 22:28
If the red and green masks look good, and your holding the target in the correct orientation, the next things I'd do is to watch the Tracking State indicator. When the camera doesn't see the target, it should search. When it acquires some pink and green that make up a foe, it uses the proportional constants to move the camera so that the target is closer to center. You may want to check the sign of the constants to see if the servos are moving in the wrong direction.

If you don't stumble on the issue soon, one thing that will help with debugging will be to put the target in frame, but off center. Position the servos with the FP knob and slider if you need to. Then put a break point after the FindTwoColor and follow it into the Servo Track VI to determine which of these is the issue.

Greg McKaskle

rcoren22
20-01-2009, 19:13
we are able to see get colors in mask, with tracking indicators showing position and target numbers, but we cannot get any ouput to servo motors.
We have connected servos to pwm 9 and 10

Is the target numbers what should be outputed to servo pw9 & pw10

Booksy
20-01-2009, 19:38
we are able to see get colors in mask, with tracking indicators showing position and target numbers, but we cannot get any ouput to servo motors.
We have connected servos to pwm 9 and 10

Is the target numbers what should be outputed to servo pw9 & pw10

Forgive me if this has been mentioned before: Does the servo updating loop have a watchdog feeder. This has been the problem for me numerous times. It might already have it, I forget, but I know the first one (single color) didn't and I had to add it.

PhilBot
20-01-2009, 19:48
Also.... Don't forget to ADD the Servo power jumpers next to the two servos plugs you are using on the Digital Sidecar.

btgdaniel
20-01-2009, 20:13
Heres the code we're using for target tracking, using the color values from the TwoColorTrackDemo. All variables are doubles declared before the While loop, and pan and tilt are servos.

if (FindColor(IMAQ_HSL, &redData.hue, &redData.saturation,
&redData.luminance, &aRed) && FindColor(IMAQ_HSL,
&greenData.hue, &greenData.saturation,
&greenData.luminance, &aGreen)) {
redX = aRed.center_mass_x_normalized;
redY = aRed.center_mass_y_normalized;
greenX = aGreen.center_mass_x_normalized;
greenY = aGreen.center_mass_y_normalized;
centerX = (redX + greenX)/2;
centerY = (redY + greenY)/2;
panMove = (tan((PI/4)*centerX))/8;
tiltMove= (tan((PI/4)*centerY))/8;
pan->Set(pan->Get()+panMove);
tilt->Set(tilt->Get()-tiltMove);
}

Much appreciated, although in all honesty I have an admission to make: If all else fails, read the directions. "This demo assumes that the camera is set up as follows: White balance: Fluorescent 1, Auto Exposure on, Brightness: 40". Ouch. ==> Now it works as advertised. :)

rcoren22
20-01-2009, 20:24
Thank you PhilBot....the jumper cable was the problem. (As for now...)

kyungjin
20-01-2009, 23:57
To the people that have this working in WindRiver...

We've been having some problems trying to get this as a workable project. How do you set this up to work? Do you just drag the extracted folder into the workspace?

I've tried creating a New project from a workspace in an external location and linking it but I seem to get problems with finding "AxisCamera.h". When that's removed or moved somewhere else, the next header file in the #include statements seem to suffer the same problem...

Can anyone give step-by-step instructions for getting this to work? Any help would be much appreciated.

Obscuriti
21-01-2009, 22:37
Hey all,

We have the camera tracking both tilt and pan, and we re-engineered it to be functional in our Robot Main vi, what I would like to do is convert the % Area output to a distance measurement. Has anyone successfully done this or have any ideas on how...

Thanks Much,

RyanW
22-01-2009, 00:31
what I would like to do is convert the % Area output to a distance measurement. Has anyone successfully done this or have any ideas on how...

I would suggest simply creating a look-up table of a few values (get the %area output at 1 foot, 2 feet, 3 feet, or something like that). Depending on the resolution you want (to the nearest foot? 6 inches? inch?), you could either just find the closest table value to what you are seeing, or else try to create a distance formula based on your data points.

Alternatively, since you know exactly how high off the ground the targets will be, and you presumably know the position of the camera on your robot, you could try measuring the tilt angle and using trigonometry to find the distance as follows:
tan(tilt) = height of target above camera / horizontal distance from camera to object, where tilt is the angle above or below the horizontal plane. This would require finding another kind of data, namely the relationship between servo outputs and degree or radian values.

Obscuriti
22-01-2009, 01:29
thanks for the quick response,

Ive been working on a distance formula and I must be a bit rusty on this seemingly simple math.

I sampled a number of data points and fitted a line to it that should be close enough for any practical purposes

I measured feet vs % Area and when graphed and fitted with the line the equation was roughly: y being pixels and x being feet y=80,000x^-2.0, assuming that with a resolution of 320x240 there are 76,800 pixels and the percent area is the amount of pixels that the mask is tracking as the object.


I get 2 equations as follows... they work independently but not when set equal to each other.
pixels=76800(.01(%Area))
pixels=80,000(feet^-2.0)

so 76800(.01(%Area))=80,000(feet^-2.0) right?
simplifying to feet^-2=.0096(% Area)

when solved for x I end up with a messy

feet=e^((ln(.0096(% Area)))/-2))

??? idk maybe it does work, id still like a check and some input pls.

Thanks Again

dani190
22-01-2009, 22:45
well in physics we have this range forumla that splits basically a 3rd year university problem into 2 parts that a grade 12 physics student can do. It essentially is using orthogonal components...
Maybe that would also work?

also guys if i take the code and transplant it into my existing robot code... What is required to be modified so that the camera will start tracking while its driving?

LinuxMercedes
22-01-2009, 23:36
Ok...I feel really dumb. Especially after reading how you all have the code working...

I can't get it to compile. I've reinstalled all the WindRiver updates, but when I compile WPI's sample code, I get 100+ errors in nivision.h. Do I have to grab the latest WPILib source code and compile that to make this work, or am I missing something uber-obvious?

Thanks so much =]

warpling
23-01-2009, 21:15
@Obscurity
That is exactly what I have been trying to do for the past 3 hours. I'm a rookie programmer and labview has a nice steep learning curve. I have our test robot's camera gimble tracking a board with the colors, but I cant figure out how to get outputs and change them into something that can drive our two jaguar motors.
Also that trig idea is amazing I can't wait to try that out if things get working! :yikes:

XXShadowXX
23-01-2009, 21:50
We had an issue with the lab view 2 color tracking demo from usfirst.org giving us the incorrect servo values in the y axis. When the target would move vertically the tracking would move in the opposite direction...

if this happens to anyone else here the solution
|170-Y|= out put
use that forumal, you really don't need the absolute value, but that should fix the problem

Mikesername
23-01-2009, 22:13
@Obscurity
That is exactly what I have been trying to do for the past 3 hours. I'm a rookie programmer and labview has a nice steep learning curve. I have our test robot's camera gimble tracking a board with the colors, but I cant figure out how to get outputs and change them into something that can drive our two jaguar motors.
Also that trig idea is amazing I can't wait to try that out if things get working! :yikes:

Yea, we were trying to do that and one of our mentors said that it's pretty complicated.. and a little over the heads of a rookie team.

:\

XXShadowXX
23-01-2009, 22:23
thanks for the quick response,

Ive been working on a distance formula and I must be a bit rusty on this seemingly simple math.

I sampled a number of data points and fitted a line to it that should be close enough for any practical purposes

I measured feet vs % Area and when graphed and fitted with the line the equation was roughly: y being pixels and x being feet y=80,000x^-2.0, assuming that with a resolution of 320x240 there are 76,800 pixels and the percent area is the amount of pixels that the mask is tracking as the object.


I get 2 equations as follows... they work independently but not when set equal to each other.
pixels=76800(.01(%Area))
pixels=80,000(feet^-2.0)

so 76800(.01(%Area))=80,000(feet^-2.0) right?
simplifying to feet^-2=.0096(% Area)

when solved for x I end up with a messy

feet=e^((ln(.0096(% Area)))/-2))

??? idk maybe it does work, id still like a check and some input pls.

Thanks Again

(height of divide in trailer - height of camera) *COT (input angle) = distance

or (72-Camera height) cot (y) = distance in a horizontal line from you to trailer.

Our distance is accurate to about an inch currently using pretty inaccurate numbers for camera height and trailer divide height since the trailer isn't mounted onto the robot, it kind of leans on the hitch on the ground...

y is what the angle of vertical servo is called in the tracking program released by FIRST

ikhana870
25-01-2009, 18:58
Can anyone explain to me how to open up the project in Wind River?

Also, when I was working with Labview, I couldn't get it to track. I figured out how to power the servos with the jumpers included, but I couldn't get the camera to actually autonomously move and track the colors, even though it did see it in the VI. Has any one had any issues similar to this?

To the people that have this working in WindRiver...

We've been having some problems trying to get this as a workable project. How do you set this up to work? Do you just drag the extracted folder into the workspace?

I've tried creating a New project from a workspace in an external location and linking it but I seem to get problems with finding "AxisCamera.h". When that's removed or moved somewhere else, the next header file in the #include statements seem to suffer the same problem...

Can anyone give step-by-step instructions for getting this to work? Any help would be much appreciated.

our team is also having some issues with this...anyone get it working yet?