![]() |
How far have you gotten on coding the camera tracking?
Our team is mostly finished. We just need to debug and make it more efficient.
|
Re: How far have you gotten on coding the camera tracking?
We are still trying to improve the response delay, the robot takes sometimes half a second to react. The camera's FPS is almost never above 10, but I don't know if this is an issue in the robot or just the laptop communication.
We also need to make it work in different light conditions. |
Re: How far have you gotten on coding the camera tracking?
To update, I'm on the same team as Sean.
We can now successfully find a flag, check if its friend of foe, and track the motion of that flag if it is a foe. We are going to work on code to decide which flag to use if more then one is present in the field of view. Hopefully get a good start tonight |
Re: How far have you gotten on coding the camera tracking?
We've gotten our camera to successfully find the pink/green flag, determine which alliance it's on and track appropriately. Our problem now is with getting the robot to follow it - when we try to have the robot turn and drive towards the target, only one side of the drive system responds properly (the other just sort of twitches back and forth).
|
Re: How far have you gotten on coding the camera tracking?
Hi everyone,
I'm posting two videos of an older robot with the new control system tracking the vision target. Video 1 - http://www.youtube.com/watch?v=9l64T6wc1Lg Video 2 - Labview Screen View - http://www.youtube.com/watch?v=HdyzfuzJs-w We used the Two Color servo tracking vi from NI, and dropped it into the Robot Main vi to control the drive motors of the robot. Please contact me if you have questions on the implementation. The program is set to steer towards the target and maintain a certain distance, which will be adjusted to be at the correct distance to allow our manipulator to drop the moon rocks into the trailer. I'm not thrilled with the program's ability to track the colors, particularly the green color, but we will continue to test and adjust as time goes on. You can see the intermittent tracking of green in the second video. We will be transplanting the control board to the new robot to tweak it for the new wheels very soon. I expect to be able to take the outputs of this current program and feed them into a traction control algorithm. On another note, I want to say how enjoyable it has been teaching my students to program the robot using Labview. Most of the programming team has almost no programming experience, but something about Labview has made it less intimidating. The students really seem to enjoy looking through the various functions to find what they want to use, while finding other ones along the way that might be useful later on. Even more interesting, this year's programming team is almost exclusively female! If Labview was the missing factor in getting more girls interested in programming, I will gladly add it to the list of reasons I'm glad we made the decision to use it this year. One more comment - has anyone found that the camera resets itself when the motors change rapidly? The only cause that makes sense is some sort of noise getting into the system. We changed our tracking to include the PID.vi from the control toolbox, and the system was resetting any time there was integral gain. The camera would reset, and the cRIO would keep doing whatever it was doing just before the camera went out....disabling and re-enabling at the driver station would stop it from moving completely. Please let me know if you have had similar experiences. Evan |
Re: How far have you gotten on coding the camera tracking?
Our programmers are having great success with the camera. We asked them to write a code called, "Kill the Driver Mode." Our programmers have been spending time to track any color shirts they have been wearing lately so they can be chased around for the last 20 minutes of the meeting. =)
|
Re: How far have you gotten on coding the camera tracking?
Thanks for sharing this.
|
Re: How far have you gotten on coding the camera tracking?
Quote:
The green color of that fabric is indeed less predictable than the pink, which is why the pink is the primary color by default. For debugging, you may find it useful to switch the first color to green to see the entire green mask. Then tilt the target towards and away of the camera to see if the issue is with the green getting to bright, too dark, or something else. Usually it is either a bright streak or a dark streak. Once you know how to make it fail, and while it is still running, open up the Find VI, click on the HSL debug switch, and now you can hover the mouse over the green portion of the image to see what the pixel values are in HSL. This will give you an idea of how the different orientations differ, and how much you'll need to lower the saturation or brightness on green to get it covered. You may also decide that you don't want to change it if it is due to tilt which you don't expect in a game. The debug HSL button is not something I'd leave on for real usage, by the way. It works identically, but slower because of the display, and because of the explicit HSL conversion of every pixel. For normal operation turn the switch off, the threshold will still be done in HSL, but only enough math to perform the threshold. Greg McKaskle |
Re: How far have you gotten on coding the camera tracking?
We've gotten the Camera tracking a pink and green target, identifying which team it is, where it is and how far away it is, and it's all in a modular code block. We modified the Single Color Tracking Example and then added servos and position data.
|
Re: How far have you gotten on coding the camera tracking?
What does HSL stand for?
|
Re: How far have you gotten on coding the camera tracking?
Hue/Saturation/Luminosity. It is just another way to represent colors than RGB.
|
Re: How far have you gotten on coding the camera tracking?
I, as the most active of one of three programmers on my team, have made no progress at all. Especially since this is my first time ever using labview, so I'm entirely self taught.
Any help would be greatly, hugely appreciated. |
Re: How far have you gotten on coding the camera tracking?
Quote:
If you're having trouble making it track with the standard gimbal and two color tracking demo, try finding somewhere there is more light - opening the windows where we are made a huge difference. It also took a lot of tweaking of hsl values. btw see you at the regional :) -jonathan |
Re: How far have you gotten on coding the camera tracking?
We're still having fun trying to get the response time to be faster, but I believe we've reached the highest efficiency we can get before ship. I tracks like no tomorrow, but when the target is moving we're having trouble leading it... We messed around adding in a PID lead/lag loop, but we're only having moderate to no success with that. Has anyone else gotten it to successfully lead a moving target? Any thoughts on that?
By the way, I'm very impressed with the support the community has on the camera this year! Here's a shout out to all the incredibly helpful NI folks that have provided time and resources to ensuring this season's success with the new system! |
Re: How far have you gotten on coding the camera tracking?
Quote:
And yes, see you there. :D |
| All times are GMT -5. The time now is 14:30. |
Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi