![]() |
Vision tracking: Did they get it right?
During the kickoff this morning when Dave Lavery spoke about the vision tracking, he seemed optimistic. He said that he thinks "they finally got it right this time." Its obviously early to tell, but what do you think? Will the vision tracking aspect of Breakaway be a disappointment (like past games) or did they get it right this time?
|
Re: Vision tracking: Did they get it right?
No way to tell until we get our hands on the system. I assure you I will be begging to try it out on Monday :P
|
Re: Vision tracking: Did they get it right?
Quote:
|
Re: Vision tracking: Did they get it right?
Honestly, I don't see much of an advantage of it for auton. Unless you can get a ball, line up the target, and shoot it 100% of the time.
It's going to be essential for teleop though. There is no way a driver can manually make a shot without a camera feed on their netbook/drivers station. That's where vision will be awesome! |
Re: Vision tracking: Did they get it right?
While there is a use in autonomous, it seems as if you can place the balls where you'd want, thus allowing the use of encoders and some dead reckoning to allow a shot on target (You know your position, you know the ball's position, and you know the goal's position. It's all a matter of math and distances/timing from there). With regards to teleop....yes having a live video feed is useful, but that's not necessarily vision-tracking, rather just a camera feed. And as your goal is on your side of the field, homing in on the target is not that big of a deal in my opinion. It might be useful if you're looking to score in your opponents goal rather...
|
Re: Vision tracking: Did they get it right?
Quote:
|
Re: Vision tracking: Did they get it right?
Can't say if they got it right, but can say we want to use it this year, IF we can get it to work.
|
Re: Vision tracking: Did they get it right?
Is there going to be some kind of prepared program we can download?
If so can someone please post a link. |
Re: Vision tracking: Did they get it right?
Quote:
|
Re: Vision tracking: Did they get it right?
when they have proven that they can lock on the target and are giving us some of the program i would say yes they have beyond all reason got it a lot better than before
|
Re: Vision tracking: Did they get it right?
Quote:
Quote:
Quote:
|
Re: Vision tracking: Did they get it right?
Quote:
Quote:
Quote:
I may be wrong, but I think 9 out of 10 goals will be shot from within 10 feet of the goal. Given that range and the fact that the goal is right next to the drivers, camera-based tracking isn't going to help much. |
Re: Vision tracking: Did they get it right?
Quote:
I totally agree, you could use encoders to score really effectively. |
Re: Vision tracking: Did they get it right?
Found the code for targeting. According to the comments lighting should have no effect on it. For you tech savvy people (isn't that everyone here?), download the windriver updater, rename it to a .zip, and unzip it. In the folder go to vxworks-6.3\target\src\demo\2010ImageDemo. Target.cpp is the source code for the targeting system.
Looks like they just went with converting the color image into monochromatic and looking for the changes there. they even made detecting the circle an API call. oh and indubitably you can find the LabVIEW update here. |
Re: Vision tracking: Did they get it right?
thank you
|
Re: Vision tracking: Did they get it right?
Anyway I can find the Java camera code? I saw it once somewhere, but can't remember. Thanks in advance!
|
Re: Vision tracking: Did they get it right?
|
Re: Vision tracking: Did they get it right?
Quote:
|
Re: Vision tracking: Did they get it right?
Quote:
But there is a goal on your side of the line that you could block.:cool: |
Re: Vision tracking: Did they get it right?
Is there any starter camera tracking code for LabVIEW yet? I looked through the examples, but those are only for color recognition- not for shapes.
|
Re: Vision tracking: Did they get it right?
perhaps. this year doesnt so sound so imortant or controlable this year. if we had this camera last year and these cool little laptops, yeha this would make it a whoooooooooooooole lots better.
|
Re: Vision tracking: Did they get it right?
Quote:
|
Re: Vision tracking: Did they get it right?
Quote:
I think that, because the targets are stationary rather than moving like last year, there is definitely an opportunity to score in autonomous using the camera to track. Long range shooting was impossible last year because of the moving targets and the unpredictable floor, but this year I can see it make a comeback (although it might have been nice to see a point increase for a long range goal... sort of like a 3-pointer in basketball. Oh well...) |
Re: Vision tracking: Did they get it right?
I've only looked at the C++ vision code so far, but I would imagine the labview code to be similar.
I am going to test it tomorrow with the old robot, but there are a few things that worry me: - While the ellipse detection only uses luminance data for detecting edges, it does so by allocating memory, mixing the image down, processing it, then freeing the memory. I don't know how efficient vxworks' malloc is, but this seems like a rather bad idea. - From what I can tell, the ellipse detection uses the edges of ellipses - meaning that it will detect two ellipses around the inner and outer edges of the black circle. While this is perfectly acceptable when one bases navigation of the center of the circles, it has to potential to throw a wrench into distance algorithms (e.g. inverse perspective transform). Some sort of algorithm will be needed to pick one of the edges (preferably the outer one). - The tracking algorithm only samples the image once, then bases all further turning on the gyro without sampling any more images. There are both problems with this approach, as well as problems in the implementation. I won't elaborate on this point, as it probably deserves its own separate thread. I'm impressed that they created a decently working camera example for teams to start with, though it definitely is not a perfect solution. I have to wonder if they did this on purpose - after all, it would be no fun if everyone's robot ran the same code. |
Re: Vision tracking: Did they get it right?
Quote:
Quote:
This method is more reliable and typically faster when homing in on stationary targets. If you'd like to discuss it further, feel free to start another thread. -Joe |
Re: Vision tracking: Did they get it right?
I'm really excited for this year. Live video streams to the drivers will be amazing, and the targeting is built into the API (through their ellipse-finding code). Now teams will have consistant, (hopefully) reliable, and contained targeting code that's standard across the board.
|
Re: Vision tracking: Did they get it right?
Quote:
|
Re: Vision tracking: Did they get it right?
It uses RobertDrive class which does not support crab drive
|
Re: Vision tracking: Did they get it right?
Since it is not possible to support all type of robot bases with the current WPILib, it controls what is probably the most common. At one level it computes an angle to rotate, then uses robot drive to rotate. It should be pretty easy to map to alternate drive bases. Of course you will likely want it to move forward, kick, line up with another, etc.
Greg McKaskle |
Re: Vision tracking: Did they get it right?
Is vision needed in this game?
Is the goal stationary? YES Do you know the location of the balls prior to Auton? YES Do you know your robot location prior to Auton? YES Is there potential defense in auton? NO Can a human score without camera? YES Is vision needed in this game? NO We are all for using the camera, when the effort is worth it. We used it in 06, because it was worth it. We will not use it this year because it is not worth it. |
Re: Vision tracking: Did they get it right?
I think the camera will be worth using for aiming the kicker
|
Re: Vision tracking: Did they get it right?
Quote:
Vision tracking is definitely beneficial in any game, especially with this new control system allowing you to feed it straight to the classmate. The window of use in autonomous is generally smaller this year because unless you have a strong kicker there's not a lot of opportunity. (Well, I can think of more for defensive bots but I'll keep them to myself ;) ) |
Re: Vision tracking: Did they get it right?
Quote:
It would be easy enough to create your own CrabDrive class which extends the RobotDrive class. This would allow you to pass a CrabDrive to the vision code and have it work properly. Code:
class CrabDrive : RobotDrive { |
Re: Vision tracking: Did they get it right?
Our drivers were very impressed by a demo of the system. They feel that they can drive and play with the camera alone!
This should be an interesting game! |
Re: Vision tracking: Did they get it right?
We just tried it yesterday, and I agree that it works great this year.
Last year, I couldn't get it to work correctly, but this test, I tried the example code with a basic setup, and it worked right out of the box! It's quicker and more accurate than any human driver, unless the driver wants to aim with the camera (waiting for the updates) or is standing right behind the robot. Hopefully, we'll have a good autonomous program -- if our kicker isn't accurate to score from the far zone (we'll try to be in the far zone on most matches,) then I'll aim towards the center (so I don't kick it out of the arena.) |
Re: Vision tracking: Did they get it right?
Team 619 finds that the camera code works great this year! It truly was a turnkey operation. Knowing what a pain the camera has been in previous years, I assigned three programmers the task of making it work. They finished in about two days. :D
CircleTrackerDemo contains a function called "getHorizontalAngle()". You call it and it gives you the horizontal angle to the target. It's that simple. No lighting constants to tune, color constraints to check...etc. They even went ahead and showed you how to set up a PID loop and gyro. We don't use those high-level features of WPIlib (that's a little too easy IMO), but one can literally download the sample program to the robot and have a working vision tracking system. |
| All times are GMT -5. The time now is 02:47. |
Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi