|
|
|
![]() |
|
|||||||
|
||||||||
![]() |
|
|
Thread Tools |
Rating:
|
Display Modes |
|
|
|
#1
|
||||
|
||||
|
Re: FRC 2011 Vision Tracking
I had a lot of success with just using a standard light bulb, and then changing the lighting environment. Luckily the reflective tape reflects back to the light source, so when i used HSL filter(just like the 09 code) it woked in several lighting environments. The only light that would effect you is on directly behind you at the level of your robot. So I think comp lighting wont be as much of an issue with this system.
|
|
#2
|
|||
|
|||
|
Re: FRC 2011 Vision Tracking
The ambient lighting shouldn't be that much of an issue.
That is based on the fact that these markers are in roughly the same place and at roughly the same orientation as the circles of last year, and that part of the field doesn't catch much glare. It is bright, but the lights are coming from the long sides of the field. The second reason is that the material specs claim it as 600X brightness for narrow angles. This means it reflects 600 times as much light back to the source as a white painted surface would. I'm sure the measurement specifics are much more technical than that, but it is very bright. Meanwhile, the ambient light or other spots will be returned to their sources. If they aren't right behind you, you should get a pretty pure reflection of your source. The lights used in the tutorial were cheap christmas lights, not too bright, and they were acceptable even with windows behind the targets. Depending on your light source color, I would experiment with the white balance and try to set the image exposure similar to how it was for 06. In other words, darken the image to where not much more than the reflected lights and active lights show up in the image. You may be able to do this with the auto exposure and brightness combo, but I believe that it will be best to use a custom exposure that results in markers being high intensity and high saturation. This will also speed the threshold processing since the luminance and saturation will then exclude more pixels before the hue is even calculated. As for using IR or a single color plane, I'm interested to hear how it works. The sensors should be sensitive there. You may need to replace or modify the lens, and if I remember correctly, the IR source will show up more as a white light and not a colored light. Greg McKaskle |
|
#3
|
||||
|
||||
|
Re: FRC 2011 Vision Tracking
What I have found to be the best way to develop a camera algorithm is to use the NI Vision Assistant. This can be found in the Labview install included in the FRC Labview DVD. The Vision Assistant allows you to tweak your parameters and view what affect the tweaks have on the result. Once you have the image altered to an acceptable level you can export the vision assistant parameters directly to Labview code. For help on this process check our decible.ni.com
|
|
#4
|
|||
|
|||
|
I am trying to help Team 2751 to do vision tracking for the first time this year. We program our robot using Java, and I am curious how much of the NI Image Processing library is really available to us. A few questions to anyone kind enough to answer:
1. Other than getting the settings from the color thresholding operation, how much of the NI Vision Assistant algorithm prototyping can be transferred to Java? For example, I did some template matching in the Vision Assistant that works well. I can, of course, save the script for inclusion in VI. Is it possible to access VI from Java? 2. In the Javadoc reference, the package to access NI's nvision library is edu.wpi.first.wpilibj.image. It only has the following classes referenced in documentation: BinaryImage ColorImage CurveOptions EllipseDescriptor EllipseMatch HSLImage Image MonoImage ParticleAnalysisReport RegionOfInterest RGBImage ShapeDetectionOptions These classes appear to be there to support the Java sample machine vision projects from last year. Are other NI image processing functions available through wpilibj or is it just based on the people writing wrapper classes for the functions that are needed? 3. According to the WPI Robotics Library Users Guide - which is ostensibly for Java and C++ - there is reference to the FRC Vision API Specification document, which gets installed with WindRiver. Is it only available in WindRiver? Do I really have to install the IDE for C++ that I don't need since I am using Java NetBeans? The snippet of information I see in the Library Users Guide says that the FRC Vision Interface includes high level calls for color tracking through TrackingAPI.cpp. It also says programmers may call into the low level library by using nvision.h. Are the trackingAPI and/or the low level calls available to Java? The Java VM we have doesn't support JNI - which is the typical way to make calls to C-libraries. In summary, it looks like for image processing, LabView has the most support, followed by WindRiver C++, with Java bringing up the rear. From reading this forum, however, I see that several of you are using Java. Are the issues I raise here really not that big of a deal? How did you overcome them? Any answers and guidance to getting started would be most appreciated. Thanks for your help and good luck during the build season. Barry Sudduth Team 2751 |
|
#5
|
|||
|
|||
|
Re: FRC 2011 Vision Tracking
Hi--we are also looking for any sample Java code that would demonstrate how to grab images from the Axis camera and process them for tracking, etc.
I see references to prior year demo code. Where can we find that? Thanks! |
|
#6
|
||||
|
||||
|
Re: FRC 2011 Vision Tracking
Anyone looking for code for the vision tracking in Java, shoot me an email.
mwtidd@gmail.com I will add you to my dropbox and supply you with my working camera code. Also I created a video tutorial that explains how I accomplished the task. |
![]() |
| Thread Tools | |
| Display Modes | Rate This Thread |
|
|