![]() |
Brightness on 2016 vision samples
For all of the images we were given to test vision processing with, they all look like they were taken in a room with the lights off. Was this done with some setting on the camera? I've personally never been to an event that had the lights of, except maybe on Einstein...
|
Re: Brightness on 2016 vision samples
My team was also wondering this. The image recognition works really well with the low lighting, but in normal lighting, the recognition seems to take longer/be not as easy.
|
Re: Brightness on 2016 vision samples
I would assume that they changed the camera settings to filter out certain levels of light. There is a screensteps tutorial here.
https://wpilib.screenstepslive.com/s...amera-settings |
Re: Brightness on 2016 vision samples
I believe the images were taken with a low exposure setting.
Here is the Wikipedia article about exposure in photography and an article talking about exposure with examples of overexposed and underexposed images. |
Re: Brightness on 2016 vision samples
I would agree. There's also a setting on the camera webpage to adjust the exposure level; the screensteps talk about that too.
|
Re: Brightness on 2016 vision samples
There are many ways to set up the camera and get images, but I'll list the elements using the WPILib terminology.
The retroreflective tape is such a strong reflector that you can think of it as an amplifier of the ring light. The material will return either 300 or 600 times as much light as bright white paint. I no longer remember the spec for the material being used. It is so bright that it can overwhelm the camera's sensor and auto settings and you will actually get an image with a white target and an LED colored fringe. This is called sensor bloom. Fancier camera sensors will postpone the blooming, but sufficiently bright light is a challenge. The good news is that you can use this to your advantage. If you lower the exposure time and/or lower the brightness of the image, the background will darken and the tape will turn from white to the LED color. This also helps with processing performance, by the way, because many of the color conversion and processing algorithms will then be dealing with lots of dark pixels which they can quickly dismiss. So FIRST did this on purpose. You can too by setting the brightness and/or the exposure settings. You may also want to turn off the auto white balance and choose something that will stay predictable. Greg McKaskle |
Re: Brightness on 2016 vision samples
Quote:
Quote:
Quote:
|
Re: Brightness on 2016 vision samples
we were wondering this same thing. We looked into it and from what we can tell you can change the settings using a program however i do not think you can save the settings. Has anyone solved this issue
|
Re: Brightness on 2016 vision samples
The FRC update for Labview has VIs for setting these. They aren't as nice as I'd like, so I've been working on reading the limits directly and developing my own.
|
Re: Brightness on 2016 vision samples
The WPILib Camera VIs were originally written for the Axis VAPIX API. When USB cameras were added, they were added via the NI IMAQdx libraries. About half of the properties return a "not supported" error, but some others were extended to have a custom setting.
Lately, I've been using the Vision Acquisition express block configuration. It leads through five wizard screens with a test mode to view the changes as you experiment. It then generates code for IMAQdx. Once done, I will generally right-click and Open the Front Panel which will convert it to a VI. This gives a good starting point for more advanced configuration. To darken, you can make adjustments to exposure, gain, and brightness. Greg McKaskle |
Re: Brightness on 2016 vision samples
Quote:
Quote:
Quote:
|
Re: Brightness on 2016 vision samples
The camera API to an Axis camera can be remote, since it is a conversation between the laptop and the camera's web server. But the USB camera needs to have the call made on the robot, where the session was opened.
Greg McKaskle |
Re: Brightness on 2016 vision samples
Quote:
The Screensteps are only for the Web interface of the Axis Cam. It would be nice if there were some examples or documentation on doing these basic settings from Java or C++ |
Re: Brightness on 2016 vision samples
Quote:
|
Re: Brightness on 2016 vision samples
I apologize for not doing my homework.
The question is: Can the EXPOSURE of the USB camera be held or set manually? If it can be held will it hold through an on/off/on power cycle? TNX |
Re: Brightness on 2016 vision samples
Jonboy, In my thread, http://www.chiefdelphi.com/forums/sh...d.php?t=142633, that is exactly what I am trying to do - the code is there, the methods are there, but it doesn't seem to be working.
According to the USBCamera class, there is a setExposureManual(int exp) and a setBrightness(int bright) methods, which is what I'm using. Whether they get saved across power cycles or not, I'm not sure, but I don't care because I've connected it to the SmartDashboard and using the Preferences class, it will update to whatever values I save in the preferences file. |
Re: Brightness on 2016 vision samples
Quote:
https://www.microsoft.com/hardware/e...ifecam-hd-3000 |
Re: Brightness on 2016 vision samples
My team has been using NI Vision assistant and we now know how to track objects on the screen; however, we were wondering how to impliment the tracking to motor movement. We want our robot to find the target then auto adjust to score. If anyone has any code, websites, or tips for us that would be great. Thank you.
|
Re: Brightness on 2016 vision samples
Quote:
Edit: sorry, didn't see that this was already answered. |
Re: Brightness on 2016 vision samples
Quote:
|
Re: Brightness on 2016 vision samples
I've been setting exposure on the Lifecam from programs with v4l-utils. You can play with exposure settings using QV4L2 on your development computer, then in your code include a call to v4l2-ctl to set your exposure:
Code:
v4l2-ctl --set-ctrl=exposure_absolute=9 --device=1 |
Re: Brightness on 2016 vision samples
Has ANYONE been able to get DARK images out of the life cam, using ANY method? Images that come anywhere close to being as dark as the sample images for the FRC vision processing? I'm beginning to think that the LifeCam is simply not adjustable to that low of an exposure. I can see changes when I adjust the brightness, but exposure doesn't seem to be changing anything and even with both exposure and brightness set to minimums, the image still looks "normal" (a decent image for the average person to look at) not DARK like is needed for the image processing.
|
Re: Brightness on 2016 vision samples
I just confirmed it is possible - I'm running on a Mac and I just installed "Webcam Settings Panel" from the app store. It allowed me to adjust the exposure to a BLACK image. It also seems to confirm that it is not doing any of this in software, but is actually querying the camera for it's capabilities. The setting options are different between the LifeCam, my laptop iSight and my monitor's iSight, for example, the LifeCam has a backlight compensation slider while the iSights show up as simply on/off options.
I also confirmed that the settings are saved in the camera (or at least in the driver) - quit the settings panel (so I know it wasn't restoring the values) and unplugged the cam and plugged it back in and the image was the same as when I had unplugged it. This is all good news, now if I can just figure out how to use the USBCamera, NIVision.IMAQ or something on the RoboRIO to make these same adjustments, I would be happy. |
Re: Brightness on 2016 vision samples
CORRECTION - I THOUGHT I had closed "Web Cam Settings" - I hadn't. IT was storing and re-storing the settings. I made sure I quit it and tested again. When I replugged the camera, it reverted to Autoexposure
|
Re: Brightness on 2016 vision samples
One of the questions my team had is, can we change the camera settings for autonomous, and have them go back to normal for teleop? We'd like to have dark picture for vision analysis, but standard colors for real time feed for complicated driving. Is this possible?
|
Re: Brightness on 2016 vision samples
It is possible. You just need a routine to reset the camera settings again and have it run at the begining of teleop.
|
Re: Brightness on 2016 vision samples
You don't mention what language you are using, but the camera settings are adjusted using the WPILib camera property sets. The initialization code often has similar set calls.
Greg McKaskle |
Re: Brightness on 2016 vision samples
In Labview, we were able to use the stock VIs for camera control to adjust the settings of the LifeCam. In particular, settings that produced images where the target is brightly lit, but all surroundings are dark were:
Exposure 0 Brightness 20-30 We left the white balance on auto. |
Re: Brightness on 2016 vision samples
Where are the Vision Samples? I'd be interested to see what we should be shooting for.
|
Re: Brightness on 2016 vision samples
The LV examples include about 100 images. I didn't find them immediately, but I'm pretty sure they are on first forge or WPI or another public site.
Greg McKaskle |
Re: Brightness on 2016 vision samples
I was finally able to get the camera to work the way I wanted it. There appears to be some issue related to calling the camera settings function too often or to close together (in time). I haven't spent the time tracking down the issue, but do have a good work around. Previously, I had in my loop, a check that if the setting changed, it would call the appropriate set setting, then immediately call the getImage() function. This would get the image, but I could never get the exposure to work. The change/workaround that makes it work is:
(This is all Java, but applies equally to C++) - I created a separate function the when called, updates the specified setting (brightness or exposure) - it retrieves the value from a Preferences object on the SmartDashboard. - I added a button on the SmartDashboard to trigger this function call. - I also added a button (and function) for setAuto and setHoldCurrentExposure (which switches to manual exposure) - My loop now does just: - USBCamera.getImage() - CameraServer.setImage() - Now when I click one of the buttons, I get exactly what I expect, the exposure or the brightness changes according to the value in the Preferences table. I can get a VERY dark image and I can get a VERY bright image or anything in between. So for those asking about dark for autonomous and normal for teleop, you can explicitly call the settings and get the exposure/brightness you need. I think the important thing is to call them each only once in a given time period (not sure what the time period is, 100's of milli-seconds or seconds I would guess) and don't call getImage() immediately after the settings change. I'm honestly not sure why this approach works and the other doesn't someone that knows the internals of the NIVision.IMAQ class would probably have to chime in. I don't have the time to dig into it. If someone wants the code example, let me know and I'll post it somewhere. |
Re: Brightness on 2016 vision samples
They are located in (from memory so I might be a little off)
C:\Program Files (x86)\National Instruments\Labview 2015\Examples\FRC\Vision\roboRIO\2016 Vision Code\Sample Images\ I may have some of those swapped but that should let you find it provided you have the FRC update installed. |
| All times are GMT -5. The time now is 23:58. |
Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi