|
|
|
![]() |
|
|||||||
|
||||||||
![]() |
|
|
Thread Tools | Rate Thread | Display Modes |
|
|
|
#1
|
||||
|
||||
|
Brightness on 2016 vision samples
For all of the images we were given to test vision processing with, they all look like they were taken in a room with the lights off. Was this done with some setting on the camera? I've personally never been to an event that had the lights of, except maybe on Einstein...
|
|
#2
|
||||
|
||||
|
Re: Brightness on 2016 vision samples
My team was also wondering this. The image recognition works really well with the low lighting, but in normal lighting, the recognition seems to take longer/be not as easy.
|
|
#3
|
|||
|
|||
|
Re: Brightness on 2016 vision samples
I would assume that they changed the camera settings to filter out certain levels of light. There is a screensteps tutorial here.
https://wpilib.screenstepslive.com/s...amera-settings |
|
#4
|
||||
|
||||
|
Re: Brightness on 2016 vision samples
I believe the images were taken with a low exposure setting.
Here is the Wikipedia article about exposure in photography and an article talking about exposure with examples of overexposed and underexposed images. |
|
#5
|
|||
|
|||
|
Re: Brightness on 2016 vision samples
I would agree. There's also a setting on the camera webpage to adjust the exposure level; the screensteps talk about that too.
|
|
#6
|
|||
|
|||
|
Re: Brightness on 2016 vision samples
There are many ways to set up the camera and get images, but I'll list the elements using the WPILib terminology.
The retroreflective tape is such a strong reflector that you can think of it as an amplifier of the ring light. The material will return either 300 or 600 times as much light as bright white paint. I no longer remember the spec for the material being used. It is so bright that it can overwhelm the camera's sensor and auto settings and you will actually get an image with a white target and an LED colored fringe. This is called sensor bloom. Fancier camera sensors will postpone the blooming, but sufficiently bright light is a challenge. The good news is that you can use this to your advantage. If you lower the exposure time and/or lower the brightness of the image, the background will darken and the tape will turn from white to the LED color. This also helps with processing performance, by the way, because many of the color conversion and processing algorithms will then be dealing with lots of dark pixels which they can quickly dismiss. So FIRST did this on purpose. You can too by setting the brightness and/or the exposure settings. You may also want to turn off the auto white balance and choose something that will stay predictable. Greg McKaskle |
|
#7
|
|||
|
|||
|
Re: Brightness on 2016 vision samples
Quote:
Quote:
Quote:
|
|
#8
|
||||
|
||||
|
Re: Brightness on 2016 vision samples
we were wondering this same thing. We looked into it and from what we can tell you can change the settings using a program however i do not think you can save the settings. Has anyone solved this issue
|
|
#9
|
|||
|
|||
|
Re: Brightness on 2016 vision samples
The FRC update for Labview has VIs for setting these. They aren't as nice as I'd like, so I've been working on reading the limits directly and developing my own.
|
|
#10
|
|||
|
|||
|
Re: Brightness on 2016 vision samples
The WPILib Camera VIs were originally written for the Axis VAPIX API. When USB cameras were added, they were added via the NI IMAQdx libraries. About half of the properties return a "not supported" error, but some others were extended to have a custom setting.
Lately, I've been using the Vision Acquisition express block configuration. It leads through five wizard screens with a test mode to view the changes as you experiment. It then generates code for IMAQdx. Once done, I will generally right-click and Open the Front Panel which will convert it to a VI. This gives a good starting point for more advanced configuration. To darken, you can make adjustments to exposure, gain, and brightness. Greg McKaskle |
|
#11
|
|||
|
|||
|
Re: Brightness on 2016 vision samples
Quote:
Quote:
|
|
#12
|
|||
|
|||
|
Re: Brightness on 2016 vision samples
The camera API to an Axis camera can be remote, since it is a conversation between the laptop and the camera's web server. But the USB camera needs to have the call made on the robot, where the session was opened.
Greg McKaskle |
|
#13
|
|||
|
|||
|
Re: Brightness on 2016 vision samples
Ron, check the thread I just started (LifeCam USBCamera changing settings from java) - it's not about this connection, but it does contain sample code showing how to do it. Basically you create a USBCam object, open the connection, start the capture, then you loop getting and image and passing that image to the CameraServer object. It's actually pretty simple - now if I can figure out the rest, which I thought was going to be simple, but so far has not been
|
|
#14
|
|||
|
|||
|
Re: Brightness on 2016 vision samples
Quote:
The Screensteps are only for the Web interface of the Axis Cam. It would be nice if there were some examples or documentation on doing these basic settings from Java or C++ |
![]() |
| Thread Tools | |
| Display Modes | Rate This Thread |
|
|