Anyone seeing odd results from Rev ColorSensorV3?

We’ve been doing some experimentation with the FRC KOP color sensor and we’re seeing some very strange results. It appears that our green channel is reading far higher than I would expect. The data sheet for the Broadcom sensor seems to indicate that the resolution of each color channel is the same (13 bits read in a 20 bit field) but it does state that it “approximates human eye response with green channel” which makes me wonder if the green is actually higher resolution (given that human eyes are more sensitive to green than red or blue).

Experimenting with various light sources (LED on the sensor, fluorescent overheads in our build space, transmissive via a laptop display), various sensing distances and various angles, we’re seeing pretty consistent results. The green looks almost as if it’s double what we expect but merely halfing the green reading doesn’t always allow us to read all primary and secondary colors accurately. I’m also not keen to hack in a fudge factor without understanding why we need it.

Is anyone else seeing this and can anyone shed any light on what’s up (and point us at whatever math we need to adjust our readings)?

Here are a few examples of raw readings for various color targets generated on a laptop screen:

White: R 302 G 591 B 360 (expect R=G=B approximately)
Red: R 206 G 146 B 33 (expect R=high, G=B=low)
Green: R 72 G 344 B 66 (expect G=high, R=B=low so this is good :slight_smile: )
Blue: R 25 G 107 B 273 (expect R=G=low, B=high)
Yellow: R 277 G 482 B 89 (expect R=G=high, B=low)

Have you tried taking a look at the example code on Rev’s Website?

I program in labview and they have example code for all languages and gave all the RGB values and made it really easy.

Yes, we’ve looked at the source of the supplied class library and it doesn’t seem to handle the green channel any differently from the other two. We also see similarly odd results when calling the getColor() method and the getRawColor() method which we used to read the data in my original post.

I just checked their GIT repo again and see a new color matching example. Interestingly, their approach seems to be to hardcode sampled values of the various target colors and the component values in those hardcoded colors looks similarly strange to ours with higher green than I would expect. That’s interesting…

We messed with the values a little bit but only to see if it could read the colors from a farther distance. Unfortunately we were out of luck so we’re working on getting the sensor as close as possible to solve that solution.

But that’s very strange. I haven’t had that problem with ours.

A couple of observations about the color sensor and the color spectrum of the actual color swatches provided in the kit.

The color sensor itself reads the raw values from each color channel. The values read here is highly dependent on lighting conditions and surface texture etc. With the LED on you get a better signal (more light read by the sensor) but depending on the surface texture/reflection etc you may end up with more reflection, or a lighter color. Without the LED you end up with less signal.

Looking at the response of the sensor itself from the datasheet:

image

You can see the overall response for the blue channel is lower than that of the green channel. There is also a bit of overlap between each channel.

Now looking at the color spectrum


The blue and the green colors are actually pretty close to each-other. So it doesn’t come as too much of a surprise that the values for the particular channels come out pretty close. So how do we tell them apart.

This is why our example takes into account all three channels for determining the actual color. When you look at all three channels you still end up with distinct RGB values for each of the four colors. This is also why we mention calibrating the sensor to the particular conditions that the sensor will actually be used in. In fact, calibrating during ROTATION CONTROL could be an interesting way to make sure you have good values per-match.

The topic of color and how to measure and quantify the difference between color is interesting. This paper is an easy to understand high level overview of the topic.

1 Like

This is great information and explains what’s going on. Thanks!

This actually matches other peoples’ code. There is an Arduino sketch to read the sensor here:


If you look at the code, they calibrated against white and also see the green channels as ~2x the red and blue.

Also, if you look at the Rev color match example, the calibration values for the colors have green higher than one would expect.

I guess that sensor just works better in green.

The color normalization looks wrong:

public Color getColor() {
    double r = (double)getRed();
    double g = (double)getGreen();
    double b = (double)getBlue();
    double mag = r + g + b;
    return new ColorShim(r / mag, g / mag, b / mag);
}

It’s mapping the RGB values to be the ratio of each channel to the total “value”, so a pure red 255,0,0 would be mapped to 1,0,0, but will also map a dark red 100,0,0 to 1,0,0

Admittedly the documentation only says that this “gets the most likely color”, but it seems susceptible to lighting changes (eg from someone’s garage to a real field). Dividing by the maximum value may be better, but that’d be fairly complicated since the datasheet gives different max values based on gain and measurement time

After our build time today, I’m delighted to report that we’re reliably detecting the important colors under a variety of distance and lighting conditions (though we still have to lug the system into the competition gym to see how it reacts to those horrid mercury vapor lamps). The central problem I had was assuming that the sensor channels wouldn’t overlap so we could assume pretty much pure RGB results when using primary and secondary colors. Once Will pointed out the overlap issue, and we stopped using “clean” target colors (red as (1.0, 0.0, 0.0) for example) all was well.

Yes this would be the correct way to normalize, and you could internally store the gain and resolution. However actually getting anywhere near saturation is not practical, and you end up with very low values for the three colors which I believe would be more confusing. Instead of that we could internally ‘pick’ a reasonable max for each setting, or allow the user to calibrate one themselves, but this becomes more challenging to pick the right range. Though this may give better results.

Getting values that actually fill the [0 1] range also allows for use in other APIs that use frc::Color. If this is not important teams can use the GetRawColor();

Either way will be susceptible to lighting conditions. We had started experimenting with conversion to CIE XYZ color space and color distance which can help with lighting variation (see the link I posted above), but we have been getting good results with the method as it is. Since it is four distinct colors in the same condition per event I expect this to work well for teams.