How to prevent Limelight from detecting lights

What should l do to make sure that only the upper hub target can be detected by Limelight? Since my team adjusts the Limelight angle to make sure that we can detect the upper hub target from the edge of the field to the near the hub. However, l am worried that the lights on the gym’s ceiling will disturb the Limelight. So, is there any effective way to avoid this disturbance? l remember that in Limelight’s control panel, there exists a value that l can tune to avoid disturbance. However, it’s just really hard to tune. Does anyone have any better solutions?

Lower exposure so the only things visible are the lights and the target. Then, use an inRange color map to pick out your target’s color. Those steps alone should be enough.
Additionally, you can mess with color shifting.


Is that a function in code? If so, would you mind provide some related code?

It should be native to the Limelight config. The “hue slider”?


Thanks and can you post a screenshot of that since l don’t have access to my robot now. :joy:

Can you further explain how to lower the exposure and how to use the hue silder to pick the target’s color? It seems that l am still confused how to put this into practice. Thanks!

Take a look at this for the exposure:

You can see when the exposure is high, lots of the image appears to be near white and washed out. When you lower the exposure, only really bright things are even visible. When they are the only things visible, you can tell their color better because they aren’t washed out.

Here is a description of hue:

And here is where you add a thresholding wand as an “HSV” (Hue, saturation and Value) Filter:

You can use the HSV filter to remove the lights as they will appear white/yellow and the goal will be green.

Let us know if you need some more details on any of this.

Edit: added for clarity.


Thanks for replying first!

After reading the docs, l wonder whether it’s the case that the lower the exposure is, the better the limelight will be?

After reading the documentation about hue and adding an thresholding wand, l have some questions. Firstly, how should tune the Hue, saturation and value? If there is a light nearby, which parameter should be tuned first? And should all of these three parameters be tuned?

Is the HSV filter the same with Thresholding Wands?


Probably to a point yes. Its most likely the brightest things in your image will be the green goal and the white lights. In order to tell these apart you need a very low exposure otherwise they will both just appear white. You can go TOO low then when you are far away and the goal is a little dimmer you wont be able to filter for it, so just keep that in mind and make sure you test at the extremes.

Trial and error :slight_smile: Im sure there are some standard ways to do this but I have always just guess and checked. Try starting with H and tightening the threshold until its close, and then doing the same for the other 2. Having a good understanding of HSV Colorspace is probably important here too so maybe watch a youtube video.

I dont think the light should change your tuning, but its a good idea to make sure when you are tuning that there is a light in frame so you can make sure you are filtering it out.

This I dont know, anyone else?

1 Like

The things that are really challenging to get rid of are the LED pixel screens and LED TVs. Those have actual Limelight green content…
It bit us at the Texas Cup 2 years ago. Just tuning, once we could see it going “squirrel!” And grabbing the TVs every couple seconds.
*** pro tip: ALWAYS use an ethernet switch with your limelight. If you don’t, you end up unplugging it to tether and you can’t actually troubleshoot your limelight.***


First lower exposure, then try increasing red balance and lowering blue balance. I believe we ended up somewhere around 2000 and 500 respectively but this is off the top of my head. This will help make the white/yellow lights appear more red while the pure green will stay pretty green, the settings we used were probably not perfect but did work quite well. I will try to remember to check them next meeting we have and share what we used.


Are these values the Hue or saturation?

Would you mind providing a link?

They are on the “Input” tab under the exposure and black level

1 Like

The way we did it was setting red balance to 2500 and increasing blue a bit to 1600. This made everything except the target appear purple.

1 Like

So you and @swurl don’t change the HSV value? Besides changing the red and blue balance in the input tab under the exposure, what else do you change to prevent the limelight from detecting lights?

We do. The combination of red/blue balance, exposure, and well tuned HSV is what makes Limelight ignore lights.

1 Like

We do tune the HSV, changing the light balance helps make it a lot easier to tune without picking up white lights but always pick up the targets.


General hsv:

Some specifics on color filtering. He uses python but the openCV terms/filtering still applies:

1 Like

We played once where the bright white from lights and especially windows was actually a little greenish part of the day when we analyzed the RGB values. All three were above 248 or so but green had the highest value.

A possibly effective solution to that problem is REDUCE your green LED brightness (highly adjustable with the latest LL) toward as low as possible and still see the target at your max distance (and min distance if that makes the target very small and dim) and adjust LL to target on less bright objects. It worked for us.

Sometimes we tried to position the robot so as not to point at an offensive light or window.

Tuning was tricky and not ideal this past season as described in other threads with the placement of the hub target and each arena is lit differently with different color, placement, and windows that change drastically throughout the day.


I know this is a limelight thread, but I wonder if you could run the HSV / InRange / Threshold then run a ML over the result.

The shape of the actual target shouldn’t change too much (I’d use multiple positions in the training data)

Just a thought.

1 Like

What’s better than trial and error? Stealing pipelines, 1678’s 2022 pipeline. This is just one pipeline, but many teams have their pipelines public.

Since many teams have already optimized their pipelines to basically perfection, you can just use theirs as a starting point or just us it as is.