Limelight For FRC: Major software update 2018.0, Videos, Batch 2, and Shipping

Hi Everyone,

For those who haven’t been introduced to Limelight, it is the easiest way to add high speed vision to any FRC robot. It is the first smart camera/vision sensor that combines tracking software, networking, video streaming, an easy web interface for tuning, and 400 Lumens of light output in an easy-to-mount box. Our beta testers have added vision tracking to their robots in (usually) under an hour.

Initial launch video:


Website:

Docs:
http://docs.limelightvision.io/en/latest/

Limelight also enables tracking without gyroscopes or encoders, as seen here:

987’s software student and a 2017 mechanical grad took 987’s 2015 robot from zero to sensorless tracking in two hours with a very small amount of code. This included flashing the roborio, updating toolchains, etc. The documentation now has multiple case studies with code to help add tracking to any robot.

Here’s another example. Tracking and range finding were tuned entirely with Limelight’s crosshair calibration feature. The only other sensor used in this video is an encoder on the flywheel:

The full source code for this test is available in the docs. We were actually very surprised by how little code we needed to make a capable STEAMWORKS shooter with Limelight.

Limelight works with any programming language, and there is no complicated setup involved. No libraries, toolchains, soldering, crimping, networking code, or Linux required. The latency is very low, so you can completely forget about latency compensation as well. We want to level the playing field when it comes to vision, specifically for teams that lack software mentors or an abundance of software students.

We have a lot to announce today.

First, batch 2 will go on sale on Saturday at 4:00 PM PST and start shiping late week 1.
Second, batch 1 will start shipping tomorrow. Many thanks to those pre-ordered.

Also, after more testing, we are now recommending the addition of a small network switch to your robots if you wish to use vision while tethered at an event.

Software update 2018.0 (1/3/18)
On top of a ton of new case studies, more detailed documentation, and a full example program for an autonomous STEAMWORKS shooter, the software has received a major upgrade.

FEATURES

New Vision Pipeline interface:

  • https://thumbs.gfycat.com/UnfitLankyHadrosaurus-max-14mb.gif

  • Add up to 10 unique vision pipelines, each with custom crosshairs, thresholding options, exposure, filtering options, etc.

  • Name each vision pipeline.

  • Mark any pipeline as the “default” pipeline.

  • Instantly switch between pipelines during a match with the new “pipeline” NetworkTables value. This is useful for games that have multiple vision targets (eg. the gear peg and boiler from 2017). This is also useful for teams that need to use slightly different crosshair options per robot, field, alliance, etc.

  • Download vision pipelines from Limelight to backup or share with other teams.

  • Upload vision pipelines to any “slot” to use downloaded pipelines.

Target “Grouping” option:

New Crosshair Calibration interface:

  • “Single” and “Dual” crosshair modes.
  • “Single” mode is what Limelight utilized prior to this update. Teams align their robots manually, and “calibrate” to re-zero targeting values about the crosshair.
  • “Dual” mode is an advanced feature for robots that need a dynamic crosshair that automatically adjusts as a target’s area / distance to target changes. We’ve used this feature on some of our shooting robots, as some of them shot with a slight curve. This feature will also be useful for robots with uncentered andor misaligned Limelight mounts.
  • Separate X and Y calibration.

Misc:

  • Add Valid Target “tv” key to Network Tables.

  • Add Targeting Latency “tl” key to Network Tables. “tl” measures the vision pipeline execution time. Add at least 11 ms for capture time.

  • Draw additional rectangle to help explain aspect ratio calculation.

  • Remove throttling feature, and lock Limelight to 90fps.

  • Disable focusing on most web interface buttons. Fixes workflow problem reported by teams who would calibrate their crosshairs, then press “enter” to enable their robots.

  • Post three “raw” contours and both crosshairs to Network Tables.

  • Access a raw contour with tx0, tx1, ta0, ta1, etc.

  • Access both raw crosshairs with cx0, cy0, cx1, cy1.

  • All x/y values are in normalized screen space (-1.0 to 1.0)

  • Add “suffix” option to web interface. Allows users to add a suffix to their Limelights’ hostnames and NetworkTables (e.g. limelight-boiler). This feature should only be utilized if teams intend to use multiple Limelights on a single robot.

  • Display image version on web interface

Optimizations

  • Decrease networking-related latency to ~0.2 ms from ~10ms (Thanks Thad House)
  • Move stream encoding and jpg compression to third core, eliminating 10ms hitch (25 - 30ms hitch with two cameras) seen every six frames.
  • Drop steady-state pipeline execution time to 5ms with SIMD optimizations.
  • http://docs.limelightvision.io/en/latest/_images/20180_latency.png
  • New Latency testing shows ~22 ms total latency from photons to targeting information.
  • Upgrade Network Tables to v4 (Thanks Thad House)
  • Optimize contour filtering step. Latency no longer spikes when many contours exist.
  • Much improved hysterisis tuning.
  • Significantly improve responsiveness of webinterface<->limelight actions.

Bugfixes

  • Fix minor area value inaccuracy which prevented value from reaching 100% (maxed ~99%).
  • Fix half-pixel offset in all targeting calculations
  • Fix camera stream info not populating for NT servers started after Limelight’s boot sequence. Regularly refresh camera stream info.
  • Fix bug which caused aspect ratio to “flip” occasionally.
  • Force standard stream output (rather than thresholded output) in driver mode.
  • Fix bug which prevented LEDs from blinking after resetting Networking informationorking informationng information

Finally, we want to thank our testers for helping us get to this point. We chose a great mix of experienced and inexperienced teams, and their feedback directly translated to changes throughout development:

Alpha testing + **invaluable **electronics help:
Kiet Chau and Team 1538
Alpha testing:
Norman Morgan, Greg McKaskle and Team 2468

Beta testing:
Corey Applegate and Team 3244
Marshall Massengill and Team 900
Brian Selle and Team 3310
Kevin Hjelstrom and Team 2605
Adam Heard and Team 973
Solomon Greenburg and Team 2898

Best,
The Limelight Team

You guys rock. That’s all I got to say about this.

:yikes:

**I just keeps getting better! **

We will be installing this update tonight!

Looking forward to using the limelight this year! Thanks for the updates 987, this addresses several awesome upgrades

This is awesome! This figure (and your demos, of course) just confirm what I had suspected: Most teams will be very happy with using the Limelight as the only feedback device in a vision-based control loop.

“dual” grouping mode”. “Single” and “Tri

What do you know that we will find out in 2 days?

This is so awesome can’t wait to hopefully get out during week 2 to use on our 2018 robot

Yeah we knew it could work on turrets since we’ve been doing that for a while but we had to try it on a drivetrain. Joe saw us testing aiming with the drivetrain and he liked it so much now he’s threatening to veto any turret talk this year! :smiley:

Honest, we know nothing! We asked Frank to just tell us whether vision would matter this year but he wouldn’t even give us a that.

I sure am hoping for a shooting game lol! We’ll have more robots shooting than ever! (psst Frank, if its not a shooting game, there is still time to change your mind!)

It’s not a threat…:wink:

Any idea when more will become available to purchase? Please?!

They go back in stock at 4pst Saturday

Putting alarms in calendar! Thank you!

Who are you kidding… You veto turrets every year.:slight_smile:

make turrets great again!

Wow Equally amazing in person.

The Lime Light Update. This update converted my view from a good vision solution that many could reproduce using a few parts and some TIME coding to nearly a must have to quickly get your robot great vision and move onto the other important hardware and software development.

Hello, my team is wondering if Limelight is able to track colors like Blue, or Red.
We know Limelight is able to track reflective tape but does it track other things as well?

Sweet! We were able to order some. :slight_smile:

The device is a fully capable vision system. The current pipeline yes is optimized for reflective targets. The developers told me updates should include game specific tools. In the next few day i would hope for a powerup update.

I just started reading word for word the manual.


3.5
ALLIANCE color of the PLATES is provided to the Driver Station software by the Field Management 
System. More details are in Section 3.10 The Field Management System

You shouldn’t need to see this. It’s provided to everyone.

Thanks, we are new to vision sensors and wondered how well it works before we order and use on this years playing field. And one more question, can you put a map of the playing field into Limelight so it won’t run into objects and find the target in autonomous?