Photonvision vs Limelight Software Performance on AprilTags

With the release of Limelight software supporting AprilTags (fiducial markers) in the latest OS release,

I wanted to create a thread to track the software performance comparisons between this and the 3rd party Photonvision.

My team is waiting for our LL2+ to come in to do side by side comparisons, but with our normal LL2 we can get tolerable performance when using low resolution (at the sake of distance) on Photonvision. I assume Limelight 1st party would be slightly better on the old hardware, as it is the Limelight devs working only for their hardware. I do think this is an important discussion as many teams currently own older Limelights and do not want to upgrade to an OrangePi setup or LL3.

5 Likes

For sure! Anand’s awesome work in his coprocessor roundup is to my knowledge the most comprehensive benchmark collection for Photon at least. Now that the new limelight firmware is out, hopefully teams can contribute their benchmarks too! (I would, but as a broke college student, don’t own a limelight).

Fwiw, from my understanding LL2 and LL2+ both use a Pi Compute Module 3(b+?) Internally and should perform identically

3 Likes

Oh you can bet I’ll be adding Limelight performance numbers to my document. Unfortunately because they aren’t publishing basic performance numbers, it could be weeks before we know what the LL3 is capable of.

That said, the LL2 is a Pi3, and the LL3 is a Pi4, so if you want something quick you can use the numbers for those platforms for comparison.

2 Likes

Ok thanks! Your first document was a big help. At this point our budget has been set for the season, so my team is really focused on squeezing the best performance out of our LL2+, which probably would be through the use of Photonvision before the Limelight software matures.

1 Like

This spreadsheet contains comparison testing done by @bankst to compare performance between PhotonVision and the Limelight 2+.

  • The following settings were used:
    • PhotonVision Settings (2023.1.2, 2D Mode)
      • Input
        • Exposure - 1.5
        • Brightness - 52
        • Camera Gain - 20
        • Red/Blue balance - 18/24
        • Default stream resolution for each camera resolution
      • AprilTag
        • Family - 16h5
        • Decimate - (1, 2, 3)
        • Blur - 0
        • Threads - 3
        • Refine Edges - True
        • Decision Margin Cutoff - 10
        • Pose Est Iters - 50 (though unused in 2d mode)
    • Limelight Settings (2023.0.1, 2D Mode)
      • NOTE: This performance can be improved/changed using cropping or increasing the black level offset as shared here (Limelight 2023.1 - MegaTag, Performance Boost, My Mistake). These settings were picked based on recommendations in the documentation as we didn’t know these details at time of testing.
      • Input
        • Exposure - 1.2 (low as possible, as recommended by documentation)
        • Camera Gain - 20 (as recommended by documentation)
        • Red/Blue Balance - default
        • Default stream resolution for each camera resolution
        • Black Level Offset - 0 (as recommended by documentation)
      • AprilTag
        • Family - 16h5
        • Decimate - (1, 2, 3)
        • No ID Filtering

We encourage users to test performance for themselves due to the obvious potential for bias in these results (bankst is a PhotonVision developer) to find what works best for their team. For users doing testing, we have some tips for consistent testing results:

  • Use the exact same hardware.
  • Use similar settings for each vision solution.
  • Don’t move the camera, or tag during testing. They both should be in the same position and environment.
  • Test in a well lit environment at the same time of day (this is important if you’re testing in a room with windows, as lighting changes over time).
  • It’s very important to keep in mind that teams can get very different results by changing various settings (exposure, black level offset, etc.). Keep these as consistent as possible.

It is important to follow these standards and methods (as one would do when conducting a science experiment) so you can have reliable information for your team to make comparisons on.

6 Likes