Home Brew Limelight: How does your team emulate the Limelight experience?

I want to know if any teams have been able to recreate the limelight experience without needing the limelight.

Our team used the WPI Raspberry Pi image this year for our vision and generally was able to use it. I am wondering if there has been anyone who has been able to recreate a limelight type experience with a raspberry pi. This is kind of a summer project for me for me, a CAD/Design person, to get into the software side of robotics. This is also so our team can have reliable vision without needing two weeks out of season to make vision work.

Preferable aspects would be:

  • use of the raspberry pi as a co-processor
  • use of a raspberry pi camera instead of a usb camera
  • use of an LED Light ring for vision light (current test have a 12v version powered via the PDP on a 20amp fuse, looking to change to a PCM for control of the light.)
    we don’t mind using a separate cable to power the light, I would rather not have to design a custom regulation circuit and power the whole shoot and match, less headache during inspection
  • All programming made into a custom image that can be swapped out easily at comp as needed
  • all components easily packed into mountable module (that’s more CAD work, easy enough for me to do)

Like I said before, I am not a programmer. I know python syntax, and I plan on learning Java this summer to work on this project. How would you guys go about doing something like this, and what have your team’s done in the past.

Our team pretty much did this exactly. I’m not a programmer, so I don’t know the code for it, but I could probably find our teams Git Hub stuff for it, If you’re interested. I don’t know much more than that we did it.

I would love to see the github for this, could you PM it to me?

1 Like

I did something similar to what your describing but I didn’t get it completely functional.

I created a plug-n-play (almost) script that runs on WPIlib RPi image that tracks vision tape and cargo and finds the horizontal angle from the camera https://github.com/team3997/ChickenVision

Hi, I’m the Project Manager of the, ironically similarly named to your post, Homebrew Limelight project. We plan to create basically exactly what you’re talking about software-wise, and all of the software is open source. Currently, the project isn’t very active as a lot of us have just finished up school/finals, but we plan on working on the project soon. If you or anyone else would like to join, we would love to have some more developers! Here is a link to our our Discord, our main method of communication.

3 Likes

@AJohnson342

1 Like

ditto on the github

Team 2073 has attempted to create a Limelight like experience with JeVois.
Our code features were quite similar. We could not publish to NT, so we used Serial over USB to send tracking values to the Rio. It works, but honestly, it is no Limelight!

Pure performance (FPS) and almost negligible lag are the feature we never could replicate. Closing the control loop with vision just did not produce results that were favorable. That is where the Limelight really shines!

Could you not just flash the official Limelight img to a Pi and recreate the hardware? It’s not locked down.

It depends a lot on what hardware is actually inside the box. These small embedded linux boards usually come with a lot of configuration / settings / drivers that are specific to the hardware on the board. That said, I don’t know what’s actually in the Limelight box.

I’m pretty sure it’s a Raspberry Pi. I think the most recent version is a Pi zero.

5026 had a Jetson TK1 on the robot this year for streaming multiple cameras and image processing. This came about for a handful of reasons like what was on hand and priority given to multiple low-latency images during Sandstorm. Streaming was using gstreamer h.264 with the buffer and bandwidth settings tamped down for latency. Amusingly, we’ve recently pushed out a very similar setup in an industry setting for almost the same reasons.

Streaming was with us the whole season (or, it was noticeable when it wasn’t), but the image processing setup was never run in competition. The students labeled and trained YoloV3 nets on supervise.ly and set up gstreamer to split the stream between streaming and inferencing with the inferencing result sent by Network Tables back to the RIO. At some point all those individual pieces were working, but not really integrated.

I’m not sure which of these parts the students will carry over into next year, but the toolbox has some neat stuff in it. There’s been some talk about picking up the limelight or substituting GRIP / OpenCV for YoloV3 to do more traditional image processing steps. This year we had a few seniors that would make OpenCV processors for fun, but that might not be in the toolbox next year. The TK1s were important enough to the experience this year that we have invested in some Jetson Nanos to run a similar setup in the future without some of the… electromechanical challenges.

Honestly, we’ve tried so many different options at this point, and nothing even comes close to the ease of use and accessibility of the Limelight. it’s just an excellent platform overall and is definitely worth a shot. We’ve tried Pi, Jetson, Jevois, and Pixy and only only the JeVois has given results close to the Limelight.

1 Like

Our team disassembled a limelight this past season, it is a pi, it has the pi symbol, but it looks as though it’s been repackaged to fit the limelight form factor better, I didn’t look too close, but it looked like a pi zero. Also FYI our limelight targeting lights got toasted after one season of use, that’s the reason we disassembled it, and the lights are a nice separate header board, a very easy fix to remove and replace/repair.

1 Like

Huh. The Pi CM3 has two hardware camera ports… Stereo Limelight when?

1 Like

The Pi Compute Module 3 board actually. But it is basically the same thing that goes into a RPi 3B+. They use a custom breakout board for it so they can get that insanely small form factor.