I want to know if any teams have been able to recreate the limelight experience without needing the limelight.
Our team used the WPI Raspberry Pi image this year for our vision and generally was able to use it. I am wondering if there has been anyone who has been able to recreate a limelight type experience with a raspberry pi. This is kind of a summer project for me for me, a CAD/Design person, to get into the software side of robotics. This is also so our team can have reliable vision without needing two weeks out of season to make vision work.
Preferable aspects would be:
- use of the raspberry pi as a co-processor
- use of a raspberry pi camera instead of a usb camera
- use of an LED Light ring for vision light (current test have a 12v version powered via the PDP on a 20amp fuse, looking to change to a PCM for control of the light.)
we don’t mind using a separate cable to power the light, I would rather not have to design a custom regulation circuit and power the whole shoot and match, less headache during inspection - All programming made into a custom image that can be swapped out easily at comp as needed
- all components easily packed into mountable module (that’s more CAD work, easy enough for me to do)
Like I said before, I am not a programmer. I know python syntax, and I plan on learning Java this summer to work on this project. How would you guys go about doing something like this, and what have your team’s done in the past.