When Limelight was introduced last season, I was a bit…disappointed by the price tag. I didn’t focus too much on it, because I don’t know how much the device costs to manufacture, or how much time was spent developing, but regardless, I felt that it was too expensive to be accessible to many teams. Of course, many teams don’t have the mentorship to guide eager programmers through the process of vision, or they may not have time to dedicate to vision, and the limelight is an amazing solution that benefitted teams that chose to go that way.
But I knew there must be another solution. Specifically, a cheaper one. A programmer at heart, I decided to create yet another vision package. I wanted to create something for a device that many FRC students might have, such as a laptop, but also small enough to fit on a robot, like a raspberry pi. It needed to have a good camera and fast internals as to not bottleneck performance. So I reached in my pocket, pulled out my iPhone, and created Frosted Glass.
What is Frosted Glass?
Frosted Glass is an out-of-the-box vision processing app designed for iOS. It functions similarly to a raspberryPi set up for FRC, but without the hassle of setup. It’s currently in the app store at version 0.2, and will successfully detect retroreflective targets. It runs at ~80fps at 640x480 resolution and will use NetworkTables to communicate with the roboRIO.
** Why would I put my iPhone on a robot? **
To be honest…you probably won’t. At least not in competition. Not many people have spare iPhone’s lying around waiting for use. But my ideal use of Frosted Glass isn’t to completely replace a hand-built vision system. That would take away too much of the learning experience for FRC. Instead, I envision teams using it in the first couple days or weeks of build season, when testing chassis, shooters, or autonomous routines. When they don’t want to dedicate time to vision * just yet *. Then when they have the kinks worked out, they can come up with a more permanent solution, and build their own improved processing pipeline, and learn what it takes to set up an offboard (or onboard) system. An iPhone is a perfect tool for testing because you can pick it up and move it around, simulating robot movement without having a built robot, but no one wants to put theirs on a robot during an intense match.
It’s also great for off-season events when you aren’t in the grind of robotics yet, but maybe want to experiment with target tracking before January.
Whatever the reason may be, I hope teams will find a use for it.
** What about Android **
I’ve gotten asked this a bit since I’ve started discussing Frosted Glass within the FRC community. The short answer is: I have an iPhone, so that’s what I developed for. The long answer is: Cheezy Poofs have their CheezDroid vision app from 2016 that teams can use, and if someone would like to help turn this into an android app as well, I’ve heard lots of good arguments for Android over iPhone ($$$) and would be willing to venture down that path.
** How do I get Frosted Glass **
Like I said before, it’s in the App Store, but it’s also on my github. You need a Mac with the latest version of Xcode to build, but if you would like to customize it for your needs, go ahead!
** What can I do to help **
Right now, Frosted Glass is still early in development. It has no working NetworkTables interaction, and virtually no customization. I’m looking for iOS developers (specifically Swift developers) that know what they’re doing and would like to contribute to an Open-Source FRC project to help me continue developing. I expect to have a finalized version before 2019, and a working pipeline < 24 hours after kickoff, so teams can start practicing almost immediately.