Background, there was a competition sponsored by Intel, The OpenCV Spatial Ai Competition. Its deadline has passed, but the competition prize is now available for backing on Kickstarter. For $129 USD ($79 USD for the lower-spec version with no 3D Vision) you can a device with 2 stereoscopic sensors, for astronomically better 3D Vision support than the two $399 each Limelights It also comes with a 4k 30fps RGB camera embedded for integration into full OpenCV processing. Onboard vision and AI processing, so nothing has to run on the RoboRIO besides the robot movement. When I get a hold of one I will attempt a pull request for this to push values to the Network Table. So there should be at least soon for 2021 or 2022 3D AI as a super low cost. Oh… Did I mention its $10,000 minimum Kickstarter funding was broken in record time (roughly 200% in 20 minutes of releasing) and now is at over $200k USD. Recommendation, share with every team beginning or established. Something better, faster, and cheaper than a Limelight is soon to be available.
This is pretty cool. I don’t think it’s super appropriate for FRC yet, but it’s definitely a cool machine learning tool.
Probably worth noting that $129 is a special price for Kickstarter backers - the retail price is $299. Still beats a Limelight on price, but not Gloworm. While neither of those are capable of true 3D tracking, the standard single camera and green LED ring has proved to be sufficient.
It looks very interesting, but we should wait to see what usage in an FRC setting looks like before we jump on it.
Oh… Sorry, my goal isn’t to get more backers for the product they have PLENTY. I ultimately want to get the word out there of a new one. Simply a product that’s cheaper retail and with higher resolution, onboard computing, and at least more accurate stereoscopic 3D than two $399 USD Limelights and something that doesn’t require the power of a Jetson TX2 to run…
Thanks for sharing this, looks really neat.
I’m personally much too risk-averse to go down this path, especially with significantly less time-intensive options such as the LimeLight available, until 1) its proven on the field to be a significant competitive advantage or 2) the time investment to develop a system around this is low enough that it’s a no brainer to do for expected relatively small improvement gained.
(LL does vision well enough, quick enough, that we have other areas where improvement can have greater effect)
Yeah I’m 100% in agreement - a limelight is purpose built for our little space so it’s always easier for folks to use than to adapt something else.
Edit - that being said I backed it at the depth camera level for unrelated projects. Whatever. Still cheaper than the last Kickstarter I backed…
Probably hard to justify “Better” without the same ease of setup and config for vision tracking, case, power supply, etc. But, super promising for folks who want to do stereo themselves.
I’ve got zero experience doing SLAM, but I have to wonder if a handful of these could start to provide enough data to do it reliably.
This isn’t operating in the same space as the LL or Gloworm. It looks like a powerful toy to mess with, but I wager any team that buys one will ultimately have worse vision integration than someone with a Gloworm or LL. FRC vision is its own territory.
Not sure I’d make that bet. It’s def true for the average team but there are likely a few teams that have the ability to leverage this tool.
A cheap depth camera seems like it could be useful for certain things (like tracking the inner goal). I pledged for one.
I will point out that the programmer behind the vision system is much more powerful than the camera itself.
That being said, I haven’t compared the two cameras yet, nor have I done too much vision work at this point in time (though that is one of my projects right now).
Tracking the inner goal is actually quite simple if you have PnP tracking on the outer goal. It’s a single transform of the outer target to get to the inner target. See the WPILib docs for more on that.
PhotonVision has PnP capability and can be quite accurate given a good calibration is done by the user.
Here’s a peek at what that looks like.
This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.