Our team was wondering the limitations to what camera/visions systems were allowed.
or the limitations to what types of cameras we are allowed to use. We are currently working with a pixie but also want an alternative sight system for the sandstorm phase of deep space. Any help is much appreciated.
Unless it is disallowed somewhere by the rules, it’s legal. Key rules to pay attention to for camera systems are the bandwidth limitation of 4 Mbps, the no wireless restriction (all data to the DS must go through the robot radio, so no separate wireless connections), the battery/power limitations (integrated batteries are allowed if it’s COTS, and some USB battery packs are, but all voltages on the robot must be below 24V), the $500 individual component cost limit, and (potentially) the class 1 laser restriction.
thanks this was very helpful good luck.
Hey just wondering since we are a team really trying vision processing for the first time. Why would you need an alternative sight system? Is it just to get a different angle or something else? Would you use something simple like a USB Microsoft Cam for that?
My team uses the Limelight for vision. We are using the limelight’s internal camera to do vision processing, and a secondary camera to provide additional vision for the driver. It may be possible to do vision processing from multiple cameras, but it would be harder to implement and use effectively.
Our secondary camera is a USB Microsoft Cam.
yea we were thinking that during sandstorm as much vision as possible is very helpful
I notice that when a USB camera is plugged into the limelight, its image is stitched to the limelight’s image in the 5800 stream. Can you train the limelight do vision processing on the USB camera stream too?
I believe the answer is no. You can change how the limelight shows it’s image, using a picture-in-picture mode, rather than side by side, but that doesn’t change much.