What drive cameras are people using to get below 4Mbps limit? Most of the MJPEG USB cameras I’ve found don’t have any supported modes that use < 5Mbps.
We’re using a Logitech C920 with a raspberry Pi image. We have it streaming at 320:180 at 20 fps with a level of 50 compression on shuffleboard. It usually uses 0.6 Mbps and at it’s worse uses 1.9 Mbps.
If you’re having trouble keeping the bandwidth down, my suggestion is to turn down white balancing and your brightness. The more color is has to process the more bandwidth you’ll use
I’m not sure I know what shuffleboard is. We are using the Pi and opening the camera streams on Smartdashboard. What is shuffleboard - is that different?
Note that setting the compression with shuffleboard will cause the software to recompress the image, increasing latency. You should see what the bandwidth usage is with compression set to -1 (which is what the camera compresses to).
Here’s camera settings we’ve used in the past:
We use a couple of good old Microsoft HD3000s.
Shuffleboard is documented on screensteps.
Smartdashboard and shuffleboard are very similar programs. Shuffleboard is simply newer and will be replacing smartdashboard next year. I would recommend you use it this year to become familiar with it and I think you’ll find it a lot easier to use.
You should have also installed shuffleboard when you installed smartdashboard with the WPILib tools
The actual camera is not really important. What is important is the image size, frames per sec, and the compression setting. Most USB cameras will be roughly equivalent. You should be OK running at 320x240 (or something like 420x240 if it is wide screen), 30fps and a compression down around 25. Play with the setting to find something the driver is OK with. Note that having a higher FPS is better since it reduces latency.
For the record, 2877 is using Logitech C930e cameras because they have a wider field of view than most other USB cameras (74 degree horizontal).
We are using Logitech C920 and two LifeCams running various resolutions. The issue is MJPEG whcih is going to use a lot of bandwidth. We are using a Jetson as a co-processor and then using gstreamer to split the video for vision tracking and streaming to the driver station using h.264 encoding. A raspberry Pi can also be used to do similar - the key would be to stream using something other than MJPEG and keep your framerates and bitrates sane to avoid too much bandwidth utilization.