I am having an issue with the OpenSight vision system. I am not sure how to access the video stream from the CameraServer node. I am using a Raspberry Pi 3 with a Microsoft LifeCam HD-3000 USB camera. I have attached a screenshot of my nodetree setup.
How are you attempting to view the stream? The H.264 backend(s) need to be viewed with either the Shuffleboard plugin or something similar (ffmpeg, GStreamer, etc.)
Alternatively, you can set the backend to MJPEG, which will let you view the stream by going to the hooks menu on the left side of the screen (puzzle piece icon) and it should appear under opsi.videoio.
I am trying to use the H.264 Shuffleboard plugin. Is there anything I need to do within the robot code (such as setting up CameraServer) for this to work?
No, you shouldn’t need to make any robot code changes.
Check the properties of the widget in shuffleboard (Right click → Edit properties). What’s set as the URI property? For your pipeline, it should be something like rtsp://10.te.am.XX:1181/camera.
The URI property is currently set to rtsp://localhost. Under the NetworkTables menu on the Shuffleboard, though, the stream is rtsp://opensight.local:554/camera. When I put that stream into the URI property of the GStreamer widget, I still can’t see the camera stream.
Sorry about the late response. There isn’t an issue with the plugin, I was just using a camera that didn’t support hardware H.264 encoding so it was an error on my part.
The camera I was using (Microsoft LifeCam HD 3000) does not support hardware H.264 encoding, which is why I think I was unable to view the stream from the camera. I’m going to try using a camera that supports hardware H.264 encoding in a few days, as well as using a different video player to view the stream. Will update with my results.