Sending OpenCV Output back to the Driver Laptop?

Hi,

We have a mini-PC on our testing robot and we are testing some different vision code. One problem is that in our OpenCV Python code, we simply cannot figure how to send the final image back to us. We have looked all other the internet with no avail.

We would like to be able to the output for not only testing but eventually, we might make a driver assist computer vision program that points out game pieces and such.

Ideally, we would like to be able to send it to an HTTP page, but we are open to other options. Also, being low latency is a big need in the solution.

Thanks a bunch in advance!

Have you looked at mjpg-streamer?

You could reimplement the WPILib CameraServer protocol (which is pretty easy if you ignore the client to server side). Basically on a TCP connection on port 1735, repeatedly send the following in order:

  • the bytes 1, 0, 0, 0
  • a network-order 32-bit integer with the length in bytes of the frame data (JPEG format)
  • the JPEG frame data

Then you can view the stream on SmartDashboard with the USB Camera widget.

How could I input the output stream of OpenCV with MJPG Streamer?

There’s an OpenCV input plugin that you can use. mjpg-streamer/mjpg-streamer-experimental/plugins/input_opencv at master · robotpy/mjpg-streamer · GitHub

And, it supports a python input plugin. mjpg-streamer/mjpg-streamer-experimental/plugins/input_opencv/filters/cvfilter_py/README.md at master · robotpy/mjpg-streamer · GitHub

This is exactly what I was looking for! However, I am clueless on how to add this plugin to my MJPG-streamer installation on my Linux PC.

If you have the right prerequistites installed (python-devel, numpy, and opencv-devel 3.1), then when you build mjpg-streamer using CMake it should just automatically detect the dependencies and build it.

While I personally have used mjpg-streamer for some other projects, I’ve never used it to serve images from opencv. my preferred method has been this copy pasted and cut up to be used as it’s own module. Might be interesting to see if there are any performance differences between doing it yourself and going through mjpg-streamer.

The code you linked to is fairly inefficient in a couple of different ways (in particular, it doesn’t reuse image buffers), so I’m sure out of the box the mjpg-streamer would beat the pants out of it, particularly if you ran it on a RoboRIO.

After optimization, I suspect the mjpg-streamer version would still end up winning.