Streaming Video To Dashboard From Raspberry Pi

Hello, so I am trying to set up a Raspberry Pi 3 model B to use as a vision coprocessor this season. I installed Raspbian and followed the steps here to set up the vision project for a Lifecam 3000 plugged into the Pi via USB. I am able to successfully run the code generated by GRIP and send the values to networktables; however, I haven’t been able to get the camera stream to work. The example code from the WPILib vision library supposedly enables this. It creates a CVSource and feeds that into MjpegServer, which, from what I’ve read, should enable you to drop frames into the CVSource and have them sent to the SmartDashboard. At first I thought it might be that the server port was blocked, but I tried using a different radio without the FMS firewall enabled and it still won’t work. Is there a step I’m missing in code? or is there something I have to configure on the Pi to enable it to host a stream?

The only reason I really need the stream is for tuning the GRIP project, so if there was a way to view the output of the script on a monitor connected to the Pi, that would work too.

Here’s the code that’s running on the Pi:

import java.util.ArrayList;

import edu.wpi.first.wpilibj.networktables.*;
import edu.wpi.first.wpilibj.tables.*;
import edu.wpi.cscore.*;
import org.opencv.core.Mat;
import org.opencv.imgproc.Imgproc;
import org.opencv.core.*;
import org.opencv.core.Core.*;
import org.opencv.features2d.FeatureDetector;
import org.opencv.imgcodecs.Imgcodecs;
import org.opencv.imgproc.*;
import org.opencv.objdetect.*;

public class Main {
  public static void main(String] args) {
    // Loads our OpenCV library. This MUST be included
    System.loadLibrary("opencv_java310");

    // Connect NetworkTables, and get access to the publishing table
    NetworkTable.setClientMode();
    // Set your team number here
    NetworkTable.setTeam(5102);

    NetworkTable.initialize();


    // This is the network port you want to stream the raw received image to
    // By rules, this has to be between 1180 and 1190, so 1185 is a good choice
    int streamPort = 1185;

    // This stores our reference to our mjpeg server for streaming the input image
    MjpegServer inputStream = new MjpegServer("MJPEG Server", streamPort);

    // Selecting a Camera
    // Uncomment one of the 2 following camera options
    // The top one receives a stream from another device, and performs operations based on that
    // On windows, this one must be used since USB is not supported
    // The bottom one opens a USB camera, and performs operations on that, along with streaming
    // the input image so other devices can see it.

    // HTTP Camera
    /*
    // This is our camera name from the robot. this can be set in your robot code with the following command
    // CameraServer.getInstance().startAutomaticCapture("YourCameraNameHere");
    // "USB Camera 0" is the default if no string is specified
    String cameraName = "USB Camera 0";
    HttpCamera camera = setHttpCamera(cameraName, inputStream);
    // It is possible for the camera to be null. If it is, that means no camera could
    // be found using NetworkTables to connect to. Create an HttpCamera by giving a specified stream
    // Note if this happens, no restream will be created
    if (camera == null) {
      camera = new HttpCamera("CoprocessorCamera", "YourURLHere");
      inputStream.setSource(camera);
    }
    */
    
      

    /***********************************************/

    // USB Camera
    
    // This gets the image from a USB camera 
    // Usually this will be on device 0, but there are other overloads
    // that can be used
    UsbCamera camera = setUsbCamera(0, inputStream);
    // Set the resolution for our camera, since this is over USB
    camera.setResolution(640,480);
	camera.setExposureManual(0);
	
	GripPipeline pipeline = new GripPipeline();
    

    // This creates a CvSink for us to use. This grabs images from our selected camera, 
    // and will allow us to use those images in opencv
    CvSink imageSink = new CvSink("CV Image Grabber");
    imageSink.setSource(camera);

    // This creates a CvSource to use. This will take in a Mat image that has had OpenCV operations
    // operations 
    CvSource imageSource = new CvSource("CV Image Source", VideoMode.PixelFormat.kMJPEG, 640, 480, 30);
    MjpegServer cvStream = new MjpegServer("CV Image Stream", 1186);
    cvStream.setSource(imageSource);

    // All Mats and Lists should be stored outside the loop to avoid allocations
    // as they are expensive to create
    Mat inputImage = new Mat();
    Mat hsv = new Mat();

	NetworkTable targets = NetworkTable.getTable("GRIP/targets");
		
	
    // Infinitely process image
    while (true) {
      // Grab a frame. If it has a frame time of 0, there was an error.
      // Just skip and continue
      long frameTime = imageSink.grabFrame(inputImage);
      if (frameTime == 0) continue;

	  /*
      // Below is where you would do your OpenCV operations on the provided image
      // The sample below just changes color source to HSV
      Imgproc.cvtColor(inputImage, hsv, Imgproc.COLOR_BGR2HSV);

	  */
	  
      // Here is where you would write a processed image that you want to restreams
      // This will most likely be a marked up image of what the camera sees
      // For now, we are just going to stream the HSV image
	  
	  /*
	  pipeline.process(inputImage);
	  
	  String point = pipeline.findContoursOutput().toString();
	  
	  targets.putString("test", point);
	  */
	  
      imageSource.putFrame(inputImage);
	  
    }
  }

  private static HttpCamera setHttpCamera(String cameraName, MjpegServer server) {
    // Start by grabbing the camera from NetworkTables
    NetworkTable publishingTable = NetworkTable.getTable("CameraPublisher");
    // Wait for robot to connect. Allow this to be attempted indefinitely
    while (true) {
      try {
        if (publishingTable.getSubTables().size() > 0) {
          break;
        }
        Thread.sleep(500);
        } catch (Exception e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        }
    }


    HttpCamera camera = null;
    if (!publishingTable.containsSubTable(cameraName)) {
      return null;
    }
    ITable cameraTable = publishingTable.getSubTable(cameraName);
    String] urls = cameraTable.getStringArray("streams", null);
    if (urls == null) {
      return null;
    }
    ArrayList<String> fixedUrls = new ArrayList<String>();
    for (String url : urls) {
      if (url.startsWith("mjpg")) {
        fixedUrls.add(url.split(":", 2)[1]);
      }
    }
    camera = new HttpCamera("CoprocessorCamera", fixedUrls.toArray(new String[0]));
    server.setSource(camera);
    return camera;
  }

  private static UsbCamera setUsbCamera(int cameraId, MjpegServer server) {
    // This gets the image from a USB camera 
    // Usually this will be on device 0, but there are other overloads
    // that can be used
    UsbCamera camera = new UsbCamera("CoprocessorCamera", cameraId);
    server.setSource(camera);
    return camera;
  }
}

I’m not seeing anything obviously wrong here (it’s pretty much a more complex Java version of https://github.com/wpilibsuite/cscore/blob/master/examples/usbcvstream/usbcvstream.cpp). Are you getting any errors running this code? When you say you can’t get the stream to work… you can’t connect to port 1185 with a web browser? Or there’s no images when you do?

One thing you could try is enabling logging:


CameraServerJNI.setLogger((level, file, line, msg) -> System.out.println("CS:" + file + ":" + line + ": " + msg), 0);

No, I don’t get any errors. I haven’t tried connecting straight from a web browser, but I have put the ip and port into the smartdashboard CameraServer widget and haven’t gotten anything. I tried pinging the Pi’s ip from the DS laptop (and also my coding laptop), it finds the ip, but then the ping requests time out; so I wonder if it has something to do with the network configuration. I can ping the roborio just fine though. It’s like the pi can push data to networktables without any problems, but other devices aren’t able to initiate a connection with it.
I’ll try enabling logging next time I’m at our build room, and see if that gives any hints to what’s happening.
Thanks for taking the time to help me out!