Setting up NetworkTable on a RPi

I’m trying to get VisionBuildSample working on the Pi. I’m able to use the Pi camera with Grip running on my laptop and use the Generated Code with gradlew Build and run it on the Pi.

However, Generate Code does NOT generate anything for NetworkTable. Main has some NT initialization stuff. SOooo, I assUme I need to add code to Main to take, say filterContoursOutput, and push the values to NT values.

HOWever, the NT stuff in Main doesn’t make sense to me. And when I try to cut in the client example from Creating a client-side program I get errors.

SOOooo, if someone has this working or has a solution. How about some HHHHeeeeellllpppp!

Thanks.

You are correct about the GRIP pipeline not containing NT code. The pipeline only processes the image, it is up to you to utilize the data that the pipeline outputs. If you post your code it will be a lot easier to help you.

Could you provide the specific problem? An output or specific error codes? Have you installed PyNetworkTables over PIP, or have you manually compiled it? What version of Python are you running? Does the code work internally (without PyNetworkTables)?

I really have just made a few changes to Main.java get the pipeline running and to check it.

import java.util.ArrayList;

import edu.wpi.first.wpilibj.networktables.*;
import edu.wpi.first.wpilibj.tables.*;
import edu.wpi.cscore.*;
import org.opencv.core.Mat;
import org.opencv.imgproc.Imgproc;

public class Main {
	// -jch- Instantiate the Grip Pipeline as gpl
	static GripPipeline gpl = new GripPipeline();

  public static void main(String] args) {

    // Loads our OpenCV library. This MUST be included
    System.loadLibrary("opencv_java310");

    // Connect NetworkTables, and get access to the publishing table
    NetworkTable.setClientMode();
    // Set your team number here
    NetworkTable.setTeam(86);

    NetworkTable.initialize();


    // This is the network port you want to stream the raw received image to
    // By rules, this has to be between 1180 and 1190, so 1185 is a good choice
    int streamPort = 1185;

    // This stores our reference to our mjpeg server for streaming the input image
    MjpegServer inputStream = new MjpegServer("MJPEG Server", streamPort);

    // Selecting a Camera
    // Uncomment one of the 2 following camera options
    // The top one receives a stream from another device, and performs operations based on that
    // On windows, this one must be used since USB is not supported
    // The bottom one opens a USB camera, and performs operations on that, along with streaming
    // the input image so other devices can see it.

    // HTTP Camera
	/*
    // This is our camera name from the robot. this can be set in your robot code with the following command
    // CameraServer.getInstance().startAutomaticCapture("YourCameraNameHere");
    // "USB Camera 0" is the default if no string is specified
    String cameraName = "USB Camera 0";
    HttpCamera camera = setHttpCamera(cameraName, inputStream);
    // It is possible for the camera to be null. If it is, that means no camera could
    // be found using NetworkTables to connect to. Create an HttpCamera by giving a specified stream
    // Note if this happens, no restream will be created
    if (camera == null) {
      camera = new HttpCamera("CoprocessorCamera", "YourURLHere");
      inputStream.setSource(camera);
    }
    */
    
      
    /***********************************************/

    // USB Camera
    
    // This gets the image from a USB camera 
    // Usually this will be on device 0, but there are other overloads
    // that can be used
    UsbCamera camera = setUsbCamera(0, inputStream);
    // Set the resolution for our camera, since this is over USB
    camera.setResolution(320,240);  //was 640,480
    

    // This creates a CvSink for us to use. This grabs images from our selected camera, 
    // and will allow us to use those images in opencv
    CvSink imageSink = new CvSink("CV Image Grabber");
    imageSink.setSource(camera);

    // This creates a CvSource to use. This will take in a Mat image that has had OpenCV operations
    // operations 
    CvSource imageSource = new CvSource("CV Image Source", VideoMode.PixelFormat.kMJPEG, 320, 240, 30);
    MjpegServer cvStream = new MjpegServer("CV Image Stream", 1186);
    cvStream.setSource(imageSource);

    // All Mats and Lists should be stored outside the loop to avoid allocations
    // as they are expensive to create
    Mat inputImage = new Mat();
    Mat hsv = new Mat();

    // Infinitely process image
    while (true) {
      // Grab a frame. If it has a frame time of 0, there was an error.
      // Just skip and continue
      long frameTime = imageSink.grabFrame(inputImage);
      if (frameTime == 0) continue;

      // Below is where you would do your OpenCV operations on the provided image
      // The sample below just changes color source to HSV
	  // -org- Imgproc.cvtColor(inputImage, hsv, Imgproc.COLOR_BGR2HSV);

	  // -jch- Call to the Grip Pipeline.  
	  // Stores results in Global vars, hsvThresholdOutput, ...
	  gpl.process(inputImage);
	
      // Here is where you would write a processed image that you want to restreams
      // This will most likely be a marked up image of what the camera sees
      // For now, we are just going to stream the HSV image
	  // -org- imageSource.putFrame(hsv);

	  // -jch- Call to the Grip output.
	  imageSource.putFrame(gpl.hsvThresholdOutput());
	//  imageSource.putFrame(gpl.hslThresholdOutput());
	//  imageSource.putFrame(gpl.rgbThresholdOutput());
	//  imageSource.putFrame(gpl.tstInputOutput());
    }

  }  //End Main Method  -----------

  private static HttpCamera setHttpCamera(String cameraName, MjpegServer server) {
	// Start by grabbing the camera from NetworkTables
	NetworkTable publishingTable = NetworkTable.getTable("CameraPublisher");
	// Wait for robot to connect. Allow this to be attempted indefinitely
	while (true) {
	  try {
		if (publishingTable.getSubTables().size() > 0) {
		  break;
		}
		Thread.sleep(500);
		} catch (Exception e) {
			// TODO Auto-generated catch block
			e.printStackTrace();
		}
	}


	HttpCamera camera = null;
	if (!publishingTable.containsSubTable(cameraName)) {
	  return null;
	}
	ITable cameraTable = publishingTable.getSubTable(cameraName);
	String] urls = cameraTable.getStringArray("streams", null);
	if (urls == null) {
	  return null;
	}
	ArrayList<String> fixedUrls = new ArrayList<String>();
	for (String url : urls) {
	  if (url.startsWith("mjpg")) {
		fixedUrls.add(url.split(":", 2)[1]);
	  }
	}
	camera = new HttpCamera("CoprocessorCamera", fixedUrls.toArray(new String[0]));
	server.setSource(camera);
	return camera;
  }

  private static UsbCamera setUsbCamera(int cameraId, MjpegServer server) {
	// This gets the image from a USB camera 
	// Usually this will be on device 0, but there are other overloads
	// that can be used
	UsbCamera camera = new UsbCamera("CoprocessorCamera", cameraId);
	server.setSource(camera);
	return camera;
  }
}
//--------------------------------------------  Main  ---------------------------------------------------------

I don’t understand the network stuff near the beginning referencing “NetworkTable.xyz”. Isn’t that a class and should it be instantiated like in the example Creating a client-side program,
https://wpilib.screenstepslive.com/s/currentCS/m/75361/l/851714-creating-a-client-side-program

Then create the tables & entries. Finally towards the bottom read the filterContoursOutput and store to entries.

Oh and finally, finally, since I don’t have a roboRIO this needs to be checked out by viewing the networkTable on my laptop.

Sounds simple. But I’ve yet to get something that makes sense to me to compile.

import java.util.ArrayList;

import edu.wpi.first.wpilibj.networktables.*;
import edu.wpi.first.wpilibj.tables.*;
import edu.wpi.cscore.*;
import org.opencv.core.Mat;
import org.opencv.imgproc.Imgproc;

public class Main {
	// -jch- Instantiate the Grip Pipeline as gpl
	static GripPipeline gpl = new GripPipeline();

  public static void main(String] args) {

    // Loads our OpenCV library. This MUST be included
    System.loadLibrary("opencv_java310");

    // Connect NetworkTables, and get access to the publishing table
    NetworkTable.setClientMode();
    // Set your team number here
    NetworkTable.setTeam(86);

    NetworkTable.initialize();


    // This is the network port you want to stream the raw received image to
    // By rules, this has to be between 1180 and 1190, so 1185 is a good choice
    int streamPort = 1185;

    // This stores our reference to our mjpeg server for streaming the input image
    MjpegServer inputStream = new MjpegServer("MJPEG Server", streamPort);

    // Selecting a Camera
    // Uncomment one of the 2 following camera options
    // The top one receives a stream from another device, and performs operations based on that
    // On windows, this one must be used since USB is not supported
    // The bottom one opens a USB camera, and performs operations on that, along with streaming
    // the input image so other devices can see it.

    // HTTP Camera
	/*
    // This is our camera name from the robot. this can be set in your robot code with the following command
    // CameraServer.getInstance().startAutomaticCapture("YourCameraNameHere");
    // "USB Camera 0" is the default if no string is specified
    String cameraName = "USB Camera 0";
    HttpCamera camera = setHttpCamera(cameraName, inputStream);
    // It is possible for the camera to be null. If it is, that means no camera could
    // be found using NetworkTables to connect to. Create an HttpCamera by giving a specified stream
    // Note if this happens, no restream will be created
    if (camera == null) {
      camera = new HttpCamera("CoprocessorCamera", "YourURLHere");
      inputStream.setSource(camera);
    }
    */
    
      
    /***********************************************/

    // USB Camera
    
    // This gets the image from a USB camera 
    // Usually this will be on device 0, but there are other overloads
    // that can be used
    UsbCamera camera = setUsbCamera(0, inputStream);
    // Set the resolution for our camera, since this is over USB
    camera.setResolution(320,240);  //was 640,480
    

    // This creates a CvSink for us to use. This grabs images from our selected camera, 
    // and will allow us to use those images in opencv
    CvSink imageSink = new CvSink("CV Image Grabber");
    imageSink.setSource(camera);

    // This creates a CvSource to use. This will take in a Mat image that has had OpenCV operations
    // operations 
    CvSource imageSource = new CvSource("CV Image Source", VideoMode.PixelFormat.kMJPEG, 320, 240, 30);
    MjpegServer cvStream = new MjpegServer("CV Image Stream", 1186);
    cvStream.setSource(imageSource);

    // All Mats and Lists should be stored outside the loop to avoid allocations
    // as they are expensive to create
    Mat inputImage = new Mat();
    Mat hsv = new Mat();

    // Infinitely process image
    while (true) {
      // Grab a frame. If it has a frame time of 0, there was an error.
      // Just skip and continue
      long frameTime = imageSink.grabFrame(inputImage);
      if (frameTime == 0) continue;

      // Below is where you would do your OpenCV operations on the provided image
      // The sample below just changes color source to HSV
	  // -org- Imgproc.cvtColor(inputImage, hsv, Imgproc.COLOR_BGR2HSV);

	  // -jch- Call to the Grip Pipeline.  
	  // Stores results in Global vars, hsvThresholdOutput, ...
	  gpl.process(inputImage);
	
      // Here is where you would write a processed image that you want to restreams
      // This will most likely be a marked up image of what the camera sees
      // For now, we are just going to stream the HSV image
	  // -org- imageSource.putFrame(hsv);

	  // -jch- Call to the Grip output.
	  imageSource.putFrame(gpl.hsvThresholdOutput());
	//  imageSource.putFrame(gpl.hslThresholdOutput());
	//  imageSource.putFrame(gpl.rgbThresholdOutput());
	//  imageSource.putFrame(gpl.tstInputOutput());
    }

  }  //End Main Method  -----------

  private static HttpCamera setHttpCamera(String cameraName, MjpegServer server) {
	// Start by grabbing the camera from NetworkTables
	NetworkTable publishingTable = NetworkTable.getTable("CameraPublisher");
	// Wait for robot to connect. Allow this to be attempted indefinitely
	while (true) {
	  try {
		if (publishingTable.getSubTables().size() > 0) {
		  break;
		}
		Thread.sleep(500);
		} catch (Exception e) {
			// TODO Auto-generated catch block
			e.printStackTrace();
		}
	}


	HttpCamera camera = null;
	if (!publishingTable.containsSubTable(cameraName)) {
	  return null;
	}
	ITable cameraTable = publishingTable.getSubTable(cameraName);
	String] urls = cameraTable.getStringArray("streams", null);
	if (urls == null) {
	  return null;
	}
	ArrayList<String> fixedUrls = new ArrayList<String>();
	for (String url : urls) {
	  if (url.startsWith("mjpg")) {
		fixedUrls.add(url.split(":", 2)[1]);
	  }
	}
	camera = new HttpCamera("CoprocessorCamera", fixedUrls.toArray(new String[0]));
	server.setSource(camera);
	return camera;
  }

  private static UsbCamera setUsbCamera(int cameraId, MjpegServer server) {
	// This gets the image from a USB camera 
	// Usually this will be on device 0, but there are other overloads
	// that can be used
	UsbCamera camera = new UsbCamera("CoprocessorCamera", cameraId);
	server.setSource(camera);
	return camera;
  }
}
//--------------------------------------------  Main  ---------------------------------------------------------



The first confusion is near the top, NetworkTable.xyz is a class(?) ref to a method? It compiles so I guess it’s legit but what object is it referencing? How do I add tables and entries? I figure I’ll need to get gpl.filterContoursOutput and store some values to the table entries but how.

I tried following Creating a client-side program example but got errors, For the 2020 season software documentation has been moved to https://docs.wpilib.org. Documentation for KOP items can still be found here. | FRC KOP Documentation.

Finally, I don’t have a roboRIO. The idea is to create, test, develop without one, then attach to the bot with networkTable. How do I confirm the networkTables are working with say, OutlineViewer?

Still trying stuff.

If you don’t have a RoboRio, your client-side networktables program can’t work.
Try creating a server-side program and then running the OutlineViewer with the RPi ip adress.

Before I get into the weeds here: you should not be using the deprecated NetworkTable class; it only exists for backwards compatibility with old programs. Use the NetworkTableInstance class instead.

Those are static methods that belong to the NetworkTable class, not an object.

How do I add tables and entries? I figure I’ll need to get gpl.filterContoursOutput and store some values to the table entries but how.

Use the NetworkTableInstance class to get the tables and entries you want. The screensteps page you linked is pretty clear on how to use it.

I tried following Creating a client-side program example but got errors, For the 2020 season software documentation has been moved to https://docs.wpilib.org. Documentation for KOP items can still be found here. | FRC KOP Documentation.

Without knowing the errors, I can’t help with this.

Finally, I don’t have a roboRIO. The idea is to create, test, develop without one, then attach to the bot with networkTable. How do I confirm the networkTables are working with say, OutlineViewer?

Run OutlineViewer in server mode and have the client-side program connect to the computer running OutlineViewer.


See above.

The roboRIO is always the server. It’s better to spoof the server with OutlineViewer and have the client application connect to it.

Agreed but that’s what’s in the FIRST example.

HMmm, OK. Static, one occurance, multiple references. Just never seen it used that way before.

Now we’re in the weeds. The networkTable imports in the Vision example dont match the imports for the client example. So I assUme the dependencies.gradle for ntCore needs to be modified also. Here’s my efforts so far:

ext.ntcoreDep = { ->
    def classifier
    if (buildType == "windows") {
      classifier = "windows2015"
    } else if (buildType == "linux") {
      classifier = "desktop"
    } else {
      classifier = buildType
    }
    return "edu.wpi.first.wpilib.networktables.java:NetworkTables:+:$classifier"	// -org-
//  return "edu.wpi.first.networktables.java:NetworkTable:+:$classifier"
//  return "edu.wpi.first.networktables.NetworkTable.java:$classifier"
}

The errors are usually, could not resolve dependencies or cannot find symbol depending on changes.

Any help. Thanks.
This is soooo much fun … NOT! :yikes:

What FIRST example? The one you linked earlier is the correct one, but the code snippets you’re posting are clearly following the old library examples. I suspect you’re getting confused with the old 2014 screensteps archived examples (which only exists as a reference for developing for the old cRIO control system)

You are also using the old, deprecated, version of networktables in your buildscript. Obviously, that will prevent you from using features and examples from the past few years

I am confused … understatement for 2018. Hopefully it will be a HAPPY NEW YEAR! ALL the code I am referencing comes from present screensteps VisionBuildSample: https://wpilib.screenstepslive.com/s/currentCS/m/vision/l/682949-vision-processing-on-an-arm-coprocessor which links to Release Release 2 of Java Build System · wpilibsuite/VisionBuildSamples · GitHub and I downloaded Sample Code (zip). However (full disclosure) there is a note " :exclamation: :exclamation: **Note: This currently does not work as of 10/9/17. We will try and push a new release as soon as we can. For now, instead just clone the newest version of master." but this is the sample.

So, is there another Sample?

In the meantime:
After a million or so various combinations (even a blind dog finds a bone once in awhile) I got the a build. I changed in dependencies.gradle:
// return “edu.wpi.first.wpilib.networktables.java:NetworkTables:+:$classifier” // -org-
return “edu.wpi.first.ntcore:ntcore-java:+:” //$classifier"

I Stated I didn’t know Gradle :blush: . I also changed imports in Main, copied the networktablesdesktopclient example into a file named networktablesdesktopclient.java then added a line to Main static NetworkTablesDesktopClient ntdtClient = new NetworkTablesDesktopClient();
and it compiles now! Still don’t know how to tie the Grip output to the network table but I am getting closer (I think).

HOWEVER, if there is a better sample to follow, I am all :eyes:.

Thanks

There’s a new Raspberry Pi 3 image built by WPILib that has networktables, cscore, and cameraserver installed. Check out the thread by @Peter_Johnson here: Kickoff Release of WPILib FRCVision Raspberry Pi image

It also contains an example Java program that you can download via the webdash - just connect to the pi in a browser and you’ll be able to download the Gradle project and dependencies. Just generate the GRIP pipeline file into the src/main/java directory and update the Main class to run the pipeline.

:laughing: It is already a Happy New Year!

I look forward to getting the new code working. However, I will miss the calluses from beating my head against the wall trying to get the old working. :smile: :smile: :smile:

I’ll just consider it a learning opportunity.

Thanks.