I.R. in love with you!
Home
Go Back   Chief Delphi > Technical > Programming > Java
CD-Media   CD-Spy  
portal register members calendar search Today's Posts Mark Forums Read FAQ rules

 
Reply
Thread Tools Rate Thread Display Modes
  #1   Spotlight this post!  
Unread 05-02-2017, 18:11
thecoopster20 thecoopster20 is offline
4th Year Programmer - Java
FRC #3602 (Robomos)
Team Role: Programmer
 
Join Date: Mar 2016
Rookie Year: 2014
Location: Escanaba, MI
Posts: 33
thecoopster20 is an unknown quantity at this point
Help Importing GRIP Pipeline Into Gradle

So I have my GRIP pipeline code generated, and I have the basic gradle file setup to grab the stream of the camera I want from the RIO. How do I go about combining the GRIP code with the gradle code and then output everything to network tables? This is where the screensteps don't really have any info. Someone have some basic example code I can look at?

Last edited by thecoopster20 : 05-02-2017 at 18:14.
Reply With Quote
  #2   Spotlight this post!  
Unread 06-02-2017, 21:21
thecoopster20 thecoopster20 is offline
4th Year Programmer - Java
FRC #3602 (Robomos)
Team Role: Programmer
 
Join Date: Mar 2016
Rookie Year: 2014
Location: Escanaba, MI
Posts: 33
thecoopster20 is an unknown quantity at this point
Re: Help Importing GRIP Pipeline Into Gradle

Here's what I have so far, would this work?

Code:
import java.util.ArrayList;
import java.util.List;

import edu.wpi.first.wpilibj.networktables.*;
import edu.wpi.first.wpilibj.tables.*;
import edu.wpi.cscore.*;

import org.opencv.core.Core;
import org.opencv.core.CvType;
import org.opencv.core.Mat;
import org.opencv.core.MatOfInt;
import org.opencv.core.MatOfPoint;
import org.opencv.core.MatOfPoint2f;
import org.opencv.core.Rect;
import org.opencv.core.Scalar;
import org.opencv.imgproc.Imgproc;

public class Main {
  public static void main(String[] args) {
    // Loads our OpenCV library. This MUST be included
    System.loadLibrary("opencv_java310");

    // Connect NetworkTables, and get access to the publishing table
    NetworkTable.setClientMode();
    // Set your team number here
    NetworkTable.setTeam(3602);

    NetworkTable.initialize();


    // This is the network port you want to stream the raw received image to
    // By rules, this has to be between 1180 and 1190, so 1185 is a good choice
    int streamPort = 1183;

    // This stores our reference to our mjpeg server for streaming the input image
    MjpegServer inputStream = new MjpegServer("MJPEG Server", streamPort);

    // Selecting a Camera
    // Uncomment one of the 2 following camera options
    // The top one receives a stream from another device, and performs operations based on that
    // On windows, this one must be used since USB is not supported
    // The bottom one opens a USB camera, and performs operations on that, along with streaming
    // the input image so other devices can see it.

    // HTTP Camera
    
    // This is our camera name from the robot. this can be set in your robot code with the following command
    // CameraServer.getInstance().startAutomaticCapture("YourCameraNameHere");
    // "USB Camera 0" is the default if no string is specified
    String cameraName = "Switcher";
    HttpCamera camera = setHttpCamera(cameraName, inputStream);
    // It is possible for the camera to be null. If it is, that means no camera could
    // be found using NetworkTables to connect to. Create an HttpCamera by giving a specified stream
    // Note if this happens, no restream will be created
    if (camera == null) {
      camera = new HttpCamera("CoprocessorCamera", "YourURLHere");
      inputStream.setSource(camera);
    }
    
    
      

    /***********************************************/

    // USB Camera
    /*
    // This gets the image from a USB camera 
    // Usually this will be on device 0, but there are other overloads
    // that can be used
    UsbCamera camera = setUsbCamera(0, inputStream);
    // Set the resolution for our camera, since this is over USB
    camera.setResolution(640,480);
    */

    // This creates a CvSink for us to use. This grabs images from our selected camera, 
    // and will allow us to use those images in opencv
    CvSink imageSink = new CvSink("CV Image Grabber");
    imageSink.setSource(camera);

    // This creates a CvSource to use. This will take in a Mat image that has had OpenCV operations
    // operations 
    CvSource imageSource = new CvSource("CV Image Source", VideoMode.PixelFormat.kMJPEG, 640, 480, 30);
    MjpegServer cvStream = new MjpegServer("CV Image Stream", 1186);
    cvStream.setSource(imageSource);

    // All Mats and Lists should be stored outside the loop to avoid allocations
    // as they are expensive to create
    Mat inputImage = new Mat();
    Mat hsv = new Mat();

    // Infinitely process image
    while (true) {
      // Grab a frame. If it has a frame time of 0, there was an error.
      // Just skip and continue
      long frameTime = imageSink.grabFrame(inputImage);
      if (frameTime == 0) continue;

      // Below is where you would do your OpenCV operations on the provided image
      // The sample below just changes color source to HSV
      process(inputImage);

      // Here is where you would write a processed image that you want to restreams
      // This will most likely be a marked up image of what the camera sees
      // For now, we are just going to stream the HSV image
    }
  }

  private static HttpCamera setHttpCamera(String cameraName, MjpegServer server) {
    // Start by grabbing the camera from NetworkTables
    NetworkTable publishingTable = NetworkTable.getTable("CameraPublisher");
    // Wait for robot to connect. Allow this to be attempted indefinitely
    while (true) {
      try {
        if (publishingTable.getSubTables().size() > 0) {
          break;
        }
        Thread.sleep(500);
        } catch (Exception e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        }
    }


    HttpCamera camera = null;
    if (!publishingTable.containsSubTable(cameraName)) {
      return null;
    }
    ITable cameraTable = publishingTable.getSubTable(cameraName);
    String[] urls = cameraTable.getStringArray("streams", null);
    if (urls == null) {
      return null;
    }
    ArrayList<String> fixedUrls = new ArrayList<String>();
    for (String url : urls) {
      if (url.startsWith("mjpg")) {
        fixedUrls.add(url.split(":", 2)[1]);
      }
    }
    camera = new HttpCamera("CoprocessorCamera", fixedUrls.toArray(new String[0]));
    server.setSource(camera);
    return camera;
  }

  private static UsbCamera setUsbCamera(int cameraId, MjpegServer server) {
    // This gets the image from a USB camera 
    // Usually this will be on device 0, but there are other overloads
    // that can be used
    UsbCamera camera = new UsbCamera("CoprocessorCamera", cameraId);
    server.setSource(camera);
    return camera;
  }


  private static double centerX = 0.0;

//Outputs
  private static Mat rgbThresholdOutput = new Mat();
  private static ArrayList<MatOfPoint> findContoursOutput = new ArrayList<MatOfPoint>();
  private static ArrayList<MatOfPoint> filterContoursOutput = new ArrayList<MatOfPoint>();
  private static ArrayList<MatOfPoint> convexHullsOutput = new ArrayList<MatOfPoint>();


  /**
   * This is the primary method that runs the entire pipeline and updates the outputs.
   */
  public static void process(Mat source0) {
	// Step RGB_Threshold0:
	Mat rgbThresholdInput = source0;
	double[] rgbThresholdRed = {213.70056497175142, 255.0};
	double[] rgbThresholdGreen = {216.10169491525423, 255.0};
	double[] rgbThresholdBlue = {213.7005649717514, 255.0};
	rgbThreshold(rgbThresholdInput, rgbThresholdRed, rgbThresholdGreen, rgbThresholdBlue, rgbThresholdOutput);

	// Step Find_Contours0:
	Mat findContoursInput = rgbThresholdOutput;
	boolean findContoursExternalOnly = true;
	findContours(findContoursInput, findContoursExternalOnly, findContoursOutput);

	// Step Filter_Contours0:
	ArrayList<MatOfPoint> filterContoursContours = findContoursOutput;
	double filterContoursMinArea = 50.0;
	double filterContoursMinPerimeter = 30.0;
	double filterContoursMinWidth = 0;
	double filterContoursMaxWidth = 30.0;
	double filterContoursMinHeight = 0;
	double filterContoursMaxHeight = 50.0;
	double[] filterContoursSolidity = {86.64103200198461, 100.0};
	double filterContoursMaxVertices = 1000000;
	double filterContoursMinVertices = 0;
	double filterContoursMinRatio = 0.0;
	double filterContoursMaxRatio = 1.0;
	filterContours(filterContoursContours, filterContoursMinArea, filterContoursMinPerimeter, filterContoursMinWidth, filterContoursMaxWidth, filterContoursMinHeight, filterContoursMaxHeight, filterContoursSolidity, filterContoursMaxVertices, filterContoursMinVertices, filterContoursMinRatio, filterContoursMaxRatio, filterContoursOutput);

	// Step Convex_Hulls0:
	ArrayList<MatOfPoint> convexHullsContours = filterContoursOutput;
	convexHulls(convexHullsContours, convexHullsOutput);
	
	if (!convexHullsOutput.isEmpty()) {
		Rect r = Imgproc.boundingRect(convexHullsOutput().get(0)); 
		centerX = r.x + (r.width / 2);
		NetworkTable.getTable("GRIP").putNumber("centerX", centerX);
	}

  }

  /**
   * This method is a generated getter for the output of a RGB_Threshold.
   * @return Mat output from RGB_Threshold.
   */
  public static Mat rgbThresholdOutput() {
	return rgbThresholdOutput;
  }

  /**
   * This method is a generated getter for the output of a Find_Contours.
   * @return ArrayList<MatOfPoint> output from Find_Contours.
   */
  public static ArrayList<MatOfPoint> findContoursOutput() {
	return findContoursOutput;
}

/**
 * This method is a generated getter for the output of a Filter_Contours.
 * @return ArrayList<MatOfPoint> output from Filter_Contours.
 */
public static ArrayList<MatOfPoint> filterContoursOutput() {
	return filterContoursOutput;
}

/**
 * This method is a generated getter for the output of a Convex_Hulls.
 * @return ArrayList<MatOfPoint> output from Convex_Hulls.
 */
public static ArrayList<MatOfPoint> convexHullsOutput() {
	return convexHullsOutput;
}


/**
 * Segment an image based on color ranges.
 * @param input The image on which to perform the RGB threshold.
 * @param red The min and max red.
 * @param green The min and max green.
 * @param blue The min and max blue.
 * @param output The image in which to store the output.
 */
private static void rgbThreshold(Mat input, double[] red, double[] green, double[] blue,
	Mat out) {
	Imgproc.cvtColor(input, out, Imgproc.COLOR_BGR2RGB);
	Core.inRange(out, new Scalar(red[0], green[0], blue[0]),
		new Scalar(red[1], green[1], blue[1]), out);
}

/**
 * Sets the values of pixels in a binary image to their distance to the nearest black pixel.
 * @param input The image on which to perform the Distance Transform.
 * @param type The Transform.
 * @param maskSize the size of the mask.
 * @param output The image in which to store the output.
 */
private static void findContours(Mat input, boolean externalOnly,
	List<MatOfPoint> contours) {
	Mat hierarchy = new Mat();
	contours.clear();
	int mode;
	if (externalOnly) {
		mode = Imgproc.RETR_EXTERNAL;
	}
	else {
		mode = Imgproc.RETR_LIST;
	}
	int method = Imgproc.CHAIN_APPROX_SIMPLE;
	Imgproc.findContours(input, contours, hierarchy, mode, method);
}


/**
 * Filters out contours that do not meet certain criteria.
 * @param inputContours is the input list of contours
 * @param output is the the output list of contours
 * @param minArea is the minimum area of a contour that will be kept
 * @param minPerimeter is the minimum perimeter of a contour that will be kept
 * @param minWidth minimum width of a contour
 * @param maxWidth maximum width
 * @param minHeight minimum height
 * @param maxHeight maximimum height
 * @param Solidity the minimum and maximum solidity of a contour
 * @param minVertexCount minimum vertex Count of the contours
 * @param maxVertexCount maximum vertex Count
 * @param minRatio minimum ratio of width to height
 * @param maxRatio maximum ratio of width to height
 */
private static void filterContours(List<MatOfPoint> inputContours, double minArea,
	double minPerimeter, double minWidth, double maxWidth, double minHeight, double
	maxHeight, double[] solidity, double maxVertexCount, double minVertexCount, double
	minRatio, double maxRatio, List<MatOfPoint> output) {
	final MatOfInt hull = new MatOfInt();
	output.clear();
	//operation
	for (int i = 0; i < inputContours.size(); i++) {
		final MatOfPoint contour = inputContours.get(i);
		final Rect bb = Imgproc.boundingRect(contour);
		if (bb.width < minWidth || bb.width > maxWidth) continue;
		if (bb.height < minHeight || bb.height > maxHeight) continue;
		final double area = Imgproc.contourArea(contour);
		if (area < minArea) continue;
		if (Imgproc.arcLength(new MatOfPoint2f(contour.toArray()), true) < minPerimeter) continue;
		Imgproc.convexHull(contour, hull);
		MatOfPoint mopHull = new MatOfPoint();
		mopHull.create((int) hull.size().height, 1, CvType.CV_32SC2);
		for (int j = 0; j < hull.size().height; j++) {
			int index = (int)hull.get(j, 0)[0];
			double[] point = new double[] { contour.get(index, 0)[0], contour.get(index, 0)[1]};
			mopHull.put(j, 0, point);
		}
		final double solid = 100 * area / Imgproc.contourArea(mopHull);
		if (solid < solidity[0] || solid > solidity[1]) continue;
		if (contour.rows() < minVertexCount || contour.rows() > maxVertexCount)	continue;
		final double ratio = bb.width / (double)bb.height;
		if (ratio < minRatio || ratio > maxRatio) continue;
		output.add(contour);
	}
}

/**
 * Compute the convex hulls of contours.
 * @param inputContours The contours on which to perform the operation.
 * @param outputContours The contours where the output will be stored.
 */
private static void convexHulls(List<MatOfPoint> inputContours,
	ArrayList<MatOfPoint> outputContours) {
	final MatOfInt hull = new MatOfInt();
	outputContours.clear();
	for (int i = 0; i < inputContours.size(); i++) {
		final MatOfPoint contour = inputContours.get(i);
		final MatOfPoint mopHull = new MatOfPoint();
		Imgproc.convexHull(contour, hull);
		mopHull.create((int) hull.size().height, 1, CvType.CV_32SC2);
		for (int j = 0; j < hull.size().height; j++) {
			int index = (int) hull.get(j, 0)[0];
			double[] point = new double[] {contour.get(index, 0)[0], contour.get(index, 0)[1]};
			mopHull.put(j, 0, point);
		}
		outputContours.add(mopHull);
	}
}
Reply With Quote
  #3   Spotlight this post!  
Unread 08-02-2017, 13:02
dvanvoorst dvanvoorst is offline
Registered User
FRC #2771 (Code Red)
Team Role: Mentor
 
Join Date: Jan 2012
Rookie Year: 2012
Location: Grand Rapids, MI
Posts: 75
dvanvoorst is an unknown quantity at this point
Re: Help Importing GRIP Pipeline Into Gradle

Maybe you're trying to do something fancier than what I'm understanding, but I think you're making this way harder than it has to be.
The GRIP pipeline code can go right into your robot program and there is no need to use Network Tables to communicate the results - other than for debugging possibly. If you haven't already seen this screensteps document, it shows how to intergrate the GRIP pipeline into your program.
Using Generated Code in a Robot Program
__________________

Reply With Quote
Reply


Thread Tools
Display Modes Rate This Thread
Rate This Thread:

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT -5. The time now is 15:34.

The Chief Delphi Forums are sponsored by Innovation First International, Inc.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi