How do I put the CvSource video in the smart dashboard?


We want to test out the vision processing by seeing the final Mat in the SmartDashboard, but what we concluded is that - that mat is not what we’re seeing in the SmartDashboard when I try to add a CameraServer Stream Viewer (Checked by telling the process() function to add a rectangle over the image). So how do we put this CvSource in the SmartDashboard?


The code for anyone who asks:

package org.usfirst.frc.team5951.robot;

import org.opencv.core.Mat;

import edu.wpi.cscore.CvSink;
import edu.wpi.cscore.CvSource;
import edu.wpi.cscore.UsbCamera;
import edu.wpi.first.wpilibj.CameraServer;
import edu.wpi.first.wpilibj.SampleRobot;
import edu.wpi.first.wpilibj.smartdashboard.SmartDashboard;

 * This is a demo program showing the use of the RobotDrive class. The
 * SampleRobot class is the base of a robot application that will automatically
 * call your Autonomous and OperatorControl methods at the right time as
 * controlled by the switches on the driver station or the field controls.
 * The VM is configured to automatically run this class, and to call the
 * functions corresponding to each mode, as described in the SampleRobot
 * documentation. If you change the name of this class or the package after
 * creating this project, you must also update the manifest file in the resource
 * directory.
 * WARNING: While it may look like a good choice to use for your code if you're
 * inexperienced, don't. Unless you know what you are doing, complex code will
 * be much more difficult under this system. Use IterativeRobot or Command-Based
 * instead if you're new.
public class Robot extends SampleRobot {

	VisionThread visionThread;
	UsbCamera source;
	CvSink sink;
	CvSource output;
	Mat mat;

	 * UsbCamera camera = CameraServer.getInstance().startAutomaticCapture();
	 * camera.setResolution(640, 480);
	 * CvSink cvSink = CameraServer.getInstance().getVideo(); CvSource
	 * outputStream = CameraServer.getInstance().putVideo("Blur", 640, 480);
	 * Mat source = new Mat(); Mat output = new Mat();
	 * while(!Thread.interrupted()) { cvSink.grabFrame(source);
	 * Imgproc.cvtColor(source, output, Imgproc.COLOR_BGR2GRAY);
	 * outputStream.putFrame(output); }
	public Robot() {
		source = CameraServer.getInstance().startAutomaticCapture();
		source.setResolution(640, 480);
		sink = CameraServer.getInstance().getVideo();
		output = CameraServer.getInstance().putVideo("Processed: ", 640, 480);
		visionThread = new VisionThread(source, new VisionImplementation(), pipeline -> {

	public void robotInit() {

	 * This autonomous (along with the chooser code above) shows how to select
	 * between different autonomous modes using the dashboard. The sendable
	 * chooser code works with the Java SmartDashboard. If you prefer the
	 * LabVIEW Dashboard, remove all of the chooser code and uncomment the
	 * getString line to get the auto name from the text box below the Gyro
	 * You can add additional auto modes by adding additional comparisons to the
	 * if-else structure below with additional strings. If using the
	 * SendableChooser make sure to add them to the chooser code above as well.
	public void autonomous() {
//		visionThread.start();

	 * Runs the motors with arcade steering.
	public void operatorControl() {

	 * Runs during test mode
	public void test() {

VIsionImplementation is a class we’ve created, and for that example all it did was Normalize the picture, median blured it, and added a rectangle over it.

imageToShow() just returns a Mat of the final “product”.

2019_Vision: How to modify and display camera image CPP

Have you looked at the intermediate vision example?


Yeah, the code is pretty much the same, apart from the fact that they’re using a normal thread instead of the VisionThread.


I think this is your issue:


Also doesn’t the visionthread get created in RobotInit instead of the constructor?


Oh woops, that was just a typo but it’s not the problem because it’s the same typo in both.

Is there a difference in where it’s initialized? Since it only runs when Teleop is running.