Camera Tracking Problems...

With this code we get an image from the camera but we have an issue with trying to get the camera to track and print out the results, any ideas? thanks.

package edu.wpi.first.wpilibj.templates;


import edu.wpi.first.wpilibj.DriverStationLCD;
import edu.wpi.first.wpilibj.IterativeRobot;
import edu.wpi.first.wpilibj.Timer;
import edu.wpi.first.wpilibj.camera.AxisCamera;
import edu.wpi.first.wpilibj.command.Scheduler;
import edu.wpi.first.wpilibj.image.*;
import edu.wpi.first.wpilibj.templates.commands.CommandBase;

/**
 * The VM is configured to automatically run this class, and to call the
 * functions corresponding to each mode, as described in the IterativeRobot
 * documentation. If you change the name of this class or the package after
 * creating this project, you must also update the manifest file in the resource
 * directory.
 */
public class RobotTemplate extends IterativeRobot {

    
    AxisCamera camera;
    BinaryImage sensorimage;
    CriteriaCollection cc;
    
    public void robotInit() {
        String camip = "10.29.77.11";
        camera = AxisCamera.getInstance(camip);
        camera.writeResolution(AxisCamera.ResolutionT.k640x480);
        camera.writeRotation(AxisCamera.RotationT.k180);
        cc = new CriteriaCollection();      // create the criteria for the particle filter
        cc.addCriteria(NIVision.MeasurementType.IMAQ_MT_BOUNDING_RECT_WIDTH, 30, 400, false);
        cc.addCriteria(NIVision.MeasurementType.IMAQ_MT_BOUNDING_RECT_HEIGHT, 40, 400, false);
        
        while (true) {
           
           
            try {
                ColorImage image;                           // next 2 lines read image from flash on cRIO
                image =  new RGBImage("/10ft2.jpg");
                BinaryImage thresholdImage = image.thresholdRGB(0, 0, 0, 0, 0, 0);   // keep only black objects
                BinaryImage bigObjectsImage = thresholdImage.removeSmallObjects(false, 2);  // remove small artifacts
                BinaryImage convexHullImage = bigObjectsImage.convexHull(false);          // fill in occluded rectangles
                BinaryImage filteredImage = convexHullImage.particleFilter(cc);           // find filled in rectangles
                
                ParticleAnalysisReport] reports = filteredImage.getOrderedParticleAnalysisReports();  // get list of results
                for (int i = 0; i < reports.length; i++) {                                // print results
                    ParticleAnalysisReport r = reports*;
                    System.out.println("Particle: " + i + ":  Center of mass x: " + r.center_mass_x);
                   
                DriverStationLCD.getInstance().println(DriverStationLCD.Line.kMain6, 1, "Particle: " + i + ":  Center of mass x: " + r.center_mass_x);
                DriverStationLCD.getInstance().updateLCD();
                
                }
                System.out.println(filteredImage.getNumberParticles() + "  " + Timer.getFPGATimestamp());

                DriverStationLCD.getInstance().println(DriverStationLCD.Line.kMain6, 1, filteredImage.getNumberParticles() + "  " + Timer.getFPGATimestamp());
                DriverStationLCD.getInstance().updateLCD();
                
                /**
                 * all images in Java must be freed after they are used since they are allocated out
                 * of C data structures. Not calling free() will cause the memory to accumulate over
                 * each pass of this loop.
                 */
                filteredImage.free();
                convexHullImage.free();
                bigObjectsImage.free();
                thresholdImage.free();
                image.free();

                
                } catch (NIVisionException ex) {
                } 
        }
                                
                
   
            // freeing the image. A.K.A. letting the image go so we can have more images
            
        
                //sensorimage = camera.getImage();
         
        
        CommandBase.init();
    }
    
    public void autonomousInit() {
        // schedule the autonomous command (example)
        //autonomousCommand.start();
    }

    /**
     * This function is called periodically during autonomous
     */
    public void autonomousPeriodic() {
        Scheduler.getInstance().run();
    }

    public void teleopInit() {
		// This makes sure that the autonomous stops running when
		// teleop starts running. If you want the autonomous to 
		// continue until interrupted by another command, remove
		// this line or comment it out.
		//autonomousCommand.cancel();
    }

    /**
     * This function is called periodically during operator control
     */
    public void teleopPeriodic() {
        Scheduler.getInstance().run();
    }
}

I think the problem with this code is the threshold. If you have the NI Vision Assistant, put the image in that program and then mess with threshold values. Once you have that down, plug those numbers back into the program and that should do it :smiley:

Is this copied directly from the demo program with a few modifications? Also what errors are you getting and where?

ColorImage image; // next 2 lines read image from flash on cRIO
image = new RGBImage("/10ft2.jpg");

You are reading the image from the crio, so you probably didnt upload the image to it. There were two lines after this code that allowed you to read it from the camera. Try setting the image to the one read from the camera and see what happens.

Thanks for the heads up, we should have seen this. we just changed code to read from the camera, like you told us to. we’re going to go test it as soon as we are able.

@severhale yes we did copy this from the demo program and made some modifications and are not getting any errors accordingly.
@xmendude217 we did check the thresholds, and they are correct for what we need.

Our camera is not taking the code… we get an unfiltered image to the driver station, and the camera doesn’t seem to have any problems… it’s imaged correctly apparently, but it’s not responding to changes in resolution or rotation code, or anything of the sorts.

The dashboard image is not affected by certain things like resolution changes. And you will not see the filtered images unless you save them to the cRIO.

I would be inclined to agree with xmendude that your problem most likely lies in your thresholding. Setting 0s for all values would, yes, detect black objects, but only perfectly black objects with absolutely no color at all. My bet is as soon as you have any light at all in the image the values would no longer be 0 for R, G, and B.

Out of curiosity, what’s your setup that you find it viable to look for only perfectly black objects?

We have a procedure for determining the distortion for your camera, using JavaCV. See this thread http://chiefdelphi.com/forums/showthread.php?t=101753

Our answer was xc= x*(1 -0.055r^2 ), and yx = y(1 - 0.55*r^2), where r is the distance from the center of the image, and x,y is the pixel location from the center of the image. This number is for the Axis 1011 camera.

You may use this value either during photo processing or after you get your points, depending on the image processing load you can support.

Other calibration techniques exist; this is just the one we used.

alright, how de we get the image saved to our rio, and then display the image on our dashboard.

@jesusrambo We are setting up for perfectly black objects because of the black tape around our targets because they are black, if we were to compensate for luminosity where would we set that?

@jviolette123 we will take that into account.

thank you for your time, we will post again if we have more problems.

Yes the tape is black, but with ambient light and all and with the quality of the cameras I’d be somewhat surprised if it was black enough to be all 0s.

A better solution might be to check out HSL for your thresholding.

I’d actually be surprised if the sensor returns zeros even in the dark. Sensors are noisy.

Greg McKaskle

we are still having issues with the code, we have set it with a larger threshold to account luminosity and color gradient. can you give us a direct way to save the image to the rio and then to export it to the dashboard so we can see it? thanks.

Someone else already answered that :slight_smile: :

We have set our code to write the images the camera has retrieved and filtered. After the camera’s fresh image is returned, all of them return black, any ideas what caused this? thanks

Can we see your code?

/*
 * To change this template, choose Tools | Templates
 * and open the template in the editor.
 */
package edu.wpi.first.wpilibj.templates.commands;



import edu.wpi.first.wpilibj.Timer;
import edu.wpi.first.wpilibj.camera.AxisCamera;
import edu.wpi.first.wpilibj.camera.AxisCameraException;
import edu.wpi.first.wpilibj.image.*;

public class CameraSeek extends CommandBase {
  
    AxisCamera camera;
    CriteriaCollection cc;    
     
     public CameraSeek() {
        
        // Use requires() here to declare subsystem dependencies
        // eg. requires(chassis);
    }

    // Called just before this Command runs the first time
    protected void initialize() {
        String camip = "10.29.77.11";
        camera = AxisCamera.getInstance(camip);
        cc = new CriteriaCollection();      // create the criteria for the particle filter
        cc.addCriteria(NIVision.MeasurementType.IMAQ_MT_BOUNDING_RECT_WIDTH, 20, 400, false);
        cc.addCriteria(NIVision.MeasurementType.IMAQ_MT_BOUNDING_RECT_HEIGHT, 20, 300, false); 
    try {
                ColorImage picture = camera.getImage();   //should read image from camera and then save it to a jpg file   
                picture.write("runthis.jpg");
                
                ColorImage trypicture = new RGBImage("runthis.jpg");    //workaround that should allow us to process the image
                
                //BinaryImage thresholdImage = trypicture.thresholdRGB(205, 255, 205, 255, 205, 255);   // keep only black objects
                BinaryImage thresholdImage = trypicture.thresholdHSL(126, 255, 6, 255, 201, 255);     //Saturation/Hue/Lumination filter
                BinaryImage bigObjectsImage = thresholdImage.removeSmallObjects(false, 2);  // remove small artifacts
                BinaryImage convexHullImage = bigObjectsImage.convexHull(false);          // fill in occluded rectangles
                BinaryImage filteredImage = convexHullImage.particleFilter(cc);           // find filled in rectangles                
                ParticleAnalysisReport] reports = filteredImage.getOrderedParticleAnalysisReports();  // get list of results 
                
                //System.out.println("Step 2 Achieved-Threshold set, image filtered");
              
        
               
    for (int i = 0; i < reports.length; ++i) {                               // print results              
                ParticleAnalysisReport r = reports*;
                System.out.println("Particle: " + i + ":  Center of mass x: " + r.center_mass_x_normalized);                   
                System.out.println("Particle: " + i + ": Center of mass y: " + r.center_mass_y_normalized);
                
                if (r.center_mass_x_normalized < 0.7) {
                  System.out.println("Found it, its to the left");
                  shootrun.Pan(r.center_mass_x_normalized);
                  
                    if (r.center_mass_x_normalized < 0.0) {
                  Timer.delay(-1.0 * r.center_mass_x_normalized); 
                  }
                   else if (r.center_mass_x_normalized > 0.0) {
                   Timer.delay(r.center_mass_x_normalized);     
                   }
                    shootrun.Pan(0);
                }  
                else if (r.center_mass_x_normalized < -0.9) {
                  System.out.println("Found it, its to the right");
                  shootrun.Pan(r.center_mass_x_normalized);
                   if (r.center_mass_x_normalized < 0.0) {
                  Timer.delay(-1.0 * r.center_mass_x_normalized); 
                  }
                   else if (r.center_mass_x_normalized > 0.0) {
                   Timer.delay(r.center_mass_x_normalized);     
                   }
                  shootrun.Pan(0);    
                }
                else 
                  shootrun.Pan(0);  
                  System.out.println("It be trippen, your right on top of it.");
                
    }
           shootrun.Pan(0);
            filteredImage.write("A-5-filteredpic.jpg");
            bigObjectsImage.write("A-3-bigimage.jpg");
            convexHullImage.write("A-4-convex.jpg");
            thresholdImage.write("A-2-threshold.jpg");
            picture.write("A-1-Fresh.jpg");
            trypicture.write("A-6-riu.jpg");
            //thresholdImagebw.write("A-7-Secondarythreshold");
            
                System.out.println(filteredImage.getNumberParticles() + "  " + Timer.getFPGATimestamp()); 
                filteredImage.free();
                convexHullImage.free();
                bigObjectsImage.free();
                thresholdImage.free();
                picture.free();
                trypicture.free();
                //thresholdImagebw.free();
                
                //System.out.println("Step 3 Achieved-Images Freed");   
                  
            } catch (AxisCameraException ex) {
            } catch (NIVisionException ex) {

            }
        
       
    }

    // Called repeatedly when this Command is scheduled to run
    protected void execute() {
        
    
    }

    // Make this return true when this Command no longer needs to run execute()
    protected boolean isFinished() {
        return false;
    }

    // Called once after isFinished returns true
    protected void end() {
    }

    // Called when another command which requires one or more of the same
    // subsystems is scheduled to run
    protected void interrupted() {
    }
}

Desperation bump