Chief Delphi

Chief Delphi (http://www.chiefdelphi.com/forums/index.php)
-   Java (http://www.chiefdelphi.com/forums/forumdisplay.php?f=184)
-   -   Did Vision Tracking Work? (http://www.chiefdelphi.com/forums/showthread.php?t=127218)

Kingland093 26-02-2014 10:52

Did Vision Tracking Work?
 
Did anyone get the sample vision tracking project provided in NetBeans to actually work? If so, what modifications did you do or did you even need any?
My team never got it to work so I was wondering if anyone else succeeded.

notmattlythgoe 26-02-2014 10:56

Re: Did Vision Tracking Work?
 
Quote:

Originally Posted by Kingland093 (Post 1349885)
Did anyone get the sample vision tracking project provided in NetBeans to actually work? If so, what modifications did you do or did you even need any?
My team never got it to work so I was wondering if anyone else succeeded.

We were able to get it to work and plan on using it for autonomous this year. There is an error that needs fixed with the camera not being instantiated, and we modified the color values to use ones that would fit our needs. Other than that it works fine. We still need to make sure the targets will work when they are off to the side, but it should be workable.

mwtidd 26-02-2014 11:20

Re: Did Vision Tracking Work?
 
We are using a robot builder based project. So we created a subsystem called HotCamera. We changed the filter from HSV to RGB, look for a width to height particle ratio greater that 3:1, and look for a particle wider than a defined size.

Code:

// RobotBuilder Version: 1.0
//
// This file was generated by RobotBuilder. It contains sections of
// code that are automatically generated and assigned by robotbuilder.
// These sections will be updated in the future when you export to
// Java from RobotBuilder. Do not put any code or make any change in
// the blocks indicating autogenerated code or it will be lost on an
// update. Deleting the comments indicating the section will prevent
// it from being updated in the future.
package org.usfirst.frc319.AerialAssistComp.subsystems;
import edu.wpi.first.wpilibj.camera.AxisCamera;
import edu.wpi.first.wpilibj.camera.AxisCameraException;
import org.usfirst.frc319.AerialAssistComp.RobotMap;
import org.usfirst.frc319.AerialAssistComp.commands.*;
import edu.wpi.first.wpilibj.command.Subsystem;
import edu.wpi.first.wpilibj.image.BinaryImage;
import edu.wpi.first.wpilibj.image.ColorImage;
import edu.wpi.first.wpilibj.image.CriteriaCollection;
import edu.wpi.first.wpilibj.image.NIVision;
import edu.wpi.first.wpilibj.image.NIVisionException;
import edu.wpi.first.wpilibj.image.ParticleAnalysisReport;
/**
 *
 */
public class HotCamera extends Subsystem {
    // BEGIN AUTOGENERATED CODE, SOURCE=ROBOTBUILDER ID=DECLARATIONS
    // END AUTOGENERATED CODE, SOURCE=ROBOTBUILDER ID=DECLARATIONS
   
    // Put methods for controlling this subsystem
    // here. Call these from Commands.
    //Camera constants used for distance calculation
    final int Y_IMAGE_RES = 480;                //X Image resolution in pixels, should be 120, 240 or 480
    final double VIEW_ANGLE = 49;                //Axis M1013
    //final double VIEW_ANGLE = 41.7;                //Axis 206 camera
    //final double VIEW_ANGLE = 37.4;  //Axis M1011 camera
    final double PI = 3.141592653;
    //Score limits used for target identification
    final int  RECTANGULARITY_LIMIT = 40;
    final int ASPECT_RATIO_LIMIT = 55;
    //Score limits used for hot target determination
    final int TAPE_WIDTH_LIMIT = 50;
    final int  VERTICAL_SCORE_LIMIT = 50;
    final int LR_SCORE_LIMIT = 50;
    //Minimum area of particles to be considered
    final int AREA_MINIMUM = 200;
    //Maximum number of particles to process
    final int MAX_PARTICLES = 5;
   
    AxisCamera camera;          // the axis camera object (connected to the switch)
    CriteriaCollection cc;      // the criteria for doing the particle filter operation
   
    public void initDefaultCommand() {
        // BEGIN AUTOGENERATED CODE, SOURCE=ROBOTBUILDER ID=DEFAULT_COMMAND
        setDefaultCommand(new CameraSleep());
    // END AUTOGENERATED CODE, SOURCE=ROBOTBUILDER ID=DEFAULT_COMMAND
       
        // Set the default command for a subsystem here.
        //setDefaultCommand(new MySpecialCommand());
    }
    public void initialize() {
        camera = AxisCamera.getInstance();  // get an instance of the camera
        cc = new CriteriaCollection();      // create the criteria for the particle filter
        //cc.addCriteria(NIVision.MeasurementType.IMAQ_MT_AREA, AREA_MINIMUM, 65535, false);
        cc.addCriteria(NIVision.MeasurementType.IMAQ_MT_BOUNDING_RECT_WIDTH, 8, 65535, false);
   
    }
     
    public boolean isHot(){
        //initialize();
        camera.freshImage();
       
       
        boolean hot = false;
        try {
                /**
                * Do the image capture with the camera and apply the algorithm described above. This
                * sample will either get images from the camera or from an image file stored in the top
                * level directory in the flash memory on the cRIO. The file name in this case is "testImage.jpg"
                *
                */
                ColorImage image;                          // next 2 lines read image from flash on cRIO
               
                image = camera.getImage();
               
                int redLow = 0;//0
                int redHigh = 10;//100;
                int greenLow = 0;//110
                int greenHigh = 10;//255;
                int blueLow = 50;//235
                int blueHigh = 110;//255
               
                BinaryImage thresholdImage = image.thresholdRGB(redLow,redHigh,greenLow,greenHigh,blueLow,blueHigh);
                BinaryImage filteredImage = thresholdImage.particleFilter(cc);
               
                System.out.println("particle count: " + filteredImage.getNumberParticles());
               
                if(filteredImage.getNumberParticles() > 0)
                {
                    int targetWidth = 0;
                   
                    for (int i = 0; i < MAX_PARTICLES && i < filteredImage.getNumberParticles(); i++) {
                        ParticleAnalysisReport report = filteredImage.getParticleAnalysisReport(i);
                        double widthRatio = report.boundingRectWidth / report.boundingRectHeight;
                        if(widthRatio > 3 && widthRatio < 12){
                            targetWidth = report.boundingRectWidth;
                        }
                        System.out.println("particle height: " + report.boundingRectHeight);
                        System.out.println("particle width: " + report.boundingRectWidth);
                        System.out.println("particle size: " + report.particleArea);
                    }
                    int minWidth = 30;
                   
                    hot = targetWidth > minWidth;
                   
                }
                /**
                * all images in Java must be freed after they are used since they are allocated out
                * of C data structures. Not calling free() will cause the memory to accumulate over
                * each pass of this loop.
                */
                filteredImage.free();
                thresholdImage.free();
                image.free();
           
                return hot;
            } catch (NIVisionException ex) {
                System.out.println(ex.getMessage());
                ex.printStackTrace();
            }
            catch (AxisCameraException ex) {
                System.out.println(ex.getMessage());
                ex.printStackTrace();
            }
        return false;
    }
   
}

We then created a WaitForHot command. Also we only call the isHot function once every 10 loops and this helps to avoid any Robot Drive Not Updated Enough issues.

Code:

// RobotBuilder Version: 1.0
//
// This file was generated by RobotBuilder. It contains sections of
// code that are automatically generated and assigned by robotbuilder.
// These sections will be updated in the future when you export to
// Java from RobotBuilder. Do not put any code or make any change in
// the blocks indicating autogenerated code or it will be lost on an
// update. Deleting the comments indicating the section will prevent
// it from being updated in the future.
package org.usfirst.frc319.AerialAssistComp.commands;
import edu.wpi.first.wpilibj.command.Command;
import org.usfirst.frc319.AerialAssistComp.Robot;
/**
 * @success cameraDrive
 * @fail stopDrive
 */
public class  WaitForHot extends ConditionalCommand {
    public static final String name = "WAIT_FOR_HOT";
    private static final int MIN_POSITIVE_SAMPLES = 2;
    private long startTime = 0;
   
    public WaitForHot() {
        super(name,CameraDrive.name,StopDrive.name);
            // Use requires() here to declare subsystem dependencies
        // eg. requires(chassis);
       
        // BEGIN AUTOGENERATED CODE, SOURCE=ROBOTBUILDER ID=REQUIRES
        requires(Robot.hotCamera);
    // END AUTOGENERATED CODE, SOURCE=ROBOTBUILDER ID=REQUIRES
       
    }
   
    // Called just before this Command runs the first time
    protected void initialize() {
        startTime = System.currentTimeMillis();
    }
    // Called repeatedly when this Command is scheduled to run
    private int hotCounter = 0;
    private int loopCounter = 0;
    protected void execute() {
        //for the first half second, wait for the targets to flip
        if(System.currentTimeMillis()- startTime < 500){
            return;
        }
//      //only run vision code once every 10 iterations
        loopCounter++;
        if(loopCounter > 10){
           
            //if camera sees hot, increment the hot camera
            if(Robot.hotCamera.isHot()){
                hotCounter++;
            }
           
            loopCounter = 0;
        }
    }
    // Make this return true when this Command no longer needs to run execute()
    protected boolean isFinished() {
        //if the camera has seen hot for 3 iterations finish
        if(System.currentTimeMillis()- startTime > 5000){
            return true;
        }
        return hotCounter > MIN_POSITIVE_SAMPLES;
    }
    // Called once after isFinished returns true
    protected void end() {
    }
    // Called when another command which requires one or more of the same
    // subsystems is scheduled to run
    protected void interrupted() {
    }
}

We are also using a blue LED ring light and the exposure is turned way down on the camera.

You can see it working here: http://www.youtube.com/watch?v=AUCYI...ature=youtu.be

Kingland093 02-03-2014 11:03

Re: Did Vision Tracking Work?
 
Quote:

Originally Posted by notmattlythgoe (Post 1349886)
We were able to get it to work and plan on using it for autonomous this year. There is an error that needs fixed with the camera not being instantiated, and we modified the color values to use ones that would fit our needs. Other than that it works fine. We still need to make sure the targets will work when they are off to the side, but it should be workable.

How do you fix that camera error?


All times are GMT -5. The time now is 22:40.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi