Go to Post Are we really at the point where the only useful thing we can discuss is Dean's choice in timekeeping apparel? The man has a watch. He uses it to tell time. Do we really care how much it is worth, or who made it? I would be much more interested in what he does, not what he wears. - dlavery [more]
Home
Go Back   Chief Delphi > Technical > Programming > Java
CD-Media   CD-Spy  
portal register members calendar search Today's Posts Mark Forums Read FAQ rules

 
Reply
 
Thread Tools Rate Thread Display Modes
  #1   Spotlight this post!  
Unread 26-02-2014, 10:52
Kingland093 Kingland093 is offline
Registered User
FRC #4215 (Trinity Tritons)
Team Role: Alumni
 
Join Date: Jan 2014
Rookie Year: 2013
Location: Minnesota
Posts: 107
Kingland093 will become famous soon enough
Did Vision Tracking Work?

Did anyone get the sample vision tracking project provided in NetBeans to actually work? If so, what modifications did you do or did you even need any?
My team never got it to work so I was wondering if anyone else succeeded.
Reply With Quote
  #2   Spotlight this post!  
Unread 26-02-2014, 10:56
notmattlythgoe's Avatar
notmattlythgoe notmattlythgoe is offline
Flywheel Police
AKA: Matthew Lythgoe
FRC #2363 (Triple Helix)
Team Role: Mentor
 
Join Date: Feb 2010
Rookie Year: 2009
Location: Newport News, VA
Posts: 1,717
notmattlythgoe has a reputation beyond reputenotmattlythgoe has a reputation beyond reputenotmattlythgoe has a reputation beyond reputenotmattlythgoe has a reputation beyond reputenotmattlythgoe has a reputation beyond reputenotmattlythgoe has a reputation beyond reputenotmattlythgoe has a reputation beyond reputenotmattlythgoe has a reputation beyond reputenotmattlythgoe has a reputation beyond reputenotmattlythgoe has a reputation beyond reputenotmattlythgoe has a reputation beyond repute
Re: Did Vision Tracking Work?

Quote:
Originally Posted by Kingland093 View Post
Did anyone get the sample vision tracking project provided in NetBeans to actually work? If so, what modifications did you do or did you even need any?
My team never got it to work so I was wondering if anyone else succeeded.
We were able to get it to work and plan on using it for autonomous this year. There is an error that needs fixed with the camera not being instantiated, and we modified the color values to use ones that would fit our needs. Other than that it works fine. We still need to make sure the targets will work when they are off to the side, but it should be workable.
Reply With Quote
  #3   Spotlight this post!  
Unread 26-02-2014, 11:20
mwtidd's Avatar
mwtidd mwtidd is offline
Registered User
AKA: mike
FRC #0319 (Big Bad Bob)
Team Role: Mentor
 
Join Date: Feb 2005
Rookie Year: 2003
Location: Boston, MA
Posts: 714
mwtidd has a reputation beyond reputemwtidd has a reputation beyond reputemwtidd has a reputation beyond reputemwtidd has a reputation beyond reputemwtidd has a reputation beyond reputemwtidd has a reputation beyond reputemwtidd has a reputation beyond reputemwtidd has a reputation beyond reputemwtidd has a reputation beyond reputemwtidd has a reputation beyond reputemwtidd has a reputation beyond repute
Re: Did Vision Tracking Work?

We are using a robot builder based project. So we created a subsystem called HotCamera. We changed the filter from HSV to RGB, look for a width to height particle ratio greater that 3:1, and look for a particle wider than a defined size.

Code:
// RobotBuilder Version: 1.0
//
// This file was generated by RobotBuilder. It contains sections of
// code that are automatically generated and assigned by robotbuilder.
// These sections will be updated in the future when you export to
// Java from RobotBuilder. Do not put any code or make any change in
// the blocks indicating autogenerated code or it will be lost on an
// update. Deleting the comments indicating the section will prevent
// it from being updated in the future.
package org.usfirst.frc319.AerialAssistComp.subsystems;
import edu.wpi.first.wpilibj.camera.AxisCamera;
import edu.wpi.first.wpilibj.camera.AxisCameraException;
import org.usfirst.frc319.AerialAssistComp.RobotMap;
import org.usfirst.frc319.AerialAssistComp.commands.*;
import edu.wpi.first.wpilibj.command.Subsystem;
import edu.wpi.first.wpilibj.image.BinaryImage;
import edu.wpi.first.wpilibj.image.ColorImage;
import edu.wpi.first.wpilibj.image.CriteriaCollection;
import edu.wpi.first.wpilibj.image.NIVision;
import edu.wpi.first.wpilibj.image.NIVisionException;
import edu.wpi.first.wpilibj.image.ParticleAnalysisReport;
/**
 *
 */
public class HotCamera extends Subsystem {
    // BEGIN AUTOGENERATED CODE, SOURCE=ROBOTBUILDER ID=DECLARATIONS
    // END AUTOGENERATED CODE, SOURCE=ROBOTBUILDER ID=DECLARATIONS
    
    // Put methods for controlling this subsystem
    // here. Call these from Commands.
    //Camera constants used for distance calculation
    final int Y_IMAGE_RES = 480;		//X Image resolution in pixels, should be 120, 240 or 480
    final double VIEW_ANGLE = 49;		//Axis M1013
    //final double VIEW_ANGLE = 41.7;		//Axis 206 camera
    //final double VIEW_ANGLE = 37.4;  //Axis M1011 camera
    final double PI = 3.141592653;
    //Score limits used for target identification
    final int  RECTANGULARITY_LIMIT = 40;
    final int ASPECT_RATIO_LIMIT = 55;
    //Score limits used for hot target determination
    final int TAPE_WIDTH_LIMIT = 50;
    final int  VERTICAL_SCORE_LIMIT = 50;
    final int LR_SCORE_LIMIT = 50;
    //Minimum area of particles to be considered
    final int AREA_MINIMUM = 200;
    //Maximum number of particles to process
    final int MAX_PARTICLES = 5;
    
    AxisCamera camera;          // the axis camera object (connected to the switch)
    CriteriaCollection cc;      // the criteria for doing the particle filter operation
    
    public void initDefaultCommand() {
        // BEGIN AUTOGENERATED CODE, SOURCE=ROBOTBUILDER ID=DEFAULT_COMMAND
        setDefaultCommand(new CameraSleep());
    // END AUTOGENERATED CODE, SOURCE=ROBOTBUILDER ID=DEFAULT_COMMAND
	
        // Set the default command for a subsystem here.
        //setDefaultCommand(new MySpecialCommand());
    }
    public void initialize() {
        camera = AxisCamera.getInstance();  // get an instance of the camera
        cc = new CriteriaCollection();      // create the criteria for the particle filter
        //cc.addCriteria(NIVision.MeasurementType.IMAQ_MT_AREA, AREA_MINIMUM, 65535, false);
        cc.addCriteria(NIVision.MeasurementType.IMAQ_MT_BOUNDING_RECT_WIDTH, 8, 65535, false);
    
    }
       
    public boolean isHot(){
        //initialize();
        camera.freshImage();
        
        
        boolean hot = false;
        try {
                /**
                 * Do the image capture with the camera and apply the algorithm described above. This
                 * sample will either get images from the camera or from an image file stored in the top
                 * level directory in the flash memory on the cRIO. The file name in this case is "testImage.jpg"
                 * 
                 */
                ColorImage image;                           // next 2 lines read image from flash on cRIO
                
                image = camera.getImage();
                
                int redLow = 0;//0
                int redHigh = 10;//100;
                int greenLow = 0;//110
                int greenHigh = 10;//255;
                int blueLow = 50;//235
                int blueHigh = 110;//255
                
                BinaryImage thresholdImage = image.thresholdRGB(redLow,redHigh,greenLow,greenHigh,blueLow,blueHigh);
                BinaryImage filteredImage = thresholdImage.particleFilter(cc); 
                
                System.out.println("particle count: " + filteredImage.getNumberParticles());
                
                if(filteredImage.getNumberParticles() > 0)
                {
                    int targetWidth = 0;
                    
                    for (int i = 0; i < MAX_PARTICLES && i < filteredImage.getNumberParticles(); i++) {
			ParticleAnalysisReport report = filteredImage.getParticleAnalysisReport(i);
                        double widthRatio = report.boundingRectWidth / report.boundingRectHeight;
                        if(widthRatio > 3 && widthRatio < 12){
                            targetWidth = report.boundingRectWidth;
                        }
                        System.out.println("particle height: " + report.boundingRectHeight);
                        System.out.println("particle width: " + report.boundingRectWidth);
                        System.out.println("particle size: " + report.particleArea);
                    }
                    int minWidth = 30;
                    
                    hot = targetWidth > minWidth;
                     
                }
                /**
                 * all images in Java must be freed after they are used since they are allocated out
                 * of C data structures. Not calling free() will cause the memory to accumulate over
                 * each pass of this loop.
                 */
                filteredImage.free();
                thresholdImage.free();
                image.free();
             
                return hot;
            } catch (NIVisionException ex) {
                System.out.println(ex.getMessage());
                ex.printStackTrace();
            }
            catch (AxisCameraException ex) {
                System.out.println(ex.getMessage());
                ex.printStackTrace();
            }
        return false;
    }
    
}
We then created a WaitForHot command. Also we only call the isHot function once every 10 loops and this helps to avoid any Robot Drive Not Updated Enough issues.

Code:
// RobotBuilder Version: 1.0
//
// This file was generated by RobotBuilder. It contains sections of
// code that are automatically generated and assigned by robotbuilder.
// These sections will be updated in the future when you export to
// Java from RobotBuilder. Do not put any code or make any change in
// the blocks indicating autogenerated code or it will be lost on an
// update. Deleting the comments indicating the section will prevent
// it from being updated in the future.
package org.usfirst.frc319.AerialAssistComp.commands;
import edu.wpi.first.wpilibj.command.Command;
import org.usfirst.frc319.AerialAssistComp.Robot;
/**
 * @success cameraDrive
 * @fail stopDrive
 */
public class  WaitForHot extends ConditionalCommand {
    public static final String name = "WAIT_FOR_HOT";
    private static final int MIN_POSITIVE_SAMPLES = 2;
    private long startTime = 0;
    
    public WaitForHot() {
        super(name,CameraDrive.name,StopDrive.name);
            // Use requires() here to declare subsystem dependencies
        // eg. requires(chassis);
	
        // BEGIN AUTOGENERATED CODE, SOURCE=ROBOTBUILDER ID=REQUIRES
        requires(Robot.hotCamera);
    // END AUTOGENERATED CODE, SOURCE=ROBOTBUILDER ID=REQUIRES
        
    }
    
    // Called just before this Command runs the first time
    protected void initialize() {
        startTime = System.currentTimeMillis();
    }
    // Called repeatedly when this Command is scheduled to run
    private int hotCounter = 0;
    private int loopCounter = 0;
    protected void execute() {
        //for the first half second, wait for the targets to flip
        if(System.currentTimeMillis()- startTime < 500){
            return;
        }
//      //only run vision code once every 10 iterations
        loopCounter++;
        if(loopCounter > 10){
            
            //if camera sees hot, increment the hot camera
            if(Robot.hotCamera.isHot()){
                hotCounter++;
            }
            
            loopCounter = 0;
        }
    }
    // Make this return true when this Command no longer needs to run execute()
    protected boolean isFinished() {
        //if the camera has seen hot for 3 iterations finish
        if(System.currentTimeMillis()- startTime > 5000){
            return true;
        }
        return hotCounter > MIN_POSITIVE_SAMPLES;
    }
    // Called once after isFinished returns true
    protected void end() {
    }
    // Called when another command which requires one or more of the same
    // subsystems is scheduled to run
    protected void interrupted() {
    }
}
We are also using a blue LED ring light and the exposure is turned way down on the camera.

You can see it working here: http://www.youtube.com/watch?v=AUCYI...ature=youtu.be
__________________
"Never let your schooling interfere with your education" -Mark Twain
Reply With Quote
  #4   Spotlight this post!  
Unread 02-03-2014, 11:03
Kingland093 Kingland093 is offline
Registered User
FRC #4215 (Trinity Tritons)
Team Role: Alumni
 
Join Date: Jan 2014
Rookie Year: 2013
Location: Minnesota
Posts: 107
Kingland093 will become famous soon enough
Re: Did Vision Tracking Work?

Quote:
Originally Posted by notmattlythgoe View Post
We were able to get it to work and plan on using it for autonomous this year. There is an error that needs fixed with the camera not being instantiated, and we modified the color values to use ones that would fit our needs. Other than that it works fine. We still need to make sure the targets will work when they are off to the side, but it should be workable.
How do you fix that camera error?
Reply With Quote
Reply


Thread Tools
Display Modes Rate This Thread
Rate This Thread:

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT -5. The time now is 22:40.

The Chief Delphi Forums are sponsored by Innovation First International, Inc.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi