Go to Post All I know is that last year in Philadelphi, our camera locked dead onto MOE (Team 365) and their flourescent green shirts during the matches. They might want to consider what would happen if they get pelted with balls during the autonomous period! :ahh: - Greg Marra [more]
Home
Go Back   Chief Delphi > Technical > Programming > Java
CD-Media   CD-Spy  
portal register members calendar search Today's Posts Mark Forums Read FAQ rules

 
Reply
Thread Tools Rate Thread Display Modes
  #1   Spotlight this post!  
Unread 14-01-2017, 10:35
graytonio graytonio is offline
Registered User
FRC #1523
 
Join Date: Apr 2016
Location: Jupiter, FL
Posts: 3
graytonio is an unknown quantity at this point
NI Vision Code Example Not Working

I was attempting to use the example vision code from the newest NI Vision library however it created an error. Has anyone else found this to be true or have I done something wrong?
Reply With Quote
  #2   Spotlight this post!  
Unread 14-01-2017, 10:37
stundt1's Avatar
stundt1 stundt1 is offline
Steve
FRC #4930 (Electric Mayhem)
Team Role: Programmer
 
Join Date: Dec 2010
Rookie Year: 2009
Location: Buffalo,NY
Posts: 364
stundt1 will become famous soon enoughstundt1 will become famous soon enough
Re: NI Vision Code Example Not Working

Please post the example and the error so we can help diagnose the issue.
__________________
Steve
-Programming Mentor
- Team 578 Alumni
Reply With Quote
  #3   Spotlight this post!  
Unread 14-01-2017, 15:27
graytonio graytonio is offline
Registered User
FRC #1523
 
Join Date: Apr 2016
Location: Jupiter, FL
Posts: 3
graytonio is an unknown quantity at this point
Re: NI Vision Code Example Not Working

Code:
package $package;

import java.lang.Math;
import java.util.Comparator;
import java.util.Vector;

import com.ni.vision.NIVision;
import com.ni.vision.NIVision.Image;
import com.ni.vision.NIVision.ImageType;

import edu.wpi.first.wpilibj.CameraServer;
import edu.wpi.first.wpilibj.SampleRobot;
import edu.wpi.first.wpilibj.Timer;
import edu.wpi.first.wpilibj.smartdashboard.SmartDashboard;

/**
 * Example of finding yellow totes based on color.
 * This example utilizes an image file, which you need to copy to the roboRIO
 * To use a camera you will have to integrate the appropriate camera details with this example.
 * To use a USB camera instead, see the SimpelVision and AdvancedVision examples for details
 * on using the USB camera. To use an Axis Camera, see the AxisCamera example for details on
 * using an Axis Camera.
 *
 * Sample omages can be found here: http://wp.wpi.edu/wpilib/2015/01/16/sample-images-for-vision-projects/
 */
public class Robot extends SampleRobot {
        //A structure to hold measurements of a particle
        public class ParticleReport implements Comparator<ParticleReport>, Comparable<ParticleReport>{
            double PercentAreaToImageArea;
            double Area;
            double ConvexHullArea;
            double BoundingRectLeft;
            double BoundingRectTop;
            double BoundingRectRight;
            double BoundingRectBottom;

            public int compareTo(ParticleReport r)
            {
                return (int)(r.Area - this.Area);
            }

            public int compare(ParticleReport r1, ParticleReport r2)
            {
                return (int)(r1.Area - r2.Area);
            }
        };

        //Structure to represent the scores for the various tests used for target identification
        public class Scores {
            double Trapezoid;
            double LongAspect;
            double ShortAspect;
            double AreaToConvexHullArea;
        };

        //Images
        Image frame;
        Image binaryFrame;
        int imaqError;

        //Constants
        NIVision.Range TOTE_HUE_RANGE = new NIVision.Range(24, 49);    //Default hue range for yellow tote
        NIVision.Range TOTE_SAT_RANGE = new NIVision.Range(67, 255);    //Default saturation range for yellow tote
        NIVision.Range TOTE_VAL_RANGE = new NIVision.Range(49, 255);    //Default value range for yellow tote
        double AREA_MINIMUM = 0.5; //Default Area minimum for particle as a percentage of total image area
        double LONG_RATIO = 2.22; //Tote long side = 26.9 / Tote height = 12.1 = 2.22
        double SHORT_RATIO = 1.4; //Tote short side = 16.9 / Tote height = 12.1 = 1.4
        double SCORE_MIN = 75.0;  //Minimum score to be considered a tote
        double VIEW_ANGLE = 49.4; //View angle fo camera, set to Axis m1011 by default, 64 for m1013, 51.7 for 206, 52 for HD3000 square, 60 for HD3000 640x480
        NIVision.ParticleFilterCriteria2 criteria[] = new NIVision.ParticleFilterCriteria2[1];
        NIVision.ParticleFilterOptions2 filterOptions = new NIVision.ParticleFilterOptions2(0,0,1,1);
        Scores scores = new Scores();

        public void robotInit() {
            // create images
            frame = NIVision.imaqCreateImage(ImageType.IMAGE_RGB, 0);
            binaryFrame = NIVision.imaqCreateImage(ImageType.IMAGE_U8, 0);
            criteria[0] = new NIVision.ParticleFilterCriteria2(NIVision.MeasurementType.MT_AREA_BY_IMAGE_AREA, AREA_MINIMUM, 100.0, 0, 0);

            //Put default values to SmartDashboard so fields will appear
            SmartDashboard.putNumber("Tote hue min", TOTE_HUE_RANGE.minValue);
            SmartDashboard.putNumber("Tote hue max", TOTE_HUE_RANGE.maxValue);
            SmartDashboard.putNumber("Tote sat min", TOTE_SAT_RANGE.minValue);
            SmartDashboard.putNumber("Tote sat max", TOTE_SAT_RANGE.maxValue);
            SmartDashboard.putNumber("Tote val min", TOTE_VAL_RANGE.minValue);
            SmartDashboard.putNumber("Tote val max", TOTE_VAL_RANGE.maxValue);
            SmartDashboard.putNumber("Area min %", AREA_MINIMUM);
        }

        public void autonomous() {
            while (isAutonomous() && isEnabled())
            {
                //read file in from disk. For this example to run you need to copy image20.jpg from the SampleImages folder to the
                //directory shown below using FTP or SFTP: http://wpilib.screenstepslive.com/s/4485/m/24166/l/282299-roborio-ftp
                NIVision.imaqReadFile(frame, "/home/lvuser/SampleImages/image20.jpg");

                //Update threshold values from SmartDashboard. For performance reasons it is recommended to remove this after calibration is finished.
                TOTE_HUE_RANGE.minValue = (int)SmartDashboard.getNumber("Tote hue min", TOTE_HUE_RANGE.minValue);
                TOTE_HUE_RANGE.maxValue = (int)SmartDashboard.getNumber("Tote hue max", TOTE_HUE_RANGE.maxValue);
                TOTE_SAT_RANGE.minValue = (int)SmartDashboard.getNumber("Tote sat min", TOTE_SAT_RANGE.minValue);
                TOTE_SAT_RANGE.maxValue = (int)SmartDashboard.getNumber("Tote sat max", TOTE_SAT_RANGE.maxValue);
                TOTE_VAL_RANGE.minValue = (int)SmartDashboard.getNumber("Tote val min", TOTE_VAL_RANGE.minValue);
                TOTE_VAL_RANGE.maxValue = (int)SmartDashboard.getNumber("Tote val max", TOTE_VAL_RANGE.maxValue);

                //Threshold the image looking for yellow (tote color)
                NIVision.imaqColorThreshold(binaryFrame, frame, 255, NIVision.ColorMode.HSV, TOTE_HUE_RANGE, TOTE_SAT_RANGE, TOTE_VAL_RANGE);

                //Send particle count to dashboard
                int numParticles = NIVision.imaqCountParticles(binaryFrame, 1);
                SmartDashboard.putNumber("Masked particles", numParticles);

                //Send masked image to dashboard to assist in tweaking mask.
                CameraServer.getInstance().setImage(binaryFrame);

                //filter out small particles
                float areaMin = (float)SmartDashboard.getNumber("Area min %", AREA_MINIMUM);
                criteria[0].lower = areaMin;
                imaqError = NIVision.imaqParticleFilter4(binaryFrame, binaryFrame, criteria, filterOptions, null);

                //Send particle count after filtering to dashboard
                numParticles = NIVision.imaqCountParticles(binaryFrame, 1);
                SmartDashboard.putNumber("Filtered particles", numParticles);

                if(numParticles > 0)
                {
                    //Measure particles and sort by particle size
                    Vector<ParticleReport> particles = new Vector<ParticleReport>();
                    for(int particleIndex = 0; particleIndex < numParticles; particleIndex++)
                    {
                        ParticleReport par = new ParticleReport();
                        par.PercentAreaToImageArea = NIVision.imaqMeasureParticle(binaryFrame, particleIndex, 0, NIVision.MeasurementType.MT_AREA_BY_IMAGE_AREA);
                        par.Area = NIVision.imaqMeasureParticle(binaryFrame, particleIndex, 0, NIVision.MeasurementType.MT_AREA);
                        par.ConvexHullArea = NIVision.imaqMeasureParticle(binaryFrame, particleIndex, 0, NIVision.MeasurementType.MT_CONVEX_HULL_AREA);
                        par.BoundingRectTop = NIVision.imaqMeasureParticle(binaryFrame, particleIndex, 0, NIVision.MeasurementType.MT_BOUNDING_RECT_TOP);
                        par.BoundingRectLeft = NIVision.imaqMeasureParticle(binaryFrame, particleIndex, 0, NIVision.MeasurementType.MT_BOUNDING_RECT_LEFT);
                        par.BoundingRectBottom = NIVision.imaqMeasureParticle(binaryFrame, particleIndex, 0, NIVision.MeasurementType.MT_BOUNDING_RECT_BOTTOM);
                        par.BoundingRectRight = NIVision.imaqMeasureParticle(binaryFrame, particleIndex, 0, NIVision.MeasurementType.MT_BOUNDING_RECT_RIGHT);
                        particles.add(par);
                    }
                    particles.sort(null);

                    //This example only scores the largest particle. Extending to score all particles and choosing the desired one is left as an exercise
                    //for the reader. Note that the long and short side scores expect a single tote and will not work for a stack of 2 or more totes.
                    //Modification of the code to accommodate 2 or more stacked totes is left as an exercise for the reader.
                    scores.Trapezoid = TrapezoidScore(particles.elementAt(0));
                    SmartDashboard.putNumber("Trapezoid", scores.Trapezoid);
                    scores.LongAspect = LongSideScore(particles.elementAt(0));
                    SmartDashboard.putNumber("Long Aspect", scores.LongAspect);
                    scores.ShortAspect = ShortSideScore(particles.elementAt(0));
                    SmartDashboard.putNumber("Short Aspect", scores.ShortAspect);
                    scores.AreaToConvexHullArea = ConvexHullAreaScore(particles.elementAt(0));
                    SmartDashboard.putNumber("Convex Hull Area", scores.AreaToConvexHullArea);
                    boolean isTote = scores.Trapezoid > SCORE_MIN && (scores.LongAspect > SCORE_MIN || scores.ShortAspect > SCORE_MIN) && scores.AreaToConvexHullArea > SCORE_MIN;
                    boolean isLong = scores.LongAspect > scores.ShortAspect;

                    //Send distance and tote status to dashboard. The bounding rect, particularly the horizontal center (left - right) may be useful for rotating/driving towards a tote
                    SmartDashboard.putBoolean("IsTote", isTote);
                    SmartDashboard.putNumber("Distance", computeDistance(binaryFrame, particles.elementAt(0), isLong));
                } else {
                    SmartDashboard.putBoolean("IsTote", false);
                }

                Timer.delay(0.005);                // wait for a motor update time
            }
        }

        public void operatorControl() {
            while(isOperatorControl() && isEnabled()) {
                Timer.delay(0.005);                // wait for a motor update time
            }
        }

        //Comparator function for sorting particles. Returns true if particle 1 is larger
        static boolean CompareParticleSizes(ParticleReport particle1, ParticleReport particle2)
        {
            //we want descending sort order
            return particle1.PercentAreaToImageArea > particle2.PercentAreaToImageArea;
        }

        /**
         * Converts a ratio with ideal value of 1 to a score. The resulting function is piecewise
         * linear going from (0,0) to (1,100) to (2,0) and is 0 for all inputs outside the range 0-2
         */
        double ratioToScore(double ratio)
        {
            return (Math.max(0, Math.min(100*(1-Math.abs(1-ratio)), 100)));
        }

        /**
         * Method to score convex hull area. This scores how "complete" the particle is. Particles with large holes will score worse than a filled in shape
         */
        double ConvexHullAreaScore(ParticleReport report)
        {
            return ratioToScore((report.Area/report.ConvexHullArea)*1.18);
        }

        /**
         * Method to score if the particle appears to be a trapezoid. Compares the convex hull (filled in) area to the area of the bounding box.
         * The expectation is that the convex hull area is about 95.4% of the bounding box area for an ideal tote.
         */
        double TrapezoidScore(ParticleReport report)
        {
            return ratioToScore(report.ConvexHullArea/((report.BoundingRectRight-report.BoundingRectLeft)*(report.BoundingRectBottom-report.BoundingRectTop)*.954));
        }

        /**
         * Method to score if the aspect ratio of the particle appears to match the long side of a tote.
         */
        double LongSideScore(ParticleReport report)
        {
            return ratioToScore(((report.BoundingRectRight-report.BoundingRectLeft)/(report.BoundingRectBottom-report.BoundingRectTop))/LONG_RATIO);
        }

        /**
         * Method to score if the aspect ratio of the particle appears to match the short side of a tote.
         */
        double ShortSideScore(ParticleReport report){
            return ratioToScore(((report.BoundingRectRight-report.BoundingRectLeft)/(report.BoundingRectBottom-report.BoundingRectTop))/SHORT_RATIO);
        }

        /**
         * Computes the estimated distance to a target using the width of the particle in the image. For more information and graphics
         * showing the math behind this approach see the Vision Processing section of the ScreenStepsLive documentation.
         *
         * @param image The image to use for measuring the particle estimated rectangle
         * @param report The Particle Analysis Report for the particle
         * @param isLong Boolean indicating if the target is believed to be the long side of a tote
         * @return The estimated distance to the target in feet.
         */
        double computeDistance (Image image, ParticleReport report, boolean isLong) {
            double normalizedWidth, targetWidth;
            NIVision.GetImageSizeResult size;

            size = NIVision.imaqGetImageSize(image);
            normalizedWidth = 2*(report.BoundingRectRight - report.BoundingRectLeft)/size.width;
            targetWidth = isLong ? 26.0 : 16.9;

            return  targetWidth/(normalizedWidth*12*Math.tan(VIEW_ANGLE*Math.PI/(180*2)));
        }
}
This is the example that came with the NI Vision Library however
Code:
CameraServer.getInstance().setImage(binaryFrame);
Gives an error that the setImage() method does not exist
Reply With Quote
  #4   Spotlight this post!  
Unread 14-01-2017, 17:48
Thad House Thad House is offline
Volunteer, WPILib Contributor
no team (Waiting for 2021)
Team Role: Mentor
 
Join Date: Feb 2011
Rookie Year: 2010
Location: Thousand Oaks, California
Posts: 1,106
Thad House has a reputation beyond reputeThad House has a reputation beyond reputeThad House has a reputation beyond reputeThad House has a reputation beyond reputeThad House has a reputation beyond reputeThad House has a reputation beyond reputeThad House has a reputation beyond reputeThad House has a reputation beyond reputeThad House has a reputation beyond reputeThad House has a reputation beyond reputeThad House has a reputation beyond repute
Re: NI Vision Code Example Not Working

Quote:
Originally Posted by graytonio View Post
Code:
package $package;

import java.lang.Math;
import java.util.Comparator;
import java.util.Vector;

import com.ni.vision.NIVision;
import com.ni.vision.NIVision.Image;
import com.ni.vision.NIVision.ImageType;

import edu.wpi.first.wpilibj.CameraServer;
import edu.wpi.first.wpilibj.SampleRobot;
import edu.wpi.first.wpilibj.Timer;
import edu.wpi.first.wpilibj.smartdashboard.SmartDashboard;

/**
 * Example of finding yellow totes based on color.
 * This example utilizes an image file, which you need to copy to the roboRIO
 * To use a camera you will have to integrate the appropriate camera details with this example.
 * To use a USB camera instead, see the SimpelVision and AdvancedVision examples for details
 * on using the USB camera. To use an Axis Camera, see the AxisCamera example for details on
 * using an Axis Camera.
 *
 * Sample omages can be found here: http://wp.wpi.edu/wpilib/2015/01/16/sample-images-for-vision-projects/
 */
public class Robot extends SampleRobot {
        //A structure to hold measurements of a particle
        public class ParticleReport implements Comparator<ParticleReport>, Comparable<ParticleReport>{
            double PercentAreaToImageArea;
            double Area;
            double ConvexHullArea;
            double BoundingRectLeft;
            double BoundingRectTop;
            double BoundingRectRight;
            double BoundingRectBottom;

            public int compareTo(ParticleReport r)
            {
                return (int)(r.Area - this.Area);
            }

            public int compare(ParticleReport r1, ParticleReport r2)
            {
                return (int)(r1.Area - r2.Area);
            }
        };

        //Structure to represent the scores for the various tests used for target identification
        public class Scores {
            double Trapezoid;
            double LongAspect;
            double ShortAspect;
            double AreaToConvexHullArea;
        };

        //Images
        Image frame;
        Image binaryFrame;
        int imaqError;

        //Constants
        NIVision.Range TOTE_HUE_RANGE = new NIVision.Range(24, 49);    //Default hue range for yellow tote
        NIVision.Range TOTE_SAT_RANGE = new NIVision.Range(67, 255);    //Default saturation range for yellow tote
        NIVision.Range TOTE_VAL_RANGE = new NIVision.Range(49, 255);    //Default value range for yellow tote
        double AREA_MINIMUM = 0.5; //Default Area minimum for particle as a percentage of total image area
        double LONG_RATIO = 2.22; //Tote long side = 26.9 / Tote height = 12.1 = 2.22
        double SHORT_RATIO = 1.4; //Tote short side = 16.9 / Tote height = 12.1 = 1.4
        double SCORE_MIN = 75.0;  //Minimum score to be considered a tote
        double VIEW_ANGLE = 49.4; //View angle fo camera, set to Axis m1011 by default, 64 for m1013, 51.7 for 206, 52 for HD3000 square, 60 for HD3000 640x480
        NIVision.ParticleFilterCriteria2 criteria[] = new NIVision.ParticleFilterCriteria2[1];
        NIVision.ParticleFilterOptions2 filterOptions = new NIVision.ParticleFilterOptions2(0,0,1,1);
        Scores scores = new Scores();

        public void robotInit() {
            // create images
            frame = NIVision.imaqCreateImage(ImageType.IMAGE_RGB, 0);
            binaryFrame = NIVision.imaqCreateImage(ImageType.IMAGE_U8, 0);
            criteria[0] = new NIVision.ParticleFilterCriteria2(NIVision.MeasurementType.MT_AREA_BY_IMAGE_AREA, AREA_MINIMUM, 100.0, 0, 0);

            //Put default values to SmartDashboard so fields will appear
            SmartDashboard.putNumber("Tote hue min", TOTE_HUE_RANGE.minValue);
            SmartDashboard.putNumber("Tote hue max", TOTE_HUE_RANGE.maxValue);
            SmartDashboard.putNumber("Tote sat min", TOTE_SAT_RANGE.minValue);
            SmartDashboard.putNumber("Tote sat max", TOTE_SAT_RANGE.maxValue);
            SmartDashboard.putNumber("Tote val min", TOTE_VAL_RANGE.minValue);
            SmartDashboard.putNumber("Tote val max", TOTE_VAL_RANGE.maxValue);
            SmartDashboard.putNumber("Area min %", AREA_MINIMUM);
        }

        public void autonomous() {
            while (isAutonomous() && isEnabled())
            {
                //read file in from disk. For this example to run you need to copy image20.jpg from the SampleImages folder to the
                //directory shown below using FTP or SFTP: http://wpilib.screenstepslive.com/s/4485/m/24166/l/282299-roborio-ftp
                NIVision.imaqReadFile(frame, "/home/lvuser/SampleImages/image20.jpg");

                //Update threshold values from SmartDashboard. For performance reasons it is recommended to remove this after calibration is finished.
                TOTE_HUE_RANGE.minValue = (int)SmartDashboard.getNumber("Tote hue min", TOTE_HUE_RANGE.minValue);
                TOTE_HUE_RANGE.maxValue = (int)SmartDashboard.getNumber("Tote hue max", TOTE_HUE_RANGE.maxValue);
                TOTE_SAT_RANGE.minValue = (int)SmartDashboard.getNumber("Tote sat min", TOTE_SAT_RANGE.minValue);
                TOTE_SAT_RANGE.maxValue = (int)SmartDashboard.getNumber("Tote sat max", TOTE_SAT_RANGE.maxValue);
                TOTE_VAL_RANGE.minValue = (int)SmartDashboard.getNumber("Tote val min", TOTE_VAL_RANGE.minValue);
                TOTE_VAL_RANGE.maxValue = (int)SmartDashboard.getNumber("Tote val max", TOTE_VAL_RANGE.maxValue);

                //Threshold the image looking for yellow (tote color)
                NIVision.imaqColorThreshold(binaryFrame, frame, 255, NIVision.ColorMode.HSV, TOTE_HUE_RANGE, TOTE_SAT_RANGE, TOTE_VAL_RANGE);

                //Send particle count to dashboard
                int numParticles = NIVision.imaqCountParticles(binaryFrame, 1);
                SmartDashboard.putNumber("Masked particles", numParticles);

                //Send masked image to dashboard to assist in tweaking mask.
                CameraServer.getInstance().setImage(binaryFrame);

                //filter out small particles
                float areaMin = (float)SmartDashboard.getNumber("Area min %", AREA_MINIMUM);
                criteria[0].lower = areaMin;
                imaqError = NIVision.imaqParticleFilter4(binaryFrame, binaryFrame, criteria, filterOptions, null);

                //Send particle count after filtering to dashboard
                numParticles = NIVision.imaqCountParticles(binaryFrame, 1);
                SmartDashboard.putNumber("Filtered particles", numParticles);

                if(numParticles > 0)
                {
                    //Measure particles and sort by particle size
                    Vector<ParticleReport> particles = new Vector<ParticleReport>();
                    for(int particleIndex = 0; particleIndex < numParticles; particleIndex++)
                    {
                        ParticleReport par = new ParticleReport();
                        par.PercentAreaToImageArea = NIVision.imaqMeasureParticle(binaryFrame, particleIndex, 0, NIVision.MeasurementType.MT_AREA_BY_IMAGE_AREA);
                        par.Area = NIVision.imaqMeasureParticle(binaryFrame, particleIndex, 0, NIVision.MeasurementType.MT_AREA);
                        par.ConvexHullArea = NIVision.imaqMeasureParticle(binaryFrame, particleIndex, 0, NIVision.MeasurementType.MT_CONVEX_HULL_AREA);
                        par.BoundingRectTop = NIVision.imaqMeasureParticle(binaryFrame, particleIndex, 0, NIVision.MeasurementType.MT_BOUNDING_RECT_TOP);
                        par.BoundingRectLeft = NIVision.imaqMeasureParticle(binaryFrame, particleIndex, 0, NIVision.MeasurementType.MT_BOUNDING_RECT_LEFT);
                        par.BoundingRectBottom = NIVision.imaqMeasureParticle(binaryFrame, particleIndex, 0, NIVision.MeasurementType.MT_BOUNDING_RECT_BOTTOM);
                        par.BoundingRectRight = NIVision.imaqMeasureParticle(binaryFrame, particleIndex, 0, NIVision.MeasurementType.MT_BOUNDING_RECT_RIGHT);
                        particles.add(par);
                    }
                    particles.sort(null);

                    //This example only scores the largest particle. Extending to score all particles and choosing the desired one is left as an exercise
                    //for the reader. Note that the long and short side scores expect a single tote and will not work for a stack of 2 or more totes.
                    //Modification of the code to accommodate 2 or more stacked totes is left as an exercise for the reader.
                    scores.Trapezoid = TrapezoidScore(particles.elementAt(0));
                    SmartDashboard.putNumber("Trapezoid", scores.Trapezoid);
                    scores.LongAspect = LongSideScore(particles.elementAt(0));
                    SmartDashboard.putNumber("Long Aspect", scores.LongAspect);
                    scores.ShortAspect = ShortSideScore(particles.elementAt(0));
                    SmartDashboard.putNumber("Short Aspect", scores.ShortAspect);
                    scores.AreaToConvexHullArea = ConvexHullAreaScore(particles.elementAt(0));
                    SmartDashboard.putNumber("Convex Hull Area", scores.AreaToConvexHullArea);
                    boolean isTote = scores.Trapezoid > SCORE_MIN && (scores.LongAspect > SCORE_MIN || scores.ShortAspect > SCORE_MIN) && scores.AreaToConvexHullArea > SCORE_MIN;
                    boolean isLong = scores.LongAspect > scores.ShortAspect;

                    //Send distance and tote status to dashboard. The bounding rect, particularly the horizontal center (left - right) may be useful for rotating/driving towards a tote
                    SmartDashboard.putBoolean("IsTote", isTote);
                    SmartDashboard.putNumber("Distance", computeDistance(binaryFrame, particles.elementAt(0), isLong));
                } else {
                    SmartDashboard.putBoolean("IsTote", false);
                }

                Timer.delay(0.005);                // wait for a motor update time
            }
        }

        public void operatorControl() {
            while(isOperatorControl() && isEnabled()) {
                Timer.delay(0.005);                // wait for a motor update time
            }
        }

        //Comparator function for sorting particles. Returns true if particle 1 is larger
        static boolean CompareParticleSizes(ParticleReport particle1, ParticleReport particle2)
        {
            //we want descending sort order
            return particle1.PercentAreaToImageArea > particle2.PercentAreaToImageArea;
        }

        /**
         * Converts a ratio with ideal value of 1 to a score. The resulting function is piecewise
         * linear going from (0,0) to (1,100) to (2,0) and is 0 for all inputs outside the range 0-2
         */
        double ratioToScore(double ratio)
        {
            return (Math.max(0, Math.min(100*(1-Math.abs(1-ratio)), 100)));
        }

        /**
         * Method to score convex hull area. This scores how "complete" the particle is. Particles with large holes will score worse than a filled in shape
         */
        double ConvexHullAreaScore(ParticleReport report)
        {
            return ratioToScore((report.Area/report.ConvexHullArea)*1.18);
        }

        /**
         * Method to score if the particle appears to be a trapezoid. Compares the convex hull (filled in) area to the area of the bounding box.
         * The expectation is that the convex hull area is about 95.4% of the bounding box area for an ideal tote.
         */
        double TrapezoidScore(ParticleReport report)
        {
            return ratioToScore(report.ConvexHullArea/((report.BoundingRectRight-report.BoundingRectLeft)*(report.BoundingRectBottom-report.BoundingRectTop)*.954));
        }

        /**
         * Method to score if the aspect ratio of the particle appears to match the long side of a tote.
         */
        double LongSideScore(ParticleReport report)
        {
            return ratioToScore(((report.BoundingRectRight-report.BoundingRectLeft)/(report.BoundingRectBottom-report.BoundingRectTop))/LONG_RATIO);
        }

        /**
         * Method to score if the aspect ratio of the particle appears to match the short side of a tote.
         */
        double ShortSideScore(ParticleReport report){
            return ratioToScore(((report.BoundingRectRight-report.BoundingRectLeft)/(report.BoundingRectBottom-report.BoundingRectTop))/SHORT_RATIO);
        }

        /**
         * Computes the estimated distance to a target using the width of the particle in the image. For more information and graphics
         * showing the math behind this approach see the Vision Processing section of the ScreenStepsLive documentation.
         *
         * @param image The image to use for measuring the particle estimated rectangle
         * @param report The Particle Analysis Report for the particle
         * @param isLong Boolean indicating if the target is believed to be the long side of a tote
         * @return The estimated distance to the target in feet.
         */
        double computeDistance (Image image, ParticleReport report, boolean isLong) {
            double normalizedWidth, targetWidth;
            NIVision.GetImageSizeResult size;

            size = NIVision.imaqGetImageSize(image);
            normalizedWidth = 2*(report.BoundingRectRight - report.BoundingRectLeft)/size.width;
            targetWidth = isLong ? 26.0 : 16.9;

            return  targetWidth/(normalizedWidth*12*Math.tan(VIEW_ANGLE*Math.PI/(180*2)));
        }
}
This is the example that came with the NI Vision Library however
Code:
CameraServer.getInstance().setImage(binaryFrame);
Gives an error that the setImage() method does not exist
Remove the

import edu.wpi.first.wpilibj.CameraServer;

line, and replace that with

import edu.wpi.first.wpilibj.vision.CameraServer;

I'll fix the examples. The issue is it's attempting to use the new CameraServer. However, we do recommend switching to the new CameraServer and OpenCV, as the LabVIEW and SmartDashboard camera viewers will not by default connect to the NiVision camera server.
__________________
All statements made are my own and not the feelings of any of my affiliated teams.
Teams 1510 and 2898 - Student 2010-2012
Team 4488 - Mentor 2013-2016
Co-developer of RobotDotNet, a .NET port of the WPILib.
Reply With Quote
  #5   Spotlight this post!  
Unread 18-01-2017, 13:16
graytonio graytonio is offline
Registered User
FRC #1523
 
Join Date: Apr 2016
Location: Jupiter, FL
Posts: 3
graytonio is an unknown quantity at this point
Re: NI Vision Code Example Not Working

Thank you so much
Reply With Quote
Reply


Thread Tools
Display Modes Rate This Thread
Rate This Thread:

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT -5. The time now is 12:34.

The Chief Delphi Forums are sponsored by Innovation First International, Inc.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi