Tracking Multiple Targets

This year our team was finally able to track shapes with the vision tape. During the build season we were able to track 1 target perfectly, but when we added the other 3 targets the robot freaked out and would track right. It would track the two side targets switching between them. We’ve looked at the code multiple times and couldn’t figure out why it wasn’t able to lock onto the center. Would someone be able to look at our code to lead us in the right direction on why we aren’t able to track multiple targets at once. Your help is greatly appreciated. Thanks in advance.

This is part of our teleop mode where when you press a button it calls a method to find the targets and tells the Drive Team when its ready to shoot.


else if (stickShooter.getRawButton(2)) {
                readyToShoot = "Not ready to shoot";
                
                //%Initialize to Shoot]
                if (centered< 15)//Change #
                {
                    direction = cameraDirection(isBlueLED == true); 
                }
                readyToShoot = "Ready to Shoot";
                printUserMessage(output, 1, 1, dsBlank);
                printUserMessage(output, 2, 1, dsBlank);
                printUserMessage(output, 3, 1, dsBlank);
                printUserMessage(output, 4, 1, dsBlank);
                printUserMessage(output, 5, 1, dsBlank);
                printUserMessage(output, 6, 1, dsBlank);
                printUserMessage(output, 1, 1, mode);
                printUserMessage(output, 2, 1, readyToShoot);
                output.updateLCD();

                /* Was using tankDrive but gave errors when ran the code. Also tried myDrive but robot wouldn't move. We have watchDog. May we need to disable it when it tracks?
                 * if (direction == 1) { // myDrive.drive(0.50, 1.0);
                 *
                 * turretJag.set(direction * .65); //BEFORE USING TURRET } else
                 * if (direction == -1) { myDrive.drive(0.50, -1.0); } else {
                 * //myDrive.drive(0.0, 0.0); turretJag.set(0); }
                 */

                turretJag.set(direction * .65); //BEFORE USING TURRET
                direction = 0;
            } else {
                direction = 0;
                turretJag.set(0.0);
            }

//%Camera]
    public int cameraDirection(boolean ledIsBlue) {
        //printAll();
        output.updateLCD();
        BinaryImage bImg;
        int lowestSquare = 0;
        int lowSquareIndex = 0;
        int cameraDirection = 0; //is it right
        System.out.println("It Worked!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!");
        try {
            ColorImage img = camera.getImage();
            //System.out.println("Got IMAGE!!!!");
            if (ledIsBlue) {
                bImg = img.thresholdHSL(126, 155, 0, 255, 150, 255);
                //BLUE
                System.out.println("IS BLUE");
            } else {
                bImg = img.thresholdHSL(85, 128, 0, 255, 150, 255);
                //GREEN
            }

            ParticleAnalysisReport] reports = bImg.getOrderedParticleAnalysisReports(4);//Was 1

            if (reports.length > 0) {
                //System.out.println("REPORTS > 0");
                for (int i = 0; i < reports.length; i++) {
                    if (reports*.imageHeight > minSquareHeight && reports*.center_mass_y > lowestSquare) {
                        lowestSquare = reports*.center_mass_y;
                        lowSquareIndex = i;
                        System.out.println("Target: " + i + "  " + reports*.center_mass_x + "," + reports*.center_mass_y);
                    }
                }
            }

            if (reports[lowSquareIndex].center_mass_x < 0.9 * (WIDTH / 2)) {
                //Was reports[lowSquareIndex]. instead of 0
                System.out.println("LEFT");
                centered = 0;
                direction = -1;
            } else if (reports[lowSquareIndex].center_mass_x > 1.1 * (WIDTH / 2)) {
                //.center_mass_y before
                //Was reports[lowSquareIndex]. instead of 0
                System.out.println("RIGHT");
                centered = 0;
                direction = 1;
            } else if ((reports[lowSquareIndex].center_mass_x > 0.9 * (WIDTH / 2)) && (reports[lowSquareIndex].center_mass_x < 1.1 * (WIDTH / 2))) {
                //Was reports[lowSquareIndex]. instead of 0
                System.out.println("BATMAN");
                centered += 1;
                direction = 0;
            } else {
                System.out.println("NO TARGET");
                centered = 0;
                direction = 0;
            }
            img.free();
            bImg.free();
            System.out.println("FREED IMAGES");
        } catch (AxisCameraException ex) {
            System.out.println("Axis Camera Exception");
        } catch (NIVisionException ex) {
            System.out.println("NI Vision Exception");
        }
        output.updateLCD();
        return direction;
    }


Well, first of all, that code doesn’t track multiple targets… It only tracks the lowest one. The most important part of the code for this topic is:

                for (int i = 0; i < reports.length; i++) {
                    if (reports*.imageHeight > minSquareHeight && reports*.center_mass_y > lowestSquare) {
                        lowestSquare = reports*.center_mass_y;
                        lowSquareIndex = i;
                        System.out.println("Target: " + i + "  " + reports*.center_mass_x + "," + reports*.center_mass_y);
                    }
                }

Here you’re going through all the particles (aka visible targets) and selecting the lowest one (the one with the greatest y value, since images have an inverted y axis). You’re completely ignoring the rest.

On a slightly different topic, what are you trying to do with this?

reports*.imageHeight > minSquareHeight

If you’re trying to create a minimum threshold for the particle’s height, this is not what it’s doing. ParticleAnalysisReport.imageHeight is the height of the entire image*****, not the particle. For the height of the particle, you want ParticleAnalysisReport.boundingRectHeight. Although, if you want to create a size threshold, you’re much better off using NIVision.particleFilter() (which is wrapped by BinaryImage.particleFilter()).*

Yea. We decided to track the bottom target, but when we added multiple targets the camera would track the left and right targets not knowing what target to lock onto. With reports*.imageHeight > minSquareHeight we were trying to find where the lowest target was. If I were to change report*.imageHeight to boundingRectHeight will the camera be able to find the middle of the bottom target. Where would I get the BinaryImage.particleFilter(). I don’t see it under the Binary Image class. How does it work and where would I put it in the code.
How would this work?


for (int i = 0; i < reports.length; i++) {
                    if (reports*.boundingRectHeight > minSquareHeight && reports*.center_mass_y > lowestSquare) {
                        lowestSquare = reports*.center_mass_y;
                        lowSquareIndex = i;
                        System.out.println("Target: " + i + "  " + reports*.center_mass_x + "," + reports*.center_mass_y);
                    }
                }

Thanks again for your help.*******

Would it be better to change all of the reports*.center to boundingRectHeight*

To understand boundingBoxHeight and why you want it instead of imageHeight, you have to first understand what the bounding box is.
Now, particles can be any shape whatsoever, but National Instrument’s IMAQ library (the library powering the FIRST image processing stuff) only worries about it’s bounding rect. The image looks something like this:

The red circle is your particle, the red box is your bounding box, and the black box is your image. imageHeight is the height of the black box, which is always the same because you’re getting it from the camera, which is why you should care about boundingBoxHeight instead; it’s the height of the red box.

To use particleFilter, you need a CriteriaCollection. I don’t have the code we used on me, but it would go something like this for a height filter:

CriteriaCollection cc = new CriteriaCollection();
cc.addCriteria(NIVision.MeasurementType.IMAQ_MT_BOUNDING_RECT_HEIGHT,min,max,outsideRange);
BinaryImage filteredImage = image.particleFilter(cc);

min and max are the minimum and maximum values for the range (floats), and outsideRange should be true if you want particles outside that range to pass, false if you want particles inside the range to pass.

You say that you changed the code to work with multiple targets. Can you provide this code?

And no, changing center_mass_y to boundingRectHeight would not be helpful. Height is independent of location, and you care about location, so height should not be what you’re looking for.

Ok that makes sense. However the CriteriaCollection and particleFilter class doesn’t show up. I’ve looked through the the NIVision class along with the Binary Image classes and can’t find it. I know that CritieriaCollection is part of the image class but when I try to type it in it doesn’t show up. I have the most recent API for this year but it doesn’t show up. I did see that the NIVision class had many IMAQ methods. Would I use one of them instead of the CriteriaCollection or particleFilter class.

 The code that was able to track multiple targets is the same code as below but without the for loop. When ever we added a target it would track the most recent target or the closest one (the one it saw first).

I was looking around and I did find the CriteriaCollection and particleFilter.

Here is the code below. Do you think this will be able to track the targets. Do you think that I should rearrange anything to make it work better? Thanks again for your help.


     //%Camera]
    public int cameraDirection(boolean ledIsBlue) {
        //printAll();
        output.updateLCD();
        BinaryImage bImg;
        int lowestSquare = 0;
        int lowSquareIndex = 0;
        int cameraDirection = 0; //is it right
	cc = new CriteriaCollection();
	cc.addCriteria(NIVision.MeasurementType.IMAQ_MT_BOUNDING_RECT_WIDTH, 30, 400,false);
	cc.addCriteria(NIVision.MeasurementType.IMAQ_MT_BOUNDING_RECT_HEIGHT,40,400,false);
	
        //System.out.println("It Worked!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!");
        try {
            ColorImage img = camera.getImage();
	    
            //System.out.println("Got IMAGE!!!!");
            if (ledIsBlue) {
                bImg = img.thresholdHSL(126, 155, 0, 255, 150, 255);
                //BLUE
                System.out.println("IS BLUE");
            } else {
                bImg = img.thresholdHSL(85, 128, 0, 255, 150, 255);
                //GREEN
            }
	    
	    bImg.removeLargeObjects(false, 2);
	    bImg.convexHull(false);
	    BinaryImage filteredImage = bImg.particleFilter(cc);
	    
            ParticleAnalysisReport] reports = bImg.getOrderedParticleAnalysisReports(4);//Was 1

            if (reports.length > 0) {
                //System.out.println("REPORTS > 0");
                for (int i = 0; i < reports.length; i++) {
                    if (reports*.imageHeight > minSquareHeight && reports*.center_mass_y > lowestSquare) {
                        lowestSquare = reports*.center_mass_y;
                        lowSquareIndex = i;
                        //System.out.println("Target: " + i + "  " + reports*.center_mass_x + "," + reports*.center_mass_y);
                    }
                }
            }
	 
            if (reports[lowSquareIndex].center_mass_x < 0.9 * (WIDTH / 2)) {
                //Was reports[lowSquareIndex]. instead of 0
                System.out.println("LEFT");
                centered = 0;
                direction = -1;
            } else if (reports[lowSquareIndex].center_mass_x > 1.1 * (WIDTH / 2)) {
                //.center_mass_y before
                //Was reports[lowSquareIndex]. instead of 0
                System.out.println("RIGHT");
                centered  = 0;
                direction = 1;
            } else if ((reports[lowSquareIndex].center_mass_x > 0.9 * (WIDTH / 2)) && (reports[lowSquareIndex].center_mass_x < 1.1 * (WIDTH / 2))) {
                //Was reports[lowSquareIndex]. instead of 0
                System.out.println("CENTERED");
                centered += 1;
                direction = 0;
            } else {
                System.out.println("NO TARGET");
                centered = 0;
                direction = 0;
            }
	    
            img.free();
            bImg.free();
            //System.out.println("FREED IMAGES");
	    
        } catch (AxisCameraException ex) {
            System.out.println("Axis Camera Exception");
        } catch (NIVisionException ex) {
            System.out.println("NI Vision Exception");
        }
        //printAll();
        output.updateLCD();
	
        return direction;
    }


I see quite a bit wrong with that code… so I’ll break it down one piece at a time:

  • Remove large objects - You don’t catch its return (which is a BinaryImage), so you have a memory leak. Also, I’m a little confused about why you want to remove large objects. I’d assume that the targets would come out as fairly large objects, so you’re probably removing the very things you’re trying to track.
  • Convex hull - You don’t catch this return either, so more memory leaked. This operation, though, is worthwhile, though if you’re going to do, say, a small
    object removal, you probably want this first.
  • filteredImage - You never use this (and never free it, so leak there).
  • Operations are cumulative, but provide you with new images - You repeatedly perform operations on bImg. None of these operations actually affect bImg, but instead return a new image, which contains the transformed bImg. To do, say, a convex hull, without allocating a new BinaryImage, you would have to do NIVision.convexHull(bImg.image,bImg.image,0);
  • You’re still only tracking the bottom target.

So in short, no, I don’t think it can track multiple targets. It’s a step in the right direction, but you definitely made some mistakes.

Ok I think I fixed the code. So the code will declare the binary images and such. Then add values for the rectangles. After that it will try to get a color image from the camera and set the binary image to the image that the camera got and look for the color specified. It will then fill in the boxes and remove the small boxes. After that it will filter the particles based on the criteria collection and then order the 4 biggest particles in size (i think smallest to biggest). Then if reports is bigger than 0 it will go through and find the lowest square on the y axis (bottom target). Then it’ll find the center of that target. Does that seem about right? If I were to take out the for loop. would it then be able to track all 4 targets because I would then not be restricting it to just look at the bottom half of the image.

    
//%Camera]
    public int cameraDirection(boolean ledIsBlue) {
        //printAll();
        output.updateLCD();
        BinaryImage bImg;
        int lowestSquare = 0;
        int lowSquareIndex = 0;
        int cameraDirection = 0; //is it right
	cc = new CriteriaCollection();
	cc.addCriteria(NIVision.MeasurementType.IMAQ_MT_BOUNDING_RECT_WIDTH, 30, 400,false);
	cc.addCriteria(NIVision.MeasurementType.IMAQ_MT_BOUNDING_RECT_HEIGHT,40,400,false);
	
        //System.out.println("It Worked!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!");
        try {
            ColorImage img = camera.getImage();
	    
            //System.out.println("Got IMAGE!!!!");
            if (ledIsBlue) {
                bImg = img.thresholdHSL(126, 155, 0, 255, 150, 255);
                //BLUE
                System.out.println("IS BLUE");
            } else {
                bImg = img.thresholdHSL(85, 128, 0, 255, 150, 255);
                //GREEN
            }
	    
	    NIVision.convexHull(bImg.image, bImg.image, 0);
	    bImg.removeSmallObjects(false, 2);
	   
	    BinaryImage filteredImage = bImg.particleFilter(cc);
	    
            ParticleAnalysisReport] reports = filteredImage.getOrderedParticleAnalysisReports(4);//Was 1

            if (reports.length > 0) {
                //System.out.println("REPORTS > 0");
                for (int i = 0; i < reports.length; i++) {
                    if (reports*.imageHeight > minSquareHeight && reports*.center_mass_y > lowestSquare) {
                        lowestSquare = reports*.center_mass_y;
                        lowSquareIndex = i;
                        //System.out.println("Target: " + i + "  " + reports*.center_mass_x + "," + reports*.center_mass_y);
                    }
                }
            }
	 
            if (reports[lowSquareIndex].center_mass_x < 0.9 * (WIDTH / 2)) {
                //Was reports[lowSquareIndex]. instead of 0
                System.out.println("LEFT");
                centered = 0;
                direction = -1;
            } else if (reports[lowSquareIndex].center_mass_x > 1.1 * (WIDTH / 2)) {
                //.center_mass_y before
                //Was reports[lowSquareIndex]. instead of 0
                System.out.println("RIGHT");
                centered  = 0;
                direction = 1;
            } else if ((reports[lowSquareIndex].center_mass_x > 0.9 * (WIDTH / 2)) && (reports[lowSquareIndex].center_mass_x < 1.1 * (WIDTH / 2))) {
                //Was reports[lowSquareIndex]. instead of 0
                System.out.println("CENTERED");
                centered += 1;
                direction = 0;
            } else {
                System.out.println("NO TARGET");
                centered = 0;
                direction = 0;
            }
	    
            img.free();
            bImg.free();
            //System.out.println("FREED IMAGES");
	    
        } catch (AxisCameraException ex) {
            System.out.println("Axis Camera Exception");
        } catch (NIVisionException ex) {
            System.out.println("NI Vision Exception");
        }
        //printAll();
        output.updateLCD();
	
        return direction;
    }


You still have not caught the memory leak created by removeSmallObjects(), or freed filteredImage. Also, be careful about using NIVision, because BinaryImage hides certain functionality from you, and using NIVision along with BinaryImage can mess up particle counting. Until you have a working product and you know it’s too slow, I’d recommend against using NIVision’s methods directly.

Removing the for loop would not allow multiple target tracking on its own. However, if you make it go through each particle, and output the data for all of them, then it would be tracking multiple targets.

Sorry but I don’t understand how I haven’t caught the memory leak created by removeSmallObjects(), or freed filteredImage. Isn’t when you free the images you remove the memory for more pictures?

Here is updated code. I changed the way I used NIVision methods. I believe I freed all the images that I needed to. I haven’t changed it to look at all the particles yet.


   //%Camera]
    public int cameraDirection(boolean ledIsBlue) {
        //printAll();
        output.updateLCD();
        BinaryImage bImgThreshold;
        int lowestSquare = 0;
        int lowSquareIndex = 0;
        int cameraDirection = 0; //is it right
	
	cc = new CriteriaCollection();
	cc.addCriteria(NIVision.MeasurementType.IMAQ_MT_BOUNDING_RECT_WIDTH, 30, 400,false);
	cc.addCriteria(NIVision.MeasurementType.IMAQ_MT_BOUNDING_RECT_HEIGHT,40,400,false);
	
        System.out.println("It Worked!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!");
        try {
            ColorImage img = camera.getImage();
	    
            //System.out.println("Got IMAGE!!!!");
            if (ledIsBlue) {
                bImgThreshold = img.thresholdHSL(126, 155, 0, 255, 150, 255);
                //BLUE
                System.out.println("IS BLUE");
            } else {
                bImgThreshold = img.thresholdHSL(85, 128, 0, 255, 150, 255);
                //GREEN
            }
	    BinaryImage bigObjectsImage = bImgThreshold.removeSmallObjects(false, 2); //Remove small artifacts
	    BinaryImage convexHullImage = bigObjectsImage.convexHull(false); //Fill in occluded rectangles
	    BinaryImage filteredImage = convexHullImage.particleFilter(cc); //Find filled in rectangles
	    
            ParticleAnalysisReport] reports = filteredImage.getOrderedParticleAnalysisReports();//Gets a list of results

            if (reports.length > 0) {
                //System.out.println("REPORTS > 0");
                for (int i = 0; i < reports.length; i++) {
                    if (reports*.imageHeight > minSquareHeight && reports*.center_mass_y > lowestSquare) {
                        lowestSquare = reports*.center_mass_y;
                        lowSquareIndex = i;
                        //System.out.println("Target: " + i + "  " + reports*.center_mass_x + "," + reports*.center_mass_y);
                    }
                }
            }
	 
            if (reports[lowSquareIndex].center_mass_x < 0.9 * (WIDTH / 2)) {
                System.out.println("LEFT");
                centered = 0;
                direction = -1;
            } else if (reports[lowSquareIndex].center_mass_x > 1.1 * (WIDTH / 2)) {
                System.out.println("RIGHT");
                centered = 0;
                direction = 1;
            } else if ((reports[lowSquareIndex].center_mass_x > 0.9 * (WIDTH / 2)) && (reports[lowSquareIndex].center_mass_x < 1.1 * (WIDTH / 2))) {
                //Was reports[lowSquareIndex]. instead of 0
                System.out.println("CENTERED");
                centered += 1;
                direction = 0;
            } else {
                System.out.println("NO TARGET");
                centered = 0;
                direction = 0;
            }
	    
            img.free();
            bImgThreshold.free();
	    convexHullImage.free();
	    bigObjectsImage.free();
	    filteredImage.free();
            //System.out.println("FREED IMAGES");
	    
        } catch (AxisCameraException ex) {
            System.out.println("Axis Camera Exception");
        } catch (NIVisionException ex) {
            System.out.println("NI Vision Exception");
        }
        //printAll();
        output.updateLCD();
	
        return direction;
    }


It looks like it should work :wink: If you have any other issues, just ask!

Ok thanks for all your help. Greatly appreciated. :slight_smile:

We tested the code to see if the robot will track and it froze and we got robot drive not outputed often enought? We had the same code as a few posts ago. Do you think it could be caught in the if statement somewhere?

The operations performed in image processing are expensive, and become even more so when allocating 300 kb (640x480 at 1 byte per pixel; a BinaryImage). Is it actually outputting processed data? If so, it may just be that the processing is taking too much time. There are a few ways to optimize the image processing, and you can do such things as setting up the processing to run in a separate thread and putting the drive code in continuous to make sure the robot drive gets updated often enough.

But before any of this, there’s one thing I’d like to see: the output, in its entirety. If an exception happened, the type of exception and the call stack can provide a lot of information. If there’s no exception, it just means you need to find some creative ways to optimize.

I didn’t copy the output, but I don’t think there were any exceptions. What CRio did you guys use this year? We used the CRioI.

We found that similiar code took about a half second to process an image - which is too long - the robot gets cranky. So we do the image processing in a seperate thread so the robot can keep processing other things while the image is being manipulated.