Chief Delphi

Chief Delphi (http://www.chiefdelphi.com/forums/index.php)
-   C/C++ (http://www.chiefdelphi.com/forums/forumdisplay.php?f=183)
-   -   Vision tracking and related questions (http://www.chiefdelphi.com/forums/showthread.php?t=141644)

FRC2501 01-13-2016 04:27 PM

Vision tracking and related questions
 
I am wondering what vision tracking options there are and the relative difficulties of each.

Our team programs in C++, and we've never done any vision tracking before.

Then onto our other questions:
1. Can you use multiple cameras on a single robot?

Fauge7 01-13-2016 06:05 PM

Re: Vision tracking and related questions
 
Vision tracking options: Grip (good gui, easy to use,alpha) opencv(well documented, more complicated, have to develop your own algarithm)

you can have more then one camera on the robot, the problem you will run into is the allowed bandwidth at 7mbps

viggy96 01-15-2016 06:05 AM

Quote:

Originally Posted by FRC2501 (Post 1522963)
I am wondering what vision tracking options there are and the relative difficulties of each.

Our team programs in C++, and we've never done any vision tracking before.

Then onto our other questions:
1. Can you use multiple cameras on a single robot?

If you want to do multiple images, out would be best to use an onboard co-processor, like a Raspberry Pi, or NVIDIA Jetson TK1. That way you can guarantee resources, and not have to worry about the FMS bandwidth limit.

Also, I like OpenCV, GRIP is good to, and generates OpenCV code when used, but I don't know if it can do multiple cameras.

I was thinking of doing stereo vision myself...

dubiousSwain 01-15-2016 07:24 AM

Re: Vision tracking and related questions
 
Quote:

Originally Posted by Fauge7 (Post 1523020)
Vision tracking options: Grip (good gui, easy to use,alpha) opencv(well documented, more complicated, have to develop your own algarithm)

you can have more then one camera on the robot, the problem you will run into is the allowed bandwidth at 7mbps

Our team is using Roborealm and Labview, both of which are good options. We are removing Roborealm for the sake of simplifying the stack, but its a good program to get the concepts.

jfitz0807 01-15-2016 11:27 PM

Re: Vision tracking and related questions
 
We too program in C++. We have some new programming students this year and are very interested in vision tracking.

Since we're just starting out, would you recommend RoboRealm or GRIP to play with the images?

We think that "all we need to do" is find a target image and tell us how far left or right we are. Ideally, we could tell how far away from the goal we are too.

Do you think it's feasible to run the vision processing on the Roborio or do you think wee need a separate platform?

Our first option then would be to run it on the driver station as opposed to a co-processor for simplicity. Unfortunately, we think we'll need a second camera. Our ball pick-up is opposite from our shooter.

I've been thinking about vision processing for years now, but this is the first time was have enough programming students to make it feasible. It seems that with the cRio, off-board vision processing was a must. Has anyone done any throughput studies to see just how much computing resources a typical vision processing algorithm takes on the Roborio?

Thanks.

kmckay 01-21-2016 10:40 PM

Re: Vision tracking and related questions
 
Do you have to use grip or opencv? The screensteps seems to suggest you can just code it with the wpilibs.
I knowif ican get the data coordinates, i can work out the algorithm.
Also, the sample with the 2016 libraries is simplerobot, does anyone have a command based example?

pnitin 01-24-2016 10:57 PM

Re: Vision tracking and related questions
 
Look here for tracking shronghold goalpost
http://www.mindsensors.com/blog/how-...our-frc-robot-

FRC2501 02-08-2016 05:33 PM

Re: Vision tracking and related questions
 
Thanks for all the suggestions! Our team has started to play around with GRIP, but we are wondering whether it's best to run it from the driver station laptop (i5-4210U, integrated graphics, 6GB RAM), from a raspberry pi2, the RIO or buy some sort of co-processor like the kangaroo?

We plan to have a USB camera (Logitech) that we plan to plug the camera into whatever we are using for the vision processing, then publishing it to the RIO.

Anyone have suggestions?

I also would like some examples/tutorials on how to read the contours report from C++ robot code.

MaikeruKonare 02-08-2016 07:47 PM

Re: Vision tracking and related questions
 
Your main software options are Grip and RoboRealm. Grip feel unfinished and doesn't yet function well, as it is a work in progress. I greatly prefer RoboRealm.

I have succeeded in getting working RoboRealm vision processing this week and I can even adjust the robot's position to within 15 pixels of the center of the vision target.

This is my RoboRealm pipeline:

The Axis Camera module is in order to connect with an IP camera. RoboRealm can only use a USB camera if you're running RoboRealm on that device. (Aka in order to use a USB camera on the robot you would need a Windows machine ON the robot, such as a Kangaroo).

The Adaptive Threshold gray scales the image and filters it so that only intensities of ~190-210 show, which is about the intensity of the reflective tape when an LED light is shown on it.

Convex hull fills in the target U shape and makes it a rectangle.

My blob filter removes all blobs that have made it this far except the largest blob. (If you want multiple targets to come through, remove blobs based on area instead of largest only.)

Center of Gravity gives the X coordinate of the center of the target in pixels.

Network Tables publishes the center of gravity information and image dimensions to the network tables, in order to be read in by your program.

The following is my C++ code that is compatible with the above RoboRealm pipeline, look mainly at the Test() function:
Code:

/*
 * Robot.cpp
 *
 *  Created on: Feb 7, 2016
 *      Author: Michael Conard
 **/

#include "WPILib.h"
#include "Shooter.h"
#include "Constants.h"

class Robot: public SampleRobot
{
private:
        RobotDrive* mRobot;
        BuiltInAccelerometer* mAcc; //RoboRio accelerometer

        Joystick* mStick;                //Xbox controller

        CANTalon* mFrontLeft;
        CANTalon* mFrontRight;
        CANTalon* mRearLeft;
        CANTalon* mRearRight;

        Shooter* mShooter; //Shooter class object

        std::shared_ptr<NetworkTable> roboRealm; //Network table object, communicate with RoboRealm
public:
        Robot();                                //Constructor
        ~Robot();                                //Destructor
        void OperatorControl();
        void Autonomous();
        void Test();
        void Disabled();
};

Robot::Robot():roboRealm(NetworkTable::GetTable("RoboRealm")) //Construct network object within the scope of Robot
{
        printf("File %18s Date %s Time %s Object %p\n",__FILE__,__DATE__, __TIME__, this);

        mStick = new Joystick(XBOX::DRIVER_PORT);
        mAcc = new BuiltInAccelerometer();
        mShooter = new Shooter(VIRTUAL_PORT::SHOOTER_LEFT_WHEEL, VIRTUAL_PORT::SHOOTER_RIGHT_WHEEL,
                        VIRTUAL_PORT::SHOOTER_ANGLE, PWM_PORT::PUSH_SERVO_PORT, mStick);

        mFrontLeft = new CANTalon(VIRTUAL_PORT::LEFT_FRONT_DRIVE);
        mFrontRight = new CANTalon(VIRTUAL_PORT::RIGHT_FRONT_DRIVE);
        mRearLeft = new CANTalon(VIRTUAL_PORT::LEFT_REAR_DRIVE);
        mRearRight = new CANTalon(VIRTUAL_PORT::RIGHT_REAR_DRIVE);

        mRobot = new RobotDrive(mFrontLeft, mRearLeft, mFrontRight, mRearRight);
        mRobot->SetSafetyEnabled(false);
}

Robot::~Robot()
{
        delete mRobot;
        delete mAcc;
        delete mStick;
        delete mFrontRight;
        delete mFrontLeft;
        delete mRearRight;
        delete mRearLeft;
        delete mShooter;
}

void Robot::Disabled()
{
        printf("\nDisabled\n");
}

void Robot::OperatorControl() //standard driving and shooter control
{
        while(IsOperatorControl() && IsEnabled())
        {
                //printf("X Acc: %f, Y Acc: %f. Z Acc. %f\n", mAcc->GetX(), mAcc->GetY(), mAcc->GetZ());

                mRobot->ArcadeDrive(-mStick->GetRawAxis(XBOX::LEFT_Y_AXIS), -mStick->GetRawAxis(XBOX::LEFT_X_AXIS)); //standard drive

                if(mStick->GetRawAxis(XBOX::LEFT_TRIGGER_AXIS) >= 0.5) //spin wheels to suck ball in
                {
                        mShooter->ActivateMotors(-.8);
                }
                else if(mStick->GetRawAxis(XBOX::RIGHT_TRIGGER_AXIS) >= 0.5) //activate wheels, wait for full speed, shoot ball
                {
                        mShooter->ShootBall();
                }
                else
                {
                        mShooter->DisableMotors();
                }

                Wait(0.02);
        }
}

void Robot::Test() //tests aligning with vision target
{
        double imageWidth = roboRealm->GetNumber("IMAGE_WIDTH", 320); //get image width
        double xPosition;
        double distFromCenter;
        while(IsTest() && IsEnabled())
        {
                xPosition = roboRealm->GetNumber("COG_X", -1.0); //get center of gravity of vision target
                distFromCenter = imageWidth/2.0 - xPosition; //positive means object too far right, negative means too far left
                printf("xPosition: %f, distFromCenter: %f\n", xPosition, distFromCenter);

                if(xPosition != -1) //if set to default value, network table not found
                {
                        if(distFromCenter > 15) //vision target too far right, spin right
                                mRobot->ArcadeDrive(0.0,0.40);
                        else if(distFromCenter < -15) //vision target too far left, spin left
                                mRobot->ArcadeDrive(0.0,-0.40);
                        else
                        {
                                mRobot->ArcadeDrive(0.0,0.0); //stop, centered within 15 pixels
                                printf("Centered!  "); //lines up with xPosition print above
                        }
                }
                else
                {
                        printf("Network table error!!!\n");
                }
        }
}

void Robot::Autonomous() //aligns with vision target then shoots
{
        double imageWidth = roboRealm->GetNumber("IMAGE_WIDTH", 320); //get image width
        double xPosition;
        double distFromCenter;
        while(IsAutonomous() && IsEnabled())
        {
                xPosition = roboRealm->GetNumber("COG_X", -1.0); //get center of gravity of vision target
                distFromCenter = imageWidth/2.0 - xPosition; //positive means object too far right, negative means too far left

                if(xPosition != -1) //if set to default value, network table not found
                {
                        if(distFromCenter > 15) //vision target too far right, spin right
                                mRobot->ArcadeDrive(0.0,0.2);
                        else if(distFromCenter < -15) //vision target too far left, spin left
                                mRobot->ArcadeDrive(0.0,-0.2);
                        else
                        {
                                Wait(0.5);
                                mShooter->ShootBall(); //centered within 15 pixels, shoot ball
                        }
                }
                else
                {
                        printf("Network table error!!!\n");
                }

                Wait(0.02);
        }
}

START_ROBOT_CLASS(Robot)

Let me know if you need any clarifications or additional information.

FRC2501 02-09-2016 05:13 PM

Re: Vision tracking and related questions
 
Quote:

Originally Posted by MaikeruKonare (Post 1537246)
Your main software options are Grip and RoboRealm. Grip feel unfinished and doesn't yet function well, as it is a work in progress. I greatly prefer RoboRealm.

Sadly our school blocks access to RoboRealm on the basis of "Website contains prohibited Forums content.", so I have been "forced" into using GRIP, which while it seems unfinished seems good enough for our usage atm.

MaikeruKonare 02-09-2016 06:15 PM

Re: Vision tracking and related questions
 
Quote:

Originally Posted by FRC2501 (Post 1537687)
Sadly our school blocks access to RoboRealm on the basis of "Website contains prohibited Forums content.", so I have been "forced" into using GRIP, which while it seems unfinished seems good enough for our usage atm.

You could try downloading it at home and bringing it in on a zipdrive. You could also try translating the RoboRealm website in google translate from French to English. Using the translated link brings up the website in a way that it avoids the block that the school places.


All times are GMT -5. The time now is 10:15 AM.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi