S.W.A.T 1806 is proud to announce the first Lift Tracking software of the 2017 game FIRST Steamworks. No longer will mentors and coaches have to yell at the programmers to get vision tracking working, it’s here. The software will recognize the distance from the target, and the angle to the target. This will also run on different processing computers like the PI and Kangaroo with relative ease. You can also easily edit this software by using the included GRIP file and generate the code that you like, no more messing with pesky HSV!
How to install:
Install opencv 3.X on whatever computer it’s running on from here
Download NetworkTables 3.0 (inside repo) and make sure it’s in the build path
Download the repo
Run GRIP with the included project file, and tune your values to your liking, and export the code and overwrite everything in LiftTracker.java BE SURE TO NOT DELETE LINES 303-310, KEEP IT SOMEWHERE IN THAT CODE
Export the project as a runnable jar
Run it using command line or a batch file (or .sh file if you are on linux)
Before you run it though, you need to calculate the distance constant. This is a pretty easy task and should take under 10 min. Choose 5 distances for the robot to sit (12, 24, 48, 60, 72 in). Move your robot to each of these distances and record the variable lengthBetweenContours, then write that down. Multiply the distance and lengthBetweenContours and write what you get down. After you do that for all of the values, average everything and that’s the distance constant. There is a variable in the code that you can change named DISTANCE_CONSTANT so you can easily change it
Any chance I could get a little more guidance/advice for a complete noob?
I’m a very good programmer but have not yet worked in the environments called for for robot programming.
Just learning the LabView stuff for robot control.
The instructions say “install” opencv but when I go and download it from the site I just get a folder of libraries. There’s nothing to install. So, are you really just saying “go download the libraries”?
Why do I have to download Network Tables 3.0 if it is already in the repo.
I did a git clone of the repo so am I good with Network Tables?
(Side note - I understand the concept of Network Tables but haven’t used them yet - I’ll be Googling that to try and figure it out.)
It says make sure it is in the “build path”. Build Path for what toolchain?
I download and installed GRIP for windows. I tried to open the .grip file and it crashed GRIP and wouldn’t open. Is that because maybe I don’t have the opencv files in the “correct” place or the Network Tables file in the right spot?
Export the project as a runnable JAR. Using what tool?
I don’t think if you respond it has to be in too much detail. Broad strokes are good. I can Google and follow-up with any unresolved issues after I work on it some more.
Installing opencv is just the library, it should come with a file opencv_3.XX.jar or something like that. You put that file inside of your built path.
I included NetworkTables 3.0 in the repo so you can easily include it in the ECLIPSE built path. This allows you to use the different classes and methods from inside that jar in your project. You are good with the git clone once you get it inside the build path
I’m not really sure what is going on with that, I just installed GRIP on a fresh machine and opened up the GRIP file and it worked fine. Send me over a log file and I can check it out for you.
To export a jar, you go to File -> Export -> Runnable JAR from eclipse and it’ll be good to run
I hope this helps! PM me if you need any more help
Hello, we’re new to vision this year, and we came across your github repository. We have used GRIP before to generate our code, and we have attempted to get the contours from the camera from the robot, but to no success.
We were wondering how we would incorporate your Process.java into our Robot.java class. How would we able to run it, without exporting it as a runnable jar file?
Awesome code. I looking to run this on a coprocessor (Jetson TX1). Would both liftTracker and processing.java run on the coprocessor or liftTracker on jetson and processing.java on robo rio. In processing.java you set the IP for networktables. Is this where your getting the network tables from and would that IP need to change for the jetson. Also your pulling a table called LiftTracker but in GRIP the network table published is still called myContours. Also setting up the camera steam ( I am using USB camera) I am assuming I would have to change IP to jetson as well (probably would have to create a mjpeg server?) and then also try to send it to the Driver Station.
That is the IP where I’m putting all of the output values (angle, distance, etc). So, you would want to change 1806 over to your team number.
Also your pulling a table called LiftTracker but in GRIP the network table published is still called myContours.
LiftTracker is the table where I am *putting *all of my values, not getting. You aren’t going to get any values from myContours from GRIP. You aren’t actually relying on the GRIP gui, it’s the code generated from it, placed in the LiftTracker.java
Also setting up the camera steam ( I am using USB camera) I am assuming I would have to change IP to jetson as well (probably would have to create a mjpeg server?) and then also try to send it to the Driver Station.
What I would do is replace the .open with the URL to:
This makes it so it’ll open up the webcam plugged into the USB. Then you might want to start an MJPG server. I’m not sure how well these work hand and hand together so you may have to experiment around
So I’m having a bit of trouble: probably because I did something wrong, but still trouble.
I’ve created a new project to test this with that essentially prints the angle and distance to the target to the driver station console so I know it works. This is where the problems start. getAngle() throws a null pointer exception at line 143, and I’m not sure why.
Anyone who thinks they can help would be very much appreciated! I have copied this down exactly, changed things to my team number, and implemented it into the main robot class without any obvious errors.
Looking through the commit history, looks like this bug was fixed. Re-pull and then try it again.
The bug was caused by when tracker.filterContoursOutput was empty, meaning that the processing routines didn’t find any contours, therefore there was no angle to find. An appropriate if statement was added to ensure that a contour was found.