![]() |
Moved to JAVA - help with vision tracking
Hey CD!
I tried to find good documentation or tutorials but failed, so im asking here... We recently decided to move from LabView to Java due to numerours reasons. We have arleady programmed our 2016 robot in Java to perform as we want it in all but one things - the image proccessing. We wish to learn how to use 2 things: 1. processing on the RIO itself using Java libraries and a regular webcam (say, the kit one). 2. Using the Jetson tk1 Nidia board as a co processor. I saw the zebracorn whitepaper, but i really dont know where to start there. I know we need to use Linuk and OpenCV for that (Pyhton?), but are missing some basic guides on how to set it all up to communicate etc. If anyone could direct us to some kind of material for both those 2 things, we will know how to keep going. Thanks |
Re: Moved to JAVA - help with vision tracking
I would advise against using vision tracking with a USB webcam on the RoboRIO, mainly because the software/hardware isn't very optimized on the RIO's USB ports, so having a camera even plugged in significantly increases CPU usage. I'd suggest an IP camera if you wanted to run on-robot vision and also it would likely be best to use it with GRIP - 1058 uses GRIP on the Driver Station to process our images for vision tracking.
There are some guides for using GRIP on the GitHub wiki as well as on ScreenSteps. |
Re: Moved to JAVA - help with vision tracking
I wrote the code for my team's (2084) vision system last year, and we used Java running on an Jetson TK1. We used Java and the official OpenCV Java wrappers for most of our vision code. You can use any language supported by OpenCV without much performance difference because the computationally expensive algorithms are running as native code.
I did implement some parts of the algorithm in C, called from Java using the JNI. This allowed us to use the OpenCV CUDA (GPU) libraries, which are not available in the Java wrapper. We used NetworkTables to communicate the distance and heading (we directly interfaced our NavX with the Jetson) of the goal to the robot. If you want to use the Jetson, it isn't that hard. It comes with Ubuntu preinstalled, and you can connect it to a monitor, mouse and keyboard and use it like a normal computer if you want. You will want to get familiar with how to use the command line and SSH, because this will make things much easier once you mount the board on the robot. We mounted ours in a 3D printed case and powered it using a DC-DC converter from Polulu (I don't have a link at the moment). |
Re: Moved to JAVA - help with vision tracking
Thanks!
We'll start looking into it and post questions if we have any. |
Re: Moved to JAVA - help with vision tracking
Quote:
|
Re: Moved to JAVA - help with vision tracking
Quote:
|
Re: Moved to JAVA - help with vision tracking
Quote:
|
Re: Moved to JAVA - help with vision tracking
Quote:
|
| All times are GMT -5. The time now is 14:34. |
Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi