Hey,
Does anyone have recommendations, tips and more about programming the Jetson with Labview?
Any help would be appreciated.
Well, I know Team 900 have made a guide on how to use the TX1-Jetson. Perhaps you can check that out and see if they’ve included Labview, but it’s doubtful.
https://team900.org/labs
I’m not aware of anyone running labview on the Jetson itself. We do use labview on the RoboRIO it communicates with the Jetson running code written in C++, Python and so on. See the linked white papers for more info and please ask questions if we’ve made them hard to understand.
Sadly you can’t program the Jetson using LabVIEW but you can get the Jetson talking to a robot programmed in LabVIEW.
We’re happy to help with specific questions. Just email: [email protected]
Also, we wrote this guide that might help you get started: The Incomplete Guide to Using the TX1 for FRC - Google Docs
Beat me to it by like a second…
Thanks for the help guys I appreciate it.
I have to search my archived links, but I found a LabVIEW enthusiast community project that enabled programming of a BeagleBone Black or a Raspberry Pi with LabVIEW. Both are ARM-based like the Jetson (and RoboRIO).
We program our robot in LabVIEW, so this looked like a potential attractive alternative. I had fully intended to look into seeing if that could be used on the Jetson last fall. We shifted to just using Python with nVidia optimized OpenCVlibraries on our Jetson TK1, because we knew we could get that working.
Overall, Python + OpenCV worked for us, but there were a lot of things that lacked sufficient (correct) documentation. The nVidia libraries were fast, but limited and old. Keep in mind this was for the TK1, which meant we had OpenCV 2.4.9, requiring Python 2.7. The TX1 libraries are more current, and allow Python 3. The Python version wasn’t the issue. The almost completely changed API for the Python bindings between OpenCV 2.4 and 3.0, along with nVidia’s selected supported subset of functionality, made it very difficult to find documentation of what would work as expected. We learned a lot about what wouldn’t work, and several times contemplated compiling a newer version of OpenCV to use. We kept the nVidia libraries out of fear that our own compile might not be fast enough without the special tweaks that nVidia uses. They know their chips far better than we do, and I am convinced that some of the limitations that we chafed against were largely responsible for the speedy performance (we had 40 fps processing@ 640x480, and streamed MJPEG 25 fps @ 320 x 240). I should mention that Python +OpenCV only uses the ARM CPU, not the GPU. The TK1 CPU is about 3x faster than the RoboRIO, and has hardware floating point.
We will be moving up to the TX1/2 next year!
I found my link!
The Lynx 3.0 project is a mostly open source effort to bring the LabVIEW runtime to the Raspberry Py, and BeagleBone Black.
This may lead someday to being able to program the Jetsons with LabVIEW.
Problem there would be porting from 32-bit ARM to 64-bit. Not sure how big a hurdle it would be but they are totally different ISAs.
The older Jetson TK1 is 32-bit ARM (with hardware floating point). That’s what my team used this year for vision.
You are correct that the TX1/2 are 64-bit ARM, but that might not be a deal breaker. LabVIEW uses LLVM to compile code, and it can be used to produce 64-bit ARM. I haven’t poked around enough to see if LabVIEW can be configured to let LLVM generate 64-bit ARM code.