How do people both stream a camera feed to their driver station and use a camera feed for vision processing on the TX1?
We tried taking the cv::Mat that we get from our camera (and that we use for vision), encoding it as a .jpg on CPU using imencode, then sending it over a socket to the driver station, and displaying it.
This is much too slow, but opencv doesn’t seem to have image encoding abilities on the GPU. We’re worried if we use opencv to grab the camera feed for vision and another library/program to grab a feed from our camera for streaming, we’ll start to have problems.
Is the bottleneck the image encoding or the process of sending it over the socket? Writing jpgs to /dev/shm and displaying them with mjpg-streamer does not pose much of a performance issue on our TK1.
Just to clarify, you encode them with opencv on the jetson, write them to disk at /dev/shm/ on the jetson, and then stream them with mjpeg-streamer to the driver station?
We have been unable to compile the cscore library on the jetson TX1. When we run “./gradlew :arm:build -PcompilerPrefix=”, we get this error:
ubuntu@tegra-ubuntu:~/cscore-1.0.0$ ./gradlew :arm:build -PcompilerPrefix=
No .git was found in /home/ubuntu/cscore-1.0.0, or any parent directories of that directory.
No version number generated.
:outputVersions UP-TO-DATE
:arm:compileJava UP-TO-DATE
:arm:processResources UP-TO-DATE
:arm:classes UP-TO-DATE
:arm:jniHeadersCscore UP-TO-DATE
:downloadOpenCvHeaders UP-TO-DATE
:arm:unzipOpenCvHeaders UP-TO-DATE
:arm:downloadOpenCvJni_linux-arm UP-TO-DATE
:arm:unzipOpenCvJni_linux-arm UP-TO-DATE
:arm:downloadOpenCvNatives_linux-arm UP-TO-DATE
:arm:unzipOpenCvNatives_linux-arm UP-TO-DATE
:arm:downloadWpiUtil UP-TO-DATE
:arm:unzipWpiUtil UP-TO-DATE
:arm:compileCscoreSharedLibraryCscoreCpp
:arm:linkCscoreSharedLibrary/home/ubuntu/cscore-1.0.0/arm/build/wpiutil/Linux/arm/libwpiutil.a: error adding symbols: File in wrong format
collect2: error: ld returned 1 exit status