LyonVision - easily test and deploy C++ vision programs

Hi CD, today I want to present you LyonVision. It is composed of a custom RaspberryPi image and of a Github template repository. Thanks to the template repo, you can create a C++ vision program with opencv 3.4.7 and WpiLib in a minimum of time. Then you can test it on your PC (Windows/Linux/OSX) and deploy it on your RaspberryPi with the custom image.

Why another vision project ?

This summer I started looking for simple way to program, compile, test and deploy C++ vision programs on a RaspberryPi with opencv and wpilib libraries. After extensive research, I found an interesting project from CJBuchel : 2020-Vision-Tracking-Template. However, with it you still had to configure the pi by ssh and with an internet connection and it doesn’t had some features I wanted. So I decided to create a new project based on CJBuchel work : LyonVision.

LyonVision-Template

LyonVision-Template is a github template repository. You can either clone it or create a new repo based on it to start programming.

  • WpiLib : for the template to work, you just have to install Wpilib with the installer, a compiler for your PC (MSVS, g++, clang) and java (for gradle)
  • VS Code integration : the template use GradleRIO so it benefits of the VS Code integration + it had custom “launch configuration” which allows debugging on your PC
  • Deploy : in just a command you can deploy and launch the program on the RaspberryPi
  • Data directory : similar to the deploy folder in a robot project, all the content of this folder is deployed on the RaspberryPi and on your PC where the executable is
  • Google Test : like on a robot program, you can use google test suite to write unit tests

LyonVision-pi-gen

LyonVision-pi-gen is the repo used to build the LyonVision RaspberryPi image. You can download the latest image here.

  • No need to log in the RaspberryPi : just image a SD card and connect the RaspberryPi to your PC via ethernet
  • Automatic start : the latest deployed program is automatically launched when the RaspberryPi boots
  • Enable the Picam
  • Works with LyonVision-Template for deploying to RaspberryPi

LyonVision-Library

LyonVision-Library is a set of classes which can be useful to develop a vision program. It is used as a git submodule by LyonVision-Template. At the moment, it is composed of 2 classes :

  • Diapo : read one by one each of the pictures that are in a file
  • MjpegStream : create a MjpegStream and inform the user of the stream url

LyonVision-Calibration

A project based on LyonVision-Template. It is a calibration program for camera (the chessboard thing) which can be used on a RaspberryPi without screen thanks to a MjpegStream.

I would appreciate your feedback on this project. And excuse me in advance if any parts of the project are writen/documented in french. As a french I tried my best to write in english but there can still be some text writen in french.

This sounds very similar to the FRCVision image for the Raspberry Pi (also see Kickoff Release of WPILib FRCVision Raspberry Pi image), with slight differences in terms of how the template programs are done and how programs are deployed. The CameraServer library provides camera streaming and publishing to NetworkTables.

It seems duplicative to have two versions of nearly the same thing–maybe the calibration program and deployment features could be contributed to FRCVision instead of keeping it as a completely separate project?

2 Likes

To me this project is very different to FRCVision. Indeed, I would describe FRCVision as a Raspberry Pi image which contains wpilib and opencv libraries and useful features for vision processing (dashboard/read only mode).

On the other hand, LyonVision is more focused on the PC side, it provide an easy way to develop, compile, test and deploy your program. The LyonVision raspberry image is just here for some minor things (vision service to run the program at boot, useful packages, picam, etc).

The problem I had with FRCVision is that you still had to compile and cross-compiled your program from yourself.

AFAIK the latest FRCVision image supports both running as a service and the RasPi Cam (I tested this myself earlier this week). The only thing I noticed in your code that was different than the official FRCVision tool was the configuration in the template, which I think can be used with FRCVision with little to no change.

Don’t get me wrong, this looks pretty cool, but it seems that it’s pretty much a duplicate of FRCVision and maybe it’ll be worth more to contribute these configurations to FRCVision somehow or something.

1 Like

How is the vision lib going?
For our team we had a bunch of finicky things we needed to do to get that system moving smoothly. But when it did it was great, easily build/run it locally and deploy to the PI. But after the WPIlib 2.2 and 3.2 the gradle was updated to a newer version,
and the toolchains for the Pi and tinkerboard became a huge issue to source. The tinkerboard especially which was our main board we used in 2019, to which we gave up this year and used the Pi. But next season we’re thinking of going back to it. So i couldn’t really figure out a nice system that was easy to download and use without much prior co-processor or vision knowledge for building and deploying without the need for me to match the pace of WPI’s updates and changes to their toolchains and syntax.

I eventually folded and scrapped the toolchains and gradle plugins idea. And made vision it’s own entity not relying on WPI as much and switched to local builds and runs using makefiles GitHub - wml-frc/CJ-Vision at 2.0. Then sending the code over the network and having it built on the co-processor. Essentially gradle’s only purpose was for deploying the code to the board and running the commands for the makefiles locally and on the coprocessor. Which I started to prefer over the template system i built. It had the same syntax as the template (buildVision, runVision, deploy etc).

But it was just using a makefile in the background to do all the work. And the best part was, no toolchains. In theory you could deploy it to anything that was Linux based. Jetson’s, Pi’s tinkerboards so on. Then i wrapped it up in a pre-threaded library so all the functions that were normally used for FRC. (colour filtering, erosion, video capture and the lot) They were all threaded when the functions where called to speed up the Vision tracking. Slimming down user programming to 20 lines in some cases. It’s not perfect, but it’s promising. And it just needs the basic installs. I’m still working on it now. Getting it ready before the new season. But I’m interested if you guys can get that toolchain version to work smoother than i did. Because the only real downside to the newer system that i’m using is it doesn’t build it locally and send the binaries over. But instead sends the physical code files for it to be built. Which requires the co-processor to be setup with the same libraries as the machine used for programming it. It’s a large bootstrap install for the board. It takes almost an hour to setup a co-processor because of OpenCV.

I think that the latest version of the template on GitHub is up-to-date with Wpilib (my last commit was in April). You should maybe try to start a new project with this version.
However, I have to admit that I have not maintained this project very well. Last season, we choosed to use Chameleon so we didn’t really used LyonVision. Moreover, this year I will unfortunately no longer be involved in Frc at all so do not plan for support from our team :frowning:
Thank you for your interest in the project !

1 Like

Seems last year was hectic all-round. Hopefully next season will be better. Sad to hear your not going to be involved with it. gl for any future projects.

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.