paper: Using JeVois camera in FRC

Did you use the hardware RS232 port on the RIO, or the USB serial device? We quickly attempted USB serial last night (kUSB), but were getting errors about it not being available. I ran out of time to properly debug, but was curious if anyone else had started down this path yet.

Be very careful. The RS232 port on the roboRIO is not compatible with the TTL hardware port on the JeVois without a TTL to RS233 converter.
Use either the USB or the TTL pins in the MXP port.
Honestly, USB is the easiest method.

We just used USB serial - if you’re getting an “in use” exception I’m wondering if you are already using USB elsewhere in the code (like for a NavX for example). If you have NI MAX installed on the laptop, try using that as a first step to confirm the JeVois is connected and communicating (the JeVois should show up as an additional NI VISA device; select it and open the VISA Test Panel and you can try pinging the JeVois just like you would from a serial terminal). I can post screenshots tonight if helpful…

Also in case it helps here’s some very simple Java code using IterativeRobot that should ping the JeVois and display the response (“ALIVE” and then “OK”) to the console when you enable in teleop. This is a quick cut & paste from our github repo, so apologies in advance if any typos slipped through :eek:

public class Vision extends IterativeRobot {
	static final int BAUD_RATE = 115200;
	SerialPort visionPort = null;
	int loopCount = 0;
	public void robotInit() {
		try {
			System.out.print("Creating JeVois SerialPort...");
			visionPort = new SerialPort(BAUD_RATE,SerialPort.Port.kUSB);
		} catch (Exception e) {
			System.out.println("FAILED!!  Fix and then restart code...");
        public void teleopInit() {
              if (visionPort == null) return;
              System.out.println("pinging JeVois");
              String cmd = "ping
              int bytes = visionPort.writeString(cmd);
	      System.out.println("wrote " +  bytes + "/" + cmd.length() + " bytes, cmd: " + cmd);

        public void teleopPeriodic() {
                if (visionPort == null) return;
        	if (visionPort.getBytesReceived() > 0) {
		   System.out.println("Waited: " + loopCount + " loops, Rcv'd: " + visionPort.readString());
		   loopCount = 0;
		} else {

Thanks! Our code was largely doing the same things. I think we were gonna use an extra enableTermination() call to auto-do the
, but that’s all I’m seeing different now.

I’m used NI-MAX for similar stuff in the past, I’ll poke with that as well.

The only thought I had on it was that we have a thumb drive in the top port on the RIO, and the JeVois in the bottom port. By reading the WPIlib docs, I’m wondering if we actually need to be using kUSB2? Even though there’s only one device which should present a USB serial port, there are technically two USB devices plugged in…

So our progress from tonight:

Your code was helpful! Not exactly sure what I did differently tonight, but ~30 test cycles later it seems to be connecting to serial without issue.

Next step is getting the MjpegServer to actually broadcast the image. Still not working at the moment. I am curious if the classes are having issue with the fact the “resolution” passed to the JeVois is not the actual image size that comes back… has just released a new image. It is supposed to address the issues with Windows 10!

I have confirmed it does work properly under Windows 10 for me.

I just uploaded the latest, and possibly final, version of the Whitepaper.
Updates included a link to the latest JeVois image, and details on how to receive the targeting data on the roboRIO.

Please feel free to leave feedback. If requests mandate addition information, or corrections, I will update the document.

If I’m understanding your comment above, are you passing serial commands from the roboRIO to the JeVois to initiate the video stream?

Have you tried just using the initscript.cfg to start the stream so that the roboRIO just has to pass along what it receives?

We had tried that at one point - based on the response from the “help” command I believe we had selected the proper program. I was thinking initscript.cfg would be the easier way to go, but hadn’t confirmed yet.

Where we were getting stuck was that the MJEPG server & USB camera classes had reported that they were connected. Doing a curl of the mjpg stream URL returns text that looks like an mjpg stream (with the boundaries and binary data inbetween). However, no web browser (Chrome, IE, Edge) seems to be able to display it.

We also, with the same software, plugged in a known-working USB camera, which produced a valid displayed image.

I guess what’s weird here is I think the JeVois is transmitting video data, but the stream isn’t appearing to be valid. We’ll do some more debugging tonight, and I’ll see if I can upload a snippet of the curl results.

Looking at the test code that you linked a few posts up, the issue may be that you need to set the pixel format for the UsbCamera…it defaults to MJPG rather than YUV. So unless you’ve setup a mode on the JeVois to output MJPG, the streamer is probably indeed sending a botched stream.

The following works, at least with the current 2018 beta test release and I think it should with 2017 as well:

visionCam = new UsbCamera("VisionProcCam", 0);
camServer = new MjpegServer("VisionCamServer", MJPG_STREAM_PORT);

Using the above you should see the JeVois DemoSaliency stream unless you also change the resolution (see below as to why that happens even if you’ve configured a different start-up mode in the JeVois)

We’ve found that the CameraServer seems to always make an initial connection to the JeVois at 640x300, so even if you set initscript.cfg on the JeVois to start a different mode, it will kick over to whatever mode is at 640x300 (DemoSaliency with the default JeVois mapping) when CameraServer initially connects. So for now in our testing we’re setting the pixel format, resolution, and fps via the .setVideoMode() method of UsbCamera to make it easier:

visionCam = new UsbCamera("VisionProcCam", 0);
visionCam.setVideoMode(PixelFormat.kYUV, 320, 252, 30);  // start ObjectDetect		
camServer = new MjpegServer("VisionCamServer", MJPG_STREAM_PORT);

Hope that helps!

  • Ron
1 Like

oooooooo! This might just be the secrete sauce we were missing! I don’t know if it’s worth posting or not, but I did get a curl of what the webcam was streaming. It was producing the usual --boundary markers with a bunch of binary data in between that no web browser (or VLC) could display. Something was definitely coming out, but was scratching my head as to what it was. Tonight or tomorrow, we’ll be giving this a shot.

Also, yes, we noticed the initscript limitation. Most programs even seem to have a default which they force the camera into (much to the befuddlement of our programmers). We were able to debug with serial that we were at least starting our code properly, but as soon as anything connected it would swap to some different program. Amcap and VLC both did this, and it’s good to know UsbCamera will do the same.

Turns out that our videomappings.cfg on the JeVois had the DemoSaliency module mapping at 640x300 marked as the default. According to the JeVois docs it will announce that default to the USB host (i.e. the roboRIO), so CameraServer probably isn’t defaulting to 640x300 itself, but rather just connecting to whatever default that the JeVois advertises.

  • Ron

There are a couple options here.
Edit the videomappings.cfg to point that resolution to your code, or, set the default to your image processing script. Do this by moving the “*” to where you want JeVois to announce it’s default. Although, if you are reading the content of the videomapping.cfg file to figure out what was happening, I’m sure you already knew this.

Yep, exactly! We were curious why the initial CameraServer connection was always at 640x300 so we dug into it :slight_smile:

I’m hoping we’ll have time to develop a general “execute GRIP pipeline” module and set of instructions as we get ready for build, which may also make sense to add into the whitepaper…

  • Ron

If you provide the details, the document will have room made in it for those additions.
Can you say “v1.10”?

Pushed code updates from tonight with some after-hours cleanup.

We can get MJPG streaming to run, and can read serial packets from custom code on the Jevois simultaneously.

RIO processor load seems to be super high. We also need to be sure we’re getting high framerates from the vision data, but not flooding the usb streaming setup with that same framerate.

Are you running the video stream and the tracking data both over USB?
It would be an interesting exercise to run the video over USB, and the tracking data over the TTL serial link.
With that configuration, it would be easy to verify which of the 2 sets of data are pushing the Rio CPU the hardest. It would also be much easier to determine if a higher, or lower, frame rate would be beneficial. For example: Acquire your video at 60 frames fps but only send 15 fps over USB. That way you still get 60 frames per second of tracking data, but not saturate the USB link with the video stream.

Yup! It’s on the list. First pass was going to to be to establish if the YUV->MJPG conversion was the cause of most of the processor load (my strongest suspicion). After that, yah, playing with framerates, then possibly separating out the serial stream.

The mjpg stream is definitely a “nice-to-have” - I’d really really like to have a nice streamlined single-camera system where the JeVois does double duty as driver assist and vision process. Overall, I would hope the vision process algorithm runs at it’s max possible, and we can tone down the video stream to something much slower. Then again, with the way the python architecture is set up, we’ll still need some investigation to figure out how to do this. To the docs this weekend :slight_smile:

How would it be the driver assist camera once you tune the contrast for the target? Just for aiming, or changing settings at different times?

It’s in another post - basically, send commands to the JeVois to swap between a nice-driver-autoexposure setting, and a vision process setting. When the driver hits the “auto align” button, the camera quickly flips to vision-process exposure and starts searching for targets.

This is as-of-yet untested. Not sure if we can execute the flip fast enough to be useful. But, I still feel it would be nifty if we could get it to work.

You can avoid much of the RIO’s CPU loading by having JeVois do the YUV to MJPG conversion.

The command to launch a Script, or as you suggested to switch modes, contains all you need to acquire images in YUYV and then stream them over USB as MJPG.

For example:

MJPG 320 240 15.0 YUYV 320 240 60.0 JeVois EagleTracker

This command launches our tracking code using 60fps from the camera and streams it out USB in MJPG at 15 fps.
Now, as Allen mentioned, we will likely not use this mode. To optimize target identification, we tune the camera to the point that the image is basically useless for the driver.

Now, switching modes on the fly, that is where the command line reconfiguration of the JeVois will really make this camera shine!