KauaiLabs 2018 Season Product Announcements

New for the FRC2018 season, Kauai Labs announces VMX:

Vision. Motion. eXtreme. For FIRST FRC & ROS.

VMX-pi transforms your Raspberry Pi 3 or Raspberry Pi Zero W into a reliable, real-time Robotics Controller or Vision/Motion Processor with integrated IMU & CAN-bus interface. VMX-pi + Raspberry Pi can perform both real-time robotic control and (via ROS) higher-layer Robot Position Tracking, Drivetrain path-planning and kinematics-based control – remotely accessed via Ethernet, Wifi or Bluetooth - at a groundbreaking low price of $159, available for sale on Christmas day, 2017.

For FIRST FRC teams, VMX-pi provides a Raspberry Pi power supply, allows access to navX-technology IMU data/timestamps to both RoboRIO (over USB) and Raspberry Pi (via libraries from C++, Java, C# and Python), and enables rapid development of Vision Processing that’s directly integrated with IMU, Motion Processing & timestamps, especially when using the VMX Robotics Toolkit (VMX-rtk). VMX-pi can also be used to monitor the robot CAN bus and interface to external sensors.

Note to FIRST FRC robotics teams: historically, VMX-pi’s Digital Output control functionality is not legal for use with actuators used in competition (e.g., servos, motors, relays, pneumatics) – but it’s great for building an off-season robot!

Robotics Toolkit for FRC and ROS.

VMX Robotics Toolkit for Raspberry Pi is a suite of pre-built software tools/libraries for vision/motion processing and sensor fusion, including NetworkTables and OpenCV, as well as the Robot Operating System (Kinetic) and Raspbian Linux Operating System. VMX-rtk comes pre-installed on a high-quality 32GB Samsung EvoPlus micro-SD Class 10 Card, with extra room for storing on-board videos. Designed to save you time and frustration, VMX-rtk will be on sale for $29 at the Kauai Labs Store on January 6, 2018. Game-specific vision processing examples will follow after that.

And these existing products have software updates for the FRC2018 season:

Classic. Must-have. Featured on Championship Robots.

navX-MXP is a mature, must-have navigation sensor and I/O expander for a FIRST FRC RoboRIO-based robot control system. Since 2015, well over a thousand FRC teams have purchased navX-MXP for drive-train navigation, including Einstein champion robots in the FRC 2016 & 2017 World Championship Finals, and the 2017 Festival of Champions.

Combine data from multiple sensors. Now supports VMX-pi.

SF2 is an open-source software framework making navX-MXP, navX-Micro and VMX-pi even more powerful - fusing multiple sensors together to help you build even better robots.

Available in LabVIEW, C++ and Java for the FIRST FRC RoboRIO, the current SF2 release enables Video Processing Latency Correction on FRC robots.

SF2 was a project I wanted to contribute towards after speaking to a representative at St. Louis champs this year, but I found that the repo wasn’t very contributor friendly (no CONTRIBUTING.md, a single Development branch that’s a few commits behind the master branch, etc.). Do you think it’s possible in the near future to have further documentation for any hopeful open source contributors?

These products look really cool by the way, great work! We love the NavX MXP and have been using it for years now!

I’ve been excited for a lot of announcements but this one has had me in suspense for a while. I can’t wait to start playing with this.

Hi Dan, we’ve made the decision to take a step back and rework SF2 a bit because we saw great value in integrating with and reusing components from the Robot Operating System - this allows SF2 to extend beyond FRC, better leverage ROS developments, and to work better with VMX-pi. The tricky part is how to support both ROS and WPI Library in a way that is approachable, which forces some structural changes. Now that the VMX-pi platform is in place, our plan is to re-focus efforts on VMX-pi software components and SF2 in the coming year. Please feel free to contact me directly ([email protected]) and we can chat more about it.

man, I can’t believe that i’m going to agree with Marshall, but he’s right! These look wicked.

Is there a CAD model for this guy? I have some students wanting to get started on a cover for this.

We’ll be releasing the second generation 3d-printable enclosure CAD model (in both solidworks and STL format) next week. This is comprised of a “base” and a “lid”, and is designed to enclose the VMX-pi and Raspberry Pi 3; it’s also designed to fit the Raspberry Pi Zero W. For reference, we print the model on an Ultimaker, I’ll work on getting details posted regarding the required printing resolution for best results.

Given that, is the board CAD model still desirable?

Nope, your case will be perfect. thanks!


Since the process of building the image can take over 24 hours and requires constant internet connection


Are you talking about building all those components on the Pi itself? Because I have most of those components on my Beaglebone already and I KNOW I haven’t had that thing connected to the internet for 24 hours, it’s battery powered.

The VMX-rtk image is built on a RPI. We’ll make the scripts available soon so you can see for yourself. We’re getting faster at it, but it’s a day-long process to create the image (download, compile, verify).

Raspbian Stretch itself is easily downloaded from the Raspberry Pi foundation, but the image includes ROS Kinetic, OpenCV, Eclipse, Mono, xGalaga, VMX-pi HAL, WPI cscore and ntcore and more. The biggest contributor to the build time is ROS Kinetic.

We’ve used the NavX and NavMXP, and love them both, but what do you envision the usage case for this to be? What FRC robot features have you tested that this will enable?

We currently use the NavMXP, and a separate Raspberry Pi/Webcam doing all of our vision detection and processing. I’m imagining some benefit by lower latency coupling the IMU and Vision, and potentially freeing up the MXP port and serial ports by just communicating between the PI and RoboRio over Ethernet. The extra motor control outputs can’t be used as they typically wouldn’t be FRC legal.

I read the email and watched the videos, but I’m missing the perceived benefit. What do you see most teams using this for, and how much of the SW is developed already in libraries vs will have to be custom developed by teams?

PS- don’t at all take this as negative, NavMXP is amazing, Scott is incredibly helpful and supportive in the products and integration… I’d just like to know more about this product.

I think that’s a great question. VMX-pi is “base camp” for a number of things we’ve envisioned moving ahead:

  1. To start, it’s a way to simplify integrating vision/motion processing, like you mentioned. That’s what we’re focusing on for this FRC season, along with the VMX-rtk to save the precious days students have to spend configuring stuff now. There are some little things like simplifying how to provide power to Raspberry Pi and have a battery-backed real-time clock on the robot that are in the mix there, too.

  2. Tools to monitor a FRC robot CAN bus, tune drive system calibration parameters.

  3. A $200 robot controller w/capabilities similar to RoboRIO+navX-MXP+OpenMesh radio that (once we get WPI Library running on it) will provide something inexpensive enough that the programmers on each team can have one to take home with them. For teams like those here on Kauai with limited budgets, that’s a big deal.

  4. A platform for building what I like to call a “Localization Processor” (robot position tracking) that performs fusion of IMU/Encoder/Vision processing in an integrated package that can operate in the same FRC robot as the robot application, but decoupled (a “sidecar”, if you will). The focus is sensor fusion and the tight integration required to do it well. I see this as a new class of product and something that will emerge as we continue to develop/integrate SF2 & ROS with VMX-pi.

As an aside, we think it’s likely that over the coming years there will be a move towards the Robot Operating System, and we think VMX-pi could be the stepping stone that allows FRC students to get their feet wet w/ROS before they dive into it more deeply at the University level. We’ll see on that one, but we think it’s possible and worth investing in. There are a number of non-FRC folks getting interested in these areas currently.

We thought for 2018 the first two items were significant enough to make it available now. VMX-pi might not be for everyone since it’s not fully turn-key yet, but we’re getting some feedback so far that’s encouraging. Like navX-MXP, we’re committed to adding more features over the next few years as we develop items 2, 3 and 4 above.

  • scott

“Game-specific vision processing examples will follow after that.”

Will these examples also be available on the website for download?


Yes, they’ll be online at vmx-rtk.kauailabs.com. The examples can be used by anyone, VMX-pi is not required unless accessing IMU data and timestamps.

The VMX-pi enclosure design files (in Solidworks STEP file format, for Robot CAD Layout; and in STL format for 3D printing) are now available. More info is available on the VMX-pi enclosure page, including a link to the Shapeways store where printed enclosures can be ordered by those teams not having a 3D printer.

The VMX Robotics Toolkit on high-quality 32GB SD Card for VMX-pi and Raspberry Pi is now available.

Using the libraries within the toolkit, a Java vision processing example is available, includes these features:

  • Configuring and Acquiring data from a USB Camera
  • Saving video to Raspberry Pi disk
  • Streaming video from USB Camera to Dashboard (before processing)
  • Streaming opencv-processed video to Dashboard (post-processing)
  • Acquiring VMX-pi IMU and Timestamp Data and sending to RoboRIO via NetworkTables
  • Overlaying IMU and Timestamp Data directly on the Video
  • Easy integration of a GRIP-generated Pipeline

Over the coming days, the examples will be enhanced with C++ and Python examples of the above functionality. Please visit the VMX Robotics Toolkit FRC Examples page for more details.

Additionally, example game-specific opencv-based vision processing algorithms will be made available as well.

If you want to learn to develop vision processing code on a Raspberry Pi, but want to avoid the hassles and get something up and running quickly, we encourage you to checkout the VMX Robotics Toolkit.

[And in honor of this year’s “Power Up” game, we’ve included a classic console video game in case you need some stress relief from the pressures of the build season.]

Two updates:

  • 100 VMX-pi units are available on FIRST Choice, Round 2; “cost” is 20 points; one per team.

  • In addition to the Java example, the vmx-rtk-examples showing how to integrate a Grip Pipeline on a Raspberry Pi (including VMX-pi orientation data and timestamps) now has had a C++ example added; work on the Python example is underway and we’ll let you know when that’s available.

What’s the preferred way to power the vmx? Will the 2amp port on the vrm do?

The preferred way is to connect VMX-pi directly to an output of the Power Distribution Panel (PDP). There’s now a section on the VMX-pi RoboRIO installation page that describes the power connection, including a photo.

You can also connect the VMX-pi Battery Adapter Cable to a 2Amp port on the VRM. That’s not the recommended approach as it won’t give you the full 3Amps (if that’s not an issue, please feel free to power it from the VRM). We also figured teams would like to use their VRM ports for other things.

I can confirm the 2A port works. I was just nervous about plugging the VRM straight into the PDB with a breaker that is rated at a much higher rate than 3A. I hate releasing magic smoke on the first start up.

For some reason I missed your pictures on the site. Those are very helpful.