Kinect LabView Drivers

Hello all,

Based on the work of the OpenKinect initiative, I’ve been able to begin to write drivers in LabView to interface with the Kinect.

For now I’ve got the following implemented:

  • Control LED Color
  • Control Motor Position (between +/- 31 degree)
  • View Servo Position
  • View Servo Speed
  • View Servo Status (Stopped/Reached Limits/Moving)
  • View Accelerometer data (ux,uy,uz)


  • Turn on Depth & RGB Camera
  • Retrieve Depth & RGB Data
  • Retrieve Audio Data

LabView doesn’t support isochronous USB transfer so that will make it harder to interface with the camera and audio data but not impossible.

I’ll have source code and instructions available as soon as it I get the camera turned on and retrieving data from them. I’m hoping that this can be a learning experience and a research experience for everyone. It would be awesome to see some old FIRST robots lying around from previous competitions become a lot more autonomous with this technology.

I could see robots that could drive themselves around objects given a desired route, map an entire room, use facial recognition and voice commands to carry out tasks by specific individuals. I’m sure there’s lots more useful applications for this type of technology.

All of the information and research that has gone into making this possible and more can be found at

Source code soon!


How are you connecting the Kinect to the CRIO? I was going to work in a similar project but didn’t want to commit funds until I figured out how to connect it.

It’s actually just USB to my laptop at the moment, sorry I didn’t make that clear beforehand. I don’t have a cRIO available to me yet at the moment so I haven’t had a chance to investigate that.

However, the USB interface is seriously complex, especially with the fact that we have to deal with isochronous transfers for video and audio. It might be possible to strip the USB and power cables and connect the data wires to the DIO board and the power distribution board respectively.

Just to note, the newer cRIO’s have a USB interface port.

They do? The NI 9074 I have at work (purchased less than 3 months ago) does not…

Check this one out:

Either way we should look for a solution with the DIO board since that will probably be the most natural interface to work with when wiring up the Kinect. It would actually probably be pretty easy to put together a tutorial on creating a male usb to the individual power wire and the I/O wires that can be connected to the Kinect. Could be as simple as buying the Kinect legacy extension from Fry’s or the like, and cutting off the ends of the wires, stripping them, and wiring them appropriately.

On the USB subject, the cRIO family has lots of variety, and some have a USB port. Last I checked, the USB stack exclusively supported the storage protocol. It was used to provide a file storage interface to a USB stick.

True, drivers for other device types could be written, but doing so for VXWorks and keeping the RT aspects in mind are not so easy, so the other device types will not be supported unless there is a good reason for NI to purchase or build the necessary drivers.

I wouldn’t really worry about the USB issue though. Exploring and learning about the kinect using a laptop sounds like an incredibly beneficial journey. Please keep it up. And if you are interested in sharing your results, please consider also posting them to NIWeb as you may get a good number of power users interested.

Greg McKaskle

The problem is not in getting the Kinect physically wired to the cRIO, the problem is that the FRC cRIO does not have a USB Host Controller.

I could see someone getting a Kinect running on a robot using a mini-ITX motherboard with a SSD drive. Another option would be to get it running on a microcontroller. I will likely be working on using the Kinect with a Luminary Micro 3000 series chip at work sometime in the next few months, but I’m not certain I will have anything there before kickoff.

Ah, I did not consider that. Yes, it would be difficult to interface with the Kinect without a host USB controller.

I’ve attached my source code here even though it doesn’t have depth or camera data working yet. I’ve certainly learned a lot from this experience, it’s been really rewarding.

I’ll probably end up changing it all up to use a dll generated from the library. LabView doesn’t have isochronous USB support so that will be necessary in the end anyway but I wanted to see how far I could get with the raw VISA interface.

To setup the Kinect to your PC using the labview libraries you’ll need to use the VISA Driver Wizard to create a NI-VISA driver for the Xbox NUI Motor and Xbux NUI Camera. Audio isn’t necessary since at the moment it hasn’t been reverse engineered yet.

Once that’s been done it’s as simple as open up and running it. If the Kinect is working and the driver was setup right it will autodetect the usb port it’s running on and initialize the motor and accelerometer, etc.

Uploading isn’t working here for some reason.

You could theoretically program the FPGA on the cRIO to act as a USB host, and from there, you could control the Kinect. I haven’t actually tried it, but it seams feasable.

The FPGA could theoretically implement the host controller features, but you would still need a USB phy (physical layer interface) to host the USB electrically. Also, the host controller is terribly complex and not really a reasonable engineering approach. You would be far better off using a device that already has a USB host controller (such as the cRIO 9022 mentioned earler, a laptop, a TI Beagle Board, or even a Luminary (now TI) Stellaris part with USB OTG support).

I’ve got the LabView drivers fully working but still not completely stable yet. The dlls that run behind the scenes still need a lot of polishing but this is a great start so far. The next big challenge will be to find a reliable/easy way to get the Kinect to run through the cRIO as we wait for the dlls to stabilize.

I’ll hopefully have the development source code up in my github repository within the next weeks.


Aside from getting the data into the c-rio, there is the raw computational power of doing anything useful with the data. Wouldn’t a better strategy be getting the data back to the driver station lap top and do the processing there?
A 400 mhz power pc vs. a dual core x86? The c-rio could be by passed completely. A usb host controller interfaces to the kinect and then sends it by lan to the gaming router back to the laptop. First would have to change the
ruling on no lasers. I believe the Kinect has a true infrared laser. The kinect could have tremendous affect on our robots in the future. You may want to keep an eye on Microsoft’s robotic studio. Word has it they may be releasing some drivers.

This wasn’t ever meant to be used during competition in FIRST :wink: The laser is a Level 1 Class laser but it’s perfectly safe as there is a million safety features that PrimeSense and Microsoft built in to make sure it stays safe.

But the showstopper here is that there is too much interference with multiple Kinect’s moving around. I’m just building this so I can make a robot do things by itself autonomously; The idea of true autonomy is very poorly implemented in FIRST, mainly because the complexity and lack of technology to make it more plausible. I quite think this would help others interested in exploring truly autonomous robots. Every year that passes by goes more decommissioned pieces of metal that never get put into competition again and I quite think this would make for some interesting projects with those old robots. That is certainly what I’d like to do off-season with them.

I am EXTREMELY interested in this. Do you plan on releasing it when its done?

Hi, this has grabbed my interest. I believe that LabVIEW does support isochronous connections. Although I could only find documentation on Firewire and isochronous data transfer, I have successfully used every USB webcam that I have tried with LabVIEW. The IMAQ USB drivers have to be installed. They are available to download from NI, but, I am unsure of any licensing. (I think only the NI Vision module has to be activated, not the drivers) If the device can be enumerated via USB devices, it should work via IMAQ USB capture.

At this point, I’ve switched to using the OpenKinect project as a dll (based on cross-platform libusb). Much easier to work with their framework rather then reinvent the wheel :slight_smile: I previously tried using the VISA interface for USB capture which doesn’t support isochronous transfers.

I certainly am; I should have source code up in my github fork within the next few days.

The only reason why using LabVIEW for the image acquisition is that IMAQ uses the Vision module. It is all reference based, and very well designed. It is fast, and has a ton of primitives for image manipulation/measurement. For automation, this would surely make life easier. The first time I used the Vision, I built a flag follower module that ran on the cRIO in about 2 hours. I suppose that the image could be sent to the Vision module, but there may be performance issues that would have to be worked out. I would be more then happy to help out with the LabVIEW side.

I’ll do some timing to see what kind of performance we get out of this; Converting to something that IMAQ can handle isn’t too hard.

I’d surely like anyone’s help (who wants to) to improve this :slight_smile:

If you have any modified code from the last time you posted, please re-post or we setup some kind of code repository. Also, which version of LabVIEW do you use? I don’t want to up-save, and cause troubles when you try to open the code.

I have setup a Google code repository account/project. This can be changed, I just wanted to try out their repository anyways.

I have not seriously used their repository before, so I am still learning how to configure it.