Hello all,
Based on the work of the
OpenKinect initiative, I've been able to begin to write drivers in LabView to interface with the Kinect.
For now I've got the following implemented:
- Control LED Color
- Control Motor Position (between +/- 31 degree)
- View Servo Position
- View Servo Speed
- View Servo Status (Stopped/Reached Limits/Moving)
- View Accelerometer data (ux,uy,uz)
Todo:
- Turn on Depth & RGB Camera
- Retrieve Depth & RGB Data
- Retrieve Audio Data
LabView doesn't support isochronous USB transfer so that will make it harder to interface with the camera and audio data but not impossible.
I'll have source code and instructions available as soon as it I get the camera turned on and retrieving data from them. I'm hoping that this can be a learning experience and a research experience for everyone. It would be awesome to see some old FIRST robots lying around from previous competitions become a lot more autonomous with this technology.
I could see robots that could drive themselves around objects given a desired route, map an entire room, use facial recognition and voice commands to carry out tasks by specific individuals. I'm sure there's lots more useful applications for this type of technology.
All of the information and research that has gone into making this possible and more can be found at
http://openkinect.org/
Source code soon!
Ryan