Chief Delphi

Chief Delphi (http://www.chiefdelphi.com/forums/index.php)
-   Programming (http://www.chiefdelphi.com/forums/forumdisplay.php?f=51)
-   -   Running the Kinect on the Robot. (http://www.chiefdelphi.com/forums/showthread.php?t=99275)

catacon 23-01-2012 19:29

Re: Running the Kinect on the Robot.
 
Quote:

Originally Posted by RufflesRidge (Post 1112091)
Looks more like Libfreenect, my guess would be hooked up to OpenCV.

At the time I was using the CodeLabratories NUI drivers with OpenCV. I have since switched to OpenKinect (libfreenect). Seems to be working very well. The Kinect is able to track the target and I am working on getting the depth to the target.

shuhao 23-01-2012 20:03

Re: Running the Kinect on the Robot.
 
How are you doing the tracking/recognition?

What's your algorithm? It seems to be fairly fast..

My knowledge of CV is rather limited (the basics of edge, corner detection, the high level understanding of stereo vision, scrolling window.. machine learning based pattern matching etc..), and I haven't had much experiences with OpenCV.

A basic flow of your algorithm would be nice if you're willing to share :D

catacon 24-01-2012 01:39

Re: Running the Kinect on the Robot.
 
Quote:

Originally Posted by shuhao (Post 1112204)
How are you doing the tracking/recognition?

What's your algorithm? It seems to be fairly fast..

My knowledge of CV is rather limited (the basics of edge, corner detection, the high level understanding of stereo vision, scrolling window.. machine learning based pattern matching etc..), and I haven't had much experiences with OpenCV.

A basic flow of your algorithm would be nice if you're willing to share :D


I will post a general outline of my algorithm once I get something more solid down. I have improved it greatly since those videos.

Depth measurements and angle based tracking have been locked in.

catacon 24-01-2012 14:48

Re: Running the Kinect on the Robot.
 
Another video. Yaaaaaaay....


http://www.youtube.com/watch?v=6M3MpksczlY

shuhao 28-01-2012 00:13

Re: Running the Kinect on the Robot.
 
Quote:

Originally Posted by catacon (Post 1112695)
Another video. Yaaaaaaay....


http://www.youtube.com/watch?v=6M3MpksczlY


Any explanations soon?

catacon 30-01-2012 16:21

Re: Running the Kinect on the Robot.
 
Yeah...maybe this week. I kind of want to get things perfected first.

I got out Pandaboard and will be working on getting things running on that this week.

spying189 31-01-2012 11:36

Re: Running the Kinect on the Robot.
 
Quote:

Originally Posted by realslimschadey (Post 1112037)
are there any drivers that we need to use the kinect with the driver station. it says i need a server. does the new driver station come with it. when i plug the kinect into the classmate it doesnt recognize it????????:eek: :confused: :confused: :confused: :confused:

Quote:

Originally Posted by RoboMaster (Post 1112114)
realslimschadey, please look at some other threads, resources from FIRST, or start your own question thread. This thread is about using the Kinect on the robot, like a camera.


To use the Kinect on the robot, you would need either a computer ON the robot, or have the Kinect USB somehow streamed wirelessly to the Classmate/FIRST laptop for translation, then sent back to the bot for angle determination. FIRST provides all the resources through their "Technical Resources" webpage.
http://www.usfirst.org/roboticsprograms/frc/2012-kit-of-parts-driver-station/ This webpage provides you with all the links that you will need. First, you need to download the NI Labview Update in order to have the Classmate up to date, along with a update for FIRST utilities, and the Driver Station Update. These MUST be installed **IN LISTED ORDER** to run the supported version of the Driver Station. To give your Classmate/FIRST laptop the ability to support Kinect use, you must first ensure the computer meets the following system requirements.
It must have:
Microsoft Windows 7 Starter Edition and up
2.0 GHZ Processor or higher
1 GB of RAM or higher
3 GB or more of FREE HARD DRIVE space

To get the Kinect running after the Driver Station is installed, download & install the Microsoft Kinect SDK. After doing this, download/install the FRC Kinect Server software. This will allow FIRST software cross-compatibility between the Kinect SDK and the FIRST software.
Finally, download the Kinect Kiosk software to enable viewing of what the Kinect sees on the Driver Station. (Skeleton or not.) If you would like it to act just as a camera on the robot, then you will have to somehow program that into the FRC Dashboard (Editing the Driver Station code isn't allowed).

Hopefully this will help those who needed it working!

shuhao 31-01-2012 13:27

First of all, you don't need windows for the kinect oon the robot. In fact, it is probably a bad idea, because libfreenect is just better than Microsoft's sdk

azula369 31-01-2012 17:25

Re: Running the Kinect on the Robot.
 
We're a beginning team considering using a USB wireless extender (http://www.usbfirewire.com/Parts/rr-47-2022.html) to transfer the data collected by the Kinect to the Classmate, process it there, and then transfer it back through the radio. Does this sound feasible, and also legal, to the more experienced teams?

mwtidd 31-01-2012 17:49

Re: Running the Kinect on the Robot.
 
Quote:

Originally Posted by shuhao (Post 1117119)
First of all, you don't need windows for the kinect oon the robot. In fact, it is probably a bad idea, because libfreenect is just better than Microsoft's sdk

Your first statement is definitely correct.

However I'd be curious as to what makes libfreenect better.

For me I think it would save money and weight, but the library itself isn't necessarily better.

The key function I utilize in the MS SDK that makes the vision processing for this game quite a bit easier is the GetColorPixelCoordinatesFromDepthPixel function. This makes finding the intersection of the RGB and depth images much easier.

As far as I know achieving this in labfreenect takes a good deal of work and calibration. Also, I've found OpenKinect to be a pain to install, and watched it kill my machine the last two times I've tried to install it on windows.

I don't believe one solution is "better" than the other, it all depends on the approach and the application.

cgmv123 31-01-2012 18:40

Re: Running the Kinect on the Robot.
 
Quote:

Originally Posted by azula369 (Post 1117269)
We're a beginning team considering using a USB wireless extender (http://www.usbfirewire.com/Parts/rr-47-2022.html) to transfer the data collected by the Kinect to the Classmate, process it there, and then transfer it back through the radio. Does this sound feasible, and also legal, to the more experienced teams?

Feasible depending on range, but not legal. The only wireless communication allowed is the robot radio.

azula369 31-01-2012 19:04

Re: Running the Kinect on the Robot.
 
OK, but buying and connecting a Pandaboard is legal, right? So that's plan B. Are there any legality issues we should be aware of with that?

catacon 31-01-2012 21:18

Re: Running the Kinect on the Robot.
 
Quote:

Originally Posted by lineskier (Post 1117279)
The key function I utilize in the MS SDK that makes the vision processing for this game quite a bit easier is the GetColorPixelCoordinatesFromDepthPixel function. This makes finding the intersection of the RGB and depth images much easier.

As far as I know achieving this in labfreenect takes a good deal of work and calibration. Also, I've found OpenKinect to be a pain to install, and watched it kill my machine the last two times I've tried to install it on windows.

That sure is a mouthful haha. This is not that difficult with libfreenect, especially with OpenCV (for me, anyways). Besides, not real need for the RGB feed anyway. ;-)

libfreenect is a pain to install on Windows. It's cake on Linux, though.

catacon 31-01-2012 21:58

Re: Running the Kinect on the Robot.
 
Got the Kinect running on our Pandaboard tonight. It's a little slow, but I have a few ideas on how to speed it up.

RufflesRidge 31-01-2012 22:16

Re: Running the Kinect on the Robot.
 
Quote:

Originally Posted by catacon (Post 1117455)
Got the Kinect running on our Pandaboard tonight. It's a little slow, but I have a few ideas on how to speed it up.

I'd love to know what framerate you're seeing and what kind of processing you're doing. Are you using both feeds or just opening the depth stream?


All times are GMT -5. The time now is 02:15.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi