Kinect Questions

First of all, I want to know if we’re allowed to have the Kinect ON our robot. It seems like it was only intended to be used for people’s gestures, but we want to utilize its depth-perception in the field. Since it has a wall-plug and USB plug, it doesn’t seem like we’d be able to attach it to the robot without “hacking” it, so I don’t know if we would even be allowed to rules-wise. If anyone knows about this, please tell me.

Also, how would we use it in Netbeans? There’s a Kinect class, but I can’t make an instance of it, and only one of its static methods is available to me. Do I need to somehow use C# and Java together to use it? It would be awesome if anyone could answer these questions.

kThxBai();

From what i’ve read, you are allowed to actually place the Kinect on the robot. I’m not so sure though.

Please, do not be so dramatic and be nice to your fellow team mates.

But Gabe is correct. You can use a Kinect on the robot.

Are we allowed to use the Web cam on the robot? Since it’s kind of the same thing as the Kinect.

The only reference I find is this one…

[R84]
Other than the system provided by the Arena, no other form of wireless communications shall be used to
communicate to, from or within the Operator Console.
Examples of prohibited wireless systems include, but are not limited to, active wireless
network cards and Bluetooth devices. For the case of FRC, a motion sensing input device
(e.g. Microsoft Kinect) is not considered wireless communication and is allowed.

Please note that this rule is in reference to the Operator Console

Basically any sensor is legal; there have been multiple (not-in-rules) official posts saying the kinect is leagal. However, you have to interface the USB to something and you’d have to write all of the software yourself or go through a PC.

Quote, please…

While the focus for Kinect in 2012 is at the operator level, as described above, there are no plans to prohibit teams from implementing the Kinect sensor on the robot.

http://www.usfirst.org/roboticsprograms/frc/kinect

Thank you all so much for helping. Now we just need to figure out how to get the Kinect attached :slight_smile:

I brought this issue up with my team and they suggested to use a USB breakout board. Connecting it is probably similar to connecting the diffrent accelerometers and encoders that we get.

More accurately the above link…

Q: What can I do with the Kinect?

The Kinect sensor, in conjunction with the Microsoft Kinect SDK, can interpret human motion and generate a map of 20 skeletal points (joints) each containing X, Y, and Z coordinates and a quality measure. From this data gestures can be analyzed - such as various arm, leg, and other body motions.

Q: How can my team get started learning about using the Kinect before kickoff?

If you have a Kinect sensor, it can be attached to a computer running Microsoft Windows 7. Employing a Microsoft software development tool, such as the free Visual Studio Express C# Edition, and the Kinect SDK available from the Microsoft Research web site will allow you to look at the provided sample programs to get an idea of what the Kinect can do.

FIRST will provide Kinect software tools at Kickoff, but looking at the available tools and sample will help give you an idea of the capabilities of the sensor.

Q: Can I put the Kinect on my robot to detect other robots or field elements?

While the focus for Kinect in 2012 is at the operator level, as described above, there are no plans to prohibit teams from implementing the Kinect sensor on the robot.

Emphasis mine. I read that as “if you plan on using Kinect on the robot, some development will be needed. The SDK tools appear to be written for Windows 7.”

So does that mean you can put a computer running Windows 7 if you connect it via ethernet to the router on the robot?

Yes, you would be able to put a computer directly on the robot. The batteries in the laptop are allowed so long as they only power the laptop, and seeing as it does not control the motors, etc, directly, it is somewhat analogous to a “custom circuit”.

Others were talking about using a development board like a BeagleBoard or PandaBoard running linux, in which case you’d have to substitute the libfreenect libraries for the official microsoft ones (which do not have the skeletal tracking system – but if it is on a robot, we’ll assume you didn’t need that).

Such computing devices are custom circuits and there are rules in the Robot Rules section that cover their (custom circuits and computing devices) use on the robot.

Even with a breakout board, you would have to write your own software. Kinect doesn’t just magically find things. Its just like a suped up webcam that comes with a cool SDK.

i agree, but if you could send the data to computer used for driving, then you could have a program on that computer that could do stuff with the kinect data. but then again, there is nothing against putting a computer on the robot and connecting the kinect into it.

but what if you want to use the kinect as a depth camera?

Thats what we are thinking about using the kinect for.