![]() |
Re: Kinect usage at competitions
Quote:
|
Re: Kinect usage at competitions
Quote:
:rolleyes: |
Re: Kinect usage at competitions
We use ours. For the first ~7 seconds, its preprogrammed shooting. Then control is givin to our kinect driver and our ball collector is turned on. Our kinect driver drives around for the remainder of hybrid trying to collect balls.
|
Re: Kinect usage at competitions
So yes our Kinect code works perfectly in the shop......
But when we use the exact same DS w/ exact same USB hub we are unable to get the Field Kinect to work w/ the DS at all. This is @ GVSU/West Michigan. Anyone else have similar issues? |
Re: Kinect usage at competitions
Quote:
|
Re: Kinect usage at competitions
We're at GVSU this weekend, we had the exact same issue in Traverse City with the USB hub. I would recommend hooking directly into the computer with it.
If your still experimenting with it, feel free to hunt me down (team 2474 :D ) just ask for the programmer (or Joe). I'd kinda like to see it working in any scenario, and to see if you see similar results (see my other post in this thread). I'd love to pair up and work on it if you think you'll be spending some time on it tomorrow! :] (as long as I'm not too busy with our robot) |
Re: Kinect usage at competitions
We are using a Kinect (IR imaging) with some success. We do not use its auto-ranging feature, rather, we process the image. We have good accuracy on the practice field, but we are experimenting with software and IR illumination adjustments to get better consistency on the game field. We suspect that the field lighting and reflections from the backboards and possibly the floor, other robots, or the big projection screen are causing the Kinect system to "lose lock" on the targets at times. Our test field used plywood for the backboards. We did not get to calibrate to the lighting in Rochester, hopefully we can in DC.
We are using an Arm/Linux Beagle Board-xM and a CV image processing library, in addition to some custom image processing code developed by the students. It does take a long time to boot... The USB Kinect plugs into the Beagle Board, and the Beagle is on the local Ethernet. Code by one of our students does the trig on the vertices of the trapezoids to calculate the angle and distance to the target, which is passed to the cRIO over a UDP packet link developed by another student. When the "aim" trigger is pulled, the steering loop is closed around the error signal from the Beagle (in discrete slow steps due to current 6 frames-per-second data rate) until it reaches the "dead-band" near zero error. The students developed a LOT of software over 6-weeks. On the test field we can range and aim at the center target with 0.5 degree accuracy with any 3 of the 4 targets visible, tested to about 20 feet. We did not have sufficient time at the Rochester regional to collect enough distance calibration data points, so the curve we are currently using is not so good, based on only 5 points. This was due to time spent fixing mechanical problems. We hope the shooting will work better on the field at the Washington DC regional, and that we'll have enough time to properly calibrate there :). We also have a device that "squeezes" the ball to identify soft balls on the way in, but we haven't tried it. If things go well, we may use a different distance-to-shooter-speed curve for soft balls, or apply some other correction, but that is asking a lot... We'd be happy to talk with anyone using the Kinect, and offer help if we can. --Dan Sternglass |
Re: Kinect usage at competitions
Quote:
|
Re: Kinect usage at competitions
987 is using the Kinect's IR imaging on the robot for distance. So far it has worked quite well. We also use the field kinect as an Emergency stop during autonomous.
|
Re: Kinect usage at competitions
At the NY Regional, I don't remember seeing any teams using the Kinect. One team I talked to said they had used it successfully at another regional (Long Island?), but that it wasn't working on the NY field.
Our team found a use for the Kinect, although it probably isn't what this thread was meant for. We set up a cool demo in our pit where you could manipulate a 3D model of our bot on a projector just using hand gestures, Minority Report-style. Another team was using the Kinect to play a xylophone. If there's any interest in our Kinect-CAD project, I'm sure I could find a video somewhere. |
Re: Kinect usage at competitions
Quote:
|
| All times are GMT -5. The time now is 18:52. |
Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi