|
|
|
![]() |
|
|||||||
|
||||||||
![]() |
|
|
Thread Tools | Rate Thread | Display Modes |
|
|
|
#1
|
|||
|
|||
|
Re: Kinect usage at competitions
Quote:
![]() |
|
#2
|
|||||
|
|||||
|
Re: Kinect usage at competitions
When I saw the teams using the Kinect I couldn't tell if they were doing anything effective. You might as well clown around with it.
|
|
#3
|
||||
|
||||
|
Re: Kinect usage at competitions
I've only seen a team use it once every 3-5 matches on the webfeeds I've been watching.
|
|
#4
|
|||
|
|||
|
Re: Kinect usage at competitions
We gave serious thought to juggling balls. But then figured that the balls would be ruled illegal.
|
|
#5
|
|||||
|
|||||
|
Re: Kinect usage at competitions
Yeah refs are trained to not have a sense of humor about things like that.
|
|
#6
|
||||
|
||||
|
Re: Kinect usage at competitions
There is so much strategic potential with the Kinect that isn't really being utilized. A lot of teams are either using it as the only method of controlling their robots or don't use it, which is a little silly.
HOT's paper explains that they use the Kinect as a safety override. That is a great use. Say you miss shot 1 - use the Kinect to adjust for shot 2. Maybe you are trying to go to the middle bridge, and someone beat you to it - use the Kinect to cancel and drive to where you want to start teleop instead of breaking your manipulator or whatever. |
|
#7
|
|||
|
|||
|
Re: Kinect usage at competitions
We are using a Kinect (IR imaging) with some success. We do not use its auto-ranging feature, rather, we process the image. We have good accuracy on the practice field, but we are experimenting with software and IR illumination adjustments to get better consistency on the game field. We suspect that the field lighting and reflections from the backboards and possibly the floor, other robots, or the big projection screen are causing the Kinect system to "lose lock" on the targets at times. Our test field used plywood for the backboards. We did not get to calibrate to the lighting in Rochester, hopefully we can in DC.
We are using an Arm/Linux Beagle Board-xM and a CV image processing library, in addition to some custom image processing code developed by the students. It does take a long time to boot... The USB Kinect plugs into the Beagle Board, and the Beagle is on the local Ethernet. Code by one of our students does the trig on the vertices of the trapezoids to calculate the angle and distance to the target, which is passed to the cRIO over a UDP packet link developed by another student. When the "aim" trigger is pulled, the steering loop is closed around the error signal from the Beagle (in discrete slow steps due to current 6 frames-per-second data rate) until it reaches the "dead-band" near zero error. The students developed a LOT of software over 6-weeks. On the test field we can range and aim at the center target with 0.5 degree accuracy with any 3 of the 4 targets visible, tested to about 20 feet. We did not have sufficient time at the Rochester regional to collect enough distance calibration data points, so the curve we are currently using is not so good, based on only 5 points. This was due to time spent fixing mechanical problems. We hope the shooting will work better on the field at the Washington DC regional, and that we'll have enough time to properly calibrate there .We also have a device that "squeezes" the ball to identify soft balls on the way in, but we haven't tried it. If things go well, we may use a different distance-to-shooter-speed curve for soft balls, or apply some other correction, but that is asking a lot... We'd be happy to talk with anyone using the Kinect, and offer help if we can. --Dan Sternglass |
|
#8
|
||||
|
||||
|
Re: Kinect usage at competitions
Quote:
![]() |
|
#9
|
||||
|
||||
|
Re: Kinect usage at competitions
We use the kinect on the robot for it's infrared camera.
|
![]() |
| Thread Tools | |
| Display Modes | Rate This Thread |
|
|