Is anyone here using the kinect at competitions? I wasn’t too surprised to not see anyone use it at the week 1 event I worked at, but I figure that some teams would start using it by now. If you are do you mind sharing how you are using it?
I saw only one team using it at Lake Superior, but then I was only watching (from the floor) during our team’s matches and some of elims.
We used it at the Rutgers MAR event, I don’t know if any other teams did. Didn’t have to debate who got the station though
I attempted to help our team set it up to be used at Traverse City, however, and we were even given some time on the field during lunch to test it (because there were problems the first day with us using a USB hub, apparently that was not working with the extra cable length, we were unable to test during it on setup night) (I’d like to thank the field personnel for trying to accommodate us setting it up).
However, I was severely disappointed in how it functioned, and I can’t say for sure if it was due to the setup at the field or just the Kinect in general, but this is what I was seeing:
Reliability: I was frequently seeing a loss of the control enable signal (button 9) which indicates that the users arms are in valid position (extended in the same xy plane). I am a bit suspicious that this could be due to cable length.
Sensitivity: The controls I had setup the robot to use (left/right leg forward/backward) seemed extremely sensitive, making them essentially impossible to control. Depending on the posture I stood with, I would see the buttons for leg forward/backward activating sporadically.
Speed: Our driver station (not a classmate, but it had an atom) was abysmal for controlling using the kinect, it was processing 4 or 5 FPS max. We had to setup my programming laptop to get decent results (which left me unable to make programming changes outside of extended periods of downtime) and was draining the battery much more quickly, so we eventually were going to need an external power supply installed on the field.
Height: I’m about 6’ with long arms, and the Kinect was barely able to pickup my elbows while I stood in the box.
The above issues pretty much killed all excitement about using the Kinect for me. When you can’t drive the robot correctly because control is constantly cutting out, and the robot is picking up different gestures based on you slightly changes in stance, and it’s not picking up your full range of motion…
I wanted to try out using different gestures for control (legs and head to the sides instead of legs forward/back, along with some logic to continue the last input when the control enable signal vanishes), but if that doesn’t work I feel I will need to recompile the KinectServer in order to get it to be less sensitive.
All of this combined with the fact that we were doing really well in standard autonomous, kinda made me think it wasn’t really worth fussing with anymore.
I’m hoping to do some off season projects with it once we can permanently unbag the robot, but until then there are lots more things to worry about.
We will be using it at the Los Angeles regional this weekend, mostly to delay/activate our launcher to ensure we deconflict with other teammate’s autonomous modes. I saw too many shots miss in Week 1 because 2 balls were trying to enter the 3-point target at the same time.
Team 1912 Combustion will be using it at the Bayou Regional this week. Hopefully it works…
A couple of teams at Alamo and a couple of teams at FLR. All it’s really good for is taking up floor space.
It’s a joke.
Team 23 used the kinect at the WPI regional. I believe it was used to tip the bridge.
It’s a novel implement and does have some legitimate uses (see HOTs recent technical design whitepaper) but using it for main control in Hybrid mode is hardly better/more reproducible than a sheer open-loop implementation in all but the simplest of autonomous routines.
We might play with it a bit offseason in case it is legal for use again, perhaps to use it in a way similar to HOT.
Only saw one team using it at ON GTR East Regional, and that was only for one round. It kind of didn’t seem to do anything.
I’m thinking someone should just like breakdance in the Kinect Station. You’d get massive street cred.
We joked around about having our human player go in and pretend to shoot baskets while the robot ran autonomous, but decided we didn’t want to accidentally misrepresent ourselves to the judges
When I saw the teams using the Kinect I couldn’t tell if they were doing anything effective. You might as well clown around with it.
I’ve only seen a team use it once every 3-5 matches on the webfeeds I’ve been watching.
We gave serious thought to juggling balls. But then figured that the balls would be ruled illegal.
Yeah refs are trained to not have a sense of humor about things like that.
This thought actually crossed my mind after Kettering.
We use the kinect on the robot for it’s infrared camera.
There is so much strategic potential with the Kinect that isn’t really being utilized. A lot of teams are either using it as the only method of controlling their robots or don’t use it, which is a little silly.
HOT’s paper explains that they use the Kinect as a safety override. That is a great use.
Say you miss shot 1 - use the Kinect to adjust for shot 2. Maybe you are trying to go to the middle bridge, and someone beat you to it - use the Kinect to cancel and drive to where you want to start teleop instead of breaking your manipulator or whatever.
Team 23 was the only team to use the Kinect at the WPI regional.
Only one team legitimately used it once, however 79 Krunch realized about halfway through Orlando that it was a great opportunity to have someone dance through the first 15 seconds for the rest of their games