Guiding the robot with hand movements and muscle data

Hello,
I wrote a code to drive robots with the cascade method and muscle sensors. I was inspired by Yoda (Star Wars).

Trying on a real robot Video: https://youtu.be/eiMoOXukPAs
(I don’t know how to i embed video in topic)

source code is available here:

6 Likes

That’s an interesting control method but how did you get a CAD model of a robot simulated on a field?

I found it here: https://synthesis.autodesk.com/robot-library.html

1 Like

In 2012 (and possibly earlier, before my time) you were allowed to control your robot during autonomous using a Kinect provided to teams. IIRC it provided the robot with a list of coordinates corresponding to the “driver’s” joints, which teams could process for the robot to recognize hand signals. It wasn’t very popular but there were a few teams that used it. It also lead to some funky hijinks.

In 2014, there was a significant advantage to controlling the robot using simple hand gestures in autonomous: the hot goals. If you shot the ball while the goal was “hot” (aka the lights around it were lit) you got a bonus. That lead to CheesyVision, which also uses the driver station webcam to send signals to the robot. Unfortunately this hack was short-lived, and the next year the language changed from “no touching your controls during auto” to “no communicating with your robot during auto”.

3 Likes

waoov this is(CheesyVision) very cool :smiley: