wanted to hear what different teams used to see balls on the opposite alliance side. We tried using a driver cam on the robot, but this ended up confusing the driver since our swerve control was field oriented, but the vision from the camera was robot-oriented. This led to the drive team seeing the ball “in front” of them, making them move the stick “forwards”, which made the robot actually move further away from the ball. Thought CD might have ideas.
We’ve had a similar issue. One thought was to have a trigger on the driver’s controller that when it’s held down the robot goes into robot-oriented driving.
Haven’t implemented it yet, but that was our first idea.
Our team was using a Jevois camera paired with a small python script to determine the location of cargo. This ended up getting scrapped due to some bugs and the fact that besides cargo against the far fender locations, the sight lines weren’t too bad for our drivers and they were able to operate pretty confidently on the other side of the field without any vision assistance.
It would be nice if you could mount a 360 degree camera up high and have your driver station screen show a top down field-oriented view of the field with your robot in the middle. Surely someone is doing that currently. Can’t be as hard as making a robot jump.
This may be a bit simple but you could always have your driver sweep close to the fender, across it, to move them out of the blindspot. We sent our human player to the far slot and did hand signals for me(Drive Coach) to relay how many and which fender had some building up.
We have the ability to switch between field orient and robot orient for exactly this reason. However, I don’t think our drivers have used it this year.
Since there is a human player located at the far end of the field, you technically have a spotter that can tell you a) that there are hidden balls and b) approximately where they are. If you come up with a set of signals, your human player can signal to your drive coach where they are and your drivers could do a blind sweep through the area and push them out where they can see them to scoop them up.
We have a button that the operator presses and holds that puts them into “fps driving mode”. Works pretty well.
A quick switch to robot centric is certainly an option. Another option is to find a driver who is great in robot centric mode all the time. That is our current appoach.
If you can add active intake centering to the “driver cam”, it might be helpful in field centric driving by letting the robot do some of the work. We have this capability, but our driver rarely uses it because there is so much potential danger with all the moving robots in teleop. It works great for autonomous, though.
This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.