Autonomous without using camera

Is it possible to make autonomous without using a camera? If so, how?

1 Like

Yes, you use wheel odometry. WPILIB provides kinematics objects for the most common type of drivetrains to help you keep track of your location. It is how most teams did it until AprilTags (maybe even now). Keep in mind you odometry will get worse the longer your bot runs and the more rotations you make. So non camera autos are more limited in their accuracy and complexity but are possible and even common.

5 Likes

If there is a source about this, can you share it?

What kind of drivetrain are you using?

Pathplanner works based on your supplied position and you kinematics so it can help you plan your paths with or without vision.

Regardless of your drivetrain and how you plan your autos the WPILIB documentation is the place to start. If you configure your drivetrain kinematics properly and update the kinematic object with your drive states you will have some position information. It will get worse and worse as the match goes on as carpet slip and collisions will mess up the pose estimate provided by kinematic object. It is a start and will get you through a basic auto.

2 Likes

Thank you very much

Before pathplanner provided an easy way to track your Odometry, the simplest, effective ‘dead reckoning’ was done by measuring out wheel distances and moving at slowish speeds with the gyro keeping it straight, to not overshoot a stop point. Then using the gyro again turn to a precise angle, stop, move another direction, stop, turn again, etc. By chaining all of these smaller actions together you could accomplish a lot, but the longer the routine ran the more likely one position down the line would drift. If one position gets off spot the rest of the following commands will be wrong and out of position.

With pathplanner you wouldn’t supply the low level commands any more, or be limited to just straight drive + gyro turns. You could do splines (nice curves), complex paths that are difficult to measure in feet/inches/encoder counts, etc.

You will still run into drift issues over time without a camera or some other sensor that can provide a position on the field to resync your Odometry to.

Edit: This drift/localization is an issue even in ROS land where the robot is using SLAM navigation. Walls in one room can look exactly the same as another and with a global map that has lots of similar rooms the robot cannot decide where it is starting at first boot or if it drifts too far without another known good location input.

It needs either a manual input of ‘you are here’ and it bases everything from that start position, or you read a fiducial like an Apriltag and use that data to know your exact real world location on the global map.

It would be the same issue on a perfectly symmetrical field where red and blue alliance sides are the same shape. The SLAM wouldn’t be able to decide which side you were on without some other clue from another sensor or user input. But if you can see one Apriltag and it’s ID you know where that is (which side of the field) and now you can figure out the rest programmatically. That key ‘Origin’ point matters. Then resyncing yourself to another origin point over time as you drift becomes key.

2 Likes

Our team had no vision in 2023 yet we still had a very consistent balancing autonomous at our competitions.

The other thing to keep in mind… In every game I can recall, at least a few points resulted if you could do the following algorithm:

  1. spin wheels for 3 seconds
  2. stop wheels

And all it needed was correct placement at the start of the match.

In some cases there was a starting position where a wall or other barrier existed nearby, which could “catch” the robot. This further reduces the algorithm to

  1. spin wheels

Points are points. Start simple.

5 Likes

Case in point, even if you don’t cross a line you can earn points by spinning in place! In 2023 we had a preloaded cube on the bot. We had an untested auto that was run. We were supposed to just drive straight and stop across the line. Either the gyro direction call was reversed or the motor direction was swapped, either way we spun in an rapidly-accelerating manner that caused the cube to get knocked into the low node earning points anyways!

2 Likes

Yes and like others described. We used PathPlanner with WPILib swerve odometry to get three Notes most of the time and almost a fourth. It took some fussing especially since we had a narrow intake (bad idea to need that much accuracy for intake).

1 Like

I’m going to add that doing just odometry is a good start, but you will need to migrate to using sensors/vision eventually for the complicated autos.

For example, if you setup with a 1* error at your starting position, and travel half a field, you are off 6” (sin(1*) times 27’ =.47’ or 6”). Errors add up quickly too, like have your swerve wheels worn down, were they aligned the same way at startup, or carpet weave direction between alliance sides. Remember the walls move during the competition too as robots hit them and the carpet stretches out, so fixed reference points like wall corners can move.

For short auto paths, just odometry will work, but you will need to migrate to sensors eventually.