On The Fly Auto Path Stuttering

Hello, We are trying to implement an auto-alignment feature for the different nodes. We have it working, but as we add vision measurements and use the Pose Estimator class built into wpilib, when using limelight, our swerve drive wheels seem to spaz every control loop. We think that this is because of the discontinuity of our position over time, and our path trying to compensate for that, as we are using the estimated position to feed our auto command, but we are wondering if there is a way to reduce this “spazzing”. Should we lower our holonomic XY pid so that it “corrects less”, or should we change the way we are adding vision measurements? Any suggestions are great! Thanks.

2 Likes

How are you adding vision measurements? Are you just setting your pose to that read from vision?

We are using a pose estimator, we originally set it to update when we have a target, with the timestamp being FPGATime-(limelight latency +11ms) as we thought we saw in a prior CD thread that 11ms is the time it takes from limelight → robot, but we don’t believe this is the case anymore, will test. Any ideas other than that? @Brandon_Hjelstrom ?

1 Like

To add to this (I’m on the same team as Henry here, :slight_smile:), in the limelight documentation, it states

“tl: The pipeline’s latency contribution (ms) Add at least 11ms for image capture latency.”

We are wondering if we actually needed to add the 11ms for image capture latency, or if that might be what is causing the stuttering with the on the fly auto pathing.

1 Like

Do you have your code publicly available? A couple of things that I would check:

  • Make sure that your vision readings make sense for the robot. According to WPILib:
  • Make sure you are using standard deviations for the adding the vision pose (documentation)
    • This lets you determine how much you trust your vision measurements
  • Make sure that you are using the bot pose and not the camera pose as they are likely in different places in space.

Here is a breakdown of the major parts of the code:

Drive Subsystem:

public class DriveSubsystem extends SubsystemBase {
 SwerveDrivePoseEstimator poseEstimator;
  public DriveSubsystem() {
    poseEstimator = new SwerveDrivePoseEstimator(Kinematics, Rotation2d.fromDegrees(0), getPositions(), InitialPose2d, 
        VecBuilder.fill(0.1, 0.1, 0.1),
        VecBuilder.fill(0.9, 0.9, 0.1));
  }
  
  public void periodic() {
      poseEstimator.update(Rotation2d.fromDegrees(getHeading()), getPositions());
    }

}

Vision Subsystem

public class VisionSubsystem extends SubsystemBase {
  public void periodic() {
  
      if (hasTargets()) {
  
        double[] bot_pose_blue = LimelightTable
            .getEntry("botpose_wpiblue")
            .getDoubleArray(new double[1]);
  
        double tl = LimelightTable.getEntry("tl").getDouble(40); // + 11;
  
        double tx = bot_pose_blue[0];
        double ty = bot_pose_blue[1];
        double tz = bot_pose_blue[2];
        double rx = bot_pose_blue[3];
        double ry = bot_pose_blue[4];
        double rz = (bot_pose_blue[5] + 360) % 360;
  
  
        if (bot_pose_blue.length >= 6 && tx != 0 && ty != 0) {
  
          Rotation3d rotation = new Rotation3d(rx, ry, (rz));
          Pose3d pose = new Pose3d(tx, ty, tz, rotation);
  
          SmartDashboard.putNumber("Estimated Rotation", rz);
  
          if (Timer.getFPGATimestamp() > 0.1) {
            DriveSubsystem.poseEstimator.addVisionMeasurement(
                new Pose2d(new Translation2d(tx, ty), Rotation2d.fromDegrees(rz)),
                Timer.getFPGATimestamp() - Units.millisecondsToSeconds(tl));
   }
  }
}

2 Likes

We were planning on eventually throwing out bad vision readings, but for now, we are moving about a meter away, and our vision measurements seem realistic. We are using bot pose instead of camera pose as well, through the limelight interface. We’ve been messing around with the standard deviations as well, but nothing seems to have fixed it yet.

1 Like

I have some clarifying questions.

  1. I assume you are using AprilTags in 3D mode but would like to confirm.
  2. What is the value in your feedback loop? Are you trying to drive to a specific pose or just trying to get the vision readings to a specific value?
  3. What is this “spazzing” you’re seeing? Is it just a mild shaking back and forth at or near the setpoint? Is the robot moving more than a few inches in oscillation? Or is it not even oscillating and just taking off?

Yes, running in 3D mode.

We are trying to drive to a specifc pose using a PPSwerve controller, which works fine running on only odometry, so our PID values should be fine.

The spazzing is like the swerve modules fluctuate ± maybe 10 degrees, really not that much but not quite elegant. We think this could be to minor fluctiations in vision measurements, but we aren’t very sure.

I would try having a threshold around your goal Pose2D and maybe not sampling vision every loop. I would think that the pose estimator would be able to handle that but maybe I’m wrong. I’d start by just sampling it once when the drive to pose command runs, see how well that does, and if it fixes the issue.

1 Like

We’ve had a similar problem. Our issue was that botpose angles are in degrees and Rotation3d expect angles in radian. So a normal stuttering of ±1 degree becomes a stuttering of ± 1 rad, so almost 60 degrees !

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.