Getting Started With Limelight + CTRE Tuner Swerve

I’ve been reading up on Limelight AprilTag tracking with this goal in mind: the ability to place an AprilTag within view of the Limelight and have the robot maneuver itself to an arbitrary point in space (at which the AprilTag is still within view). I am unsure if I will need 2D or 3D tracking for this task and how to get started with 3D tracking in general (should I create a mock .fmap file with a mock AprilTag setup?)

Even if I did have a .fmap and a AprilTag setup, how should I make use of the vision data in code? My current codebase is only the CTRE Tuner generated swerve code and some boilerplate Limelight code. I am unsure how to approach updating the pose estimator and whether or not this would require 3D tracking with an uploaded .fmap file.

What should I do to get started with tackling this task?

Disclamer: I only recently started working with limelight

2D tracking is much easier, and I would recommend starting with it and only moving to 3D if needed. The main reason you would use 3D is if the limelight’s position relative to the robot isn’t fixed (For example, if it was on an elevator).

If the limelight is fixed, and you want to align to an April tag (An example for this year would be the April Tags on the reef), you should only need to work with 2d. An example of this in C++ on a tank drive bot is included in the docs here. If this is your goal, and you aren’t using 3D, you shouldn’t need a .fmap, you only need to specify the April tag’s ID in the Limelight Pipeline. The main reason for this is that 2D tracking gives you how far from center an April Tag is in degrees from which you adjust the bot (Known in the code as TX and TY).

I haven’t fully tested this yet, but in my team’s CTRE swerve code (I think it was generated by TunerX) there is the following function which could be useful:

public void drive(Translation2d translation, double rotation, boolean fieldRelative, boolean isOpenLoop){

I’m not familiar with the pose estimator, so I can’t provide much help there.

I hope this helps. If you have any questions, feel free to ask and I will try my best to help.

1 Like

This unfortunately isn’t in my Tuner X generated code. I just read through the SwerveRequest class in the CTRE library and also didn’t find anything about driving the robot to a Pose.

I found it in the swerve subsystem file, but if you don’t have it, I might be able to help find another way to control the bot from limelight if you post the code for the swerve subsystem. It also appears possible to use the Swerve request to control the bot based on what I read in the docs with something like the following:

  public AlignCommand(VisionSubsystem vision, SwerveDrivetrain swerve) {
    m_Vision = vision;
    m_Swerve = swerve;
    m_alignRequest = new SwerveRequest.FieldCentric()
      .withDeadband(0.1)
      .withRotationalDeadband(0.1)
      .withDriveRequestType(SwerveRequest.DriveRequestType.Velocity)
      .withSteerRequestType(SwerveRequest.SteerRequestType.MotionMagicExpo);

    addRequirements(m_Vision, m_Swerve);
  }

After calculating adjustments:

    m_Swerve.setControl(
      m_alignRequest
        .withVelocityX(distanceAdjust)  // Forward/backward movement
        .withVelocityY(0)               // No lateral movement
        .withRotationalRate(steeringAdjust) // Rotational correction
    );

You likely will have to refine the code a bit, as it is something I wrote quickly. IN addition, this is more verbose and harder to debug than using the drive command.

If you do decide to do 3D, I’ve linked the .fmap I’m using, as well as the JSON.
It is the same data and format, but the JSON is easier to read through. You can upload the .fmap to the Limelight Map builder.

To make full use of the 3D tracking and pose data, I would recommend looking for teams using it on GitHub.

This is the current code I have. Going into this, I was thinking there was probably a solution provided by a library involving a PID control system as I figured this is a common scenario.

Sorry for the late reply.

This is the code I am currently working with for April Tag alignment.

I wrote something similar and it worked okay. When you use your align command does it jitter when rotating? The Limelight picking up on the AprilTag is spotty too which I think is also causing some of the jitter. Have you had this problem?

Also, I was wondering if you align command aligns to the tag head on or just so that the tag is in the center of the Limelight’s vision.

The jitter likely can be fixed by tuning PID values. For me, I haven’t had any issues with consistent April Tag detection. You might be able to fix it by changing the Limelight’s angle or moving to a lower resolution and increasing FPS (Also using the gain, black level, etc. as described in the docs).

My command aligns so that the April Tag is in the center of the limelight’s vision.