Auto-Alignment using Limelight

Hey everyone! I have a quick question. My team and I are attempting to create an autonomous command capable of moving our robot either left or right so that it can be centered with an apriltag. Right now we’re having a hard time figuring out how to implement that. One of the only solutions we have thought of is using a hashmap of arrays to hold angle information about the april tags on the field. However, I’m just wondering if there is a better way? Any help would be appreciated!

A simple method would be doing point of interest tracking.

it will return tx values relative to that tag, and you can instruct your robot to move left or right based on the value of tx, aiming to get to “0”.

The following example shows using the “ty” value of a retro reflective target to drive the robot to a specific distance. The same concept applies, where instead of using “ty” to go forward/backwards, you can use “tx” to go left and right.

You can, after a bit of configuration, get the LimeLight to calculate your x, y, z position plus heading and publish it to the robot via NetworkTables. You can then drive the robot from the current pose to desired pose, or as close to it as you need to, and then take the shot. You can ramp up sophistication by fusing wheel odometry and a gyro into your pose estimations to increase accuracy, likely fusing the LimeLight coordinates with your odometry with something like a Kalman filter.

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.