Hi teams, these last days we have been working with apriltags detection, so far we have been able to detect them with limelight software (we also know how to detect them with photonvision) but we are stuck on how to generate movement in a swerve chassis depending on the position of the apriltag?
Well it certainly depends on how your code is written, but for instance, you can use PID controller using april tag yaw for example as your measurment, setting a setpoint based on the limelights measurment, and sending this value to your advance value:
velForward = drivePID.calculate(limelight.getALimelight(), driveOffset);
velStrafe = strafePID.calculate(limelight.getXLimelight(), strafeOffset);
velGiro = rotationPID.calculate(limelight.getYaw(), rotationOffset);
Here’s our full code, you can see how we use limelight to align to an april tag on robot/commands/limelight/autoAlign.java
Thank you very much, this will help us a lot, just one question, the offsets, both the strafe and the drive, what do these values indicate?
Have you gotten to the point of being able to obtain the absolute position of your robot using the AprilTags?
If so, you can then produce an on-the-fly path to go to any other point on the field, using PathPlannerLib.
With something like this:
m_path = new PathPlannerPath(
PathPlannerPath.bezierFromPoses(waypoints),
pathConstraints,
new GoalEndState(0.0, waypoints.get(waypoints.size() - 1).getRotation())
);
Where waypoints
is a list of Pose2d variables, and the first in that list is the current location of the robot.
So these depend on where your camera is on your robot, Idk if you’ve worked with PID previously, in case you haven’t, this offsets are the setpoints where you want your system to move to, so for example if you want to position your chassis in the center of the april tag, and your limelight is in the center of your robot, your setpoint would be 0 since this is where you want your liimelight value to be in. Idk if I explained myself, but If you didn’t got it, lmk and we can get deeper on this.
At the moment we have not been able to obtain an absolute position of the robot, the only advance we have obtained is the detection of an apriltag with limelight. How would it be possible to obtain the position with respect to the apriltags? However, the on-the-fly-path is quite interesting.
I guess a more prudent question would be, what kind of motion are you looking for? Are you wanting to simply “aim” the robot at a tag? Or do you want to drive the robot to an arbitrary position on the field, and simply use the tags for visual odometry?
Ohhh okay okay, I recognized the term, as I used to handle it as setpoint I didn’t get the similarity. Thank you very much I will take a closer look at the code :>.
Do you have a more complete code implementing this?
Sure, please take a look at GitHub - lasarobotics/PH2024: 2024 Purple Haze FRC
We developed our own wrapper library to help with automatically logging all inputs via Advantagekit: GitHub - lasarobotics/PurpleLib: Custom library for 418 Purple Haze
We’re also using a custom pathfinding script on our coprocessor GitHub - lasarobotics/PurplePath: FRC Path Finding
We can drive to any point of interest on the field (at least in simulation), and can also “point” the robot at any point on the field, while compensating for the robot’s own velocity.
How would you implement code that would “aim” at the tag? Like where your robot is x distance from the tag, so with x being a number, the speed and pivot (or whatever other variables your specified outtake may have) is set to something based on that number for x? I don’t really know if that makes any sense, but trying to figure out a general concept of code for when we decide on an outtake.
Yup, that functionality is in our repo, PH2024. Take a look in the DriveSubsystem.java
file at the aimAtPointCommand
.
Awesome! Thank you.
Note that our command also compensates for the robot’s own velocity as well, so while the robot is moving perpendicular to the target, it will “lag” the target so to speak to try to cancel out the motion of the robot on any projectile that is going to be shot at the target.
Hello, there is already a more concise idea, in case we want the apriltags to serve to self-align the chassis, as an example let’s say that the movement in x, y is controlled by the joystick, but the rotation is controlled by the position of the apriltag. How it would be a good place to start?
Yup, again that exact scenario is in the previously mentioned repo.
Oh, I’m sorry, I didn’t read it well, I have never worked with simulations, so I really don’t understand the code, I don’t know if you could help me by explaining it to me?
Of course, no problem.
First, you wanna clone the code, and checkout to the drive branch.
Then, you can open the command palette in VSCode with Ctrl+Shift+P, and typing “simulate robot code”, and select sim GUI.
Connect a controller to your computer.
Start up AdvantageScope, open a 3D field tab, and connect it to the simulator, and drag out the AdvantageKit/RealOutputs/DriveSubsytem/Pose variable to the 2d pose part of the AdvantageScope window at the bottom. Then you can drive the robot around. I believe the left bumper does the auto-aiming at the speaker.
Now with respect to the code, you’ll want to look at RobotContainer.java
, for the aimAtPointCommand
, and you can follow that code in DriveSubsystem.java
.
You’ll see in VisionCamera
and VisionSubsystem
about how we get our current position using AprilTags.
Feel free to ask any more questions.
Basically we get our absolute position from our pose estimator, and get the direction of the vector that is between that point and the point we want to aim at. Then, we plug in that info into a profiled PID controller, and the robot rotates toward the target, and the driver can still strafe freely around the field.
Is it possible that the vision subsystem can be easily replaced by a limelight (since if I remember correctly from the README, it mentions that it uses an X86 Mini PC used as coprocessor (MinisForum UM560XT)?