I am trying to use limelight 2+ data to simply aim and range to April tags on the speaker and amp. We are new to using vision, and for now, I’d like to avoid diving into the complexity of 3d poses and full odometry localization (maybe next year!).
Here is a link to the relevant parts of our code: GitHub - nitzky/Crescendo_2024: SwerveDrive Test Project
We are using rev maxswerve and navx gyro. I believe the Limelight.java file I currently have running is probably doing much more than required…As I said, we just want to have a method in the DriveSubsytem that will make angle and distance adjustments to April tags. I will likely use a driver-pressed button in the robot container to call this method when close to the correct position. Thanks!
There is a tutorial on the LL website to do this using a PID loop and the AprilTags. Our team also uses MaxSwerve, a limelight, and a navx so I figured I should link the class in our code that handles aiming at AprilTags.
Thank you for the code link! …I am relatively new to this level of programming, so I’m struggling to see the connection between the LL tutorial, your code, and my code…if that makes sense. The tutorial makes it appear fairly straightforward, but the complexity of the swerve drive code leaves quite a gap in implementing the example. I will continue looking through your team’s code to try to make sense of the structure and systems. Thanks!
I had nothing to do this afternoon so I coded your robot for you lol. I made a pull request on your GitHub to add my team’s AprilTag rotation code. Feel free to use this (or don’t) with or without crediting me. Also, if you have any questions, feel free to ask. I tried to make some comments in the code, but not everything is self-explanatory. Essentially, instead of always getting the desired rotation from the driver controller, it has the ability to use a different source (in this case the AprilTag PID loop, but more can also be added if that is something you are looking to do) when you press the a button on the driver controller. Once you release the button it should go back to using the joystick for input. This code should work, but obviously I haven’t tested it on your robot.
I also noticed while looking at your teams code a mistake (I think it was a mistake but I don’t know for use) I made last year when adding driver and operator controls for (I presume) the same reason you did. When your robot is “settingX” (locking your wheels in an x pattern to brake), that command is run for the duration of the time the button is pressed (i.e. If you hold down the setX button for one second, the function runs 50 times). For many of the functions you are using (enabling shooters, etc.) you only need to call the function once when the button is pressed and sometimes a different function when the button is unpressed. This can be done by using a combination of onTrue
/onFalse
instead of whileTrue
/whileFalse
and InstantCommand
instead of RunCommand
. To enable your intake, you use the code

This works, but every 20ms that the button is not pressed, it calls the stopIntake()
function again and the same for the startIntake()
function. This works, but it is harder on the CPU (I’d assume). When I added the code to change rotation sources (manual joystick input vs AprilTag PID loop), I used the onTrue/False methods and InstantCommands.
I didn’t change this in your code, because I am not your programmer and it isn’t my business to change things that I don’t know 100% how they work, but I figured I would let you know since this is an issue that I had last year.
2 Likes
Wow…this was extremely generous of you! I can’t thank you enough for taking the time to help us with this…you are truly embracing the spirit of this great community! I will absolutely use the code and give you credit. I try these changes ASAP and let you know how it works!
I’m looking over the code now…I had a feeling the whileTrue and RunCommand was not the best way to do that, I can see why your function works better and I will make those changes. So yes, it was a mistake in that I copied the function from the setx method because I didn’t know better haha. Thanks again for your help…truly…I’ll get back to you!
1 Like
Okay so I have your code integrated with ours on vscode, and everything checks out without errors! (I won’t have access to our robot until Tuesday to test). But, I did have 2 questions if you don’t mind indulging me more on this learning curve…
-
In your bit of code, the limelight data is called in the AprilTagLock file…but I don’t see the limelight or limelight helpers imported in that file–is that not required? Or is it all handled by the NetworkTablesInstance
class? And as long as I have the limelight correctly mounted and configured, any needed tx/ty/ta data will be pulled with that class?
-
If I also wanted to range distance to the April tag, I could use a similar approach but use ‘ty’ instead of ‘tx’ and ‘getY’ instead of ‘getR’?
To answer your first question. I am not using any helper code to get the LL data. While you can do this, I found it easier to just use NT. The limelight publishes all of its values to NetworkTables and as long as it is plugged into ethernet and accessible from the network, all of the values should be visible to the robot.
Getting distance to AprilTags is a different beast that is also shown on a limelight tutorial and in my team’s code it requires some basic trigonometry knowledge but nothing more than SOHCAHTOA and some measurements made on your robot. Again I would be happy to help you implement this in your robot.
Another thing to keep in mind is I don’t have access to your robot so I’m just doing what worked for mine. Some possible problems in my code would be the PID not being tuned and possibly adding a negative sign to the rotation if it is rotating the wrong way.