3015 Ranger Robotics 2023 Code Release

Team 3015 is proud to release the code for our 2023 robot, co-driver dashboard, and pit display!

A few highlights:

  • Inverse kinematics based “point to position” control of turret and arm that will place a game piece within the intake at a 3d position in field coordinates
  • Automated scoring (cones only) using an arm mounted limelight to verify when the game piece is on-target
  • Multi-limelight AprilTag odometry correction
  • USB camera to determine cone position within intake for more accurate cone scoring
  • Object detection limelight pipeline for floor intake driver assistance
  • Automated system checks to verify robot functionality before a match
  • Touchscreen co-driver dashboard to control the robot’s target scoring position + other functionality
  • Pit display that will show current event ranking, schedule, and stream as well as connect to the robot to provide diagnostic data and run our system checks
36 Likes

Did you guys think of using point of interest apriltag tracking to auto-score cubes as well?

Our team planned to do that, however we didn’t eventually use the limelight.

Nope, everything is odometry based. We just use the AprilTags for correction (slight difference in auto, which won’t actually correct odometry but still uses the estimated botpose). This way we can still aim relatively accurately even if an AprilTag isn’t currently visible, or we lose our limelights.

The auto scoring doesn’t really work for cubes since there isn’t the verification of 2 different vision targets like there is with cones. We could do it based off of just the AprilTags, but the risk of possibly missing more cubes wasn’t worth it for us for the small benefit in cycle time. Its a lot easier for the driver to see when a cube is on-target vs when a cone is, which is why the benefit of doing it for cones was worth it.

I fully understand the accuracy concern but the cube placements had such a high error tolerance that trusting April tags especially on a swerve drive which doesn’t slip excessively shouldn’t have been a high risk, did it actually have an impact on accuracy to rely on April tags alone?

We probably could have done it without much issue, it just didn’t seem worth it for the risk for us with the time we had left, the auto-scoring for cones literally came together 2 days before packing up for champs.

There were definitely situations we noticed in practice where our cubes could miss without driver intervention even if the arm was “on target”. So, leaving the driver as the second source of validation for cubes made sense for us.

1 Like

Ah, our auto scoring came together in an hour using tables to fake nodes and cones, but we had spend months building it in simulation. Still with the turret system the auto scoring of cones was incredible from your team, also love the auto pit check systems.

2 Likes

Do you use Pathplanner to generate paths in teleoperador to take you to the scoring/subsystem positions

Nope. We left driving almost entirely up to the driver. In retrospect, doing some trajectory generation to get to the scoring position could have made some cycle time improvements and prevented some other issues we had, but we decided fairly early on not to go down the route of on-the-fly generation.

2 Likes

Wait did your auto scoring not use pathfinding or even a point controller?!

Not sure what you mean exactly by “point controller”, but I’m assuming you mean moving the drive base to a point on the field. Answer for both is no. Driver gets the robot close to desired scoring position and holds an “aim” button, then the robot will put the cone on the target pole and when its verified as “on-target” by the arm limelight, the intake will shoot it out. Our driver was really good at driving, so we just let him cook instead of taking over control of the drivetrain.

5 Likes

Interesting, also point controller is just navigating a holonomic drive base to a point x,y, theta but I’m impressed a driver could do that so quickly, we tuned ours to the bots physical acceleration limits without it tipping.

I’m honestly amazed by the driver skill, and I guess a point controller isn’t as necessary when you have a turret.

Yeah. The turret was really the main reason we felt like we didn’t need to do any automated driving in teleop. There is a really wide margin for error since our turret is limited to +/- 60 degrees and the arm can extend almost 1 meter. So the robot can be pretty far away from where it wants to aim and still make it. I can’t recall exact match numbers but we had a couple situations where another robot was in the way of where we wanted to score but we could still aim for it from off to the side. Using a trajectory here would have failed to reach the position we needed to get to which was another reason we avoided doing that.

Really nice, elegant robot code, and nice robot overall. Honestly quite shocked that your alliance didn’t make at least division finals.

One question-- looking at this bit of code in your library, it looks like you’ve used a Jetson for AprilTag detection (but then cut it).

We have an unused Jetson Nano sitting in our inventory (I imagine it’s lonely), and I would really love to use it instead of a multi-Pi or miniPC setup for vision coprocessing. However, 1) OpenCV ArUcO and base AprilTag libraries run on CPU, 2) Jetson is kinda terrible for CPU tasks (or, at least worse than the alternatives), and 3) Nvidia’s GPU-accelerated AprilTag libraries are closed-source and ROS-only.

I’d really love to learn more about your setup, as 997 is spending time this offseason moving away from PhotonVision to our own vision system.

The apriltag setup on the jetson was only really a thing before the LL3 became available and we were able to play around with their apriltag detection, so that was changed out fairly early. The limelight just provided a better, more well rounded solution than what we currently had with the jetson. It basically just used the apriltag python package and opencv + some hardware accelerated video decoding. Was looking into the Nvidia apriltag ROS package at the time but never progressed with it.

I don’t think the jetsons are necessarily bad at cpu tasks. We have the Xavier NX boards which could do 720p 3d apriltags at about 90fps, or 60fps with some heavy ML stuff going on in the background. This performance would be even better with the new Orin nano/nx modules that are starting to become available.

We will likely stick with the limelights for apriltags in the future, but may take another look at using the jetson for it. It was just really easy for us to add another limelight when we found out we needed more apriltag cameras than it would be to either add another camera to the jetson and slow down the processing of the original one or add a whole other jetson system to the robot.

1 Like

2 questions relating to the pit display, since I was very fascinated by it when I saw it at the championships

  1. How exactly do you run it? I see different entry points in different files

  2. Is there a way other teams can customize it so it can work with their specific systems?

  1. It’s just a flutter app so once it gets built there will be an exe that you can run. You can look up how to install flutter and build an app. They have some pretty good documentation.

  2. It could be customized fairly easily if you are able to figure out how to use the network tables stuff in there. It’s a bit messy and missing some functionality since it was purpose built for just running the pit display and dashboard. But, there’s a bunch of examples for all of our subsystems and what not. The biggest thing would just be learning how to use flutter and then it’s as customizable as any other flutter project.

Are there any plans to make it a more documented project/library that any team can use and customize? I’ll try to figure it out but it’s a bit difficult to trace through it all and see which parts I would have to change for our robot.

1 Like

Probably not. The event page is already general enough that it can handle any team at any event but all the stuff for the system checks is so specific to our team/code and how we do things that it would be a very significant overhaul to make it customizable enough for general use. Not saying never but unlikely. It would be a lot of bandwidth to dedicate to a project that would probably be pretty niche.

Not full system checks but I liked the self check system for hardware so I turned it into a proper package that I’m testing and added support for PCM, PDB/PDH, NavX, & Pigeon IMU so far and did full documentation for it, also changed the callback system into a consumer so that it can be used as a package. If you’re free and would be interested in looking over the system and giving any advice that would be fantastic.

If you want access to the early testing of it I’d be happy to share if you could report any issues you find.

If you send it over, sure.

1 Like