Code Orange: 3476 - Code Release

The Link!

High Level Overview

  • We’re using AdvantageKit and AdvantageScope to log everything in our robot code. We’re utilizing an IO Layer to abstract the control logic and the actual and record all our inputs to our subsystems.
    • We have over 450 fields being logged!
  • The AutoBuilder is used for creating all out autonomouses
  • Controls
    • Normal Usage
      • The Operator selects what scoring location we want in a grid using a button panel arranged in a 3x3 grid.
      • The driver uses an Xbox Controller
      • Automations
        • A beam break sensor is used to automatically close the grabber in certain mechanism states
        • The mechanism is automatically closed in certain states once the grabber is closed
        • The grabber is always automatically closed when retracting
    • Manual Controls
      • Our stick’s Y axis can be used to rotate the pivot of our grabber mechanism up and down
      • A joystick on our button panel can be used to move the grabber up and down
      • Extra buttons on the joystick can be used to explicitly set the mechanism to different states
        • Would be required if our pose estimator completely failed (which has never happened at comp)
        • Mainly used for systems testing in the pit

Vision

Our vision system utilizes 3 cameras (2 limelight and an Intel Realsense D455 connected to a Beelink MiniPC).

Intel Realsense D455

The Camera is mounted on top of our elevator facing forward on our robot.

The pose each for tag is sent over NT to the roborio. Each tag pose is then transformed into field space on the rio and sent to the pose estimator individually.

When calculating the translation of the robot from each tag we don’t use the tags’ orientation. We instead use the expected orientation using information from our gyro which helps us get useful poses from anywhere on the field.

Limelights

After running into occlusion issues with only having our main camera at our first comp we added two limelights that point to the left & right at an ~15-degree angle. They ensure that we always have a tag visible while we’re scoring and picking up game pieces.

We’re simply using the botpose the limelights output and applying them directly to our pose estimator. We found that the pose estimation degraded quickly once we moved too far away and throw out any poses when we’re > 4 meters away from a tag. It’s possible that the new camera calibration features would help significantly for this.

Auto Lineup

  • While the button is held down control of the robot’s drivetrain is swapped to the robot code.
  • When the button is initially pressed the robot requests the robot commands the robot to continue moving at the current velocity. A path is then generated asynchronously taking the robot from the current position to the target goal.

Path Generation

Choosing where to go

  • The same button is used to line up to scoring nodes is also used to line up to pickup positions
    • We first decide whether we want to go to a pickup position or scoring node based on which side of the field we’re
      on + some velocity based lookahead.
      • Scoring
        • The grid that we want to go is predicted based on the robot’s position + a velocity lookahead.
        • Once the grid is determined the actual node we want to go to is based on what is selected on the
          operator’s button panel
      • Pickup
        • The normal left-paddle will auto lineup to the inner double-substation
        • The right-bumper is used to line up to the outer double-substation
          • We initially tried position and velocity based prediction for this, but it was too unpredictable for
            our driver

Creating the path

  • We start creating a spline utilizing the robot’s current velocity as the heading and length (times a scaling factor)
    and the position as the starting point.
  • The spline ends at the target position with a spline that points into the wall a short amount. (This helps the robot
    drive a smoother S-curve)
  • Other points are dynamically added in between the start and end position if needed to avoid the charging station.
  • We then pass this to WPILib’s trajectory generator to return a path the robot can drive.

Driving the path

  • We use the same path following code that we have for driving during the autonomous period.
  • Once the path is complete we have an additional position PID controller that is continually running.
  • Vision is continually correcting the position of the robot as it drives the path.

Mechanism Toggling

  • The same button is used to bring the mechanism up for scoring and double-substation pickup.
    • The robot’s position on the field determines the which setpoint to go to
  • In scoring the selected position on the button panel is also determines the setpoint to use.
  • The selected position also determines which piece mode to use which changes the strength of the grabber.

Controlling the mechanism

  • Each of the mechanism’s control their position independently with their own PID controllers and feedforward to
    compensate for gravity.
  • Our MechanismStateManager class converts setpoints represented as 2D coordinates into the 1D setpoints that each
    mechanism needs to go to.
    • The MechanismStateManager also ensures we don’t command the mechanism past their limits and into field elements.

If you all have any questions about anything feel free to ask below and I’ll try my best to respond!

43 Likes

3476 had an excellent robot this year and I’ve been very impressed by the auto builder stuff over the last couple of years. It was a pleasure to meet you and the rest of the 3476 students and mentors at champs this year. 3476 has been an inspiration to me for many years now.

5 Likes

Awesome explanation Varun. This code does more than I ever realized, and it really showed with its success in competition. Congrats on a great season!

1 Like

Great bot, why do most teams use a Xbox controller instead of a single joystick?

Usually it’s a matter of comfort and familiarity to other gaming modules the students are used to. I personally find full sized joysticks to be more comfortable, but lost that battle years ago…

1 Like

I love how clean and beautiful your code is! I have question: How did you tune your PID and motion profiling? Did you tune PID with the motion profile or without it first?

iirc we only use motion profiling for our turnPID. For that we first tuned the PID portion, so it got to the target position as fast as possible and the configured the profile to achieve a balance between how fast we get to our target angle, and how much turning would slow down our robot while we drove a path.

The motion profiling part was something we played around with quite a lot throughout the season. Having it turn too quickly would cause us to fall behind on our trajectory (and then slam into the scoring nodes because the position PID is trying to catch us up), but if we were too slow we wouldn’t be done turning in time.

For our drive motors we only used feedforward to control the velocity of the robot. We would log our motor velocities/accelerations and then fit them to a line to get our feedforward values. We then ran our autos a bunch of times and hand tuned those values more until we were happy with them. We don’t actually have a way to input acceleration throughout our drive class, but we faked this in autos by calculating a velocity command that would give us acceleration equivalent to what we want.

For most of our other control loops we first figured out the appropriate constants to cancel out gravity (if needed) and then hand tuned our pid loops to act on top of them. Not having to worry about gravity really allowed us to push our pid loops to the maximum. (iirc our elevator ran at its maximum output power until we were within just a few centimeters of our goal position.)
(You may still see some leftover motion profiles in our robot code, but these should all be disabled. Our robot loop jittered too much to have to be able to properly run the motion profile from the rio & we weren’t able to get the motion profiles to work properly on our sparks.)

1 Like

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.