DriveTeam Robot placement of Robot on Field with PathFollowing

Hey all, we’re a first time PathPlanner/Fancy Auto team this year, and there is some gap in knowledge we seem to be missing in how to make it actually work in real life. Our code and odometry works perfectly in the simulator, on the real robot the odometry tracking seems to match up to our real world expectations (i.e. 1 meter measured is 1 meter traveled). However, on the real robot our paths just don’t work.

I’m having a hard time understanding what the mechanism will be for our students to put the robot on the field, power it on, and still have all this work. I have been thinking on this for over a week now, and I have narrowed down the real-life robot not functioning to the likelihood that our Gyro is off when the code restarts.

Our desired starting configuration will be facing the speaker, which on the blue side would put our starting Gyro angle at 0 degrees facing the blue side, however, when we built our path and to maintain Field Oriented poses, 0 degrees is facing to the right, or to the red alliance side.

For the red alliance side, not an issue since we’ll be starting with facing the red speaker.

So, the advice I’m seeking is to see what other teams are doing with respect to setting their robot on the field in the proper orientation and then still ending up with proper gyro headings.

Is the move to just put the robot down on the field against the alliance wall, facing the red side, and powering on the robot from there such that 0 is properly facing that direction.

We’re using a Navx, so we can’t “set the angle” at any point, we can reset the Yaw, or set angle adjustments, but have had limited success trying that out so far. We’re competing in week 1, so with an all-new set of drivers, I’d really like to have this down to a standard set of procedures so they can set the robot on the field properly, and the autos will work the way the software team has written them.

Just set the offset based on which alliance you’re on

Do you normally just keep it perpendicular to the field such that the offset is 180 or 0?

Simple thing is to use a wall, a line or an object to line it up.

Fancy thing is to use your vision system and LEDs or dashboard.

Yeah, face it towards the same field element every time. If red, set to one offset, if blue, the other.

Re-reading your post …. You probably shouldn’t be using your gyro’s raw yaw. If your odometry is working get your heading from the current pose.

You should probably have a something like this:


    /*
         * reset the robot heading based on the alliance color
         * the robot front should be facing straight away from the operator when using
         */
        private void resetFieldHeading() {
                var alliance = DriverStation.getAlliance();
                if (alliance.isPresent()) {
                        if (alliance.get() == DriverStation.Alliance.Red) {
                                DataLogManager.log("%%%%%%%%%% resetFieldHeading: Red.");
                                drivetrain.seedFieldRelative(new Pose2d(15.25, 5.5,
                                                Rotation2d.fromDegrees(180)));
                                m_joystickAlliance = 1;
                                return;
                        }
                }
                DataLogManager.log("%%%%%%%%%% resetFieldHeading: Blue.");
                drivetrain.seedFieldRelative(new Pose2d(1.3, 5.5,
                                Rotation2d.fromDegrees(0.0)));
                m_joystickAlliance = -1;
        }

Pathplanner is doing something similar when you run an auto…

Also, note the joystick thing to flip the controls.

We’re using differential drive, so slightly different but thanks for the heads up on that. I will double check.

Regarding your other statement

Right now our odometry is using the Rotation2d from the gyro, which seems to be sort of standard for most example code I could find.

The current odometry update we are doing looks like this in our drivetrain periodic() method:

pose = self._odometry.update(
            Rotation2d().fromDegrees(self.__get_gyro_heading()),
            self.__rotations_to_meters(
                self._left_leader.get_position().value_as_double
            ),
            self.__rotations_to_meters(
                self._right_leader.get_position().value_as_double
            ),
        )

Hopefully the python readability is okay. I just recently changed our get_gyro_heading() method to be:

    def __get_gyro_heading(self) -> float:
        angle = math.fmod(-self._gyro.getAngle(), 360)
        if angle < 0:
            return angle if angle >= -180 else angle + 360
        else:
            return angle if angle <= 180 else angle - 360

That returns the heading scaled between -180 and 180. It seems to function well in the simulator, but hasn’t been tested on the real robot.

Do you know offhand if Pathplanner wants the gyro to be -360 to 360 or is -180 to 180. Based on the GUI it seemed like it needed to be -180 to 180.

We use -180 to 180 (-PI, PI).

Yes you send your gyro heading into the odometry. When you set the pose it will calc and store an offset. So, if you are using the heading in your code read from the odometry rather than the gyro via current pose and that will tell you your field relative heading.

If I’m being confusing just ignore it…. Maybe I don’t understand but your comment about not being to set the gyro on your NavX is what I’m reacting too….

I appreciate the help. I think I just worked out the bugs to the point where we’re going to (-pi, pi) in our gyro ranges when I plot the values via smartdashboard.

Getting through all this I think I realized why I have a difference in real-world and sim as well, so hopefully I can get this squared away and tested on Tuesday.

You might find the new field coordinate system documentation helpful. I recommend using the “Always blue origin” approach.

Yes, thanks. We did design all the paths on Blue, and that always blue origin is the set of procedures I’m trying to solve with the drive team.

I think the best thing, meaning most repeatable process, will be to just make sure the robot is facing the right direction, facing the red alliance wall when the robot code starts up. That way the offset for the gyro is either 180, or 0.

All the advice on here (and from some on Discord) I think has brought a solid resolution to what we were seeing. I will hopefully successfully test tomorrow.

However, some of the lessons learned as to why our results were different in simulation and real-world are:

  1. Navx Gyro is CW positive, but we weren’t accounting for that in our odometry updates.
    In simulationPeriodic we do this:
        # Update the gyro simulation
        degrees = self._drivesim.getHeading().degrees()
        self.navx_yaw.set(degrees)

Which was making the performance of the Simulation Navx CCW positive, instead of CW positive like real life. Which leads to:

  1. The odometry on the simulation, and the real robot were not congruent. Over time we changed the odometry tracking to be
pose = self._odometry.update(
            Rotation2d().fromDegrees(self.__get_gyro_heading()),
            self.__rotations_to_meters(
                self._left_leader.get_position().value_as_double
            ),
            self.__rotations_to_meters(
                self._right_leader.get_position().value_as_double
            ),
        )

        self._field.setRobotPose(pose)

Which worked perfectly on the robot, but not in real life. To make it more compatible with pathplanner, we changed the __get_gyro_heading() method to look like this:

def __get_gyro_heading(self) -> float:
        angle = math.fmod(-self._gyro.getAngle(), 360)
        if RobotBase.isSimulation():
            angle *= -1

        if angle < 0:
            return angle if angle >= -180 else angle + 360
        else:
            return angle if angle <= 180 else angle - 360

In the simulator this has proven to work well in that we’re returning (-pi, pi) for our odometry updates, and as expected, everything is CCW positive.

Then to start everything off, autoInit looks like this:

    def autonomousInit(self) -> None:
        # If we're starting on the blue side, offset the Navx angle by 180
        # so 0 degrees points to the right for NWU
        self._drivetrain.set_alliance_offset()

        self._auto_command = self.getAutonomousCommand()

        if self._auto_command is not None:
            self._auto_command.schedule()

Where the offset method is:

    def set_alliance_offset(self) -> None:
        if DriverStation.getAlliance() == DriverStation.Alliance.kBlue:
            self._gyro.setAngleAdjustment(180)
        else:
            self._gyro.setAngleAdjustment(0)

I will be able to test this tomorrow when we’re back in the shop but I’m cautiously optimistic that this will solve all the problems we’ve been seeing.

I mostly put all this in here for posterity, but I do appreciate everyone’s advice, thoughts, and help to get to this point. There’s definitely a lot to digest to get to this next level of control for autos.

1 Like

I confuse myself constantly with the many levels of indirection…
Gyro zero
Pose zero
X/Y axes in my mind v. the field
Robot centric v. Field Centric v. Driver centric
Red driverstation v. Blue driverstation

Sometime I start the sim and literally rotate my laptop to try and figure out it.

And of course it is always quick and dirty to just start flipping signs and dig yourself a nested hole of frame of references that Escher would be proud of…

Fortunately, our lead programmer is better at than I and when he drives he is just as skilled holding the joystick upside down…

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.