Integrating Sensor Feedback with Talon SRX Motion Profiling

I have a question about integrating sensor feedback with motion profiling on the Talon SRX.

The Talon SRX motion profiling sends profile points to the speed controllers in two steps - first, the points are loaded into a “top-level” buffer in the API. Second, the points are pushed from the top-level buffer to a “bottom-level” buffer on the Talon itself. The latter step is suggested to be done by a separate thread executing with a frequency at least twice that of the motion profile’s sampling rate. It is, obviously, important to ensure that the bottom buffer never runs out of points to execute.

Now, imagine that I wish to modify my motion profile points in real-time, as I send them to the Talon, in response to sensor data such as a gyro. To do this, I must modify the points before they are pushed to the buffer. However, I also clearly must keep at least a few “future” points loaded into the buffer at all times, to ensure that the bottom buffer does not run out of points.

Say I decide to keep n points in the buffer at all times, to be safe. Well, if the period between my profile points is T milliseconds, I now have a built-in phase lag of nT milliseconds - that is, any output from a feedback loop in response to data from an additional sensor (such as a gyro) about the robot’s state now can only modify the robot’s trajectory nT milliseconds in the future. I don’t see any obvious way around this.

Is this a problem? I think, to go further with the example of a gyro, it could possibly be improved slightly by closing the angle feedback loop on, instead of the robot’s current angle, a kinematic projection of the robot’s angle n*T milliseconds in the future (this would not be hard to compute) - but I am unsure how this would perform, in practice.

When I talked to Omar in St. Louis he said that next season the Talon SRX will be able to use their Pigeon IMU directly and perform a heading correction on motion profiles.

This is nice, but we use a NavX, and heading correction is not the only form of sensor feedback we’d like to integrate (being able to re-compute profiles “on the fly” in response to vision data, for example, would be a huge asset).

Honestly, at this point you may just want to try writing your own drive control loop thread. Set up a thread at 200hz that just repeatedly sets velocity setpoints on the talon. Your thread is then responsible for calculating the velocity values from your profile and applying any other sensor corrections to it. Basically, do your own version of the talon motion profile.

We thought it would be difficult because threading, but we ended up liking the flexibility a lot and it was pretty easy. We reused the same drive thread for autonomous and teleop drive this year and it was great.

We could do this, but we’ve had very good results with the control algorithm of the Talon’s motion profile mode, and are not sure how a simple velocity servo would compare (mathematically, a P loop in motion profile mode ought to be the same as an I loop using an ordinary velocity servo, but in practice it very likely behaves differently due to roundoff error/filtering of the velocity signal/whathaveyou). We’d rather continue expanding on what we’ve been doing than entirely re-work our approach.

Nice! This is incredible news to hear. We really liked using the Talon’s on-board control this year, and our main ‘issue’ was the lack of a gyroscope when running motion profiles. Do you know any details on when this feature will be available?

While we would like to integrate gyro feedback for additional robustness, we found that it was fairly easy to achieve accurate headings by being sure to use empirically-derived values for the wheel diameter and wheelbase size for generating our profiles.

If achieving an accurate heading was a problem for you, I suggest trying this before implementing a gyro-based heading correction (though obviously you may still want to use a gyro!).

This year we did integrate a gyroscope correction into our motion profile tracking, but everything was done in percentVBus mode so it was a bit slower at accounting for error. We found this necessary due to inconsistencies between the two sides of our robot that would make it veer to one side with a non-corrected profile. We attempted to correct this by slowing down our motion profiles, but still were getting oscillatory movement up to a few inches in both directions on motion profiles just set to drive straight.

Thanks for the link, we will definitely look into ways to improve our tracking without a gyroscope this summer. The method of finding an ‘artificial’ wheel base is quite interesting; we’ll see how it goes. :slight_smile:

The NavX and the Pigeon, I believe, utilize the same IMU.

An “immutable portion” of a trajectory is an unavoidable fact of life when you are generating a trajectory asynchronously from a real-time trajectory follower and you want to avoid underrun. The common best practice for dealing with this in industry is basically:

  1. At your trajectory planning iteration start time, take a snapshot of the current robot pose and the remaining sequence of trajectory controls that have been buffered.

  2. Project where you believe the robot will be if it follows the buffered controls. This allows you to incorporate other sources of information about the robot pose (ex. gyro or vision) in your forecast, since your pose estimate can fuse any arbitrary collection of inputs.

  3. Re-plan a trajectory from the projected robot state, and buffer points until at least N total points are now buffered.

  4. Repeat 1-3 at a sensible loop rate (even just 10 Hz is going to do pretty well with this form of control).

The Pigeon is available to the Talons directly via the data port or the CAN bus. For the NavX to work the user code would have to send the heading to the Talons.

I don’t mean to ‘necro’ this thread, but does anyone know if there are any updates on this feature? Additionally, if implemented, how would each Talon be able to read values from both an encoder (through a breakout) and the Pigeon at the same time?

The Pigeon could be connected to any Talon. It doesn’t have to be a drive Talon. The Talon it is connected to will share the data across the CAN bus with the other Talons.

Another option would be to connect the Pigeon directly to the CAN bus.