Motion Profiling Help

Hi all!

Our team has been experimenting with Motion Profiling. We are using 4 Falcon500 motors (interfacing through LabView as TalonFX), 2 motors and 3 wheels on each side. Our main problem has been getting the dimensions in path planner to match the dimensions in real life. We first tried a path with a small S turn, and now with a semicircle. Currently the robot will complete around 75% of the semicircle path after doubling the position value outputted from path planner. The Phoenix tuner plots show that the robot is “following” the velocity graph correctly, however it completes the 75% in around 3.5 seconds where path planner says the entire path should take over 10 seconds.

Source and PathPlanner files: Projects · Ironclad / mp test · GitLab

Any pointers as to what could be wrong? We’ve not yet tampered with any PID values of the motors. Max Velocity and Max Acceleration in PathPlanner are also estimates.

I’m not very familiar with the CTRE path following / motion profiling - Path Planner, but here are a few observations that might help… These are based on the TalonSRX. I’m assuming there are similarities with what you are using. However since you are using Falcon motors, there might be differences!!! (I quickly looked through the code. I could easily have missed something…)

  • Is there an example from CTRE to look at?

  • Calculate the feedback parameters from wheel, gearbox (if needed), and encoder physical parameters ( wheel radius, encoder counts/revolution) You might want a separate subVI just to do this calculation.
    image

  • It might be easier to configure individual parameters rather than use config “all”. Here are some crude screen shots. Note that the PID settings are separate. Also this doesn’t use path following, just closed loop speed control.

  • Figure out the PID and feedforward parameters from testing. The feedforward is the most important value. (I might have a sample project that helps with this testing. Let me know if you would like to see it.)
    image

  • While running, take the sensor readings and convert back to useful units for diagnostics.
    image

  • This isn’t for path following, but a way to set the desired speed for normal driving (This is based on being sent a desired value in FT/SEC.)
    image

  • You might consider adding a gyro so you can track the robot’s rotation compared with what you think it should be. You could also add “odometry” which uses the distance from the encoders and the gyro to calculate a robot position. You can trend this and match with your path to find how far off you are. The library here has some odometry routines (with a sample) http://github.com/jsimpso81

  • If the path’s are theoretical, then a lot of real world things will affect turning, like friction and slippage. Is there a way to manually drive what you want and record it to a path?

1 Like