Calculating Odometry of a Swerve Drive

Does anyone know of an efficient, simple way for the change in position of a swerve drive to be calculated without wheel slip being a major problem? I have seen ether’s forward kinematic calculator. I have recently started looking back over much of the labview based swerve drive code that 900 has used in the past (for a variety of reasons that will be revealed eventually :slight_smile: ). We have code that calculates the change in position of a swerve drive relative to the field using encoders and some vector math. It basically works like this: if all of the wheels “agree” (within a specified margin) the x and y change in positions for all of the wheels are averaged and that vector is rotated based on the current angle of the robot (based on a gyro/magnetometer) (same methodology as ether’s forward kinematic calculator, but without calculating margin of error). This describes the motion of the robot over the last instant (typically 100 ms). This works because the rotation component of the movement of each of the wheels will average out. Often wheel slip and uneven carpet will cause the wheels to not “agree”, thus a framework exists for this.

In order for the wheels to agree, the delta x of the two front wheels should be the same, and the delta y of the two left wheels should be the same etc. (relative to the robot). Based on which wheels “contradict” some math and logic is used to approximate the movement of the robot and figure out which wheel is slipping. If too many wheels disagree*, no calculation can be made and the robot is assumed to be static. I am not going into detail about how this works in this post, but I am willing to post code and/or explain in detail how this works if anyone wants. Are there simpler or better solutions to this problem (other than mechanical ones/adding other sensors**)?

*This is a simplification
**We have already exceed our sensor allowance for the next 2 millenia

It hasn’t been done (that I know of, yet), but you may want to look into getting an optical flow setup working.

It’ll do what you want, but getting it up and running will be a bit of a task.

1 Like

Look into formulating your odometry (forward kinematics) as a linear system of equations. The inputs are the x and y velocities of each wheel in your local robot reference frame (which you can obtain by measuring your steering angle and using some trig); your outputs are your overall robot velocity/displacement in x, y, and yaw. With 8 inputs and 3 outputs, this is an overdetermined system of equations, which means you often will find there is no perfect solution. But you can find the least-squares solution (which minimizes the sum of squares of the errors between your measured inputs and what your inputs must have been in order to produce a perfect solution) using standard mathematical methods you can read about.

In the absence of extra knowledge about which inputs are most likely wrong (slipping), this method does a great job most of the time.

(Ether has posted all of the necessary equations on the past)

1 Like

Optical flow hmmmm… just sounds like weak sauce SLAM to me :rolleyes: . Seriously though, we already have SLAM set up and running to some extent… (more details to come) :cool:

Thank you for the advice. I will look into using the least squares solution, it sound like what I am looking for. It is worth noting that some information about which wheel(s) are slipping can be determined by looking at discrepancies between different wheels. (If anyone wants me to go into the logic/math behind this just ask). From my understanding, this can be used very effectively when just one wheel is slipping, even if the wheel slips a ton. (I am planning on doing some testing on this in the future). For example, if due to uneven carpet one wheel comes off the ground. However this breaks down in the much more common situation where 2+ wheels are slipping some for whatever reason.

We were hoping to use the time latency stuff that 254 did with our swerve for gear vision this year. We never had time to fully get this working but will continue to pursue it in the off season if time allows. We came up with the following forward kinematics for a swerve drive. The following code was unit tested and produced the correct result. It hasn’t actually been ran on a robot so it might not be immune to the problems you mentioned. Given the system is overdetermined as Jared mentioned, whenever we had the opportunity of choosing one value over another we chose to average both values together. This obviously won’t fully account for wheel slip but given we can’t be sure which wheel is slipping it seemed like a simple mechanism to reduce the effect.

RigidTransform2D::Delta Kinematics::forwardKinematics(double frDriveDelta, double flDriveDelta, double brDriveDelta, double blDriveDelta,
	double frRotationDelta, double flRotationDelta, double brRotationDelta, double blRotationDelta, double gyroDelta)
{
	double L = ROBOT_LENGTH;
	double W = ROBOT_WIDTH;

	double FR_B = sin(frRotationDelta) * frDriveDelta;
	double FR_C = cos(frRotationDelta) * frDriveDelta;
	
	double FL_B = sin(flRotationDelta) * flDriveDelta;
	double FL_D = cos(flRotationDelta) * flDriveDelta;

	double BR_A = sin(brRotationDelta) * brDriveDelta;
	double BR_C = cos(brRotationDelta) * brDriveDelta;

	double BL_A = sin(blRotationDelta) * blDriveDelta;
	double BL_D = cos(blRotationDelta) * blDriveDelta;

	double A = (BR_A + BL_A) / 2.0;
	double B = (FR_B + FL_B) / 2.0;
	double C = (FR_C + BR_C) / 2.0;
	double D = (FL_D + BL_D) / 2.0;

	double omega1, omega2, omega;
	omega1 = (B - A) / L;
	omega2 = (C - D) / W;
	omega = (omega1 + omega2) / 2.0;

	double STR, FWD, STR1, STR2, FWD1, FWD2;
	STR1 = omega * (L / 2.0) + A;
	STR2 = -omega * (L / 2.0) + B;
	FWD1 = omega * (W / 2.0) + C;
	FWD2 = -omega * (W / 2.0) + D;

	STR = (STR1 + STR2) / 2.0;
	FWD = (FWD1 + FWD2) / 2.0;

	return RigidTransform2D::Delta(FWD, STR, gyroDelta);
}

This is a topic we are very much interested in too. If we make any additional progress we’ll share the results here.

4 Likes

SLAM is an algorithmic approach to localization.

Optical Flow is a sensor heirarchy and feedback method.

SLAM is cool, and has a much larger body of work surrounding it. Optical Flow setups so far have been only utilized in the multirotor industry, and a few (niche) others. Basically using an Optical Flow setup would make SLAM easier.

I remember seeing a thread a while back that demostrated the use of a standard optical mouse on a floor to track distance. Obviously for a normal tank drive, using just one would not be enough to calculate distance and position given the way tanks drive, but for a swerve drive, considering motion is in every direction side to side, a mouse on the center of rotation should be able to track distance driven and/or location (which would be a bit trickier, I imagine). I am not sure how accurate this would actually be, but I suspect it would be a relatively easy way to track how far the swerve has moved.

This might not be what you’re looking for, but it seems like a quick and dirty way to get it. I haven’t ever used a mouse for tracking a surface, but if anyone has any input on its use, that would be interesting to hear!

I’m using an optical mouse for position encoding in an unrelated robotics project with an omni drive system. You have to be real careful about the rotation of your robot, but even more careful about the height it is off the floor, and what floor you’re driving on. Too far and it won’t track. If you do it on a dirty floor, you’ll pick up hair and dirt and all sorts that will clog up the sensor. I wouldn’t recommend it for FRC.

If you can make it work, it actually provides pretty decent results. I used a 1000DPI mouse with a USB polling overclock from 125Hz to 250Hz and noticed very little drift, and pretty high accuracy over a few meters, but your mileage may vary. Don’t forget to disable cursor acceleration (you don’t need to do this if you read from /dev/input/mice) and be wary of high speeds, since they tend to throw off the sensor a little in my experience.

Optical mice sensors don’t place nice on carpet. Though, there are some interesting possibilities in this space for tracking odometry… have yet to see it work for FRC.

Sparce matrix optical flow and look up.

We experimented with optical flow on a swerve drive after the 2015 season. Didn’t consider it feasible to use a sensor such as the ones in an optical mouse that is touching the ground at all times, so we played around with the PX4Flow from pixhawk. Its effectively a camera sensor for a drone, but we swapped out the stock lens for one with a smaller focal length in order get a wider picture, and thus stay mounted relatively close to the ground. From our tests, we found that it actually tracked the motion of the robot fairly well in a few rare cases. However, it was impractical if you wanted to use it at beyond a crawling speed. The low focal length caused fisheye distortion which messed up the feature recognition algorithm onboard the sensor. An object that was traveling across the Frame at a constant rate would appear to travel slower at the edges than the center.

Other factors that impeded its development for us included the different kinds of carpet and how the sensor behaved differently. Depending on the carpet, there may or may not be enough features the camera to track. And the only way to improve that is to increase the image processing resolution, which decreased our frame rate due to constraints of the onboard sensor for the camera.

Pretty much there was always some trade off that made it more trouble than it was worth.

I think the concept is possible, but with custom hardware and software changes that solve some of the bottlenecks. I don’t think there’s anything plug and play out there that can accomplish this though.

Good stuff. Thanks for sharing this.