Gyro math filters

I’ve been working on a couple of different filters for our gyro (without the robot, so I can’t verify anything I’m writing), and I’ve run into a problem. I’m trying to incorporate the gyro and the accelerometer in a complimentary filter, but it just occurred to me that I’m measuring the wrong axis with the accelerometer. I’m measuring pitch/roll when I need to be measuring yaw. All of the filters I’ve been working on rely on the following flawed code.

#include <WPILib.h>
class DefaultRobot : public SimpleRobot {
	ADXL345_I2C *accl; 
	Gyro *gyro;
	
	public:

	DefaultRobot(void){
		accl = new ADXL345_I2C(1, ADXL345_I2C::kRange_2G);
		gyro = new Gyro(1);
	}
	
	void Autonomous(void){
		while(IsAutonomous()){
		}				
	}
	
	void OperatorControl(void){ 
		DriverStationLCD *driverStationLCD = DriverStationLCD::GetInstance();
		gyro->SetSensitivity(0.007); //default sensitivity.  You can adjust to account for drift (to a degree)
		gyro->Reset(); 
		
		float tau      = 0.075;
		float a        = 0.000;
		float x_angleC = 0;
		float angle;
		
		float y, z, accAngle, gyrRate;
		while (IsOperatorControl()){
			y = acc.GetAcceleration(ADXL345_I2C::kAxis_Y);
			z = acc.GetAcceleration(ADXL345_I2C::kAxis_Z);		
			accAngle = atan2(accY,accZ)/**180/PI*/; //last part to convert from radians to degrees if not working properly
			gyrRate = gyro->GetRate(); 
			
			angle = Complimentary(accAngle, gyrRate, 20);
			
			driverStationLCD->PrintfLine((DriverStationLCD::Line) 0, "Angle: %f degrees", angle);
			driverStationLCD->UpdateLCD();
		}
	}
	
	float Complementary(float newAngle, float newRate,int looptime) {
		float dtC = float(looptime)/1000.0;                                    
		a=tau/(tau+dtC) ;
		x_angleC = a * (x_angleC + newRate * dtC) + (1-a) * (newAngle);
		return x_angleC;
	}
};

START_ROBOT_CLASS(DefaultRobot);

In order to determine any angle from the accelerometer, I need to use the Z axis (for measuring acceleration due to gravity, and thus finding the angle), but I don’t think there’s any way to use the Z axis when determining yaw because the accelerometer will always be parallel to the ground, and so the accelerometer’s Z axis should always stay the same. Since I realize I’ve been doing this all wrong, is it even possible to utilize the accelerometer when determining the yaw rotation?

Do you want to do a full 6 degree of freedom system or do you want to assume that you stay on flat ground?

It isn’t possible to use an accelerometer for measuring yaw rotation. A magnetometer should be used instead since it’s essentially a compass. I’m not sure where you could get one that’s easy to set up or use. My team got one from SparkFun, but it requires a 3.3V source; we haven’t spent enough time figuring out how to power it yet.

I’m assuming that the robot is always on a flat surface

That’s pretty much the conclusion I arrived at. There must be some correlation between rotating and accelerometer output, though, like centripetal acceleration?

A magnetometer should be used instead since it’s essentially a compass.
I’ve heard mixed opinions about using compasses. We already have one of these, so I guess we’ll modify the ADXL345_I2C class into a magnetometer class (using its pre-written i2c code) and see how it goes from there

An interesting experiment is to put a smartphone on a flat surface and rotate it. Usually the screen display doesn’t change between portrait/landscape unlike typical usage when the phone is tilted. This is also because the accelerometer can’t distinguish orientation in the X-Y plane (yaw).

The key advantage of the complementary filter is that it’s far simpler than approaches like the Kalman filter if you make certain assumptions about the signals. So in the case of an aerial vehicle we can assume that total accelerometer values near 1g represent a stable orientation.
If you have wheel encoders on a wheeled robot you can make similar assumptions about rotation based on the left-right count difference.

A Kalman filter-based state estimator would work well here. Do a search for “tilt.c” on the web for an example that many people have used.

-Kevin

I would not recommend using a magnetometer. Our team has spent quite some time trying to come up with an IMU solution for our robot and magnetometers are very unreliable when you have motors surrounding it all creating massive fluxing magnetic fields. I would suggest using a combined accelerometer and gyroscope to get as accurate of a measurement as possible, possibly using an off board processor for higher accuracy and speed.

You might be interested in the Nav6 open source IMU. https://code.google.com/p/nav6

It’s got 3 axis gyro, 3 axis accelerometer, 3 axis magnetometer. It’s also got a dedicated chip running fusion algorithms and auto calibration algorithms. Also has open firmware source that converts all that to yaw/pitch/roll/linear acceleration values. It’s very accurate.

There are also CRio libraries for c++, java and Labview, and sample robot code too.

But if you want to learn the fusion algorithms and do it yourself, I recommend reading the wiki at the link above. Under Advanced Topics section, see the References page.

  • scott

I’ve had good experience with the IMU Sensor Fusion Algorithm from Sebastian Madgwick: http://www.x-io.co.uk/open-source-imu-and-ahrs-algorithms/