We were playing with the KOP Gyro today and came up with some helpful hints. There have been many great threads on the gyro but I did not see this particular technique/procedure.
The datasheet for the gyro says the default sensitivity is 0.007, that is the value used by the Gyro class. But the datasheet also says the initial value can be between 0.0062 and 0.0078. Since the error accumulates it can lead to the significant drift (documented by many CDers). So getting the sensitivity correction right for your gyro is very important.
After waiting a couple seconds(after power-on) try this calibration loop (with the gyro level and not moving):
reset the accumulator (myGryo->reset())
wait for a while (taskDelay(sysClkRateGet() / 4)
read the angle (myGyro-GetAngle())
if the angle is positive, decrease the sensitivity by 0.0001 (myGyro->SetSensitivity())
if the angle is negative, increase the sensitivity by 0.0001 (myGyro->SetSensitivity())
keep track of the choice to increase or decrease the sensitivity
if the angle is positive and you increased the sensitivity, break out of the loop
if the angle is negative and you decreased the sensitivity, break out of the loop
if the angle is outside 0.0062 to 0.0078, break out of the loop
After we did this 3 different KOP gyros drifted less than 1 degree in 10 minutes. We could add temperature-based correction and make it a little better but the datasheet says that is only +/- 2% - not sure we will try. We could also make a wiser choice between the last two values to improve things a bit.
We still advocate resetting the accumulator (as recommended by many wide CDers) and only using the KOP gyro for 10s of seconds at a time but this calibration does make life easier and leaves us with less concern about the drift. PM me if you want the code.
// gyro should be out of cal mode by now,
// try to find and set best sensitivity
for(uLoop = 0; uLoop < 10; uLoop++)
{
float fAngleSensitivity = Gyro::kDefaultVoltsPerDegreePerSecond;
bool bLastAdjustmentWasIncrease;
// reset the accumulator, stop a while and see if we drifted
pGyro->Reset();
taskDelay(sysClkRateGet() / 4);
fAngle = pGyro->GetAngle();
if(fAngle > 0.0)
{
fAngleSensitivity -= 0.0001;
bLastAdjustmentWasIncrease = false;
}
else
{
fAngleSensitivity += 0.0001;
bLastAdjustmentWasIncrease = true;
}
pGyro->SetSensitivity(fAngleSensitivity);
if((fAngleSensitivity > 0.0) && (bLastAdjustmentWasIncrease == false))
{
// we were adjusting up then went down a step - good enough
break;
}
else if ((fAngleSensitivity < 0.0) && (bLastAdjustmentWasIncrease == true))
{
// we were adjusting down then went up a step - good enough
break;
}
if((fAngleSensitivity >= (Gyro::kDefaultVoltsPerDegreePerSecond + 0.0008)) ||
(fAngleSensitivity <= (Gyro::kDefaultVoltsPerDegreePerSecond - 0.0008)))
{
// the next number would be outside any reasonable value according to the datasheet
break;
}
}
pGyro->Reset();
bCalibrated = true;
I’m a bit confused on the calibration method you are using. The sensitivity and the bias are two different things. If the angle drifts up-wards, doesn’t that indicate that the bias is too low? How does adjusting the sensitivity downwards resolve this? I feel like I must be missing something in how this works.
This method is only to adjust the sensitivity which the datasheet says could be from 0.0062 to 0.0078. The bias is a different matter. My gyros always comes up at zero. How could there be a bias if one adopts the initial position of the robot as a zero degree orientation?
The bias is the voltage output when there is no rotation. For a 5V gyro this is usually 2.5V nominal. WPILib calibrates the bias of the gyro as part of the constructor. Errors in this calibration, or drift of the bias point, result in drift in the signal.
The sensitivity is measured in mV/degree/s. Your calibration is at 0 degrees/s. How does this yield usable data about the sensitivity?
The 2.5V is from a precision source though it could change as read at the A2D because of A2D errors, cable length etc. But I don’t think it matters because (as you pointed out) WPILib takes this out in the Gyro constructor.
I am attempting to find the bias of the sensitivity, not do a dynamic calibration of the sensitivity. If the sensitivity is not 0.007 (and it is unlikely to be exactly 0.007 per the datasheet), the output of GetAngle() and GetRate() drift (often a lot, even just sitting on the bench). This is not a dynamic calibration - I don’t think that is required for the way we use the gyro in FRC.
Huge caveat: I have yet to try it on the robot. No doubt noise, shocks and approaching the max acceleration is still a problem. It does show 90 turns accurately on the bench and returns to zero. It does not seem to vary from power cycle to power cycle but it does vary from gyro to gyro. It probably varies with temperature but we only use the gyro in autonomous so I haven’t looked into it.
This is not a panacea. I was only trying to start out with a better sensitivity setting for any given gyro part/board.
The argument for the bias vs sensitivity does make sense; can the same pattern be applied to the bias instead of the sensitivity? (i.e. is there a Gyro::SetBias method)
Also, what was the final sensitivity value you reached with this calibration? Was it a realistic value?
If what you did cut down on the drift significantly when deg/s = 0, it seems as though it would have to push the sensitivity to an extreme value where any error in bias would not accumulate as any significant angular value (but the voltage error would still be present); and in this case, the reported angle would be very inaccurate when any rotation is applied.
We limited the change in sensitivity to the range specified in the datasheet. The final number was different for each gyro. We tried three and got 0.0064, 0.0065 and 0.0066. They still drift some but less so.
I realize this is not a full-scale dynamic calibration. Such a calibration is an elaborate exercise. But the gyros consistenly drift sitting on the bench. If they are measuring noise I figured we could try to get the noise measurements to be around zero.