Example gyro code released.

What kind of gyro are you using? Did you try the manual method that uses Set_Gyro_Bias()? Just make sure the gyro is motionless as the bias is being calculated.


We are using ADXRS300ABG & ADXRS300EB-ND Gyros. Yes we tried the manual method that uses Set_Gyro_Bias()?

How much drift are you seeing (you need to calibrate the scaling factor first).


Ok, I just looked into this as well and you’re right. The darn things are too expensive! (now we need to buy a new gyro…)

Edit: Actually, on second thought, does anyone know for sure if the fact that FIRST gave it to us in previous years makes a difference?

The values for “rate” and “angle” are empty in the terminal window. Once in a while, “angle” is 0. When we manually set Bias, the values for “rate” and “angle” are gibberish…Also, sometimes, the dreaded “windows fatal error” screen comes up. We have followed the steps that was in the readme.txt file. And no changes to the code were made…except for, changing to the right gyro model #.

I think you’ll need to find a different windows machine to use before you’ll be able to make any progress.


Hey, thanks. We got it to work. We took it to a “quiet” place, and it worked fine. Maybe its those machines that messed it up…

We used the code with the ADXRS300 as well and were seeing about 8 degrees of drift over 2-1/2 minutes on a motionless robot. We modified the code to average the bias over more than a single group of samples and increased the sample size to 16 to get the extra precision. That seemed to improve the results.

Can you tell us what you were seeing for drift over time? I just wasn’t sure what to expect for results and so we tried these modifications.

Also, I have a question. You’re locking the A/D onto Channel 1 to presumably eliminate that acquisition time when switching channels. How much time does actually take (for teams that want more than one analog input)?

And do you know what the conversion time is once the channel is switched?

Thanks again for posting the code.


Without actively tracking the bias, you’re getting performance that is about what I would expect from a ADXRS300. I’d be pretty happy with a gyro that only drifted a degree over the entire autonomous period. For reference, I have a ADXRS150EB that drifts a degree over about a minute and I tested an old GyroChip that did even better.

No, I take control of the ADC hardware because I’m trying to be pretty efficient and not use up a lot of CPU time. You’ll notice that in response to the timer interrupt, I read the ADC result register and then immediatly start another conversion. If you used the ADC after my conversion was done, I’d have lost the conversion result the next time the timer interrupt fires off. One way around this is to increase the timer interrupt rate and interleave the ADC measurements.

I’m sure it’s considerably less time than the timer interrupt period, but I don’t know the exact value. The data sheet would be a good document to consult for this information.


Is it possible for us to use the ADC for say… another gyro?

Yes, but why would you?



I am a member of team 335 and one of 2 programmers. I have only been programming for a year and hardly know C(only know what i have read in Books).
I was wondering how to use the cmu camera and the gyro together for locating and directing a robot during the 15s. here is the desired algorithm

field [1/2 field length][1/2 field width]
Initilize camera

I had a question similar to Tom’s in that by using another gyro not to measure heading, but instead use it as tip sensor. Also I would like to use the accelerometer(instead of quadrature encoders) in autonomous mode which with the locking of the ADC I can’t do that.

The accelerometer is the sensor of choice to measure pitch or roll. The gyro code can be modified to interleave gyro and accelerometer measurements.

The encoders are a far better choice to measure distance than an accelerometer.


Yeah, a lot of people want to use the scripting code with the camera code. I started on the merge a few days ago and can’t really estimate when it’ll be done because I don’t yet fully understand what the camera code is doing. That coupled with camera code that’s sprinkled throughout the default code instead of being confined to one source file makes the merge a bit of a headache.


Wait… You are going to make it so we can use that scripting stuff to control the robot with the camera? If so then that is absolutely the best news I have ever heard :yikes: :ahh: :smiley: :slight_smile: . I wish the default camera code was designed more for autonomous operation, rather than with a joystick and such. It would have saved me a lot of trouble.

Don’t start celebrating yet. I haven’t a clue if we can get this working.


When I said using the accelerometer instead of quadrature encoders I meant to that it would be used along side the gear tooth sensors so that my team does not have to pay for the quadrature encoders they can use what is given in the kit. The accelerometer would only be used to integrate to velocity not position so that the teams could use only the sensors provided in the kit and have a little more knowledge of where there robot was going on the field.

I am having a bit of a problem with the gyro code at least I think I am… just to test out the gyro (a BEI Gyrochip ARQS-00075) I plugged it in and downloaded the code (edit: the newest frc_gyro code), and in the Terminal window for the Gyro Angle’s output the value just keeps increasing infinitely. When I unplug the gyro while this is running it starts to decrease infinitely. To make sure the gyro was working I just tried doing an output of the analog value and it seemed to be fine. Any ideas? (sorry I don’t have more exact information, I don’t have all the hardware at my house!)

I doesn’t sound like you read the readme.txt file.