Gyro/Accelerometer

I am developing some gyro and acceleromter code for my team and have one big question? Is there some standard ratio between the analog output from the gyro./accel. and there values in degrees/sec / meters/sec^2. I’m not even sure what unit the accel. outputs.

I understand that there are documents at www.analog.com to help out, but they are way to confusing for me to understand. Can someone put it in simpler terms?

Look at http://www.kevin.org/frc
It has everything for the gyro working quickly in frc_gyro.zip
The accelerometer is a bit harder…But it’s not the best way to track position anyway.

I understand that Kevin Watson has developed code for the gyro, but I am deveoping something independently. I just want to know the ratio from Analog value to actual value. I don’t want to go fishing through Mr.Watson’s code if someone already knows where to look.

It’s all in the specsheets, which are at http://www.usfirst.org/robotics/2006/2006specsheets.htm
Keeping in mind that the analog signal is between 0V and 5V, and the specsheets mention (on the third or so page) the info in units of <unit of acceleration or angular velocity>/V you can determine what analog value corresponds to which.
Keep in mind that it might not use the whole range of 0-1024 (or whatever the range is depending on how many bits of precision you get.) You just need to analyze the specsheet

What you are looking for is the sensitivity of the sensor, you should be able to find it in the data sheet for whichever sensor you are working with. It should be listed as 26mV/(deg/sec) (just as an example) and you’ll need to do a little math to convert that to counts based off of the micro’s A/D resolution.

Don’t forget the calibration routine I sent you, when we gave you the sample gyro code to work off of, it should still be in your PMs, once you have the values for 90 180 270 and 360/0 degrees you can figure out the intermediaries pretty simply.