Quote:
|
Originally Posted by miketwalker
I've had some very interesting results when I've been working on using loop counts as timers for distance calculations. I kept getting results being half of what I wanted. So, I threw in a timer and manually timed how many loops would go by in a minute... and after dividing my results I found that each timer tick took very close to twice the 26.2ms data stream (since I did it by hand I don't know if it was exactly twice or just close). I was wondering why this is.
|
Since you talked earlier about using a lot of floats, I'd assume you are doing more then 26.2ms worth of calculations. You can check the packet number. If you only get every other packet, your program is taking to long.