integer division

the following is part of kevins accelerometer code

 
#include "p18f452.h"
#include "accel.h"

unsigned int X_Axis_Total_Period = 2/1000; // x-axis total cycle time (T2 in the data sheet)
unsigned int X_Axis_Pulse_Period = 0; // x-axis amount of time the pulse is high (T1 in the data sheet)
unsigned int Y_Axis_Total_Period = 2/1000; // y-axis total cycle time (T2 in the data sheet)
unsigned int Y_Axis_Pulse_Period = 0; // y-axis amount of time the pulse is high (T1 in the data sheet)

my debugger says it loops in a file called 16 bit integer division leading me to think that it hangs on the 2/1000 definition of t2

any ideas?

Integer division: 2/1000 is equal to 0

In fact, 999/1000 = 0 unless your math routine does rounding…

lovely

how do the programmers in FIRST usually get around things like that

(how do they express not whole numbers)

Umm, you’re wrong. In my original code I initialize X_Axis_Total_Period and Y_Axis_Total_Period to zero.

-Kevin

The easiest way would be to use floating point numbers. The more efficient way (because floating point takes more processor instructions) would be to multiply everything by some factor of 10. ie, your 2/1000 would be 2 if you multiplied everything that uses that number by 1000.

This is precisely how the code works. Those variables aren’t in units of seconds, but in timer ticks. Each tick counts as 100 ns of elapsed time.

-Kevin