Quote:
|
Originally Posted by KenWittlief
int is signed, isnt it? so if a = 4 you end up with 40,000 which is a negative number divide by 262?
if this is what you changed and it got weird on you, just work with the tick counts in your code forget the fancy translations.
also could try running with a few printf statements and see what is happening at the start and end of each state - there might be something weird going on.
|
nice catch, Ken. the code this is based off of is an example that I posted on the forums here:
http://www.chiefdelphi.com/forums/sh...d.php?p=233912
the problem is that for any time period larger than about 3 seconds, 10000*a is greater than the max limit for a plain "int", and thus it's either wrapping around or just mucking stuff up.
you can solve this problem by changing the declarations of SECS_TO_TICKS and MSECS_TO_TICKS to the following, which use a "long" instead of an "int" (you'll notice that tickCount is already defined as a "long"... at least I was thinking clearly there!

):
Code:
//A handy-dandy macro to convert a number of seconds into a number of 26.2ms ticks
#define SECS_TO_TICKS(a) ( ((long)(a)) * (long)10000 / (long)262 )
//And milliseconds too:
#define MSECS_TO_TICKS(a) ( ((long)(a)) * (long)10 / (long)262 )
weee... can never have enough type casts.
I remember when I originally typed in the code, i for some reason was thinking that seconds would only need to be multiplied by 1000 - I realized my mistake, but forgot that by changing it to 10000 i needed to increase the type of integer for which that math was done.
So for anyone using the example I posted here, that change should fix it quite nicely.
Nice to see my examples are being used and are helpful for some teams.