Time

is there any accurate way to tell time in autonomous mode?

my teammates on electronics say that you can make a custom circuit and create a timer, or you can figure out the loop time, and add one to it every time it loops, and then figure out a time like that. I’m not really into this stuff, so sorry I can’t go into more detail

Cory

thanks, yeah lets just say counting loops isn’t very appealing

There is one way - Assuming that your driver is constantly pressing forwards on the joysticks the entire autonomous mode, your code will be able to tell 15 seconds of time when the joysticks give a value. You could also tell when the auton_mode bit changes and be able to tell 15 seconds of time.

Then, you could be counting loops in the background, and debug the value in time of one loop by dividing the total number of loops that occurred during one autonomous mode peridod by the 15 seconds.

Sorry if this too complicated, but telling time isn’t really practical on the BASIC stamp.

*Originally posted by Goya *
**There is one way - Assuming that your driver is constantly pressing forwards on the joysticks the entire autonomous mode, your code will be able to tell 15 seconds of time when the joysticks give a value. You could also tell when the auton_mode bit changes and be able to tell 15 seconds of time.

Then, you could be counting loops in the background, and debug the value in time of one loop by dividing the total number of loops that occurred during one autonomous mode peridod by the 15 seconds.

Sorry if this too complicated, but telling time isn’t really practical on the BASIC stamp. **

Your driver will not be standing in the drivers booth. They cannot be in contact with the controls during the 15 sec

*Originally posted by Cory *
**my teammates on electronics say that you can make a custom circuit and create a timer, or you can figure out the loop time, and add one to it every time it loops, and then figure out a time like that. I’m not really into this stuff, so sorry I can’t go into more detail

Cory **

using a 555 and the R*C = MS, right?

if you are allowed to do that… i mean what kind of input would you need?

dig or analog?

digital input is dependent on bitwidth
analog is always 0 - 255 right?
so i guess digital will due beucase you only need ot know when the thing is high.

Thanks cory…

Alright, here’s what I know:

The OI passes data to the RC in a very continuous, predictable fashion: every 26 milliseconds, one data packet is sent to the robot.

Now, the robot’s cycle is fully dependant on the code it’s running through, so it may miss a cycle or two. Therefore, InnovationFIRST created a variable named delta_T that keeps track of how many packets you missed in the last cycle. Can you see where this is going?

If you keep track of delta_T, you can tell how many milliseconds have passed. Errors will add up though, as you get caught in the middle of cycles and so on… it’s better than nothing though.

Someone correct me if I’m wrong.

*Originally posted by Cory *
**Your driver will not be standing in the drivers booth. They cannot be in contact with the controls during the 15 sec **

Did you read this in the rules somewhere? If it’s there, I missed it, and I haven’t heard anyone else mention it before.

I think the drivers WILL be allowed in the drivers’ stations, and may touch the controls as much as they want. The operator interface, on the other hand, will refuse to forward any operator inputs (with the exception of the E-Stop button) to the robot until the 15 seconds have passed.

I read this in the rules.
rule 7.6 states the the only operator controll allowed is pressing the E-stop button. I also thought i read somehwere else that the operators will not even be near the controls. I may be wrong though

Cory

Thats the only control that is ACTIVATED, not that can be touched. The controls wont be working; in fact, all values will be either 127 for joysticks and 0 for buttons.

So, it is legal to touch the controls, but no signals will be passed over to the robot. That way we’ll know when the autonomous mode is over.

However, if the program uses the auton_mode bit, the program will be able to know exactly when the 15 seconds are over, regardless of the driver’s joysticks.

*Originally posted by Gui Cavalcanti *
The OI passes data to the RC in a very continuous, predictable fashion: every 26 milliseconds, one data packet is sent to the robot.

Not entirely true…

Although the master processor gets updated data regularly, the Stamp chip only gets data, at best, every 25ms. This is not an exact value, only an approximation. This can even be longer, depending on if it misses a packet of data.

Basically, the packets are thrown at it one after another. If it’s still chuggin at some code when that packet gets there, it’s gone. Then it has to wait another 25ms to get a new packet of data (total of 50ms), even if the Stamp finished executing code at 26ms and is waiting at the SERIN for new data.

There is a variable in the code that will tell you how many packets have been sent. Normally, it will be 1, meaning that only 1 packet has been sent. If it is more, then it is (delta_t * 25ms).

There are commands that will pause the execution of your code. PAUSE [number_milliseconds] will halt execution of your code for that amount of time. I don’t think you can pause it for more than 8 packets of data though (200ms), or the RC flakes out. I’m not sure, haven’t tried it myself, just something I heard. Try it, it won’t break your robot or anything.

There’s a thread on this already… instead of me trying to explain it, I’ll link there. I know there’s another one out there besides this one. Search for it.

http://www.chiefdelphi.com/forums/showthread.php?s=&threadid=10275&highlight=BASIC+stamp+time+deltat

*Originally posted by Jnadke *
**
There is a variable in the code that will tell you how many packets have been sent. Normally, it will be 1, meaning that only 1 packet has been sent. If it is more, then it is (delta_t * 25ms) **

I have used delta_t and delta_t will not equal 1 if 1 packet has been sent. delta_t will tell you how many packets you missed while executing your code. If you only missed on it will return a 0. So the actually more like ((delta_t + 1)*25ms). I don’t quite use this formula but you should keep in mind that if delta_t = 0 then 25ms has passed. If it equals 1 then 50ms and so on. Again this is not that exact but it is the closest you will be able to get.

In my personal experiance I found that the Stamp gains 3sec over 1min. I guess this is not too bad but it definitly is not good. Over 15sec it doesn’t gain too much so if you are trying to keep track of time in Autonomous mode then you should be ok.

*Originally posted by rust710 *
**I have used delta_t and delta_t will not equal 1 if 1 packet has been sent. delta_t will tell you how many packets you missed while executing your code. If you only missed on it will return a 0. So the actually more like ((delta_t + 1)*25ms). I don’t quite use this formula but you should keep in mind that if delta_t = 0 then 25ms has passed. If it equals 1 then 50ms and so on. Again this is not that exact but it is the closest you will be able to get.

In my personal experiance I found that the Stamp gains 3sec over 1min. I guess this is not too bad but it definitly is not good. Over 15sec it doesn’t gain too much so if you are trying to keep track of time in Autonomous mode then you should be ok. **

That is correct, delta_t is the number of packets missed.

The loop speed is actually 26ms, not 25. If you use 25, you will be off by 2.4 seconds over a minute.

*Originally posted by Joe Ross *
**The loop speed is actually 26ms, not 25. If you use 25, you will be off by 2.4 seconds over a minute. **

Opps. Well I ussally multiply seconds by 40 to get the number of counts and add till I get to that number. I wrote a counter that can theroreticly count to 26min but havent had time to test it. Probaly gain several minutes and is not worth it. Besides can you imagine a 26 min FIRST compotion?