|
|
|
![]() |
|
|||||||
|
||||||||
![]() |
| Thread Tools | Rate Thread | Display Modes |
|
#1
|
||||
|
||||
|
time-based or event-based TeleOp
This thread is a continuation of a discussion which began here. I'll post a couple of excerpts (these excerpts are not replies to each other, just pertinent info): Quote:
Quote:
Quote:
Last edited by Ether : 23-02-2012 at 11:43. |
|
#2
|
|||||
|
|||||
|
Re: time-based or event-based TeleOp
Quote:
Teleop is just a slave task. It gets called every time there is a new packet available to process. If the packets came faster, then Teleop would just get called faster. |
|
#3
|
||||
|
||||
|
Re: time-based or event-based TeleOp
Quote:
|
|
#4
|
|||||
|
|||||
|
Re: time-based or event-based TeleOp
Oh, I'd never use the nominal Teleop 20ms as a timing device. Only if the event/action doesn't make sense without a new driver order. The 20ms certainly isn't guaranteed, and for many teams isn't even close.
Anything I want to do based on a period, I do in Periodic Tasks. If it's a time critical task, then I'll perform calculations based on a system time check. If it's a really critical task, then I'll use a Timed Structure loop. As you say, doing a system time check and calculation would work in Teleop, but the response action is still going to be occurring at wacky time intervals. You've seen this Timing is Everything whitepaper before. I suppose I should add a watch on Teleop under different programming/cRIO/DS PC conditions that all contribute to really sloppy times. Last edited by Mark McLeod : 23-02-2012 at 12:53. |
|
#5
|
||||
|
||||
|
Re: time-based or event-based TeleOp
We recently run into this issue as well. This is our first year using Jaguars on CAN bus. Apparently, a CAN bus command takes quite some time for a round trip. We are using 7 CANJaguars this year and we have implemented our own Cooperative Multi-Task Robot loop instead of using either the SimpleRobot or IterativeRobot templates. But it is based on the IterativeRobot class so we are also using the "20 msec" default DS loop time. The problem with this is that each loop involves updating each Jaguar. Updating each Jaguar not only involves setting the speed (the Set() method) but also possibly the GetPosition() or GetSpeed() methods if the Jaguar is involved in some sort of PID control. So the number of CAN messages per loop could be substantial. We found out that it consistently exceeds 50 msec for our loop. So we decided to do a SetPeriod of 100 msec. So effectively, we are hammering the CAN bus less frequently. We are going to add some more diagnostic code to collect info such as average number of CAN messages sent per loop and average time spent in each loop for sending CAN commands. We are experimenting with the implementation of a bandwidth limiting GetSpeed() method for the CANJaguar class. This basically checks the timestamp of the last call to SetSpeed and if the elapsed time between calls is shorter than a threshold, it will return the last speed value instead of sending another CAN command to get a new speed value. We probably don't have time this year but after the season ends, we would experiment with implementing an alternate CANJaguar class that is more parallel to the PWM system that no matter how often Set() is called, it will just update a variable remembering the new speed. Then a separate thread will be used to periodically send updates to the Jaguars. This will limit the bandwidth of the CAN traffic to a predictable level. Any comments on this approach?
Last edited by mikets : 23-02-2012 at 14:50. |
|
#6
|
||||
|
||||
|
Re: time-based or event-based TeleOp
Quote:
Quote:
Quote:
What's the downside of "Wait Until Next Multiple" ? When would you ever want to use "Wait" instead? I apologize if I've asked this question before, but I can't find the answer in my notes: how does LabVIEW implement parallel tasks on the cRIO? Does it use vxWorks to assign each parallel task to its own thread, or does LabVIEW have its own built-in preemptive time-slicing scheduler? |
![]() |
| Thread Tools | |
| Display Modes | Rate This Thread |
|
|