View Single Post
  #1   Spotlight this post!  
Unread 07-02-2011, 22:22
Daniel_H's Avatar
Daniel_H Daniel_H is offline
Registered User
FRC #1156 (Under Control)
Team Role: Mentor
 
Join Date: Jan 2003
Rookie Year: 2003
Location: Brazil
Posts: 171
Daniel_H is just really niceDaniel_H is just really niceDaniel_H is just really niceDaniel_H is just really nice
Sensor Sampling Rates and divagations

This is an issue that bugs me ever since I started using LabVIEW: Sample Rates.

If we use the Get Sample Rate of an analog sensor, we will see values such as 50kHz, which is a common rate for any DSP application. BUT when we actually use the sensor readings, they are called in while loops with frequencies such as 100Hz or even slower (at least in all examples I found), which, I think, throws away almost all the samples. I want to know if I’m the only one bugged by that. I mean, it seems like there’s a resistance in creating a loop that runs in a higher frequency.
Any DSP application would require a higher sampling rate, but when I try creating loops with higher frequencies, such as a second order Lowpass IIR filter in a 2kHz loop, or even much processing-simple tasks, I realize the task is too much for the processor, how come?
I discovered it by comparing the actual running time with the time the loop thought it was in a plot. When the actual running time was 12s, the graph displayed something around 5s (I supplied the graph with the correct deltaT in the cluster, before anyone asks).
Anyway, I need to understand why it seems I am stuck in this millisecond world while working with such a powerful machine as cRIO.

Maybe there are some misconceptions in the text above, I would much appreciate any enlightening in this.
__________________
[<o>] gogogo [<o>]
http://undercontrol1156.com/
Reply With Quote