Quote:
|
Originally Posted by Manoel
accuracy. Also, don't forget to sample the signal at twice it's frequency to avoid aliasing.
|
Could you elaborate on this a little? Exactly how does the nyqyist theorum, bandwith, sampling rate etc relate to this application? How do you calculate the needed bandwidth for something like this. I don't know much about such things but i would like to learn. It seems to me that bandwidth should't be of all that much concern in this application as we are not doing anything related that relies much the spectral purity of the data, and the relevant data should be of relatively low frequency. IE: a physical system such as a robot will only change acceleration soo fast under normal conditions. Any data above a certian frequency is probably error and it seems it would be advantageous not to digitize this. While undersampling would help eliminate hf noise, it also creates problems with aliasing as you mentioned. It seems to me that too high of a bandwidth might actually be a bad thing because you would be digitizing more of the high frequency errors as well as the relevant data which it seems would be rather low frequency. Would i be correct in assuming this, or am i totally confused?