|
nyquist vs averaging
suppose the following:
1. you have an infinite amount
of periodic data available for post processing.
2. the information of interest is at a
frequency higher than your data-acq card can sample.
3. you are able to cut the data into an infinite number
of single cycle data sets.
4. you interpolate the data using a high order scheme
and double the number of points.
5. you then average an "infinite" number of
resampled-single-cycle data sets into one "super-sample".
question:
does this "super-sample" contain information
beyond what the original sampling rate would normally allow?
does the averageing of interpolated data get you a little more?
does this have a name, is this ever done?
thanks in advance.
find a job or post a job opening
it could, assuming that the single-sample bandwidth is sufficiently high.
many sampling scopes and so-called "box-car" averaging use this technique. there used to be sampling scopes that sampled at 40 mhz on periodic signals while the scope had 1 ghz analog bandwidth. therefore, it could sample its way into 1 ghz bandwidth periodic signals. however, if you tried that with a 40 mhz analog bandwidth, you'd get pretty much bupkis.
"does this "super-sample" contain information
beyond what the original sampling rate would normally allow? "
also check out zoom fft
cheers
greg locock
that would depend on the sampler. for the sampling scopes from about 10 yrs ago, the answer is yes. the issue back then was that there weren't memories fast enough or deep enough to do everything in real time, but a slower memory could be accomodated by low-speed sampling clock that could effectively interleave samples from a later time on a periodic signal, but only if the sampling bandwidth was sufficient large accurately capture the high frequency signals.
under those conditions, the reconstructed signal would accurately represent the signal, but not the noise, as if it had been sampled in real time at the effective higher sampling rate.
eventually, those scopes faded away as memories got fast enough to store the data in real time.
ttfn
just rereading the op, is this a homework question?
cheers
greg locock
thank you all for your input.
bandwidth, i'll look into that,
there's a lowpass filter
upstream of the data-acq card,
which might make this a non-starter.
and this is not a homework problem,
(but could be for some nay-sayer).
i just needed a countervaling opinion
to tip the balance-of-merit towards science.
look up bandpass sampling, this technique is commonly used in radio receiver design. interpolation does not in general provide new information. you will need to swap your lowpass with a bandpass, since all the nyquist zones will fold down.
for example, if your analog bandwidth is 500 mhz, and your sample rate is 100 msps, you will have ten nyquist zones within your analog bandwidth. you could look at data from any one of these zones by using the proper filter. for instance, an if of 70 mhz (above nyquist), will fold down to 30 mhz. however, without a bandpass filter there will be ambiguity since signals at 30 mhz will also show up at 30 mhz, as well as signals at 130 mhz, 170 mhz etc. therefore you will need a bandpass filter that has reasonable attenuation at all the frequencies that could alias into your desired output.
peter |
|