scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Sampled signal reconstruction and Nyquist frequency

01 Sep 1996-IEEE Signal Processing Magazine (IEEE)-Vol. 13, Iss: 5, pp 26
TL;DR: The proposed method is nothing else than an analog form of the well-known filter banks, whose theory is consistent with the (correct) definition of the Nyquist frequency, and in general, the method can not be used with the zero-order hold operation, unless the filters are designed with frequency-dependent magnitude to compensate for the spectrum alteration produced by the sample-and-hold operation.
Abstract: The discussion focuses only on those remarks of J. Lynn Smith (see ibid., p.14, May 1996) that were directly related to earlier comments (see ibid., p.41, July 1995). It complements Marple's (see ibid., p.24, January 1996) filter-bank analysis and tries to remove some clutter due to the improper test signals used. The frame of this discussion consist of the following three statements: the "successful" use of a sampling rate lower than the Nyquist rate is an illusion; the proposed method is nothing else than an analog form of the well-known filter banks, whose theory is consistent with the (correct) definition of the Nyquist frequency; and in general, the method can not be used with the zero-order hold operation, unless the filters are designed with frequency-dependent magnitude to compensate for the spectrum alteration produced by the sample-and-hold operation.
References
More filters
Journal ArticleDOI
TL;DR: Signal processing analysis and simulation software tools should be used knowledgably for purposes of productivity enhancement, and should not be used blindly without the capability to determine when the answer provided by the tool “looks right.”
Abstract: Restoring the Nyquist Barrier “Results of data analyzed by software simulation tools are meaningless.” This was my first impression after reading the SP Lite article “Breaking the Nyquist Bmier” by Lynn Smith in the July 1995 issue [ 11. This article contains a number of fundamental conceptual errors upon which I shall comment. The author has also rediscovered filter banks, despite extensive published art on this topic. However, beyond these conceptual and rediscovery issues, I was most struck by the dependence of the author on the use of a software simulation tool to justify the author’s erroneous conclusions without an apparent full understanding of the graphical results that the tool produced. The Smith article reinforces a concern that I have been expressing to my colleagues in academia regarding the extensive use of DSP software simulation tools in virtual signal environments as a means for teaching signal processing. A selection of DSP software tools were highlighted in the article by Ebel and Younan 171 that appeared in the November 1995 IEEE Signal Processing Magazine, an issue dedicated, coincidentally, to signal processing education. Specifically, there appears to be a growing dependence on these tools with canned experiments that fails to adequately prepare many students for solving real world signal processing problems. This is most manifest during technical interviews that I often conduct with new graduates who are candidates for employment. Without access to software tools during the interview, I have observed with increasing incidence that these graduates, when presented with situations involving typical signal processing applications of importance to my employer, are unable to confidently propose signal processing operations using only knowledge of basic signal processing principles. The most evident difficulty has been their inability to relate properties of continuous time-domain and spatial-domain signals with discrete-domain digital representations of and operations on those signals. Mathematical normalization of parameters (for example, the assumption of an unity sampling rate, or expressing frequency in radian units) often utilized in academic treatments of signal processing operations also handicaps students in forming an intuitive sense of time and frequency scale when confronted with actual signals and their transforms. Signal processing analysis and simulation software tools should be used knowledgably for purposes of productivity enhancement, and should not be used blindly without the capability to determine when the answer provided by the tool “looks right.” This viewpoint is reminiscent of the debate concerning the introduction of hand calculators in public schools, in which it was argued whether hand calculators should be used by students as a substitute before learning the mathematical operations performed by the calculators or should be used only as productivity aids after they had substantial experience with the mathematical operations. I would now like to demonstrate, by use of first principles, “restoration” of the Nyquist bamer of the demonstration signal used in the Smith article [l] by showing that it was never broken in the first place. I will do this armed only with four basic waveforms (depicted in Fig. l), their transforms, and two variants of the convolution theorm. Specifically, if x(t) wX(f) designates the Fourier transform relationship between the temporal waveform x(t) and its Fourier transform X(f), while y(t) -Y(f) designates the Fourier transform relationship between

5 citations

Journal ArticleDOI
TL;DR: A simple signal processing scheme to extend resolution to nearly twice the Nyquist without the problem of alias emerged by placing one-dimensional signals into a simulated sample-and-hold process in a Mathcad application.
Abstract: The author discusses the concept that data sampled beyond the Nyquist frequency is meaningless. A simple signal processing scheme to extend resolution to nearly twice the Nyquist without the problem of alias emerged. It starts by placing one-dimensional signals into a simulated sample-and-hold process in a Mathcad application. >

4 citations