scispace - formally typeset
Search or ask a question
Author

Shuichi Itoh

Bio: Shuichi Itoh is an academic researcher from University of Electro-Communications. The author has contributed to research in topics: Wavelet & Wavelet transform. The author has an hindex of 9, co-authored 28 publications receiving 497 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: In this article, a pre-sented formula for recovering the original signal from its irregularly sampled values using wavelets, which extendsthe Walter sampling theorem to the irregular sampling case and generalizes the Paley-Wiener 1/4-Theorem by remov-ing the symmetricity constraint for sampling, is pre-specified.
Abstract: SUMMARY A formula for recovering the original signal from itsirregularly sampled values using wavelets, which extendsthe Walter sampling theorem to the irregular sampling caseand generalizes the PaleyŒWiener 1/4-Theorem by remov-ing the symmetricity constraint for sampling, is pre-sented.a 1999 Scripta Technica, Electron Comm Jpn Pt 3,82(5): 65Œ71, 1999Key words: Wavelet; sampling theorem; scalingfunction; orthogonality; biorthogonality. 1. Introduction In digital signal and image processing, digital com-munications, and so forth, a continuous signal is usuallyrepresented and processed by using its discrete samples.How, then, are we to reconstruct the original signal from itsdiscrete samples? The classical Shannon sampling theoremgives the following formula for band-limited finite energysignals.For a finite energy s-band continuous signalf( t), 2t ˛ R , that is, supp f^(w) I [- s , s] and f ˛ L (R), it canbe recovered by the formulawhere f^ is the Fourier transform of f(t) defined byIf we let s =

7 citations

Journal Article
TL;DR: The algorithm for designing a pattern classifier, which uses MDL criterion and a binary data structure, is proposed, which gives a partitioning of the space of the K-dimensional attribute and gives an estimated probability model for this partitioning.
Abstract: The algorithm for designing a pattern classifier, which uses MDL criterion and a binary data structure, is proposed. The algorithm gives a partitioning of the space of the K-dimensional attribute and gives an estimated probability model for this partitioning. The volume of bins in this partitioning is asymptotically upper bounded by /spl Oscr/((log N/N)/sup K/(K+2/)/sup )/ for large N in probability, where N is the length of training sequence. The redundancy of the code length and the divergence of the estimated model are asymptotically upper bounded by /spl Oscr/(K(log N/N)/sup 2/(K+2/)/sup )/. The classification error is asymptotically upper bounded by /spl Oscr/(K/sup 1/2/(log N/N)/sup 1/(K+2/)/sup )/.

4 citations

Proceedings ArticleDOI
27 Oct 1994
TL;DR: A new coding strategy by which the desired quality of reproduced signal can be guaranteed with the minimum cost of coding rate is proposed.
Abstract: This paper proposes a new coding strategy by which the desired quality of reproduced signal can be guaranteed with the minimum cost of coding rate. The idea was successfully introduced to the DOWT-based coding system for the ECG compression application.

3 citations


Cited by
More filters
Journal ArticleDOI
01 Apr 2000
TL;DR: The standard sampling paradigm is extended for a presentation of functions in the more general class of "shift-in-variant" function spaces, including splines and wavelets, and variations of sampling that can be understood from the same unifying perspective are reviewed.
Abstract: This paper presents an account of the current state of sampling, 50 years after Shannon's formulation of the sampling theorem. The emphasis is on regular sampling, where the grid is uniform. This topic has benefitted from a strong research revival during the past few years, thanks in part to the mathematical connections that were made with wavelet theory. To introduce the reader to the modern, Hilbert-space formulation, we reinterpret Shannon's sampling procedure as an orthogonal projection onto the subspace of band-limited functions. We then extend the standard sampling paradigm for a presentation of functions in the more general class of "shift-in-variant" function spaces, including splines and wavelets. Practically, this allows for simpler-and possibly more realistic-interpolation models, which can be used in conjunction with a much wider class of (anti-aliasing) prefilters that are not necessarily ideal low-pass. We summarize and discuss the results available for the determination of the approximation error and of the sampling rate when the input of the system is essentially arbitrary; e.g., nonbandlimited. We also review variations of sampling that can be understood from the same unifying perspective. These include wavelets, multiwavelets, Papoulis generalized sampling, finite elements, and frames. Irregular sampling and radial basis functions are briefly mentioned.

1,461 citations

Journal ArticleDOI
TL;DR: In this review, the emerging role of the wavelet transform in the interrogation of the ECG is discussed in detail, where both the continuous and the discrete transform are considered in turn.
Abstract: The wavelet transform has emerged over recent years as a powerful time-frequency analysis and signal coding tool favoured for the interrogation of complex nonstationary signals. Its application to biosignal processing has been at the forefront of these developments where it has been found particularly useful in the study of these, often problematic, signals: none more so than the ECG. In this review, the emerging role of the wavelet transform in the interrogation of the ECG is discussed in detail, where both the continuous and the discrete transform are considered in turn.

794 citations

Journal ArticleDOI
TL;DR: A unified framework for uniform and nonuniform sampling and reconstruction in shift-invariant subspaces is provided by bringing together wavelet theory, frame theory, reproducing kernel Hilbert spaces, approximation theory, amalgam spaces, and sampling.
Abstract: This article discusses modern techniques for nonuniform sampling and reconstruction of functions in shift-invariant spaces. It is a survey as well as a research paper and provides a unified framework for uniform and nonuniform sampling and reconstruction in shift-invariant subspaces by bringing together wavelet theory, frame theory, reproducing kernel Hilbert spaces, approximation theory, amalgam spaces, and sampling. Inspired by applications taken from communication, astronomy, and medicine, the following aspects will be emphasized: (a) The sampling problem is well defined within the setting of shift-invariant spaces. (b) The general theory works in arbitrary dimension and for a broad class of generators. (c) The reconstruction of a function from any sufficiently dense nonuniform sampling set is obtained by efficient iterative algorithms. These algorithms converge geometrically and are robust in the presence of noise. (d) To model the natural decay conditions of real signals and images, the sampling theory is developed in weighted L p-spaces.

762 citations

Journal ArticleDOI
J.D. Gibson1
01 Apr 1987

385 citations

Proceedings ArticleDOI
24 Aug 2008
TL;DR: This work shows how a novel multi-resolution symbolic representation can be used to index datasets which are several orders of magnitude larger than anything else considered in the literature, allowing for the exact mining of truly massive real world datasets.
Abstract: Current research in indexing and mining time series data has produced many interesting algorithms and representations. However, the algorithms and the size of data considered have generally not been representative of the increasingly massive datasets encountered in science, engineering, and business domains. In this work, we show how a novel multi-resolution symbolic representation can be used to index datasets which are several orders of magnitude larger than anything else considered in the literature. Our approach allows both fast exact search and ultra fast approximate search. We show how to exploit the combination of both types of search as sub-routines in data mining algorithms, allowing for the exact mining of truly massive real world datasets, containing millions of time series.

375 citations