scispace - formally typeset
Search or ask a question

Showing papers on "Wavelet published in 2000"


Journal ArticleDOI
TL;DR: An adaptive, data-driven threshold for image denoising via wavelet soft-thresholding derived in a Bayesian framework, and the prior used on the wavelet coefficients is the generalized Gaussian distribution widely used in image processing applications.
Abstract: The first part of this paper proposes an adaptive, data-driven threshold for image denoising via wavelet soft-thresholding. The threshold is derived in a Bayesian framework, and the prior used on the wavelet coefficients is the generalized Gaussian distribution (GGD) widely used in image processing applications. The proposed threshold is simple and closed-form, and it is adaptive to each subband because it depends on data-driven estimates of the parameters. Experimental results show that the proposed method, called BayesShrink, is typically within 5% of the MSE of the best soft-thresholding benchmark with the image assumed known. It also outperforms SureShrink (Donoho and Johnstone 1994, 1995; Donoho 1995) most of the time. The second part of the paper attempts to further validate claims that lossy compression can be used for denoising. The BayesShrink threshold can aid in the parameter selection of a coder designed with the intention of denoising, and thus achieving simultaneous denoising and compression. Specifically, the zero-zone in the quantization step of compression is analogous to the threshold value in the thresholding function. The remaining coder design parameters are chosen based on a criterion derived from Rissanen's minimum description length (MDL) principle. Experiments show that this compression method does indeed remove noise significantly, especially for large noise power. However, it introduces quantization noise and should be used only if bitrate were an additional concern to denoising.

2,917 citations


Journal ArticleDOI
TL;DR: A universal statistical model for texture images in the context of an overcomplete complex wavelet transform is presented, demonstrating the necessity of subgroups of the parameter set by showing examples of texture synthesis that fail when those parameters are removed from the set.
Abstract: We present a universal statistical model for texture images in the context of an overcomplete complex wavelet transform. The model is parameterized by a set of statistics computed on pairs of coefficients corresponding to basis functions at adjacent spatial locations, orientations, and scales. We develop an efficient algorithm for synthesizing random images subject to these constraints, by iteratively projecting onto the set of images satisfying each constraint, and we use this to test the perceptual validity of the model. In particular, we demonstrate the necessity of subgroups of the parameter set by showing examples of texture synthesis that fail when those parameters are removed from the set. We also demonstrate the power of our model by successfully synthesizing examples drawn from a diverse collection of artificial and natural textures.

1,978 citations


Journal ArticleDOI
01 Apr 2000
TL;DR: The standard sampling paradigm is extended for a presentation of functions in the more general class of "shift-in-variant" function spaces, including splines and wavelets, and variations of sampling that can be understood from the same unifying perspective are reviewed.
Abstract: This paper presents an account of the current state of sampling, 50 years after Shannon's formulation of the sampling theorem. The emphasis is on regular sampling, where the grid is uniform. This topic has benefitted from a strong research revival during the past few years, thanks in part to the mathematical connections that were made with wavelet theory. To introduce the reader to the modern, Hilbert-space formulation, we reinterpret Shannon's sampling procedure as an orthogonal projection onto the subspace of band-limited functions. We then extend the standard sampling paradigm for a presentation of functions in the more general class of "shift-in-variant" function spaces, including splines and wavelets. Practically, this allows for simpler-and possibly more realistic-interpolation models, which can be used in conjunction with a much wider class of (anti-aliasing) prefilters that are not necessarily ideal low-pass. We summarize and discuss the results available for the determination of the approximation error and of the sampling rate when the input of the system is essentially arbitrary; e.g., nonbandlimited. We also review variations of sampling that can be understood from the same unifying perspective. These include wavelets, multiwavelets, Papoulis generalized sampling, finite elements, and frames. Irregular sampling and radial basis functions are briefly mentioned.

1,461 citations


Journal ArticleDOI
TL;DR: A spatially adaptive wavelet thresholding method based on context modeling, a common technique used in image compression to adapt the coder to changing image characteristics, which yields significantly superior image quality and lower MSE than the best uniform thresholding with the original image assumed known.
Abstract: The method of wavelet thresholding for removing noise, or denoising, has been researched extensively due to its effectiveness and simplicity. Much of the literature has focused on developing the best uniform threshold or best basis selection. However, not much has been done to make the threshold values adaptive to the spatially changing statistics of images. Such adaptivity can improve the wavelet thresholding performance because it allows additional local information of the image (such as the identification of smooth or edge regions) to be incorporated into the algorithm. This work proposes a spatially adaptive wavelet thresholding method based on context modeling, a common technique used in image compression to adapt the coder to changing image characteristics. Each wavelet coefficient is modeled as a random variable of a generalized Gaussian distribution with an unknown parameter. Context modeling is used to estimate the parameter for each coefficient, which is then used to adapt the thresholding strategy. This spatially adaptive thresholding is extended to the overcomplete wavelet expansion, which yields better results than the orthogonal transform. Experimental results show that spatially adaptive wavelet thresholding yields significantly superior image quality and lower MSE than the best uniform thresholding with the original image assumed known.

875 citations


Journal ArticleDOI
TL;DR: In this paper, a denoising method based on wavelet analysis is applied to feature extraction for mechanical vibration signals, which is an advanced version of the famous soft thresholding denoizing method proposed by Donoho and Johnstone.

823 citations


Proceedings ArticleDOI
01 Jul 2000
TL;DR: A new progressive compression scheme for arbitrary topology, highly detailed and densely sampled meshes arising from geometry scanning, coupled with semi-regular wavelet transforms, zerotree coding, and subdivision based reconstruction sees improvements in error by a factor four compared to other progressive coding schemes.
Abstract: We propose a new progressive compression scheme for arbitrary topology, highly detailed and densely sampled meshes arising from geometry scanning. We observe that meshes consist of three distinct components: geometry, parameter, and connectivity information. The latter two do not contribute to the reduction of error in a compression setting. Using semi-regular meshes, parameter and connectivity information can be virtually eliminated. Coupled with semi-regular wavelet transforms, zerotree coding, and subdivision based reconstruction we see improvements in error by a factor four (12dB) compared to other progressive coding schemes.

630 citations


Journal ArticleDOI
TL;DR: A wavelet electrocardiogram (ECG) data codec based on the set partitioning in hierarchical trees (SPIHT) compression algorithm is proposed and is significantly more efficient in compression and in computation than previously proposed ECG compression schemes.
Abstract: A wavelet electrocardiogram (ECG) data codec based on the set partitioning in hierarchical trees (SPIHT) compression algorithm is proposed in this paper. The SPIHT algorithm (A. Said and W.A. Pearlman, IEEE Trans. Ccts. Syst. II, vol. 6, p. 243-50, 1996) has achieved notable success in still image coding. The authors modified the algorithm for the one-dimensional case and applied it to compression of ECG data. Experiments on selected records from the MIT-BIH arrhythmia database revealed that the proposed codec is significantly more efficient in compression and in computation than previously proposed ECG compression schemes. The coder also attains exact bit rate control and generates a bit stream progressive in quality or rate.

521 citations


Journal ArticleDOI
TL;DR: The wavelet packet transform (WPT) is introduced as an alternative means of extracting time-frequency information from vibration signatures and significantly reduces the long training time that is often associated with the neural network classifier and improves its generalization capability.
Abstract: Condition monitoring of dynamic systems based on vibration signatures has generally relied upon Fourier-based analysis as a means of translating vibration signals in the time domain into the frequency domain. However, Fourier analysis provided a poor representation of signals well localized in time. In this case, it is difficult to detect and identify the signal pattern from the expansion coefficients because the information is diluted across the whole basis. The wavelet packet transform (WPT) is introduced as an alternative means of extracting time-frequency information from vibration signatures. The resulting WPT coefficients provide one with arbitrary time-frequency resolution of a signal. With the aid of statistical-based feature selection criteria, many of the feature components containing little discriminant information could be discarded, resulting in a feature subset having a reduced number of parameters without compromising the classification performance. The extracted reduced dimensional feature vector is then used as input to a neural network classifier. This significantly reduces the long training time that is often associated with the neural network classifier and improves its generalization capability.

515 citations


Journal ArticleDOI
TL;DR: In this paper, a wavelet-based approach is proposed for structural damage detection and health monitoring using simulation data generated from a simple structural model subjected to a harmonic excitation, which consists of multiple breakable springs and may suffer irreversible damage when the response exceeds a threshold value or the number of cycles of motion is accumulated beyond their fatigue life.
Abstract: A wavelet-based approach is proposed for structural damage detection and health monitoring. Characteristics of representative vibration signals under the wavelet transformation are examined. The methodology is then applied to simulation data generated from a simple structural model subjected to a harmonic excitation. The model consists of multiple breakable springs, some of which may suffer irreversible damage when the response exceeds a threshold value or the number of cycles of motion is accumulated beyond their fatigue life. In cases of either abrupt or accumulative damages, occurrence of damage and the moment when it occurs can be clearly determined in the details of the wavelet decomposition of these data. Similar results are observed for the real acceleration data of the seismic response recorded on the roof of a building during the 1971 San Fernando earthquake. Effects of noise intensity and damage severity are investigated and presented by a detectability map. Results show the great promise of the wavelet approach for damage detection and structural health monitoring.

474 citations


Proceedings ArticleDOI
05 Apr 2000
TL;DR: In this paper, a strategy for computing a digital curvelet transform, Curvelet 256, is described, implementing this strategy in the case of 256 X 256 images, and some experiments have been conducted using it.
Abstract: Recently, Candes and Donoho introduced the curvelet transform, a new multiscale representation suited for objects which are smooth away from discontinuities across curves. Their proposal was intended for functions f defined on the continuum plane R2. In this paper, we consider the problem of realizing this transform for digital data. We describe a strategy for computing a digital curvelet transform, we describe a software environment, Curvelet 256, implementing this strategy in the case of 256 X 256 images, and we describe some experiments we have conducted using it. Examples are available for viewing by web browser.

419 citations


Journal ArticleDOI
TL;DR: At low bit rates, reversible integer-to-integer and conventional versions of transforms were found to often yield results of comparable quality, with the best choice for a given application depending on the relative importance of the preceding criteria.
Abstract: In the context of image coding, a number of reversible integer-to-integer wavelet transforms are compared on the basis of their lossy compression performance, lossless compression performance, and computational complexity. Of the transforms considered, several were found to perform particularly well, with the best choice for a given application depending on the relative importance of the preceding criteria. Reversible integer-to-integer versions of numerous transforms are also compared to their conventional (i.e., nonreversible real-to-real) counterparts for lossy compression. At low bit rates, reversible integer-to-integer and conventional versions of transforms were found to often yield results of comparable quality. Factors affecting the compression performance of reversible integer-to-integer wavelet transforms are also presented, supported by both experimental data and theoretical arguments.

Book ChapterDOI
01 Jan 2000
TL;DR: An practical overview of three simple techniques to construct wavelets under general circumstances: interpolating subdivision, average interpolation, and lifting is given.
Abstract: We give an practical overview of three simple techniques to construct wavelets under general circumstances: interpolating subdivision, average interpolation, and lifting. We include examples concerning the construction of wavelets on an interval, weighted wavelets, and wavelets adapted to irregular samples.

Journal ArticleDOI
TL;DR: In this article, the wavelet transform is applied to rainfall rates and runoffs measured at different sampling rates, from daily to half-hourly sampling rate, to provide a simple interpretation of the distribution of energy between the different scales.

Journal ArticleDOI
TL;DR: In this paper, the analysis of voltage disturbance recordings in the time-frequency domain and in time-scale domain is discussed, where the discrete short-time Fourier transform (STFT) is used for the timefrequency domain; dyadic and binary-tree wavelet filters for the temporal domain; and the discrete STFT is also able to detect and analyze transients in a voltage disturbance.
Abstract: This paper discusses the analysis of voltage disturbance recordings in the time-frequency domain and in the time-scale domain. The discrete short-time Fourier transform (STFT) is used for the time-frequency domain; dyadic and binary-tree wavelet filters for the time-scale domain. The theory is explained with special emphases on the analysis of voltage disturbance data. Dyadic wavelet filters are not suitable for the harmonic analysis of disturbance data. Filter center frequencies and bandwidths are inflexible, and the results do not give easy insight in the time behavior of the harmonics. On the other hand, band-pass filter outputs from discrete STFT are well associated with harmonics and are thus more useful for power system analysis. With a properly chosen window size, the discrete STFT is also able to detect and analyze transients in a voltage disturbance. Overall, the STFT is more suitable than wavelet filters for the analysis of power system voltage disturbance data.

Journal ArticleDOI
TL;DR: A line-based approach for the implementation of the wavelet transform is introduced, which yields the same results as a "normal" implementation, but where, unlike prior work, the memory issues arising from the need to synchronize encoder and decoder are addressed.
Abstract: This paper addresses the problem of low memory wavelet image compression. While wavelet or subband coding of images has been shown to be superior to more traditional transform coding techniques, little attention has been paid until recently to the important issue of whether both the wavelet transforms and the subsequent coding can be implemented in low memory without significant loss in performance. We present a complete system to perform low memory wavelet image coding. Our approach is "line-based" in that the images are read line by line and only the minimum required number of lines is kept in memory. There are two main contributions of our work. First, we introduce a line-based approach for the implementation of the wavelet transform, which yields the same results as a "normal" implementation, but where, unlike prior work, we address memory issues arising from the need to synchronize encoder and decoder. Second, we propose a novel context-based encoder which requires no global information and stores only a local set of wavelet coefficients. This low memory coder achieves performance comparable to state of the art coders at a fraction of their memory utilization.

Journal ArticleDOI
TL;DR: In this article, the authors define an evolutionary wavelet spectrum (EWS) which quantifies how process power varies locally over time and scale, and show how the EWS may be rigorously estimated by a smoothed wavelet periodogram.
Abstract: This article defines and studies a new class of non-stationary random processes constructed from discrete non-decimated wavelets which generalizes the Cramer (Fourier) representation of stationary time series. We define an evolutionary wavelet spectrum (EWS) which quantifies how process power varies locally over time and scale. We show how the EWS may be rigorously estimated by a smoothed wavelet periodogram and how both these quantities may be inverted to provide an estimable time-localized autocovariance. We illustrate our theory with a pedagogical example based on discrete non- decimated Haar wavelets and also a real medical time series example.

Journal ArticleDOI
TL;DR: In this article, a multiscale analysis of covariance between two time series using the discrete wavelet transform is presented. And the wavelet covariance and wavelet correlation are defined and applied to this problem as an alternative to traditional cross-spectrum analysis.
Abstract: Multiscale analysis of univariate time series has appeared in the literature at an ever increasing rate. Here we introduce the multiscale analysis of covariance between two time series using the discrete wavelet transform. The wavelet covariance and wavelet correlation are defined and applied to this problem as an alternative to traditional cross-spectrum analysis. The wavelet covariance is shown to decompose the covariance between two stationary processes on a scale by scale basis. Asymptotic normality is established for estimators of the wavelet covariance and correlation. Both quantities are generalized into the wavelet cross covariance and cross correlation in order to investigate possible lead/lag relationships. A thorough analysis of interannual variability for the Madden-Julian oscillation is performed using a 35+ year record of daily station pressure series. The time localization of the discrete wavelet transform allows the subseries, which are associated with specific physical time scales, to be partitioned into both seasonal periods (such as summer and winter) and also according to El Nino-Southern Oscillation (ENSO) activity. Differences in variance and correlation between these periods may then be firmly established through statistical hypothesis testing. The daily station pressure series used here show clear evidence of increased variance and correlation in winter across Fourier periods of 16-128 days. During warm episodes of ENSO activity, a reduced variance is observed across Fourier periods of 8-512 days for the station pressure series from Truk Island and little or no correlation between station pressure series for the same periods.

Journal ArticleDOI
TL;DR: This paper's proposed recognition scheme is carried out in the wavelet domain using a set of multiple neural networks and is capable of providing a degree of belief for the identified disturbance waveform.
Abstract: Existing techniques for recognizing and identifying power quality disturbance waveforms are primarily based on visual inspection of the waveform. It is the purpose of this paper to bring to bear advances, especially in wavelet transforms, artificial neural networks, and the mathematical theory of evidence, to the problem of automatic power quality disturbance waveform recognition. Unlike past attempts to automatically identify disturbance waveforms where the identification is performed in the time domain using an individual artificial neural network, the proposed recognition scheme is carried out in the wavelet domain using a set of multiple neural networks. The outcomes of the networks are then integrated using decision making schemes such as a simple voting scheme or the Dempster-Shafer theory of evidence. With such a configuration, the classifier is capable of providing a degree of belief for the identified disturbance waveform.

Journal ArticleDOI
TL;DR: A time-frequency analysis of the intensities in time series was developed, which uses a filter bank of non-linearly scaled wavelets with specified time-resolution to extract time- Frequency aspects of the signal.

Journal ArticleDOI
TL;DR: In this paper, a gear pair affected by a fatigue crack is compared with those obtained by means of the well-accepted cepstrum analysis and time-synchronous average analysis.

Patent
08 May 2000
TL;DR: In this paper, a Haar wavelet transform was used to obtain the signal wavelet coefficients and the transformed signals may be filtered by deleting lower amplitude ones of the signal Wavelet coefficients.
Abstract: A device for monitoring heart rhythms. The device is provided with an amplifier for receiving electrogram signals, a memory for storing digitized electrogram segments including signals indicative of depolarizations of a chamber or chamber of a patient's heart and a microprocessor and associated software for transforming analyzing the digitized signals. The digitized signals are analyzed by first transforming the signals into signal wavelet coefficients using a wavelet transform. The higher amplitude ones of the signal wavelet coefficients are identified and the higher amplitude ones of the signal wavelet coefficients are compared with a corresponding set of template wavelet coefficients derived from signals indicative of a heart depolarization of known type. The digitized signals may be transformed using a Haar wavelet transform to obtain the signal wavelet coefficients, and the transformed signals may be filtered by deleting lower amplitude ones of the signal wavelet coefficients. The transformed signals may be compared by ordering the signal and template wavelet coefficients by absolute amplitude and comparing the orders of the signal and template wavelet coefficients. Alternatively, the transformed signals may be compared by calculating distances between the signal and wavelet coefficients. In preferred embodiments the Haar transform may be a simplified transform which also emphasizes the signal contribution of the wider wavelet coefficients.

Journal ArticleDOI
TL;DR: The authors carry out low bit-rate compression of multispectral images by means of the Said and Pearlman's SPIHT algorithm, suitably modified to take into account the interband dependencies.
Abstract: The authors carry out low bit-rate compression of multispectral images by means of the Said and Pearlman's SPIHT algorithm, suitably modified to take into account the interband dependencies. Two techniques are proposed: in the first, a three-dimensional (3D) transform is taken (wavelet in the spatial domain, Karhunen-Loeve in the spectral domain) and a simple 3D SPIHT is used; in the second, after taking a spatial wavelet transform, spectral vectors of pixels are vector quantized and a gain-driven SPIHT is used. Numerous experiments on two sample multispectral images show very good performance for both algorithms.

Journal ArticleDOI
01 Feb 2000
TL;DR: WaveCluster is proposed, a novel clustering approach based on wavelet transforms, which satisfies all the above requirements and can effectively identify arbitrarily shaped clusters at different degrees of detail.
Abstract: Many applications require the management of spatial data in a multidimensional feature space. Clustering large spatial databases is an important problem, which tries to find the densely populated regions in the feature space to be used in data mining, knowledge discovery, or efficient information retrieval. A good clustering approach should be efficient and detect clusters of arbitrary shape. It must be insensitive to the noise (outliers) and the order of input data. We propose WaveCluster, a novel clustering approach based on wavelet transforms, which satisfies all the above requirements. Using the multiresolution property of wavelet transforms, we can effectively identify arbitrarily shaped clusters at different degrees of detail. We also demonstrate that WaveCluster is highly efficient in terms of time complexity. Experimental results on very large datasets are presented, which show the efficiency and effectiveness of the proposed approach compared to the other recent clustering methods.

Journal ArticleDOI
TL;DR: The strength of the new method is that it can be easily extended to the whole class of second-generation wavelets, leaving the freedom and flexibility to choose the wavelet basis depending on the application.

Journal ArticleDOI
TL;DR: A general theory for constructing linear as well as nonlinear pyramid decomposition schemes for signal analysis and synthesis and provides the foundation of a general approach to constructing nonlinear wavelet decompositions schemes and filter banks.
Abstract: Interest in multiresolution techniques for signal processing and analysis is increasing steadily. An important instance of such a technique is the so-called pyramid decomposition scheme. This paper presents a general theory for constructing linear as well as nonlinear pyramid decomposition schemes for signal analysis and synthesis. The proposed theory is based on the following ingredients: 1) the pyramid consists of a (finite or infinite) number of levels such that the information content decreases toward higher levels and 2) each step toward a higher level is implemented by an (information-reducing) analysis operator, whereas each step toward a lower level is implemented by an (information-preserving) synthesis operator. One basic assumption is necessary: synthesis followed by analysis yields the identity operator, meaning that no information is lost by these two consecutive steps. Several examples of pyramid decomposition schemes are shown to be instances of the proposed theory: a particular class of linear pyramids, morphological skeleton decompositions, the morphological Haar pyramid, median pyramids, etc. Furthermore, the paper makes a distinction between single-scale and multiscale decomposition schemes, i.e., schemes without or with sample reduction. Finally, the proposed theory provides the foundation of a general approach to constructing nonlinear wavelet decomposition schemes and filter banks.

Proceedings ArticleDOI
05 Apr 2000
TL;DR: This work exploits the property of the sources to have a sparse representation in a corresponding signal dictionary, which provides faster and more robust computations, when there are an equal number of sources and mixtures.
Abstract: The blind source separation problem is to extract the underlying source signals from a set of their linear mixtures, where the mixing matrix is unknown. This situation is common, eg in acoustics, radio, and medical signal processing. We exploit the property of the sources to have a sparse representation in a corresponding signal dictionary. Such a dictionary may consist of wavelets, wavelet packets, etc., or be obtained by learning from a given family of signals. Starting from the maximum a posteriori framework, which is applicable to the case of more sources than mixtures, we derive a few other categories of objective functions, which provide faster and more robust computations, when there are an equal number of sources and mixtures. Our experiments with artificial signals and with musical sounds demonstrate significantly better separation than other known techniques.

Journal ArticleDOI
TL;DR: In this paper, the use of a continuous wavelet transform (CWT) was used to detect and analyze voltage sags and transients and a recursive algorithm was used and improved to compute the time-frequency plane of these electrical disturbances.
Abstract: This paper deals with the use of a continuous wavelet transform to detect and analyze voltage sags and transients. A recursive algorithm is used and improved to compute the time-frequency plane of these electrical disturbances. Characteristics of investigated signals are measured on a time-frequency plane. A comparison between measured characteristics and benchmark values detects the presence of disturbances in analyzed signals and characterizes the type of disturbances. Duration and magnitude of voltage sags are measured, transients are located in the width of the signal. Furthermore, meaningful time and frequency components of transients are measured. The whole method is implemented and tested over a sample representing recorded disturbances. Detection and measurement results are compared using classical methods.

Journal ArticleDOI
TL;DR: A wavelet-based multiresolution analysis method for metal artifact reduction, in which information is extracted from corrupted projection data, which is significantly more accurate for depiction of anatomical structures, especially in the immediate neighborhood of the prostheses.
Abstract: Traditional computed tomography (CT) reconstructions of total joint prostheses are limited by metal artifacts from corrupted projection data. Published metal artifact reduction methods are based on the assumption that severe attenuation of X-rays by prostheses renders corresponding portions of projection data unavailable, hence the "missing" data are either avoided (in iterative reconstruction) or interpolated (in filtered back-projection with data completion; typically, with filling data "gaps" via linear functions). Here, the authors propose a wavelet-based multiresolution analysis method for metal artifact reduction, in which information is extracted from corrupted projection data. The wavelet method improves image quality by a successive interpolation in the wavelet domain. Theoretical analysis and experimental results demonstrate that the metal artifacts due to both photon starving and beam hardening can be effectively suppressed using the authors' method. As compared to the filtered back-projection after linear interpolation, the wavelet-based reconstruction is significantly more accurate for depiction of anatomical structures, especially in the immediate neighborhood of the prostheses. This superior imaging precision is highly advantageous in geometric modeling for fitting hip prostheses.

Journal ArticleDOI
TL;DR: Comparison of shape-adaptive wavelet coding with other coding schemes for arbitrarily shaped visual objects shows that shape- Adaptive Wavelet coding always achieves better coding efficiency than other schemes.
Abstract: This paper presents a shape-adaptive wavelet coding technique for coding arbitrarily shaped still texture. This technique includes shape-adaptive discrete wavelet transforms (SA-DWTs) and extensions of zerotree entropy (ZTE) coding and embedded zerotree wavelet (EZW) coding. Shape-adaptive wavelet coding is needed for efficiently coding arbitrarily shaped visual objects, which is essential for object-oriented multimedia applications. The challenge is to achieve high coding efficiency while satisfying the functionality of representing arbitrarily shaped visual texture. One of the features of the SA-DWTs is that the number of coefficients after SA-DWTs is identical to the number of pixels in the original arbitrarily shaped visual object. Another feature of the SA-DWT is that the spatial correlation, locality properties of wavelet transforms, and self-similarity across subbands are well preserved in the SA-DWT. Also, for a rectangular region, the SA-DWT becomes identical to the conventional wavelet transforms. For the same reason, the extentions of ZTE and EZW to coding arbitrarily shaped visual objects carefully treat "don't care" nodes in the wavelet trees. Comparison of shape-adaptive wavelet coding with other coding schemes for arbitrarily shaped visual objects shows that shape-adaptive wavelet coding always achieves better coding efficiency than other schemes. One implementation of the shape-adaptive wavelet coding technique has been included in the new multimedia coding standard MPEG-4 for coding arbitrarily shaped still texture. Software implementation is also available.

Journal ArticleDOI
TL;DR: A new multiparadigm intelligent system approach is presented for the solution of the incident detection problem, employing advanced signal processing, pattern recognition, and classification techniques and produced excellent incident detection rates with no false alarms when tested using both real and simulated data.
Abstract: Traffic incidents are nonrecurrent and pseudorandom events that disrupt the normal flow of traffic and create a bottleneck in the road network. The probability of incidents is higher during peak flow rates when the systemwide effect of incidents is most severe. Model-based solutions to the incident detection problem have not produced practical, useful results primarily because the complexity of the problem does not lend itself to accurate mathematical and knowledge-based representations. A new multiparadigm intelligent system approach is presented for the solution of the problem, employing advanced signal processing, pattern recognition, and classification techniques. The methodology effectively integrates fuzzy, wavelet, and neural computing techniques to improve reliability and robustness. A wavelet-based denoising technique is employed to eliminate undesirable fluctuations in observed data from traffic sensors. Fuzzy c-mean clustering is used to extract significant information from the observed data and to reduce its dimensionality. A radial basis function neural network (RBFNN) is developed to classify the denoised and clustered observed data. The new model produced excellent incident detection rates with no false alarms when tested using both real and simulated data.