scispace - formally typeset
Search or ask a question

Showing papers on "Noise reduction published in 2003"


Journal ArticleDOI
TL;DR: The performance of this method for removing noise from digital images substantially surpasses that of previously published methods, both visually and in terms of mean squared error.
Abstract: We describe a method for removing noise from digital images, based on a statistical model of the coefficients of an overcomplete multiscale oriented basis. Neighborhoods of coefficients at adjacent positions and scales are modeled as the product of two independent random variables: a Gaussian vector and a hidden positive scalar multiplier. The latter modulates the local variance of the coefficients in the neighborhood, and is thus able to account for the empirically observed correlation between the coefficient amplitudes. Under this model, the Bayesian least squares estimate of each coefficient reduces to a weighted average of the local linear estimates over all possible values of the hidden multiplier variable. We demonstrate through simulations with images contaminated by additive white Gaussian noise that the performance of this method substantially surpasses that of previously published methods, both visually and in terms of mean squared error.

2,439 citations


Journal ArticleDOI
TL;DR: A new method for image smoothing based on a fourth-order PDE model that demonstrates good noise suppression without destruction of important anatomical or functional detail, even at poor signal-to-noise ratio is introduced.
Abstract: We introduce a new method for image smoothing based on a fourth-order PDE model. The method is tested on a broad range of real medical magnetic resonance images, both in space and time, as well as on nonmedical synthesized test images. Our algorithm demonstrates good noise suppression without destruction of important anatomical or functional detail, even at poor signal-to-noise ratio. We have also compared our method with related PDE models.

883 citations


Journal ArticleDOI
TL;DR: A robust wavelet domain method for noise filtering in medical images that adapts itself to various types of image noise as well as to the preference of the medical expert; a single parameter can be used to balance the preservation of (expert-dependent) relevant details against the degree of noise reduction.
Abstract: We propose a robust wavelet domain method for noise filtering in medical images. The proposed method adapts itself to various types of image noise as well as to the preference of the medical expert; a single parameter can be used to balance the preservation of (expert-dependent) relevant details against the degree of noise reduction. The algorithm exploits generally valid knowledge about the correlation of significant image features across the resolution scales to perform a preliminary coefficient classification. This preliminary coefficient classification is used to empirically estimate the statistical distributions of the coefficients that represent useful image features on the one hand and mainly noise on the other. The adaptation to the spatial context in the image is achieved by using a wavelet domain indicator of the local spatial activity. The proposed method is of low complexity, both in its implementation and execution time. The results demonstrate its usefulness for noise suppression in medical ultrasound and magnetic resonance imaging. In these applications, the proposed method clearly outperforms single-resolution spatially adaptive algorithms, in terms of quantitative performance measures as well as in terms of visual quality of the images.

540 citations


Journal ArticleDOI
TL;DR: Preprocessing, which includes fixing bad and outlier pixels, local destriping, atmospheric correction, and minimum noise fraction smoothing, provides improved results and it is feasible to develop a consistent and standardized time series of data that is compatible with field-scale and airborne measured indexes.
Abstract: The benefits of Hyperion hyperspectral data to agriculture have been studied at sites in the Coleambally Irrigation Area of Australia. Hyperion can provide effective measures of agricultural performance through the use of established spectral indexes if systematic and random noise is managed. The noise management strategy includes recognition of "bad" pixels, reducing the effects of vertical striping, and compensation for atmospheric effects in the data. It also aims to reduce compounding of these effects by image processing. As the noise structure is different for Hyperion's two spectrometers, noise reduction methods are best applied to each separately. Results show that a local destriping algorithm reduces striping noise without introducing unwanted effects in the image. They also show how data smoothing can clean the data and how careful selection of stable Hyperion bands can minimize residual atmospheric effects following atmospheric correction. Comparing hyperspectral indexes derived from Hyperion with the same indexes derived from ground-measured spectra allowed us to assess some of these impacts on the preprocessing options. It has been concluded that preprocessing, which includes fixing bad and outlier pixels, local destriping, atmospheric correction, and minimum noise fraction smoothing, provides improved results. If these or equivalent preprocessing steps are followed, it is feasible to develop a consistent and standardized time series of data that is compatible with field-scale and airborne measured indexes. Red-edge and leaf chlorophyll indexes based on the preprocessed data are shown to distinguish different levels of stress induced by water restrictions.

472 citations


Journal ArticleDOI
TL;DR: A decision-based, signal-adaptive median filtering algorithm for removal of impulse noise, which achieves accurate noise detection and high SNR measures without smearing the fine details and edges in the image.
Abstract: We propose a decision-based, signal-adaptive median filtering algorithm for removal of impulse noise. Our algorithm achieves accurate noise detection and high SNR measures without smearing the fine details and edges in the image. The notion of homogeneity level is defined for pixel values based on their global and local statistical properties. The cooccurrence matrix technique is used to represent the correlations between a pixel and its neighbors, and to derive the upper and lower bound of the homogeneity level. Noise detection is performed at two stages: noise candidates are first selected using the homogeneity level, and then a refining process follows to eliminate false detections. The noise detection scheme does not use a quantitative decision measure, but uses qualitative structural information, and it is not subject to burdensome computations for optimization of the threshold values. Empirical results indicate that our scheme performs significantly better than other median filters, in terms of noise suppression and detail preservation.

290 citations


Proceedings ArticleDOI
25 Jun 2003
TL;DR: A new, single-pass nonlinear filter for edge-preserving smoothing and visual detail removal for N dimensional signals in computer graphics, image processing and computer vision applications built from two modified forms of Tomasi and Manduchi's bilateral filter.
Abstract: We present a new, single-pass nonlinear filter for edge-preserving smoothing and visual detail removal for N dimensional signals in computer graphics, image processing and computer vision applications. Built from two modified forms of Tomasi and Manduchi's bilateral filter, the new "trilateral" filter smoothes signals towards a sharply-bounded, piecewise-linear approximation. Unlike bilateral filters or anisotropic diffusion methods that smooth towards piecewise constant solutions, the trilateral filter provides stronger noise reduction and better outlier rejection in high-gradient regions, and it mimics the edge-limited smoothing behavior of shock-forming PDEs by region nding with a fast min-max stack. Yet the trilateral filter requires only one user-set parameter, filters an input signal in a single pass, and does not use an iterative solver as required by most PDE methods. Like the bilateral filter, the trilateral filter easily extends to N-dimensional signals, yet it also offers better performance for many visual applications including appearance-preserving contrast reduction problems for digital photography and denoising polygonal meshes.

286 citations


Book ChapterDOI
01 Jan 2003
TL;DR: This paper presents both theoretical and experimental justification for the constrained optimization type of numerical algorithm for restoring blurry, noisy images and results involve blurry images which have been further corrupted with multiplicative noise.
Abstract: In [447, 449, 450], a constrained optimization type of numerical algorithm for restoring blurry, noisy images was developed and successfully tested. In this paper we present both theoretical and experimental justification for the method. Our main theoretical results involve constrained nonlinear partial differential equations. Our main experimental results involve blurry images which have been further corrupted with multiplicative noise. As in additive noise case of [447, 450] our numerical algorithm is simple to implement and is nonoscillatory (minimal ringing) and noninvasive (recovers sharp edges).

263 citations


Proceedings ArticleDOI
24 Nov 2003
TL;DR: A novel approach to image denoising using adaptive principal components that assumes that the image is corrupted by additive white Gaussian noise and performs well in terms of image visual fidelity and PSNR values.
Abstract: This paper presents a novel approach to image denoising using adaptive principal components. Our assumptions are that the image is corrupted by additive white Gaussian noise. The new denoising technique performs well in terms of image visual fidelity, and in terms of PSNR values, the new technique compares very well against some of the most recently published denoising algorithms.

261 citations


Journal ArticleDOI
15 Oct 2003
TL;DR: In this article, Chang et al. introduced a novel speckle reduction method based on soft thresholding the wavelet coefficients of the logarithmically transformed medical ultrasound image, which is based on the generalized Gaussian distributed (GGD) modeling of subband coefficients.
Abstract: The paper introduces a novel speckle reduction method based on soft thresholding the wavelet coefficients of the logarithmically transformed medical ultrasound image. The method is based on the generalized Gaussian distributed (GGD) modeling of subband coefficients. The proposed method is a variant of the recently published BayesShrink method (Chang, G et al., IEEE Trans. Image Processing, vol.9, no.9, p.1522-31, 2000) derived in the Bayesian framework for denoising natural images. It is scale adaptive because the parameters required for estimating the threshold depend on scale and subband data. The threshold is computed by K/spl sigma//sup 2///spl sigma//sub x/ where /spl sigma/ and /spl sigma//sub x/ are the standard deviation of the noise and the subband data of the noise-free image, respectively, and K is a scale parameter. Experimental results show that the proposed method performs better than the median filter as well as the homomorphic Wiener filter, especially in terms of feature preservation for better diagnosis as desired in medical image processing.

218 citations


Journal ArticleDOI
TL;DR: In this article, the authors investigated the noise reduction effect of 35 evergreen tree belts and found a negative logarithmic relationship between the visibility and relative attenuation, and a positive log-arithm relationship between attenuation and the width, length or height of the tee belts.

211 citations


Journal ArticleDOI
TL;DR: In this article, the authors investigated the invariance of random matching results to noise in a large class of two-strategy population games where payoffs may vary non-linearly with the distribution of strategies among the population.

Journal ArticleDOI
TL;DR: A new approach to deal with the noise inherent in the microarray image processing procedure is presented, to denoise the image noises before further image processing using stationary wavelet transform (SWT), which is particularly useful in image denoising.
Abstract: Microarray imaging is considered an important tool for large scale analysis of gene expression. The accuracy of the gene expression depends on the experiment itself and further image processing. It's well known that the noises introduced during the experiment will greatly affect the accuracy of the gene expression. How to eliminate the effect of the noise constitutes a challenging problem in microarray analysis. Traditionally, statistical methods are used to estimate the noises while the microarray images are being processed. In this paper, we present a new approach to deal with the noise inherent in the microarray image processing procedure. That is, to denoise the image noises before further image processing using stationary wavelet transform (SWT). The time invariant characteristic of SWT is particularly useful in image denoising. The testing result on sample microarray images has shown an enhanced image quality. The results also show that it has a superior performance than conventional discrete wavelet transform and widely used adaptive Wiener filter in this procedure.

Patent
Ankur Varma1, Dinei Florencio1
25 Apr 2003
TL;DR: In this paper, an array of one or more microphones is used to selectively eliminate noise emanating from known, generally fixed locations, and pass signals from a pre-specified region or regions with reduced distortion.
Abstract: Various embodiments reduce noise within a particular environment, while isolating and capturing speech in a manner that allows operation within an otherwise noisy environment. In one embodiment, an array of one or more microphones is used to selectively eliminate noise emanating from known, generally fixed locations, and pass signals from a pre-specified region or regions with reduced distortion.

Journal ArticleDOI
TL;DR: This work evaluates several 2-D denoising procedures using test images corrupted with additive Gaussian noise and finds that a combination of simple spatial filters lead to images that were grainier with smoother edges, though the error was smaller than in the wavelet- based methods.
Abstract: Techniques based on thresholding of wavelet coefficients are gaining popularity for denoising data. The idea is to transform the data into the wavelet basis, where the ''large'' coefficients are mainly the signal, and the ''smaller'' ones represent the noise. By suitably modifying these coefficients, the noise can be removed from the data. We evaluate several 2-D denoising procedures using test images corrupted with additive Gaussian noise. We consider global, level-dependent, and subband-dependent implementations of these techniques. Our results, using the mean squared error as a measure of the quality of denoising, show that the SureShrink and the BayesShrink methods consistently outperform the other wavelet-based techniques. In contrast, we found that a combination of simple spatial filters lead to images that were grainier with smoother edges, though the error was smaller than in the wavelet- based methods. © 2003 SPIE and IS&T. (DOI: 10.1117/1.1525793)

Journal ArticleDOI
TL;DR: Use of noise reduction filters decreased image noise at low-dose CT, resulting in a statistically significant reduction of noise in low- dose images processed with three filters.
Abstract: A prospective assessment of improvement in image quality at low–radiation-dose computed tomography (CT) of the abdomen by using noise reduction filters was performed. CT images acquired at standard and 50% reduced tube current were processed with six noise reduction filters and evaluated by three radiologists for image noise, sharpness, contrast, and overall image quality in terms of abdominal organ depiction. Quantitative image noise and contrast-to-noise ratio were measured. Baseline low-dose CT images were significantly worse than standard-dose CT images (P < .05). A statistically significant reduction of noise in low-dose images processed with three filters was noted. In conclusion, use of noise reduction filters decreased image noise at low-dose CT. © RSNA, 2003

Proceedings ArticleDOI
24 Nov 2003
TL;DR: The proposed AWA wavelet Wiener filter is superior to the traditional waveletWiener filter by about 0.5 dB (PSNR) and an interesting method to effectively combine the denoising results from both wavelet and spatial domains is shown and discussed.
Abstract: In this work, we consider the adaptive Wiener filtering of noisy images and image sequences. We begin by using an adaptive weighted averaging (AWA) approach to estimate the second-order statistics required by the Wiener filter. Experimentally, the resulting Wiener filter is improved by about 1 dB in the sense of peak-to-peak SNR (PSNR). Also, the subjective improvement is significant in that the annoying boundary noise, common with the traditional Wiener filter, has been greatly suppressed. The second, and more substantial, part of this paper extends the AWA concept to the wavelet domain. The proposed AWA wavelet Wiener filter is superior to the traditional wavelet Wiener filter by about 0.5 dB (PSNR). Furthermore, an interesting method to effectively combine the denoising results from both wavelet and spatial domains is shown and discussed. Our experimental results outperform or are comparable to state-of-art methods.

Proceedings Article
13 Oct 2003
TL;DR: This work introduces an approach that significantly improves the quality of images under variable illumination directions based on a multiplexing principle, in which multiple light sources illuminate the object simultaneously from different directions.
Abstract: Imaging of objects under variable lighting directions is animportant and frequent practice in computer vision andimage-based rendering. We introduce an approach that significantlyimproves the quality of such images. Traditionalmethods for acquiring images under variable illuminationdirections use only a single light source per acquired image.In contrast, our approach is based on a multiplexing principle,in which multiple light sources illuminate the objectsimultaneously from different directions. Thus, the objectirradiance is much higher. The acquired images are thencomputationally demultiplexed. The number of image acquisitionsis the same as in the single-source method. Theapproach is useful for imaging dim object areas. We givethe optimal code by which the illumination should be multiplexedto obtain the highest quality output. For n imagescorresponding to n light sources, the noise is reduced by\sqrt n/2 relative to the signal. This noise reduction translatesto a faster acquisition time or an increase in density of illuminationdirection samples. It also enables one to use lightingwith high directional resolution using practical setups,as we demonstrate in our experiments.

Journal ArticleDOI
TL;DR: In this article, a new online secondary path modeling method with auxiliary noise power scheduling and adaptive filter norm manipulation is proposed to alleviate the increment of the residual noise due to the auxiliary noise.
Abstract: In many practical cases for active noise control (ANC), the online secondary path modeling methods that use auxiliary noise are often applied. However, the auxiliary noise contributes to residual noise, and thus deteriorates the noise control performance of ANC systems. Moreover, a sudden and large change in the secondary path leads to easy divergence of the existing online secondary path modeling methods. To mitigate these problems, this paper proposes a new online secondary path modeling method with auxiliary noise power scheduling and adaptive filter norm manipulation. The auxiliary noise power is scheduled based on the convergence status of an ANC system with consideration of the variation of the primary noise. The purpose is to alleviate the increment of the residual noise due to the auxiliary noise. In addition, the norm manipulation is applied to adaptive filters in the ANC system. The objective is to avoid over-updates of adaptive filters due to the sudden large change in the secondary path and thus prevent the ANC system from diverging. Computer simulations show the effectiveness and robustness of the proposed method.

Journal ArticleDOI
TL;DR: This paper investigates the effectiveness of a bilateral denoising filter in various biological electron microscopy applications and finds that bilateral filter holds a distinct advantage in being capable of effectively suppressing noise without blurring the high resolution details.

Journal ArticleDOI
TL;DR: Objective speech quality measures, informal listening tests, and the results of automatic speech recognition experiments indicate a substantial benefit from AMS-based noise suppression, in comparison to unprocessed noisy speech.
Abstract: A single-microphone noise suppression algorithm is described that is based on a novel approach for the estimation of the signal-to-noise ratio (SNR) in different frequency channels: The input signal is transformed into neurophysiologically-motivated spectro-temporal input features. These patterns are called amplitude modulation spectrograms (AMS), as they contain information of both center frequencies and modulation frequencies within each 32 ms-analysis frame. The different representations of speech and noise in AMS patterns are detected by a neural network, which estimates the present SNR in each frequency channel. Quantitative experiments show a reliable estimation of the SNR for most types of nonspeech background noise. For noise suppression, the frequency bands are attenuated according to the estimated present SNR using a Wiener filter approach. Objective speech quality measures, informal listening tests, and the results of automatic speech recognition experiments indicate a substantial benefit from AMS-based noise suppression, in comparison to unprocessed noisy speech.

Journal ArticleDOI
TL;DR: A reasonable and practical method for identifying the useful information from the signal that has been contaminated by noise, so that to provide a feasible tool for vibration analysis.
Abstract: The paper developed a reasonable and practical method for identifying the useful information from the signal that has been contaminated by noise, so that to provide a feasible tool for vibration analysis. A new concept namely the Singular Entropy (SE) was proposed based on the singular value decomposition technique. With the aid of the SE, a series of investigations were done for discovering the distribution characteristics of noise contaminated and pure signals, and consequently an advanced noise reduction method was developed. The experiments showed that the proposed method was not only applied for dealing with the stationary signals but also applied for dealing with the non-stationary signals.

Journal ArticleDOI
TL;DR: In this paper, the authors evaluated the effectiveness of a noise reduction system implemented in a commercial digital multichannel compression hearing aid and found that the performance of the system was similar with and without noise reduction.
Abstract: We evaluated the effectiveness of a noise reduction system implemented in a commercial digital multichannel compression hearing aid. Eight experienced hearing aid wearers with moderate sensorineural hearing loss were fitted bilaterally according to the manufacturer's fitting guidelines. After a 3-month period of regular use of two programs, one with and one without the noise reduction system, speech recognition thresholds (SRTs) were measured in four types of background noise, including steady noise, and noises with spectral and/or temporal dips. SRTs were very similar with and without the noise reduction system; in both cases, SRTs were markedly lower than for unaided listening. SRTs were lower for the noises with dips than for the steady noise, especially for the aided conditions, indicating that amplification can help to 'listen in the dips'. Ratings of sound quality and listening comfort in the aided conditions were uniformly high and very similar with and without the noise reduction system.

Journal ArticleDOI
TL;DR: Noise reduction filters reduced image noise on low-radiation-dose chest CT images, with some compromise in image sharpness and contrast assessed qualitatively, and slightly altered modulation transfer function at higher spatial frequencies.
Abstract: Effect of noise reduction filters on chest computed tomographic (CT) images acquired with 50% radiation dose reduction was evaluated. Two sets of images were acquired with multi-detector row CT at standard (220-280 mA) and 50% reduced (110-140 mA) tube current at the level of the carina. After postprocessing with six noise reduction filters, images were compared with baseline standard-dose images for noise, sharpness, and contrast in lungs, mediastinum, and chest wall. Quantitative image noise was measured in descending thoracic aorta. Modulation transfer functions were calculated from CT images of 50-microm wire. Noise reduction filters reduced image noise on low-radiation-dose chest CT images, with some compromise in image sharpness and contrast assessed qualitatively, and slightly altered modulation transfer function at higher spatial frequencies.

Journal ArticleDOI
TL;DR: Noise reduction strategies in dual-energy imaging can be effective and should focus on blending various algorithms depending on anatomical locations, thus NOC or NOC combined with KCNR performed best in the tissue image.
Abstract: In this paper we describe a quantitative evaluation of the performance of three dual-energy noise reduction algorithms: Kalender's correlated noise reduction (KCNR), noise clipping (NOC), and edge-predictive adaptive smoothing (EPAS). These algorithms were compared to a simple smoothing filter approach, using the variance and noise power spectrum measurements of the residual noise in dual-energy images acquired with an a-Si TFT flat-panel x-ray detector. An estimate of the true noise was made through a new method with subpixel accuracy by subtracting an individual image from an ensemble average image. The results indicate that in the lung regions of the tissue image, all three algorithms reduced the noise by similar percentages at high spatial frequencies (KCNR=88%, NOC=88%, EPAS=84%, NOC/KCNR=88%) and somewhat less at low spatial frequencies (KCNR=45%, NOC=54%, EPAS=52%, NOC/KCNR=55%). At low frequencies, the presence of edge artifacts from KCNR made the performance worse, thus NOC or NOC combined with KCNR performed best. At high frequencies, KCNR performed best in the bone image, yet NOC performed best in the tissue image. Noise reduction strategies in dual-energy imaging can be effective and should focus on blending various algorithms depending on anatomical locations.

Journal ArticleDOI
TL;DR: A new class of filters for noise attenuation is introduced and its relationship with commonly used filtering techniques is investigated and it is indicated that the new filter outperforms the VMF, as well as other techniques currently used to eliminate impulsive noise in color images.
Abstract: In this paper, we address the problem of impulsive noise reduction in multichannel images. A new class of filters for noise attenuation is introduced and its relationship with commonly used filtering techniques is investigated. The computational complexity of the new filter is lower than that of the vector median filter (VMF). Extensive simulation experiments indicate that the new filter outperforms the VMF, as well as other techniques currently used to eliminate impulsive noise in color images.

Journal ArticleDOI
TL;DR: Results suggest that the proposed method offers significantly improved performance over conventional and previously reported global wavelet contrast enhancement methods.
Abstract: A method aimed at minimizing image noise while optimizing contrast of image features is presented. The method is generic and it is based on local modification of multiscale gradient magnitude values provided by the redundant dyadic wavelet transform. Denoising is accomplished by a spatially adaptive thresholding strategy, taking into account local signal and noise standard deviation. Noise standard deviation is estimated from the background of the mammogram. Contrast enhancement is accomplished by applying a local linear mapping operator on denoised wavelet magnitude values. The operator normalizes local gradient magnitude maxima to the global maximum of the first scale magnitude subimage. Coefficient mapping is controlled by a local gain limit parameter. The processed image is derived by reconstruction from the modified wavelet coefficients. The method is demonstrated with a simulated image with added Gaussian noise, while an initial quantitative performance evaluation using 22 images from the DDSM database was performed. Enhancement was applied globally to each mammogram, using the same local gain limit value. Quantitative contrast and noise metrics were used to evaluate the quality of processed image regions containing verified lesions. Results suggest that the method offers significantly improved performance over conventional and previously reported global wavelet contrast enhancement methods. The average contrast improvement, noise amplification and contrast-to-noise ratio improvement indices were measured as 9.04, 4.86 and 3.04, respectively. In addition, in a pilot preference study, the proposed method demonstrated the highest ranking, among the methods compared. The method was implemented in C++ and integrated into a medical image visualization tool.

Journal ArticleDOI
TL;DR: The proposed motion adaptive deinterlacing algorithm achieves cost-efficient hardware implementation with low complexity, low memory usage, and high-speed processing capability, and allows the audience to enjoy a high-quality TV sequence on their progressive devices.
Abstract: A motion adaptive deinterlacing algorithm is presented in this paper. It consists of the ELA-median directional interpolation, same-parity 4-field horizontal motion detection, morphological operation for noise reduction and adaptive threshold adjusting. The edges can be sharper when the ELA-median directional interpolation is adopted. The same-parity 4-field horizontal motion detection detects faster motion and makes more accurate determinations about where objects are going to move. The morphological operation for noise reduction and adaptive threshold adjusting preserve the actual texture of the original objects. The proposed method achieves cost-efficient hardware implementation with low complexity, low memory usage, and high-speed processing capability. In addition, it consumes less time in producing high-quality images and allows the audience to enjoy a high-quality TV sequence on their progressive devices. The experimental results show that the proposed algorithm is more cost-effective than previous systems.

Journal ArticleDOI
TL;DR: A new perceptually motivated approach is proposed for enhancement of speech corrupted by colored noise that takes into account the frequency masking properties of the human auditory system and reduces the perceptual effect of the residual noise.
Abstract: A new perceptually motivated approach is proposed for enhancement of speech corrupted by colored noise. The proposed approach takes into account the frequency masking properties of the human auditory system and reduces the perceptual effect of the residual noise. This new perceptual method is incorporated into a frequency-domain speech enhancement method and a subspace-based speech enhancement method. A better power spectrum/autocorrelation function estimator was also developed to improve the performance of the proposed algorithms. Objective measures and informal listening tests demonstrated significant improvements over other methods when tested with TIMIT sentences corrupted by various types of colored noise.

Journal ArticleDOI
TL;DR: A critical survey of the identification and modelling of railway noise sources is presented in this article, which summarizes the current knowledge of the physical source phenomena (mainly rolling and aerodynamic sources) as well as the potential for noise reduction.

Patent
25 Jul 2003
TL;DR: In this article, a motor drive system control provides global closed loop feedback to cooperatively operate system components to adaptively reduce noise and provide noise cancellation feedback, and an active EMI filter reduces differential and common mode noise on an input and provides a noise level indication to a system controller.
Abstract: A motor drive system control provides global closed loop feedback to cooperatively operate system components to adaptively reduce noise and provide noise cancellation feedback. An active EMI filter reduces differential and common mode noise on an input and provides a noise level indication to a system controller. Power switches in both a power converter and power inverter are cooperatively controlled with dynamic dv/dt control to reduce switching noise according to a profile specified by the controller. The dv/dt control is provided as an analog signal to a high voltage IC and codified as a pulse width for a level shifting circuit supplying control signals to the high voltage gate drive. A noise extraction circuit and technique obtain fast noise sampling to permit noise cancellation and adaptive noise reduction.