scispace - formally typeset
Search or ask a question

Showing papers on "Median filter published in 2011"


Journal ArticleDOI
TL;DR: A modified decision based unsymmetrical trimmed median filter algorithm for the restoration of gray scale, and color images that are highly corrupted by salt and pepper noise is proposed and it gives better Peak Signal-to-Noise Ratio (PSNR) and Image Enhancement Factor (IEF).
Abstract: A modified decision based unsymmetrical trimmed median filter algorithm for the restoration of gray scale, and color images that are highly corrupted by salt and pepper noise is proposed in this paper. The proposed algorithm replaces the noisy pixel by trimmed median value when other pixel values, 0's and 255's are present in the selected window and when all the pixel values are 0's and 255's then the noise pixel is replaced by mean value of all the elements present in the selected window. This proposed algorithm shows better results than the Standard Median Filter (MF), Decision Based Algorithm (DBA), Modified Decision Based Algorithm (MDBA), and Progressive Switched Median Filter (PSMF). The proposed algorithm is tested against different grayscale and color images and it gives better Peak Signal-to-Noise Ratio (PSNR) and Image Enhancement Factor (IEF).

550 citations


Journal ArticleDOI
01 Mar 2011
TL;DR: The proposed robust automatic crack-detection method from noisy concrete surface images includes two preprocessing steps and two detection steps, and probabilistic relaxation is used to detect cracks coarsely and to prevent noises.
Abstract: In maintenance of concrete structures, crack detection is important for the inspection and diagnosis of concrete structures. However, it is difficult to detect cracks automatically. In this paper, we propose a robust automatic crack-detection method from noisy concrete surface images. The proposed method includes two preprocessing steps and two detection steps. The first preprocessing step is a subtraction process using the median filter to remove slight variations like shadings from concrete surface images; only an original image is used in the preprocessing. In the second preprocessing step, a multi-scale line filter with the Hessian matrix is used both to emphasize cracks against blebs or stains and to adapt the width variation of cracks. After the preprocessing, probabilistic relaxation is used to detect cracks coarsely and to prevent noises. It is unnecessary to optimize any parameters in probabilistic relaxation. Finally, using the results from the relaxation process, a locally adaptive thresholding is performed to detect cracks more finely. We evaluate robustness and accuracy of the proposed method quantitatively using 60 actual noisy concrete surface images.

221 citations


Journal ArticleDOI
TL;DR: This paper proposes a novel approach for detecting median filtering in digital images, which can accurately detect Median filtering in arbitrary images, even reliably detect median filters in low-resolution and JPEG compressed images; and reliably detect tampering when part of a Median filter is inserted into a nonmedian-filtered image, or vice versa.
Abstract: Exposing the processing history of a digital image is an important problem for forensic analyzers and steganalyzers. As the median filter is a popular nonlinear denoising operator, the blind forensics of median filtering is particularly interesting. This paper proposes a novel approach for detecting median filtering in digital images, which can 1) accurately detect median filtering in arbitrary images, even reliably detect median filtering in low-resolution and JPEG compressed images; and 2) reliably detect tampering when part of a median-filtered image is inserted into a nonmedian-filtered image, or vice versa. The effectiveness of the proposed approach is exhaustively evaluated in five different image databases.

182 citations


Proceedings ArticleDOI
09 Oct 2011
TL;DR: The experimental results show that stochastic implementations tolerate more noise and consume less hardware than their conventional counterparts, and the validity of the present stoChastic computational elements is demonstrated through four basic digital image processing algorithms.
Abstract: As device scaling continues to nanoscale dimensions, circuit reliability will continue to become an ever greater problem. Stochastic computing, which performs computing with random bits (stochastic bits streams), can be used to enable reliable computation using those unreliable devices. However, one of the major issues of stochastic computing is that applications implemented with this technique are limited by the available computational elements. In this paper, first we will introduce and prove a stochastic absolute value function. Second, we will demonstrate a mathematical analysis of a stochastic tanh function, which is a key component used in a stochastic comparator. Third, we will present a quantitative analysis of a one-parameter linear gain function, and propose a new two-parameter version. The validity of the present stochastic computational elements is demonstrated through four basic digital image processing algorithms: edge detection, frame difference based image segmentation, median filter based noise reduction, and image contrast stretching. Our experimental results show that stochastic implementations tolerate more noise and consume less hardware than their conventional counterparts.

150 citations


01 Jan 2011
TL;DR: An improved median filter algorithm is implemented for the de-noising of highly corrupted images and e dge preservation and an algorithm is designed to calculate the PSNR and MSE.
Abstract: An improved median filter algorithm is implemented for the de-noising of highly corrupted images and e dge preservation. Mean, Median and improved mean filter is used for the noise detection. Fundamental of image processing, image degradation and restoration processes are illustrated. The pictures are corrupted with different noise density and reconstructed. The noise is Gaussian and impulse (salt-and pepper) noise. An algorithm is designed to calculate the PSNR and MSE. The result is discussed for Mean, Median and improved Median filter with different noise density.

136 citations


Proceedings ArticleDOI
16 Dec 2011
TL;DR: The experimental results show that the proposed approach can effectively enhance the underwater image and reduce the execution time, and requires less computing resource and is well suitable for implementing on the surveillance and underwater navigation in real time.
Abstract: Blurred underwater image is always an annoying problem in the oceanic engineering. In this paper, we propose an efficient and low complexity underwater image enhancement method based on dark channel prior. Our method employs the median filter instead of the soft matting procedure to estimate the depth map of image. Moreover, a color correction method is adopted to enhance the color contrast for underwater image. The experimental results show that the proposed approach can effectively enhance the underwater image and reduce the execution time. Besides, this method requires less computing resource and is well suitable for implementing on the surveillance and underwater navigation in real time.

130 citations


Journal ArticleDOI
TL;DR: Experimental results show that the proposed GDWM outperforms other watermarking methods and is robust to a wide range of attacks, e.g., Gaussian filtering, amplitude scaling, median filtering, sharpening, JPEG compression, Gaussian noise, salt & pepper noise, and scaling.
Abstract: We propose a robust quantization-based image watermarking scheme, called the gradient direction watermarking (GDWM), based on the uniform quantization of the direction of gradient vectors. In GDWM, the watermark bits are embedded by quantizing the angles of significant gradient vectors at multiple wavelet scales. The proposed scheme has the following advantages: 1) increased invisibility of the embedded watermark because the watermark is embedded in significant gradient vectors, 2) robustness to amplitude scaling attacks because the watermark is embedded in the angles of the gradient vectors, and 3) increased watermarking capacity as the scheme uses multiple-scale embedding. The gradient vector at a pixel is expressed in terms of the discrete wavelet transform (DWT) coefficients. To quantize the gradient direction, the DWT coefficients are modified based on the derived relationship between the changes in the coefficients and the change in the gradient direction. Experimental results show that the proposed GDWM outperforms other watermarking methods and is robust to a wide range of attacks, e.g., Gaussian filtering, amplitude scaling, median filtering, sharpening, JPEG compression, Gaussian noise, salt & pepper noise, and scaling.

123 citations


Journal ArticleDOI
TL;DR: Experiments on real-life surveillance videos demonstrate that the proposed sequential technique for static background estimation obtains considerably better background estimates (both qualitatively and quantitatively) than median filtering and the recently proposed "intervals of stable intensity" method.
Abstract: For the purposes of foreground estimation, the true background model is unavailable in many practical circumstances and needs to be estimated from cluttered image sequences. We propose a sequential technique for static background estimation in such conditions, with low computational and memory requirements. Image sequences are analysed on a block-by-block basis. For each block location a representative set is maintained which contains distinct blocks obtained along its temporal line. The background estimation is carried out in a Markov Random Field framework, where the optimal labelling solution is computed using iterated conditional modes. The clique potentials are computed based on the combined frequency response of the candidate block and its neighbourhood. It is assumed that the most appropriate block results in the smoothest response, indirectly enforcing the spatial continuity of structures within a scene. Experiments on real-life surveillance videos demonstrate that the proposedmethod obtains considerably better background estimates (both qualitatively and quantitatively) than median filtering and the recently proposed "intervals of stable intensity" method. Further experiments on the Wallflower dataset suggest that the combination of the proposed method with a foreground segmentation algorithm results in improved foreground segmentation.

121 citations


Proceedings ArticleDOI
20 Jun 2011
TL;DR: This paper proposes an improved per-pixel confidence measure using a Random Forest regressor trained with real-world data and argues that an improved confidence measure leads to superior reconstructions in subsequent steps of traditional scan processing pipelines.
Abstract: Time-of-Flight cameras provide high-frame-rate depth measurements within a limited range of distances. These readings can be extremely noisy and display unique errors, for instance, where scenes contain depth discontinuities or materials with low infrared reflectivity. Previous works have treated the amplitude of each Time-of-Flight sample as a measure of confidence. In this paper, we demonstrate the shortcomings of this common lone heuristic, and propose an improved per-pixel confidence measure using a Random Forest regressor trained with real-world data. Using an industrial laser scanner for ground truth acquisition, we evaluate our technique on data from two different Time-of-Flight cameras1. We argue that an improved confidence measure leads to superior reconstructions in subsequent steps of traditional scan processing pipelines. At the same time, data with confidence reduces the need for point cloud smoothing and median filtering.

111 citations


01 Jan 2011
TL;DR: In this article, an improved per-pixel confidence measure using a Random Forest regressor trained with real-world data was proposed to estimate the amplitude of each time-of-flight sample.
Abstract: Time-of-Flight cameras provide high-frame-rate depth measurements within a limited range of distances. These readings can be extremely noisy and display unique errors, for instance, where scenes contain depth discontinuities or materials with low infrared reflectivity. Previous works have treated the amplitude of each Time-of-Flight sample as a measure of confidence. In this paper, we demonstrate the shortcomings of this common lone heuristic, and propose an improved per-pixel confidence measure using a Random Forest regressor trained with real-world data. Using an industrial laser scanner for ground truth acquisition, we evaluate our technique on data from two different Time-of-Flight cameras1. We argue that an improved confidence measure leads to superior reconstructions in subsequent steps of traditional scan processing pipelines. At the same time, data with confidence reduces the need for point cloud smoothing and median filtering.

104 citations


Journal ArticleDOI
TL;DR: In this paper, an approach to impulse noise removal is presented, which is a switching filter which identifies the noisy pixels and then corrects them by using median filter, in order to identify pixels corrupted by noise, local intensity extrema is applied.
Abstract: In this study an approach to impulse noise removal is presented. The introduced algorithm is a switching filter which identifies the noisy pixels and then corrects them by using median filter. In order to identify pixels corrupted by noise an analysis of local intensity extrema is applied. Comprehensive analysis of the algorithm performance [in terms of peak signal-to-noise ratio (PSNR) and Structural SIMilarity (SSIM) index] is presented. Results obtained on wide range of noise corruption (up to 98%) are shown and discussed. Moreover, comparison with well-established methods for impulse noise removal is provided. Presented results reveal that the proposed algorithm outperforms other approaches to impulse noise removal and its performance is close to ideal switching median filter. For high noise densities, the method correctly detects up to 100% of noisy pixels.

Journal Article
TL;DR: This paper proposes filtering techniques for the removal of speckle noise from the digital images by using signal to noise ration and noise level is measured by the standard deviation.
Abstract: Reducing noise from the medical images, a satellite image etc. is a challenge for the researchers in digital image processing. Several approaches are there for noise reduction. Generally speckle noise is commonly found in synthetic aperture radar images, satellite images and medical images. This paper proposes filtering techniques for the removal of speckle noise from the digital images. Quantitative measures are done by using signal to noise ration and noise level is measured by the standard deviation.

BookDOI
30 Nov 2011
TL;DR: Clinical management and follow-up of patients with carotid disease and the use of ultrasound contrast agents in plaque characterization are reviewed.
Abstract: It is well known that speckle is a multiplicative noise that degrades the visual evaluation in ultrasound imaging. This necessitates the need for robust despeckling techniques for both routine clinical practice and tele-consultation. The recent advancements in ultrasound instrumentation and portable ultrasound devices necessitate the need of more robust despeckling techniques for enhanced ultrasound medical imaging for both routine clinical practice and tele-consultation. The objective of this chapter is to introduce the theoretical background of a num- ber of despeckle filtering techniques and to carry out a comparative evaluation of despeckle filtering based on texture analysis, image quality evaluation metrics, and visual evaluation by medical experts, on ultrasound images of the carotid artery bifurcation. In this chapter, a total of ten despeckle filters are presented based on local statistics, median filtering, pixel homogeneity, geometric filtering, homomor- phic filtering, anisotropic diffusion, nonlinear coherence diffusion, and wavelet filtering. Our results suggest that the first-order statistics filter DsFlsmv gave the best performance, followed by the geometric filter DsFgf4d and the homogeneous mask area filter DsFlsminsc. These filters improved the class separation between the asymptomatic and the symptomatic classes based on the statistics of the extracted texture features, gave only a marginal improvement in the classification success rate, and improved the visual assessment carried out by the two experts. Most importantly, a despeckle filtering and evaluation protocol is proposed based on texture analysis, image quality evaluation metrics, and visual evaluation by experts. In conclusion, the proper selection of a despeckle filter is very important in the enhancement of ultrasonic imaging of the carotid artery. Further work is needed to evaluate at a larger scale and in clinical practice the performance of the proposed despeckle filters in the automated segmentation, texture analysis, and classification of carotid ultrasound imaging.

Journal ArticleDOI
TL;DR: The experiments show that the proposed method outperforms other state-of-the-art filters both visually and in terms of objective quality measures such as the mean absolute error (MAE), the peak-signal-to-noise ratio (PSNR) and the normalized color difference (NCD).
Abstract: In this paper, a new fuzzy filter for the removal of random impulse noise in color video is presented. By working with different successive filtering steps, a very good tradeoff between detail preservation and noise removal is obtained. One strong filtering step that should remove all noise at once would inevitably also remove a considerable amount of detail. Therefore, the noise is filtered step by step. In each step, noisy pixels are detected by the help of fuzzy rules, which are very useful for the processing of human knowledge where linguistic variables are used. Pixels that are detected as noisy are filtered, the others remain unchanged. Filtering of detected pixels is done by blockmatching based on a noise adaptive mean absolute difference. The experiments show that the proposed method outperforms other state-of-the-art filters both visually and in terms of objective quality measures such as the mean absolute error (MAE), the peak-signal-to-noise ratio (PSNR) and the normalized color difference (NCD).

Proceedings ArticleDOI
29 Sep 2011
TL;DR: The proposed method to expose image forgeries by detecting the noise variance differences between original and tampered parts of an image is effective with high detection accuracy and low false positive rate both quantitatively and qualitatively.
Abstract: Noise is unwanted in high quality images, but it can aid image tampering. For example, noise can be intentionally added in image to conceal tampered regions and to create special visual effects. It may also be introduced unnoticed during camera imaging process, which makes the noise levels inconsistent in splicing images. In this paper, we propose a method to expose such image forgeries by detecting the noise variance differences between original and tampered parts of an image. The noise variance of local image blocks is estimated using a recently developed technique, where no prior information about the imaging device or original image is required. The tampered region is segmented from the original image by a two-phase coarse-to-fine clustering of image blocks. Our experimental results demonstrate that the proposed method can effectively detect image forgeries with high detection accuracy and low false positive rate both quantitatively and qualitatively.

Journal ArticleDOI
TL;DR: This study proposed Contrast Limited Adaptive Histogram Equalization (CLAHE) which is one of the techniques in a computer image processing domain which was used to amend contrast in images.
Abstract: Problem statement: Various images are low quality and difficultly to detect and extract information. Therefore, the image has to get under a process called image enhancement which contains an aggregation of techniques that look for improving the visual aspect of an image. Medical images are one of the fundamental images, because they are used in more sensitive field which is a medical field. The raw data obtained straight from devices of medical acquisition may afford a comparatively poor image quality representation and may destroy by several types of noises. Image Enhancement (IE) and denoising algorithms for executing the requirements of digital medical image enhancement is introduced. The main goal of this study is to improve features and gain better characteristics of medical images for a right diagnosis. Approach: The proposed techniques start by the median filter for removing noise on images followed by unsharp mask filter which is believable the usual type of sharpening. Medical images were usually poor quality especially in contrast. For solving this problem, we proposed Contrast Limited Adaptive Histogram Equalization (CLAHE) which is one of the techniques in a computer image processing domain. It was used to amend contrast in images. Results: For testing purposes, different sizes and various types of medical images were used and more than 60 images in different parts of the body. From the experts’ evaluation, they noted that the enhanced images improved up to 80% from the original images depends on medical images modalities. Conclusion: The proposed algorithms results were significant for increasing the visibleness of relatively details without distorting the images.

Journal ArticleDOI
Wei Wang1, Peizhong Lu1
TL;DR: An effective algorithm for removing impulse noise from corrupted images is presented under the framework of switching median filtering and provides better performance in terms of PSNR and MAE than many other median filters for impulse noise removal.
Abstract: An effective algorithm for removing impulse noise from corrupted images is presented under the framework of switching median filtering. Firstly, noisy pixels are distinguished by Local Outlier Factor incorporating with Boundary Discriminative Noise Detection (LOFBDND). Then, the directional weighted median filter is adopted to remove the detected impulses by replacing each noisy pixel with the weighted mean of its neighbors in the filtering window. Our noise detection algorithm makes the decision so accurate that the miss detection rate and false detection rate are very low. Extensive simulation results show that our method provides better performance in terms of PSNR and MAE than many other median filters for impulse noise removal.

Journal ArticleDOI
TL;DR: A recursive Bayesian regularization algorithm is applied to phase-sensitive ultrasound RF signals to improve displacement estimation and the optimal strain regularization parameter was found to be twice the nominal strain and did not vary significantly with algorithmic iterations.
Abstract: Noise artifacts due to signal decorrelation and reverberation are a considerable problem in ultrasound strain imaging. For block-matching methods, information from neighboring matching blocks has been utilized to regularize the estimated displacements. We apply a recursive Bayesian regularization algorithm developed by Hayton et al. [Artif. Intell., vol. 114, pp. 125-156, 1999] to phase-sensitive ultrasound RF signals to improve displacement estimation. The parameter of regularization is reformulated, and its meaning examined in the context of strain imaging. Tissue-mimicking experimental phantoms and RF data incorporating finite-element models for the tissue deformation and frequency-domain ultrasound simulations are used to compute the optimal parameter with respect to nominal strain and algorithmic iterations. The optimal strain regularization parameter was found to be twice the nominal strain and did not vary significantly with algorithmic iterations. The technique demonstrates superior performance over median filtering in noise reduction at strains 5% and higher for all quantitative experiments performed. For example, the strain SNR was 11 dB higher than that obtained using a median filter at 7% strain. It has to be noted that for applied deformations lower than 1%, since signal decorrelation errors are minimal, using this approach may degrade the displacement image.

BookDOI
01 Jan 2011
TL;DR: In this paper, single-channel noise reduction with a Filtering Vector (FFV) and multichannel noise reduction using a Rectangular Filtering Matrix (RFM) are presented.
Abstract: Introduction.- Single-Channel Noise Reduction with a Filtering Vector.- Single-Channel Noise Reduction with a Rectangular Filtering Matrix.- Multichannel Noise Reduction with a Filtering Vector.- Multichannel Noise Reduction with a Rectangular Filtering Matrix.

Proceedings ArticleDOI
Chen Wei1, Lei Sheng1, Guo Lihua1, Chen Yuquan1, Pan Min1 
12 Dec 2011
TL;DR: PPG signal can reflect many physiological parameters, such as heart function, blood vessel elasticity, blood viscosity and so on, and it was important to find efficient pre-processing and feature extraction algorithms to deal with original PPG signal, which was interfered by many other factors.
Abstract: Photoplethysmography(PPG) signal can reflect many physiological parameters, such as heart function, blood vessel elasticity, blood viscosity and so on. It was a novel noninvasive method with the advantage of convenience and accuracy. It was important to find efficient pre-processing and feature extraction algorithms to deal with original PPG signal, which was interfered by many other factors. Many practical methods including median filtering and FIR filtering was used. A new algorithm based on wavelet transformation was proposed for eliminating the baseline drift. Feature points extraction was another key issue. An improved differential algorithm was used to solve this problem. All of these practical algorithms provided an effective platform for physiological parameters detection.

Journal ArticleDOI
TL;DR: For ultrasound kidney image, morphological filtering seems to be the best option in enhancing the image if the whole image were taken into consideration (by measuring MSE and PSNR), according to evaluation.
Abstract: Evaluation have been done to different enhancement techniques applied to ultrasound kidney images to see which enhancement techniques is the most suitable techniques that can be applied to the kidney images before segmenting the edge of the kidney. Five common enhancement techniques have been used including the spatial domain filtering, frequency domain filtering, histogram processing, morphological filtering and wavelet filtering. The techniques applied were assessed by few methods which are the observer sensitivity, measuring the image quality by calculating the MSE and PSNR of the image and applying one of the segmentation techniques to the output images. In conclusion, for ultrasound kidney image, if the whole image were taken into consideration (by measuring MSE and PSNR), morphological filtering seems to be the best option in enhancing the image. If the evaluator is concerning more on the kidney edges, enhancement techniques that should be taken into consideration are median filtering and histogram equalization.

Journal ArticleDOI
TL;DR: Under which conditions filtering can visibly improve the image quality is addressed, and it is demonstrated that it is possible to roughly estimate whether or not the visual quality can clearly be improved by filtering.
Abstract: This article addresses under which conditions filtering can visibly improve the image quality. The key points are the following. First, we analyze filtering efficiency for 25 test images, from the color image database TID2008. This database allows assessing filter efficiency for images corrupted by different noise types for several levels of noise variance. Second, the limit of filtering efficiency is determined for independent and identically distributed (i.i.d.) additive noise and compared to the output mean square error of state-of-the-art filters. Third, component-wise and vector denoising is studied, where the latter approach is demonstrated to be more efficient. Fourth, using of modern visual quality metrics, we determine that for which levels of i.i.d. and spatially correlated noise the noise in original images or residual noise and distortions because of filtering in output images are practically invisible. We also demonstrate that it is possible to roughly estimate whether or not the visual quality can clearly be improved by filtering.

Book ChapterDOI
20 Nov 2011
TL;DR: Experiments showed the results produced from the proposed adaptive guided image filtering (AGF) are superior to those produced from unsharp masking-based techniques and comparable to ABF filtered output.
Abstract: Sharpness enhancement and noise reduction play crucial roles in computer vision and image processing. The problem is to enhance the appearance and reduce the noise of the digital images without causing halo artifacts. In this paper, we propose an adaptive guided image filtering (AGF) able to perform halo-free edge slope enhancement and noise reduction simulaneously. The proposed method is developed based on guided image filtering (GIF) and the shift-variant technique, part of adaptive bilateral filtering (ABF). Experiments showed the results produced from our method are superior to those produced from unsharp masking-based techniques and comparable to ABF filtered output. Our proposed AGF outperforms ABF in terms of computational complexity. It is implemented using a fast and exact linear-time algorithm.

Proceedings ArticleDOI
10 Oct 2011
TL;DR: To effectively resolve the automatic recognition problem of analog measuring instruments, a reading recognition method based on improved Hough transformation is proposed and the experimental results show that the method is very valid for the pointer angle recognition.
Abstract: To effectively resolve the automatic recognition problem of analog measuring instruments, a reading recognition method based on improved Hough transformation is proposed. Firstly, the paper presents the image preprocessing operation. In order to make the image filtering better, an improved adaptive median filter method is adopted. Secondly, we explains in detail the recognition principle for the pointer angle using the improved Hough transform. And the pointer's reading is got using the linear relationship between scale and angle of instrument dial revolutionary. In order to further improve the recognition accuracy, as to the situation that pointer completely coincides with the whole tick, late correction will be done. Finally, the automatic recognition software is written in VC++6.0 IDE, and complete the recognition experiment for the AC voltmeter. The experimental results show that the method is very valid for the pointer angle recognition. In addition, the advantages of this study are as following: simple algorithm, good real-time and high precision.

Journal ArticleDOI
TL;DR: By designing an adaptive threshold value in the extraction process, the proposed blind watermarking scheme is more robust for resisting common attacks such as median filtering, average filtering, and Gaussian noise.
Abstract: This paper proposes a blind watermarking scheme based on wavelet tree quantization for copyright protection. In such a quantization scheme, there exists a large significant difference while embedding a watermark bit 1 and a watermark bit 0; it then does not require any original image or watermark during watermark extraction process. As a result, the watermarked images look lossless in comparison with the original ones, and the proposed method can effectively resist common image processing attacks; especially for JPEG compression and low-pass filtering. Moreover, by designing an adaptive threshold value in the extraction process, our method is more robust for resisting common attacks such as median filtering, average filtering, and Gaussian noise. Experimental results show that the watermarked image looks visually identical to the original, and the watermark can be effectively extracted.

Journal ArticleDOI
TL;DR: This paper presents a novel digital watermarking framework using electrocardiograph (ECG) and demographic text data as double watermarks that protects patient medical information and prevents mismatching diagnostic information.

Journal ArticleDOI
TL;DR: Possible structures of automatic procedures are presented and discussed for several typical applications of image processing as remote sensing data preprocessing and compression.
Abstract: In many modern applications, methods and algorithms used for image processing require a priori knowledge or estimates of noise type and its characteristics. Noise type and basic parameters can be sometimes known in advance or determined in an interactive manner. However, it occurs more and more often that they should be estimated in a blind manner. The results of noise-type blind determination can be false, and the estimates of noise parameters are characterized by certain accuracy. Such false decisions and estimation errors have an impact on performance of image-processing techniques that is based on the obtained information. We address some issues of such a negative influence. Possible structures of automatic procedures are presented and discussed for several typical applications of image processing as remote sensing data preprocessing and compression.

Proceedings ArticleDOI
01 Dec 2011
TL;DR: This framework is semi-automated and allows for rapid processing, analysis and interpretation of slow waves via qualitative and quantitative measures including isochronal activation time mapping, and velocity and amplitude mapping.
Abstract: High resolution electrical mapping of slow waves on the stomach serosa has improved our understanding of gastric electrical activity in normal and diseased states. In order to assess the signals acquired from high resolution mapping, a robust framework is required. Our framework is semi-automated and allows for rapid processing, analysis and interpretation of slow waves via qualitative and quantitative measures including isochronal activation time mapping, and velocity and amplitude mapping. Noise removal techniques were validated for raw recorded signals, where three filters were evaluated for baseline drift removal and three filters for removal of high frequency interference. For baseline drift removal, the Gaussian moving median filter was most effective, while for eliminating high frequency interference the Savitzky Golay filter was the most effective. Methods for assessing slow wave velocity and amplitude were investigated. To estimate slow wave velocity, a finite difference approach with interpolation and smoothing was used. To evaluate the slow wave amplitude and width, a peak and trough method based on Savitzky Golay derivative filters was used. Together, these methods constitute a significantly improved framework for analyzing gastric high resolution mapping data.

26 Sep 2011
TL;DR: In this paper, the authors discuss methods for filtering spatial trajectories to reduce measurement noise and to estimate higher level properties of a trajectory like its speed and direction, using mean and median filtering, the Kalman filter and particle filter.
Abstract: A spatial trajectory is a sequences of (x,y) points, each with a time stamp. This chapter discusses low-level preprocessing of trajectories. First, it discusses how to reduce the size of data required to store a trajectory, in order to save storage costs and reduce redundant data. The data reduction techniques can run in a batch mode after the data is collected or in an on-line mode as the data is collected. Part of this discussion consists of methods to measure the error introduced by the data reduction techniques. The second part of the chapter discusses methods for filtering spatial trajectories to reduce measurement noise and to estimate higher level properties of a trajectory like its speed and direction. The methods include mean and median filtering, the Kalman filter, and the particle filter.

Book
15 Apr 2011
TL;DR: Multichannel Noise Reduction with a Rectangular Filtering Matrix for Single-Channel Noise Reduction and a Filtering Vector for Multichannel noise reduction.
Abstract: Introduction.- Single-Channel Noise Reduction with a Filtering Vector.- Single-Channel Noise Reduction with a Rectangular Filtering Matrix.- Multichannel Noise Reduction with a Filtering Vector.- Multichannel Noise Reduction with a Rectangular Filtering Matrix.