scispace - formally typeset
Search or ask a question
Topic

Gaussian noise

About: Gaussian noise is a research topic. Over the lifetime, 25934 publications have been published within this topic receiving 552078 citations.


Papers
More filters
Book
01 Jan 1958
TL;DR: The aim of this book is to clarify the role of noise in the development of linear and nonlinear systems and to provide a procedure forormalising the noise generated by these systems.
Abstract: Preface to the IEEE Press Edition. Preface. Errata. Introduction. Probability. Random Variables and Probability Distributions. Averages. Sampling. Spectral Analysis. Shot Noise. The Gaussian Process. Linear Systems. Noise Figures. Optimum Linear Systems. Nonlinear Devices: The Direct Method. Nonlinear Devices: The Transform Method. Statistical Detection Signals. Appendix 1: The Impulse Function. Appendix 2: Integral Equations. Bibliography. Index.

1,473 citations

Journal ArticleDOI
TL;DR: This work analyzes the behavior of l1-constrained quadratic programming (QP), also referred to as the Lasso, for recovering the sparsity pattern of a vector beta* based on observations contaminated by noise, and establishes precise conditions on the problem dimension p, the number k of nonzero elements in beta*, and the number of observations n.
Abstract: The problem of consistently estimating the sparsity pattern of a vector beta* isin Rp based on observations contaminated by noise arises in various contexts, including signal denoising, sparse approximation, compressed sensing, and model selection. We analyze the behavior of l1-constrained quadratic programming (QP), also referred to as the Lasso, for recovering the sparsity pattern. Our main result is to establish precise conditions on the problem dimension p, the number k of nonzero elements in beta*, and the number of observations n that are necessary and sufficient for sparsity pattern recovery using the Lasso. We first analyze the case of observations made using deterministic design matrices and sub-Gaussian additive noise, and provide sufficient conditions for support recovery and linfin-error bounds, as well as results showing the necessity of incoherence and bounds on the minimum value. We then turn to the case of random designs, in which each row of the design is drawn from a N (0, Sigma) ensemble. For a broad class of Gaussian ensembles satisfying mutual incoherence conditions, we compute explicit values of thresholds 0 0, if n > 2 (thetasu + delta) klog (p- k), then the Lasso succeeds in recovering the sparsity pattern with probability converging to one for large problems, whereas for n < 2 (thetasl - delta)klog (p - k), then the probability of successful recovery converges to zero. For the special case of the uniform Gaussian ensemble (Sigma = Iptimesp), we show that thetasl = thetas

1,438 citations

Journal ArticleDOI
TL;DR: FFDNet as discussed by the authors proposes a fast and flexible denoising convolutional neural network with a tunable noise level map as the input, which can handle a wide range of noise levels effectively with a single network.
Abstract: Due to the fast inference and good performance, discriminative learning methods have been widely studied in image denoising. However, these methods mostly learn a specific model for each noise level, and require multiple models for denoising images with different noise levels. They also lack flexibility to deal with spatially variant noise, limiting their applications in practical denoising. To address these issues, we present a fast and flexible denoising convolutional neural network, namely FFDNet, with a tunable noise level map as the input. The proposed FFDNet works on downsampled sub-images, achieving a good trade-off between inference speed and denoising performance. In contrast to the existing discriminative denoisers, FFDNet enjoys several desirable properties, including: 1) the ability to handle a wide range of noise levels (i.e., [0, 75]) effectively with a single network; 2) the ability to remove spatially variant noise by specifying a non-uniform noise level map; and 3) faster speed than benchmark BM3D even on CPU without sacrificing denoising performance. Extensive experiments on synthetic and real noisy images are conducted to evaluate FFDNet in comparison with state-of-the-art denoisers. The results show that FFDNet is effective and efficient, making it highly attractive for practical denoising applications.

1,430 citations

Proceedings ArticleDOI
24 Apr 2005
TL;DR: This work proposes a simple distributed iterative scheme, based on distributed average consensus in the network, to compute the maximum-likelihood estimate of the parameters, and shows that it works in a network with dynamically changing topology, provided that the infinitely occurring communication graphs are jointly connected.
Abstract: We consider a network of distributed sensors, where where each sensor takes a linear measurement of some unknown parameters, corrupted by independent Gaussian noises. We propose a simple distributed iterative scheme, based on distributed average consensus in the network, to compute the maximum-likelihood estimate of the parameters. This scheme doesn't involve explicit point-to-point message passing or routing; instead, it diffuses information across the network by updating each node's data with a weighted average of its neighbors' data (they maintain the same data structure). At each step, every node can compute a local weighted least-squares estimate, which converges to the global maximum-likelihood solution. This scheme is robust to unreliable communication links. We show that it works in a network with dynamically changing topology, provided that the infinitely occurring communication graphs are jointly connected.

1,415 citations

Journal ArticleDOI
TL;DR: A likelihood ratio decision rule is derived and its performance evaluated in both the noise-only and signal-plus-noise cases.
Abstract: A general problem of signal detection in a background of unknown Gaussian noise is addressed, using the techniques of statistical hypothesis testing. Signal presence is sought in one data vector, and another independent set of signal-free data vectors is available which share the unknown covariance matrix of the noise in the former vector. A likelihood ratio decision rule is derived and its performance evaluated in both the noise-only and signal-plus-noise cases.

1,411 citations


Network Information
Related Topics (5)
Feature extraction
111.8K papers, 2.1M citations
85% related
Network packet
159.7K papers, 2.2M citations
83% related
Wireless
133.4K papers, 1.9M citations
83% related
Image processing
229.9K papers, 3.5M citations
83% related
Deep learning
79.8K papers, 2.1M citations
82% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023229
2022513
2021697
2020765
2019755
2018723