scispace - formally typeset
Search or ask a question
Author

Russell M. Mersereau

Bio: Russell M. Mersereau is an academic researcher from Georgia Institute of Technology. The author has contributed to research in topics: Motion compensation & Image restoration. The author has an hindex of 48, co-authored 229 publications receiving 11716 citations. Previous affiliations of Russell M. Mersereau include Massachusetts Institute of Technology & Georgia Tech Research Institute.


Papers
More filters
Journal ArticleDOI
TL;DR: This paper presents a new demosaicing technique that uses inter-channel correlation effectively in an alternating-projections scheme and outperforms various state-of-the-art demosaice techniques, both visually and in terms of mean square error.
Abstract: Most commercial digital cameras use color filter arrays to sample red, green, and blue colors according to a specific pattern. At the location of each pixel only one color sample is taken, and the values of the other colors must be interpolated using neighboring samples. This color plane interpolation is known as demosaicing; it is one of the important tasks in a digital camera pipeline. If demosaicing is not performed appropriately, images suffer from highly visible color artifacts. In this paper we present a new demosaicing technique that uses inter-channel correlation effectively in an alternating-projections scheme. We have compared this technique with six state-of-the-art demosaicing techniques, and it outperforms all of them, both visually and in terms of mean square error.

687 citations

Journal ArticleDOI
TL;DR: The author begins by discussing the image formation process and examines the demosaicking methods in three groups: the first group consists of heuristic approaches, the second group formulates demosaicked as a restoration problem, and the third group is a generalization that uses the spectral filtering model given in Wandell.
Abstract: The author begins by discussing the image formation process. The demosaicking methods are examined in three groups: the first group consists of heuristic approaches. The second group formulates demosaicking as a restoration problem. The third group is a generalization that uses the spectral filtering model given in Wandell.

616 citations

Journal ArticleDOI
01 May 1990
TL;DR: In this paper, the authors discuss the use of iterative restoration algorithms for the removal of linear blurs from photographic images that may also be degraded by pointwise nonlinearities such as film saturation and additive noise.
Abstract: The authors discuss the use of iterative restoration algorithms for the removal of linear blurs from photographic images that may also be assumed to be degraded by pointwise nonlinearities such as film saturation and additive noise. Iterative algorithms allow for the incorporation of various types of prior knowledge about the class of feasible solutions, can be used to remove nonstationary blurs, and are fairly robust with respect to errors in the approximation of the blurring operator. Special attention is given to the problem of convergence of the algorithms, and classical solutions such as inverse filters, Wiener filters, and constrained least-squares filters are shown to be limiting solutions of variations of the iterations. Regularization is introduced as a means for preventing the excessive noise magnification that is typically associated with ill-conditioned inverse problems such as the deblurring problem, and it is shown that noise effects can be minimized by terminating the algorithms after a finite number of iterations. The role and choice of constraints on the class of feasible solutions are also discussed. >

513 citations

Journal ArticleDOI
01 Apr 1981
TL;DR: It is shown that by predistorting the signal (and later removing this predistortion) it is possible to achieve spectral extrapolation, to broaden the class of signals for which these algorithms achieve convergence, and to improve their performance in the presence of broad-band noise.
Abstract: This paper describes a rather broad class of iterative signal restoration techniques which can be applied to remove the effects of many different types of distortions. These techniques also allow for the incorporation of prior knowledge of the signal in terms of the specification of a constraint operator. Conditions for convergence of the iteration under various combinations of distortions and constraints are explored. Particular attention is given to the use of iterative restoration techniques for constrained deconvolution, when the distortion band-limits the signal and spectral extrapolation must be performed. It is shown that by predistorting the signal (and later removing this predistortion) it is possible to achieve spectral extrapolation, to broaden the class of signals for which these algorithms achieve convergence, and to improve their performance in the presence of broad-band noise.

465 citations


Cited by
More filters
Proceedings ArticleDOI
04 Jan 2000
TL;DR: The Low-Energy Adaptive Clustering Hierarchy (LEACH) as mentioned in this paper is a clustering-based protocol that utilizes randomized rotation of local cluster based station (cluster-heads) to evenly distribute the energy load among the sensors in the network.
Abstract: Wireless distributed microsensor systems will enable the reliable monitoring of a variety of environments for both civil and military applications. In this paper, we look at communication protocols, which can have significant impact on the overall energy dissipation of these networks. Based on our findings that the conventional protocols of direct transmission, minimum-transmission-energy, multi-hop routing, and static clustering may not be optimal for sensor networks, we propose LEACH (Low-Energy Adaptive Clustering Hierarchy), a clustering-based protocol that utilizes randomized rotation of local cluster based station (cluster-heads) to evenly distribute the energy load among the sensors in the network. LEACH uses localized coordination to enable scalability and robustness for dynamic networks, and incorporates data fusion into the routing protocol to reduce the amount of information that must be transmitted to the base station. Simulations show the LEACH can achieve as much as a factor of 8 reduction in energy dissipation compared with conventional outing protocols. In addition, LEACH is able to distribute energy dissipation evenly throughout the sensors, doubling the useful system lifetime for the networks we simulated.

12,497 citations

01 Jan 2000
TL;DR: LEACH (Low-Energy Adaptive Clustering Hierarchy), a clustering-based protocol that utilizes randomized rotation of local cluster based station (cluster-heads) to evenly distribute the energy load among the sensors in the network, is proposed.
Abstract: Wireless distributed microsensor systems will enable the reliable monitoring of a variety of environments for both civil and military applications. In this paper, we look at communication protocols, which can have signicant impact on the overall energy dissipation of these networks. Based on our ndings that the conventional protocols of direct transmission, minimum-transmission-energy, multihop routing, and static clustering may not be optimal for sensor networks, we propose LEACH (Low-Energy Adaptive Clustering Hierarchy), a clustering-based protocol that utilizes randomized rotation of local cluster base stations (cluster-heads) to evenly distribute the energy load among the sensors in the network. LEACH uses localized coordination to enable scalability and robustness for dynamic networks, and incorporates data fusion into the routing protocol to reduce the amount of information that must be transmitted to the base station. Simulations show that LEACH can achieve as much as a factor of 8 reduction in energy dissipation compared with conventional routing protocols. In addition, LEACH is able to distribute energy dissipation evenly throughout the sensors, doubling the useful system lifetime for the networks we simulated.

11,412 citations

Journal ArticleDOI
TL;DR: In this paper, a method for finding the optical flow pattern is presented which assumes that the apparent velocity of the brightness pattern varies smoothly almost everywhere in the image, and an iterative implementation is shown which successfully computes the Optical Flow for a number of synthetic image sequences.

10,727 citations

Journal ArticleDOI
01 Apr 1988-Nature
TL;DR: In this paper, a sedimentological core and petrographic characterisation of samples from eleven boreholes from the Lower Carboniferous of Bowland Basin (Northwest England) is presented.
Abstract: Deposits of clastic carbonate-dominated (calciclastic) sedimentary slope systems in the rock record have been identified mostly as linearly-consistent carbonate apron deposits, even though most ancient clastic carbonate slope deposits fit the submarine fan systems better. Calciclastic submarine fans are consequently rarely described and are poorly understood. Subsequently, very little is known especially in mud-dominated calciclastic submarine fan systems. Presented in this study are a sedimentological core and petrographic characterisation of samples from eleven boreholes from the Lower Carboniferous of Bowland Basin (Northwest England) that reveals a >250 m thick calciturbidite complex deposited in a calciclastic submarine fan setting. Seven facies are recognised from core and thin section characterisation and are grouped into three carbonate turbidite sequences. They include: 1) Calciturbidites, comprising mostly of highto low-density, wavy-laminated bioclast-rich facies; 2) low-density densite mudstones which are characterised by planar laminated and unlaminated muddominated facies; and 3) Calcidebrites which are muddy or hyper-concentrated debrisflow deposits occurring as poorly-sorted, chaotic, mud-supported floatstones. These

9,929 citations

Journal ArticleDOI
TL;DR: An automated method for segmenting magnetic resonance head images into brain and non‐brain has been developed and described and examples of results and the results of extensive quantitative testing against “gold‐standard” hand segmentations, and two other popular automated methods.
Abstract: An automated method for segmenting magnetic resonance head images into brain and non-brain has been developed. It is very robust and accurate and has been tested on thousands of data sets from a wide variety of scanners and taken with a wide variety of MR sequences. The method, Brain Extraction Tool (BET), uses a deformable model that evolves to fit the brain's surface by the application of a set of locally adaptive model forces. The method is very fast and requires no preregistration or other pre-processing before being applied. We describe the new method and give examples of results and the results of extensive quantitative testing against "gold-standard" hand segmentations, and two other popular automated methods.

9,887 citations