scispace - formally typeset
Search or ask a question

Showing papers by "David L. Donoho published in 1981"


Book ChapterDOI
01 Jan 1981
TL;DR: In this article, a simple and general framework is provided for a number of the minimum entropy deconvolution (MED) procedures inspired by the work of Wiggens can be fit, making possible an analysis and comparison of these procedures according to the large-sample statistical properties of the coefficient estimates they produce.
Abstract: Publisher Summary This chapter provides a simple and general framework into which a number of the minimum entropy deconvolution (MED) procedures inspired by the work of Wiggens can be fit This makes possible an analysis and comparison of these procedures according to the large-sample statistical properties of the coefficient estimates they produce The chapter presents a demonstration of the role of entropy within this framework, vindicating Wiggens intuitive use of the term MED is a technique introduced by Wiggins for deconvolution without making prior assumptions about the delay characteristics of the filter f The eye, using a judgement of simplicity, can identify the correct solution to the deconvolution problem even though correlation/spectrum technologies could not Wiggins used this simple observation as the basis for a formal procedure for deconvolution The chapter discusses a basic theory of the MED technique in an ideal case The MED idea and its variants are consistent ways of solving the deconvolution problem when the data follows the convolutional form The chapter describes a partial order arising naturally from the probabilistic structure of the problem and presents a comparison of the precision of various proposals that have been made under the assumption that the size of the data sample is large It further discusses the limitations of the current analysis and the application of these results in seismology and in other fields

461 citations