scispace - formally typeset
Search or ask a question
Author

Richard W. Hamming

Other affiliations: University of Edinburgh, Bell Labs
Bio: Richard W. Hamming is an academic researcher from Naval Postgraduate School. The author has contributed to research in topics: Adaptive filter & Mathematical proof. The author has an hindex of 15, co-authored 31 publications receiving 11123 citations. Previous affiliations of Richard W. Hamming include University of Edinburgh & Bell Labs.

Papers
More filters
Journal ArticleDOI
TL;DR: The author was led to the study given in this paper from a consideration of large scale computing machines in which a large number of operations must be performed without a single error in the end result.
Abstract: The author was led to the study given in this paper from a consideration of large scale computing machines in which a large number of operations must be performed without a single error in the end result. This problem of “doing things right” on a large scale is not essentially new; in a telephone central office, for example, a very large number of operations are performed while the errors leading to wrong numbers are kept well under control, though they have not been completely eliminated. This has been achieved, in part, through the use of self-checking circuits. The occasional failure that escapes routine checking is still detected by the customer and will, if it persists, result in customer complaint, while if it is transient it will produce only occasional wrong numbers. At the same time the rest of the central office functions satisfactorily. In a digital computer, on the other hand, a single failure usually means the complete failure, in the sense that if it is detected no more computing can be done until the failure is located and corrected, while if it escapes detection then it invalidates all subsequent operations of the machine. Put in other words, in a telephone central office there are a number of parallel paths which are more or less independent of each other; in a digital machine there is usually a single long path which passes through the same piece of equipment many, many times before the answer is obtained.

5,408 citations

Book
01 Jan 1962
TL;DR: This inexpensive paperback edition of a groundbreaking classic is unique in its emphasis on the frequency approach and its use in the solution of problems.
Abstract: From the Publisher: For this inexpensive paperback edition of a groundbreaking classic, the author has extensively rearranged, rewritten and enlarged the material. Book is unique in its emphasis on the frequency approach and its use in the solution of problems. Contents include: Fundamentals and Algorithms; Polynomial Approximation— Classical Theory; Fourier Approximation—Modern Therory; Exponential Approximation.

1,612 citations

Book
01 Jan 1980
TL;DR: Coding and information theory, Coding and Information theory, مرکز فناوری اطلاعات و اصاع رسانی, کδاوρزی.
Abstract: Coding and information theory , Coding and information theory , مرکز فناوری اطلاعات و اطلاع رسانی کشاورزی

938 citations

Journal ArticleDOI
01 Jul 1962

656 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: In this paper, a method for finding the optical flow pattern is presented which assumes that the apparent velocity of the brightness pattern varies smoothly almost everywhere in the image, and an iterative implementation is shown which successfully computes the Optical Flow for a number of synthetic image sequences.

10,727 citations

Journal ArticleDOI
TL;DR: A Computer Movie Simulating Urban Growth in the Detroit Region as discussed by the authors was made to simulate urban growth in the city of Detroit, Michigan, United States of America, 1970, 1970.
Abstract: (1970). A Computer Movie Simulating Urban Growth in the Detroit Region. Economic Geography: Vol. 46, PROCEEDINGS International Geographical Union Commission on Quantitative Methods, pp. 234-240.

7,533 citations

Journal ArticleDOI
TL;DR: From incremental exercise tests on 10 subjects, the point of excess CO2 output (AT) predicted closely the lactate and HCO-3 thresholds and could be more reliably determined by the V-slope method.
Abstract: Excess CO2 is generated when lactate is increased during exercise because its [H+] is buffered primarily by HCO-3 (22 ml for each meq of lactic acid). We developed a method to detect the anaerobic threshold (AT), using computerized regression analysis of the slopes of the CO2 uptake (VCO2) vs. O2 uptake (VO2) plot, which detects the beginning of the excess CO2 output generated from the buffering of [H+], termed the V-slope method. From incremental exercise tests on 10 subjects, the point of excess CO2 output (AT) predicted closely the lactate and HCO-3 thresholds. The mean gas exchange AT was found to correspond to a small increment of lactate above the mathematically defined lactate threshold [0.50 +/- 0.34 (SD) meq/l] and not to differ significantly from the estimated HCO-3 threshold. The mean VO2 at AT computed by the V-slope analysis did not differ significantly from the mean value determined by a panel of six experienced reviewers using traditional visual methods, but the AT could be more reliably determined by the V-slope method. The respiratory compensation point, detected separately by examining the minute ventilation vs. VCO2 plot, was consistently higher than the AT (2.51 +/- 0.42 vs. 1.83 +/- 0.30 l/min of VO2). This method for determining the AT has significant advantages over others that depend on regular breathing pattern and respiratory chemosensitivity.

3,805 citations