scispace - formally typeset
Search or ask a question
Author

Gabor T. Herman

Bio: Gabor T. Herman is an academic researcher from City University of New York. The author has contributed to research in topics: Iterative reconstruction & Iterative method. The author has an hindex of 70, co-authored 410 publications receiving 24901 citations. Previous affiliations of Gabor T. Herman include University at Buffalo & University of Pennsylvania.


Papers
More filters
Journal ArticleDOI
TL;DR: The method works for totally asymmetric objects, and requires little computer time or storage, and is also applicable to X-ray photography, and may greatly reduce the exposure compared to current methods of body-section radiography.

2,609 citations

Book
01 Jan 1980
TL;DR: The fundamentals of computerized tomography computer ebook, image reconstruction from projections, and the fundamentals ofComputerized Tomography computer epub are revealed.

2,025 citations

Book
01 Jan 1980
TL;DR: The article addresses the design, implementation, evaluation, and application of computer algorithms for solving the reconstruction problem in various biomedical areas and emphasizes the essential role of computers.
Abstract: This article covers the problem of reconstruction of structures from data collected based on transmitted or emitted radiation. The problem occurs in a wide range of areas, such as X-ray CT, emission tomography, photon migration imaging, electron microscopic reconstruction, etc. The article addresses the design, implementation, evaluation, and application of computer algorithms for solving the reconstruction problem in various biomedical areas and emphasizes the essential role of computers, which is due to the fact that the underlying biomedical problems result in mathematical problems in which the number of unknowns is in the millions.

1,217 citations

BookDOI
01 Jan 2009
TL;DR: In this article, the computational and mathematical procedures underlying data collection, image reconstruction, and image display in computerized tomography are presented, including the fast calculation of a ray sum for a digitized picture, the taskoriented comparison of reconstruction algorithm performance, blob basis functions and the linogram method for image reconstruction.
Abstract: This revised and updated text presents the computational and mathematical procedures underlying data collection, image reconstruction, and image display in computerized tomography. New topics: the fast calculation of a ray sum for a digitized picture, the task-oriented comparison of reconstruction algorithm performance, blob basis functions and the linogram method for image reconstruction. Features: Describes how projection data are obtained and the resulting reconstructions are used; Presents a comparative evaluation of reconstruction methods; Investigates reconstruction algorithms; Explores basis functions, functions to be optimized, norms, generalized inverses, least squares solutions, maximum entropy solutions, and most likely estimates; Discusses SNARK09, a large programming system for image reconstruction; Concludes each chapter with helpful Notes and References sections. An excellent guide for practitioners, it can also serve as a textbook for an introductory graduate course.

1,052 citations


Cited by
More filters
Journal ArticleDOI

[...]

08 Dec 2001-BMJ
TL;DR: There is, I think, something ethereal about i —the square root of minus one, which seems an odd beast at that time—an intruder hovering on the edge of reality.
Abstract: There is, I think, something ethereal about i —the square root of minus one. I remember first hearing about it at school. It seemed an odd beast at that time—an intruder hovering on the edge of reality. Usually familiarity dulls this sense of the bizarre, but in the case of i it was the reverse: over the years the sense of its surreal nature intensified. It seemed that it was impossible to write mathematics that described the real world in …

33,785 citations

Journal Article
TL;DR: This book by a teacher of statistics (as well as a consultant for "experimenters") is a comprehensive study of the philosophical background for the statistical design of experiment.
Abstract: THE DESIGN AND ANALYSIS OF EXPERIMENTS. By Oscar Kempthorne. New York, John Wiley and Sons, Inc., 1952. 631 pp. $8.50. This book by a teacher of statistics (as well as a consultant for \"experimenters\") is a comprehensive study of the philosophical background for the statistical design of experiment. It is necessary to have some facility with algebraic notation and manipulation to be able to use the volume intelligently. The problems are presented from the theoretical point of view, without such practical examples as would be helpful for those not acquainted with mathematics. The mathematical justification for the techniques is given. As a somewhat advanced treatment of the design and analysis of experiments, this volume will be interesting and helpful for many who approach statistics theoretically as well as practically. With emphasis on the \"why,\" and with description given broadly, the author relates the subject matter to the general theory of statistics and to the general problem of experimental inference. MARGARET J. ROBERTSON

13,333 citations

Proceedings ArticleDOI
01 Aug 1987
TL;DR: In this paper, a divide-and-conquer approach is used to generate inter-slice connectivity, and then a case table is created to define triangle topology using linear interpolation.
Abstract: We present a new algorithm, called marching cubes, that creates triangle models of constant density surfaces from 3D medical data. Using a divide-and-conquer approach to generate inter-slice connectivity, we create a case table that defines triangle topology. The algorithm processes the 3D medical data in scan-line order and calculates triangle vertices using linear interpolation. We find the gradient of the original data, normalize it, and use it as a basis for shading the models. The detail in images produced from the generated surface models is the result of maintaining the inter-slice connectivity, surface data, and gradient information present in the original 3D data. Results from computed tomography (CT), magnetic resonance (MR), and single-photon emission computed tomography (SPECT) illustrate the quality and functionality of marching cubes. We also discuss improvements that decrease processing time and add solid modeling capabilities.

13,231 citations

Journal Article
TL;DR: In this article, a convolution-backprojection formula is deduced for direct reconstruction of a three-dimensional density function from a set of two-dimensional projections, which has useful properties, including errors that are relatively small in many practical instances and a form that leads to convenient computation.
Abstract: A convolution-backprojection formula is deduced for direct reconstruction of a three-dimensional density function from a set of two-dimensional projections. The formula is approximate but has useful properties, including errors that are relatively small in many practical instances and a form that leads to convenient computation. It reduces to the standard fan-beam formula in the plane that is perpendicular to the axis of rotation and contains the point source. The algorithm is applied to a mathematical phantom as an example of its performance.

5,356 citations