scispace - formally typeset
Search or ask a question
Author

Allen Tannenbaum

Other affiliations: Boston University, General Electric, Indiana University  ...read more
Bio: Allen Tannenbaum is an academic researcher from Stony Brook University. The author has contributed to research in topics: Image segmentation & Segmentation. The author has an hindex of 70, co-authored 600 publications receiving 23571 citations. Previous affiliations of Allen Tannenbaum include Boston University & General Electric.


Papers
More filters
Journal ArticleDOI
TL;DR: A natural framework that allows any region-based segmentation energy to be re-formulated in a local way is proposed and the localization of three well-known energies are demonstrated in order to illustrate how this framework can be applied to any energy.
Abstract: In this paper, we propose a natural framework that allows any region-based segmentation energy to be re-formulated in a local way. We consider local rather than global image statistics and evolve a contour based on local information. Localized contours are capable of segmenting objects with heterogeneous feature profiles that would be difficult to capture correctly using a standard global method. The presented technique is versatile enough to be used with any global region-based active contour energy and instill in it the benefits of localization. We describe this framework and demonstrate the localization of three well-known energies in order to illustrate how our framework can be applied to any energy. We then compare each localized energy to its global counterpart to show the improvements that can be achieved. Next, an in-depth study of the behaviors of these energies in response to the degree of localization is given. Finally, we show results on challenging images to illustrate the robust and accurate segmentations that are possible with this new class of active contour models.

1,149 citations

Proceedings ArticleDOI
20 Jun 1995
TL;DR: This paper analyzes geometric active contour models discussed previously from a curve evolution point of view and proposes some modifications based on gradient flows relative to certain new feature-based Riemannian metrics, leading to a novel snake paradigm in which the feature of interest may be considered to lie at the bottom of a potential well.
Abstract: In this paper, we analyze the geometric active contour models discussed previously from a curve evolution point of view and propose some modifications based on gradient flows relative to certain new feature-based Riemannian metrics. This leads to a novel snake paradigm in which the feature of interest may be considered to lie at the bottom of a potential well. Thus the snake is attracted very naturally and efficiently to the desired feature. Moreover, we consider some 3-D active surface models based on these ideas. >

754 citations

Journal ArticleDOI
TL;DR: This work employs the new geometric active contour models, previously formulated, for edge detection and segmentation of magnetic resonance imaging (MRI), computed tomography (CT), and ultrasound medical imagery, and leads to a novel snake paradigm in which the feature of interest may be considered to lie at the bottom of a potential well.
Abstract: We employ the new geometric active contour models, previously formulated, for edge detection and segmentation of magnetic resonance imaging (MRI), computed tomography (CT), and ultrasound medical imagery. Our method is based on defining feature-based metrics on a given image which in turn leads to a novel snake paradigm in which the feature of interest may be considered to lie at the bottom of a potential well. Thus, the snake is attracted very quickly and efficiently to the desired feature.

676 citations

Journal ArticleDOI
TL;DR: In this article, the use and design of linear periodic time-varying controllers for the feedback control of linear time-invariant discrete-time plants is considered. And the authors show that for a large class of robustness problems, periodic compensators are superior to time-inariant ones.
Abstract: This paper considers the use and design of linear periodic time-varying controllers for the feedback control of linear time-invariant discrete-time plants. We will show that for a large class of robustness problems, periodic compensators are superior to time-invariant ones. We will give explicit design techniques which can be easily implemented. In the context of periodic controllers, we also consider the strong and simultaneous stabilization problems. Finally, we show that for the problem of weighted sensitivity minimization for linear time-invariant plants, time-varying controllers offer no advantage over the time-invariant ones.

672 citations


Cited by
More filters
Journal ArticleDOI

[...]

08 Dec 2001-BMJ
TL;DR: There is, I think, something ethereal about i —the square root of minus one, which seems an odd beast at that time—an intruder hovering on the edge of reality.
Abstract: There is, I think, something ethereal about i —the square root of minus one. I remember first hearing about it at school. It seemed an odd beast at that time—an intruder hovering on the edge of reality. Usually familiarity dulls this sense of the bizarre, but in the case of i it was the reverse: over the years the sense of its surreal nature intensified. It seemed that it was impossible to write mathematics that described the real world in …

33,785 citations

Journal ArticleDOI
TL;DR: Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis.
Abstract: Machine Learning is the study of methods for programming computers to learn. Computers are applied to a wide range of tasks, and for most of these it is relatively easy for programmers to design and implement the necessary software. However, there are many tasks for which this is difficult or impossible. These can be divided into four general categories. First, there are problems for which there exist no human experts. For example, in modern automated manufacturing facilities, there is a need to predict machine failures before they occur by analyzing sensor readings. Because the machines are new, there are no human experts who can be interviewed by a programmer to provide the knowledge necessary to build a computer system. A machine learning system can study recorded data and subsequent machine failures and learn prediction rules. Second, there are problems where human experts exist, but where they are unable to explain their expertise. This is the case in many perceptual tasks, such as speech recognition, hand-writing recognition, and natural language understanding. Virtually all humans exhibit expert-level abilities on these tasks, but none of them can describe the detailed steps that they follow as they perform them. Fortunately, humans can provide machines with examples of the inputs and correct outputs for these tasks, so machine learning algorithms can learn to map the inputs to the outputs. Third, there are problems where phenomena are changing rapidly. In finance, for example, people would like to predict the future behavior of the stock market, of consumer purchases, or of exchange rates. These behaviors change frequently, so that even if a programmer could construct a good predictive computer program, it would need to be rewritten frequently. A learning program can relieve the programmer of this burden by constantly modifying and tuning a set of learned prediction rules. Fourth, there are applications that need to be customized for each computer user separately. Consider, for example, a program to filter unwanted electronic mail messages. Different users will need different filters. It is unreasonable to expect each user to program his or her own rules, and it is infeasible to provide every user with a software engineer to keep the rules up-to-date. A machine learning system can learn which mail messages the user rejects and maintain the filtering rules automatically. Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis. Statistics focuses on understanding the phenomena that have generated the data, often with the goal of testing different hypotheses about those phenomena. Data mining seeks to find patterns in the data that are understandable by people. Psychological studies of human learning aspire to understand the mechanisms underlying the various learning behaviors exhibited by people (concept learning, skill acquisition, strategy change, etc.).

13,246 citations

Journal ArticleDOI
TL;DR: A new model for active contours to detect objects in a given image, based on techniques of curve evolution, Mumford-Shah (1989) functional for segmentation and level sets is proposed, which can detect objects whose boundaries are not necessarily defined by the gradient.
Abstract: We propose a new model for active contours to detect objects in a given image, based on techniques of curve evolution, Mumford-Shah (1989) functional for segmentation and level sets. Our model can detect objects whose boundaries are not necessarily defined by the gradient. We minimize an energy which can be seen as a particular case of the minimal partition problem. In the level set formulation, the problem becomes a "mean-curvature flow"-like evolving the active contour, which will stop on the desired boundary. However, the stopping term does not depend on the gradient of the image, as in the classical active contour models, but is instead related to a particular segmentation of the image. We give a numerical algorithm using finite differences. Finally, we present various experimental results and in particular some examples for which the classical snakes methods based on the gradient are not applicable. Also, the initial curve can be anywhere in the image, and interior contours are automatically detected.

10,404 citations

Christopher M. Bishop1
01 Jan 2006
TL;DR: Probability distributions of linear models for regression and classification are given in this article, along with a discussion of combining models and combining models in the context of machine learning and classification.
Abstract: Probability Distributions.- Linear Models for Regression.- Linear Models for Classification.- Neural Networks.- Kernel Methods.- Sparse Kernel Machines.- Graphical Models.- Mixture Models and EM.- Approximate Inference.- Sampling Methods.- Continuous Latent Variables.- Sequential Data.- Combining Models.

10,141 citations

Journal ArticleDOI
20 Jun 1995
TL;DR: A novel scheme for the detection of object boundaries based on active contours evolving in time according to intrinsic geometric measures of the image, allowing stable boundary detection when their gradients suffer from large variations, including gaps.
Abstract: A novel scheme for the detection of object boundaries is presented. The technique is based on active contours deforming according to intrinsic geometric measures of the image. The evolving contours naturally split and merge, allowing the simultaneous detection of several objects and both interior and exterior boundaries. The proposed approach is based on the relation between active contours and the computation of geodesics or minimal distance curves. The minimal distance curve lays in a Riemannian space whose metric as defined by the image content. This geodesic approach for object segmentation allows to connect classical "snakes" based on energy minimization and geometric active contours based on the theory of curve evolution. Previous models of geometric active contours are improved as showed by a number of examples. Formal results concerning existence, uniqueness, stability, and correctness of the evolution are presented as well. >

5,566 citations