scispace - formally typeset
Search or ask a question
Author

Nancy M. Amato

Bio: Nancy M. Amato is an academic researcher from University of Illinois at Urbana–Champaign. The author has contributed to research in topics: Motion planning & Probabilistic roadmap. The author has an hindex of 51, co-authored 268 publications receiving 8988 citations. Previous affiliations of Nancy M. Amato include Texas A&M University & Google.


Papers
More filters
Proceedings Article
01 Aug 1998
TL;DR: This paper presents a new class of randomized path planning methods, known as Probabilistic Roadmap Methods (prms), which use randomization to construct a graph of representative paths in C-space whose vertices correspond to collision-free con gurations of the robot.
Abstract: Recently, a new class of randomized path planning methods, known as Probabilistic Roadmap Methods (prms) have shown great potential for solving complicated high-dimensional problems. prms use randomization (usually during preprocessing) to construct a graph of representative paths in C-space (a roadmap) whose vertices correspond to collision-free con gurations of the robot and in which two vertices are connected by an edge if a path between the two corresponding con gurations can be found by a local planning method.

533 citations

Proceedings ArticleDOI
22 Apr 1996
TL;DR: A new randomized roadmap method for motion planning for many DOF robots that can be used to obtain high quality roadmaps even when C-space is crowded, with the main novelty in the authors' approach is that roadmap candidate points are chosen on C-obstacle surfaces.
Abstract: This paper presents a new randomized roadmap method for motion planning for many DOF robots that can be used to obtain high quality roadmaps even when C-space is crowded. The main novelty in the authors' approach is that roadmap candidate points are chosen on C-obstacle surfaces. As a consequence, the roadmap is likely to contain difficult paths, such as those traversing long, narrow passages in C-space. The approach can be used for both collision-free path planning and for manipulation planning of contact tasks. Experimental results with a planar articulated 6 DOF robot show that, after preprocessing, difficult path planning operations can often be carried out in less than a second.

403 citations

Proceedings ArticleDOI
10 May 1999
TL;DR: A new method of sampling the configuration space in which randomly generated configurations are retracted onto the medial axis of the free space is proposed, and it is shown that sampling and retracting in this manner increases the number of nodes found in small volume corridors in a way that is independent of the volume of the corridor and depends only on the characteristics of the obstacles bounding it.
Abstract: Probabilistic roadmap planning methods have been shown to perform well in a number of practical situations, but their performance degrades when paths are required to pass through narrow passages in the free space. We propose a new method of sampling the configuration space in which randomly generated configurations, free or not, are retracted onto the medial axis of the free space. We give algorithms that perform this retraction while avoiding explicit computation of the medial axis, and we show that sampling and retracting in this manner increases the number of nodes found in small volume corridors in a way that is independent of the volume of the corridor and depends only on the characteristics of the obstacles bounding it. Theoretical and experimental results are given to show that this improves performance on problems requiring traversal of narrow passages.

379 citations

Journal ArticleDOI
TL;DR: FIRM is introduced as an abstract framework, a multi-query approach for planning under uncertainty which is a belief-space variant of probabilistic roadmap methods and the so-called SLQG-FIRM, a concrete instantiation of FIRM that focuses on kinematic systems and then extends to dynamical systems by sampling in the equilibrium space.
Abstract: In this paper we present feedback-based information roadmap (FIRM), a multi-query approach for planning under uncertainty which is a belief-space variant of probabilistic roadmap methods. The crucial feature of FIRM is that the costs associated with the edges are independent of each other, and in this sense it is the first method that generates a graph in belief space that preserves the optimal substructure property. From a practical point of view, FIRM is a robust and reliable planning framework. It is robust since the solution is a feedback and there is no need for expensive replanning. It is reliable because accurate collision probabilities can be computed along the edges. In addition, FIRM is a scalable framework, where the complexity of planning with FIRM is a constant multiplier of the complexity of planning with PRM. In this paper, FIRM is introduced as an abstract framework. As a concrete instantiation of FIRM, we adopt stationary linear quadratic Gaussian (SLQG) controllers as belief stabilizers and introduce the so-called SLQG-FIRM. In SLQG-FIRM we focus on kinematic systems and then extend to dynamical systems by sampling in the equilibrium space. We investigate the performance of SLQG-FIRM in different scenarios.

279 citations

Proceedings ArticleDOI
14 May 1998
TL;DR: A new local planning method is proposed, called rotate-at-s, that outperforms the common straight-line in C-space method in crowded environments and includes recommendations for selecting appropriate combinations of distance metrics and local planners for use in motion planning methods, particularly probabilistic roadmap methods.
Abstract: This paper presents a comparative evaluation of different distance metrics and local planners within the context of probabilistic roadmap methods for planning the motion of rigid objects in three-dimensional workspaces. The study concentrates on cluttered three-dimensional workspaces typical of, for example, virtual prototyping applications such as maintainability studies in mechanical CAD designs. Our results include recommendations for selecting appropriate combinations of distance metrics and local planners for such applications. Our study of distance metrics shows that the importance of the translational distance increases relative to the rotational distance as the environment becomes more crowded. We find that each local planner makes some connections that none of the others does-indicating that better connected roadmaps will be constructed using multiple local planners. We propose a new local planning method we call rotate-at-s that often outperforms the common straight-line in C-space method in crowded environments.

181 citations


Cited by
More filters
01 May 1993
TL;DR: Comparing the results to the fastest reported vectorized Cray Y-MP and C90 algorithm shows that the current generation of parallel machines is competitive with conventional vector supercomputers even for small problems.
Abstract: Three parallel algorithms for classical molecular dynamics are presented. The first assigns each processor a fixed subset of atoms; the second assigns each a fixed subset of inter-atomic forces to compute; the third assigns each a fixed spatial region. The algorithms are suitable for molecular dynamics models which can be difficult to parallelize efficiently—those with short-range forces where the neighbors of each atom change rapidly. They can be implemented on any distributed-memory parallel machine which allows for message-passing of data between independently executing processors. The algorithms are tested on a standard Lennard-Jones benchmark problem for system sizes ranging from 500 to 100,000,000 atoms on several parallel supercomputers--the nCUBE 2, Intel iPSC/860 and Paragon, and Cray T3D. Comparing the results to the fastest reported vectorized Cray Y-MP and C90 algorithm shows that the current generation of parallel machines is competitive with conventional vector supercomputers even for small problems. For large problems, the spatial algorithm achieves parallel efficiencies of 90% and a 1840-node Intel Paragon performs up to 165 faster than a single Cray C9O processor. Trade-offs between the three algorithms and guidelines for adapting them to more complex molecular dynamics simulations are also discussed.

29,323 citations

Journal ArticleDOI
TL;DR: Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis.
Abstract: Machine Learning is the study of methods for programming computers to learn. Computers are applied to a wide range of tasks, and for most of these it is relatively easy for programmers to design and implement the necessary software. However, there are many tasks for which this is difficult or impossible. These can be divided into four general categories. First, there are problems for which there exist no human experts. For example, in modern automated manufacturing facilities, there is a need to predict machine failures before they occur by analyzing sensor readings. Because the machines are new, there are no human experts who can be interviewed by a programmer to provide the knowledge necessary to build a computer system. A machine learning system can study recorded data and subsequent machine failures and learn prediction rules. Second, there are problems where human experts exist, but where they are unable to explain their expertise. This is the case in many perceptual tasks, such as speech recognition, hand-writing recognition, and natural language understanding. Virtually all humans exhibit expert-level abilities on these tasks, but none of them can describe the detailed steps that they follow as they perform them. Fortunately, humans can provide machines with examples of the inputs and correct outputs for these tasks, so machine learning algorithms can learn to map the inputs to the outputs. Third, there are problems where phenomena are changing rapidly. In finance, for example, people would like to predict the future behavior of the stock market, of consumer purchases, or of exchange rates. These behaviors change frequently, so that even if a programmer could construct a good predictive computer program, it would need to be rewritten frequently. A learning program can relieve the programmer of this burden by constantly modifying and tuning a set of learned prediction rules. Fourth, there are applications that need to be customized for each computer user separately. Consider, for example, a program to filter unwanted electronic mail messages. Different users will need different filters. It is unreasonable to expect each user to program his or her own rules, and it is infeasible to provide every user with a software engineer to keep the rules up-to-date. A machine learning system can learn which mail messages the user rejects and maintain the filtering rules automatically. Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis. Statistics focuses on understanding the phenomena that have generated the data, often with the goal of testing different hypotheses about those phenomena. Data mining seeks to find patterns in the data that are understandable by people. Psychological studies of human learning aspire to understand the mechanisms underlying the various learning behaviors exhibited by people (concept learning, skill acquisition, strategy change, etc.).

13,246 citations

Journal ArticleDOI
01 Apr 1988-Nature
TL;DR: In this paper, a sedimentological core and petrographic characterisation of samples from eleven boreholes from the Lower Carboniferous of Bowland Basin (Northwest England) is presented.
Abstract: Deposits of clastic carbonate-dominated (calciclastic) sedimentary slope systems in the rock record have been identified mostly as linearly-consistent carbonate apron deposits, even though most ancient clastic carbonate slope deposits fit the submarine fan systems better. Calciclastic submarine fans are consequently rarely described and are poorly understood. Subsequently, very little is known especially in mud-dominated calciclastic submarine fan systems. Presented in this study are a sedimentological core and petrographic characterisation of samples from eleven boreholes from the Lower Carboniferous of Bowland Basin (Northwest England) that reveals a >250 m thick calciturbidite complex deposited in a calciclastic submarine fan setting. Seven facies are recognised from core and thin section characterisation and are grouped into three carbonate turbidite sequences. They include: 1) Calciturbidites, comprising mostly of highto low-density, wavy-laminated bioclast-rich facies; 2) low-density densite mudstones which are characterised by planar laminated and unlaminated muddominated facies; and 3) Calcidebrites which are muddy or hyper-concentrated debrisflow deposits occurring as poorly-sorted, chaotic, mud-supported floatstones. These

9,929 citations

Proceedings ArticleDOI
22 Jan 2006
TL;DR: Some of the major results in random graphs and some of the more challenging open problems are reviewed, including those related to the WWW.
Abstract: We will review some of the major results in random graphs and some of the more challenging open problems. We will cover algorithmic and structural questions. We will touch on newer models, including those related to the WWW.

7,116 citations

MonographDOI
01 Jan 2006
TL;DR: This coherent and comprehensive book unifies material from several sources, including robotics, control theory, artificial intelligence, and algorithms, into planning under differential constraints that arise when automating the motions of virtually any mechanical system.
Abstract: Planning algorithms are impacting technical disciplines and industries around the world, including robotics, computer-aided design, manufacturing, computer graphics, aerospace applications, drug design, and protein folding. This coherent and comprehensive book unifies material from several sources, including robotics, control theory, artificial intelligence, and algorithms. The treatment is centered on robot motion planning but integrates material on planning in discrete spaces. A major part of the book is devoted to planning under uncertainty, including decision theory, Markov decision processes, and information spaces, which are the “configuration spaces” of all sensor-based planning problems. The last part of the book delves into planning under differential constraints that arise when automating the motions of virtually any mechanical system. Developed from courses taught by the author, the book is intended for students, engineers, and researchers in robotics, artificial intelligence, and control theory as well as computer graphics, algorithms, and computational biology.

6,340 citations