scispace - formally typeset
Search or ask a question

Showing papers presented at "Computational Science and Engineering in 1997"


Journal ArticleDOI
01 Oct 1997
TL;DR: Geometric hashing, a technique originally developed in computer vision for matching geometric features against a database of such features, finds use in a number of other areas.
Abstract: Geometric hashing, a technique originally developed in computer vision for matching geometric features against a database of such features, finds use in a number of other areas. Matching is possible even when the recognizable database objects have undergone transformations or when only partial information is present. The technique is highly efficient and of low polynomial complexity.

618 citations


Journal ArticleDOI
01 Oct 1997
TL;DR: Flash, a similarity-searching algorithm akin to geometric hashing, proves suitable for one-to-many matching of fingerprints on large-scale databases.
Abstract: Fingerprint matching is used in many noncriminal identification applications. Flash, a similarity-searching algorithm akin to geometric hashing, proves suitable for one-to-many matching of fingerprints on large-scale databases.

269 citations


Proceedings Article
01 Apr 1997

255 citations


Journal ArticleDOI
01 Jan 1997
TL;DR: To realistically simulate the behavior of complete micromachines, algorithmic innovation is necessary in several areas.
Abstract: Technologies for fabricating a variety of MEMS devices have developed rapidly, but computational tools that allow engineers to quickly design and optimize these micromachines have not kept pace. Inadequate simulation tools force MEMS designers to resort to physical prototyping. To realistically simulate the behavior of complete micromachines, algorithmic innovation is necessary in several areas.

182 citations


Journal ArticleDOI
01 Jan 1997
TL;DR: Tests showed that many software codes widely used in science and engineering are not as accurate as the authors would like to think, and it is argued that better software engineering practices would help solve this problem.
Abstract: Extensive tests showed that many software codes widely used in science and engineering are not as accurate as we would like to think. It is argued that better software engineering practices would help solve this problem, but realizing that the problem exists is an important first step.

100 citations


Journal ArticleDOI
01 Jul 1997
TL;DR: A prototype suggests that the DEVS formalism can be combined with genetic algorithms running in parallel to serve as the basis of a very general, very fast class of simulation environments.
Abstract: DEVS-C++, a high-performance environment for modeling large-scale systems at high resolution, uses the DEVS (Discrete-EVent system Specification) formalism to represent both continuous and discrete processes. A prototype suggests that the DEVS formalism can be combined with genetic algorithms running in parallel to serve as the basis of a very general, very fast class of simulation environments.

96 citations


Journal ArticleDOI
01 Jul 1997
TL;DR: This issue's theme is the rapidly evolving enabling technology of problem-solving environments, defined as "a computer system that provides all the computational facilities necessary to solve a target class of problems" for scientific computing.
Abstract: This issue's theme is the rapidly evolving enabling technology of problem-solving environments, defined as "a computer system that provides all the computational facilities necessary to solve a target class of problems" for scientific computing. The 1991 and 1995 workshops on PSEs for physical simulation helped to define this research area and highlight the pertinent research issues. The first workshop made several recommendations that led to the 1995 NSF initiative on PSEs. These efforts have identified several PSE design goals together with approaches for realizing them

93 citations


Journal ArticleDOI
01 Oct 1997
TL;DR: Computational steering is an emerging technology providing a mechanism for integrating simulation, data analysis, visualization, and postprocessing for analyzing and visualizing vast amounts of data.
Abstract: With today's large and complex applications, scientists have increasing difficulty analyzing and visualizing vast amounts of data Computational steering is an emerging technology that addresses this problem, providing a mechanism for integrating simulation, data analysis, visualization, and postprocessing

85 citations


Journal ArticleDOI
Andrew A. Berlin1, K.J. Gabriel
01 Jan 1997
TL;DR: The problems and opportunities created by the control of large numbers of MEMS sensors and actuators, including coupling to the physical world and environment driven event time demands on computation are looked at.
Abstract: How do you program a cloud of dust? That is just one computational challenge posed by MEMS, a technology in which multitudes of interacting tiny machines can add computational behavior to materials and the environment in an embedded, massively distributed fashion. Microelectromechanical systems, or MEMS, are an emerging set of technologies that make it possible to miniaturize and mass produce large numbers of integrated sensors, actuators, and computers. By merging sensing and actuation with computation and communication, MEMS devices can be distributed throughout the environment, coated on surfaces, or embedded within everyday objects to create distributed systems for sensing, reasoning about, and responding to events in the physical world on a scale never before possible. Distributed MEMS applications go well beyond the scaling limits of today's computational paradigms, posing serious challenges and new opportunities for information technology. MEMS will draw on and drive computation in four key areas: (1) control of large numbers of distributed MEMS sensors and actuators; (2) distributed intelligence, raising the general intelligence and capability of machines and matter; (3) MEMS devices as computational elements; (4) multiple energy domain simulation, analysis, and design. We look briefly at only the first of these areas: the problems and opportunities created by the control of large numbers (thousands to millions) of MEMS sensors and actuators, including coupling to the physical world and environment driven event time demands on computation.

78 citations


Journal ArticleDOI
01 Jan 1997
TL;DR: This work reports on progress towards computational MEMS, taking on the challenge of design and control of massively parallel arrays of microactuators.
Abstract: As improvements in fabrication technology for microelectromechanical systems, or MEMS, increase the availability and diversity of these micromachines, engineers are defining a growing number of tasks to which they can be put. The idea of carrying out tasks using large coordinated groups of MEMS units motivates the development of automated, algorithmic methods for designing and controlling these groups of devices. We report on progress towards computational MEMS, taking on the challenge of design and control of massively parallel arrays of microactuators. Arrays of MEMS devices can move and position tiny parts, such as integrated circuit chips, in flexible and predictable ways by oscillatory or ciliary action. The theory of programmable force fields can model this action, leading to algorithms for various types of micromanipulation that require no sensing of where the part is. Experiments support the theory.

74 citations


Journal ArticleDOI
01 Apr 1997
TL;DR: With the Stalk system, a user in a virtual reality environment can interact with a genetic algorithm running on a parallel computer to help in the search for likely geometric configurations.
Abstract: Several recent technologies-genetic algorithms, parallel and distributed computing, virtual reality, and high-speed networking-underlie a new approach to the computational study of how biomolecules interact or "dock" together. With the Stalk system, a user in a virtual reality environment can interact with a genetic algorithm running on a parallel computer to help in the search for likely geometric configurations.

Journal ArticleDOI
C. Apte1
01 Apr 1997
TL;DR: The crux of the appeal for this new technology lies in the data analysis algorithms, since they provide automated mechanisms for sifting through data and extracting useful information.
Abstract: Just what exactly is data mining? At a broad level, it is the process by which accurate and previously unknown information is extracted from large volumes of data. This information should be in a form that can be understood, acted upon, and used for improving decision processes. Obviously, with this definition, data mining encompasses a broad set of technologies, including data warehousing, database management, data analysis algorithms, and visualization. The crux of the appeal for this new technology lies in the data analysis algorithms, since they provide automated mechanisms for sifting through data and extracting useful information. The analysis capability of these algorithms, coupled with today's data warehousing and database management technology, make corporate and industrial data mining possible. The data representation model for such algorithms is quite straightforward. Data is considered to be a collection of records, where each record is a collection of fields. Using this tabular data model, data mining algorithms are designed to operate on the contents, under differing assumptions, and delivering results in differing formats. The data analysis algorithms (or data mining algorithms, as they are more popularly known nowadays) can be divided into three major categories based on the nature of their information extraction: predictive modeling (also called classification or supervised learning), clustering (also called segmentation or unsupervised learning), and frequent pattern extraction.

Journal ArticleDOI
A.P. Gueziec1, Xavier Pennec, N. Ayache
01 Oct 1997
TL;DR: Results show that two geometric hashing methods, based respectively on curves and characteristic features, can be used to compute 3D transformations that automatically register medical images of the same patient in a practical, fast, accurate, and reliable manner.
Abstract: To carefully compare pictures of the same thing taken from different views, the images must first be registered, or aligned so as to best superimpose them. Results show that two geometric hashing methods, based respectively on curves and characteristic features, can be used to compute 3D transformations that automatically register medical images of the same patient in a practical, fast, accurate, and reliable manner.

Journal ArticleDOI
01 Jul 1997
TL;DR: The SciNapse code generation system transforms high-level descriptions of partial differential equation problems into customized, efficient and documented C or Fortran code.
Abstract: The SciNapse code generation system transforms high-level descriptions of partial differential equation problems into customized, efficient and documented C or Fortran code. Modelers can specify mathematical problems, solution techniques and I/O formats with a concise blend of mathematical expressions and keywords. An algorithm template language supports convenient extension of the system's built-in knowledge base.

Journal ArticleDOI
Farid F. Abraham1
01 Apr 1997
TL;DR: Simulations of how materials crack at the atomic level are yielding surprising results that sometimes contradict existing theory, but that may explain recent physical experiments.
Abstract: How do materials fracture? The molecular dynamics methods used to model this important problem parallelize well, allowing bigger and more realistic computational experiments. Simulations of how materials crack at the atomic level are yielding surprising results that sometimes contradict existing theory, but that may explain recent physical experiments.

Journal ArticleDOI
01 Jul 1997
TL;DR: The Ctadel programming environment transforms high-level partial differential equation (PDE) problem specifications into efficient codes for serial, vector and parallel computer architectures using computing-cost heuristics and architecture-specific symbolic transformations.
Abstract: The Ctadel programming environment transforms high-level partial differential equation (PDE) problem specifications into efficient codes for serial, vector and parallel computer architectures using computing-cost heuristics and architecture-specific symbolic transformations. Ctadel-generated codes for the Hirlam weather forecasting system perform comparably with handwritten codes.

Journal ArticleDOI
01 Oct 1997
TL;DR: Using geometric hashing, algorithms can resolve problems in CAD software that cause defects in the boundaries of polyhedral objects-small gaps bounded by edges incident to one polyhedron face.
Abstract: Problems in CAD software sometimes cause defects in the boundaries of polyhedral objects-small gaps bounded by edges incident to one polyhedron face. Using geometric hashing, algorithms can resolve this problem, which has vexed layered manufacturing. Similar techniques hold promise for solving problems in such areas as computer vision, medical imaging, and molecular biology.

Journal ArticleDOI
01 Oct 1997
TL;DR: Parallelizing a model that simulates long-term populations of deer and panthers in the Everglades region of south Florida allows faster computation of complex scenarios and better exploration of hydrologic management alternatives.
Abstract: Individual-based models of animal ecology can take into account spatio-temporal details of a natural environment that are overly simplified in models based on averaged variables. Parallelizing a model that simulates long-term populations of deer and panthers in the Everglades region of south Florida allows faster computation of complex scenarios and better exploration of hydrologic management alternatives.

Journal ArticleDOI
01 Oct 1997
TL;DR: This special issue presents a small sample of the past 12 years’ work in applied geometric hashing for efficiently accessing very large databases of stored geometric information, and represents the birth of a new school: hashing and indexing for geometric pattern matching.
Abstract: This special issue presents a small sample of the past 12 years’ work in applied geometric hashing for efficiently accessing very large databases of stored geometric information. Originated in the mid-eighties at New York University’s Courant Institute of Mathematical Sciences, this method emerged initially as a general scheme for carrying out model-based object recognition in computer vision. Geometric hashing permitted the detection of objects from a given model database in scenes, where additional clutter might be present and the objects might partially occlude each other. The key observation was that a judicious choice of geometric invariants describing local scene features and proper encoding of geometric constraints inherent to rigid bodies let vision systems exploit the extremely efficient hashing methods for geometric data retrieval. Although easy to state, the problem of computer-based object recognition is very difficult to solve in its general form. In fact, until recently, computer vision was included among computer science’s Grand Challenges. However, numerous successful applications have emerged for specific tasks and constrained environments. In the geometric hashing approach, performance does not degrade linearly with the addition of new items to the database: such linear slowdown characterized all previous techniques. Geometric hashing’s indexing technique allows the speedy identification of relevant locations in the database while also obviating the need to sequentially search the entire database. By using a redundant set of indices for each object, this technique effectively handles the partial-occlusion problem. From the time it was outlined, the basic approach has found application in various fields that require geometric pattern partial matching. The first applications dealt mainly with computer-vision-related tasks. Later, system developers successfully adapted and applied the method to such additional fields as CAD/CAM, medical imaging, and 3D protein and drug molecule analysis in molecular biology. Indeed, in each of these fields, working systems already exist that were built around the concept of hashing and indexing. The existence of such systems is perhaps the best proof of this technique’s validity and robustness. Collectively, all the work that has been carried out over the last 12 years represents the birth of a new school: hashing and indexing for geometric pattern matching. The “Articles” sidebar summarizes the contributions that comprise this special issue on geometric hashing. We hope scientists in other disciplines find the work relevant and potentially applicable to specific problems in their respective domains. ♦ Geometric Hashing

Journal ArticleDOI
01 Jul 1997
TL;DR: The Internet offers scientists around the world access to high-powered problem-solving environments (PSEs) and Purdue University's Net Pellpack PSE server can solve complex partial differential equations with common World Wide Web browsers that support Java applets.
Abstract: The Internet offers scientists around the world access to high-powered problem-solving environments (PSEs). With Purdue University's Net Pellpack PSE server, they can solve complex partial differential equations with common World Wide Web browsers that support Java applets.

Journal ArticleDOI
E. Peeters1
01 Jan 1997
TL;DR: In this paper, the authors recognize that the main challenges in commercializing MEMS-on both the business and technical levels-are different from the classic semiconductor problems and that the successful MEMS venture today is likely to be one that focuses on differences from, rather than parallels with, mainstream semiconductor markets.
Abstract: Several microelectromechanical systems have achieved commercial success. The barriers can still be formidable, though, and the path to success is often much different for MEMS than it was for mainstream semiconductors. Maturing software for comprehensive modeling and design will help in the future. The entire field of MEMS has been enabled by the batch fabrication methods established in the semiconductor industry. We cannot predict the market success of MEMS based products, however, by blindly applying the economy of scale and other economic models governing semiconductor markets. Although the parallels are both undeniable and enabling, the successful MEMS venture today is likely to be one that focuses on differences from, rather than parallels with, mainstream semiconductor markets. It is essential to recognize that the main challenges in commercializing MEMS-on both the business and technical levels-are different from the classic semiconductor problems.

Journal ArticleDOI
01 Jul 1997
TL;DR: A problem-solving environment called the "Workbench for Interactive Simulation of Ecosystems" (WISE) is described that uses object-oriented software design to link research models from various subdisciplines for more comprehensive ecosystem simulation and visualization.
Abstract: A problem-solving environment called the "Workbench for Interactive Simulation of Ecosystems" (WISE) is described that uses object-oriented software design to link research models from various subdisciplines for more comprehensive ecosystem simulation and visualization. A simulation manager component uses a clock to control the coupling mechanism, allowing dynamic querying between encapsulated models for the most recent values of state variables.

Journal ArticleDOI
01 Apr 1997
TL;DR: The authors contend that the main business of CSE is the creation of problem-solving environments for science and engineering.
Abstract: Computational science and engineering (CSE) is a young field, and not everyone agrees on exactly what it is. Some define it very broadly, others more narrowly. The authors contend that the main business of CSE is the creation of problem-solving environments for science and engineering.

Journal ArticleDOI
01 Jan 1997
TL;DR: A system called Network Distributed Global Memory simplifies parallel computing in a distributed memory environment by allowing processors to be programmed as though they had shared memory.
Abstract: A system called Network Distributed Global Memory simplifies parallel computing in a distributed memory environment by allowing processors to be programmed as though they had shared memory. NDGM manages message passing; applications perform puts and gets to a virtual buffer. Porting a 3D fluid dynamics code to NDGM was much easier than writing an explicit message passing version.

Journal ArticleDOI
01 Jan 1997
TL;DR: The visual interactive desktop laboratory allows users to visualize on a workstation the results of large scale simulations performed on remote supercomputers and to interactively steer a simulation as the experiment progresses.
Abstract: The visual interactive desktop laboratory allows users to visualize on a workstation the results of large scale simulations performed on remote supercomputers and to interactively steer a simulation as the experiment progresses. Unlike conventional methods, the VIDL requires minimal data transfer and storage and uses a general, flexible environment rather than one specific to a subject or to certain hardware.

Journal ArticleDOI
01 Jan 1997
TL;DR: To realize economies of scale, seven aspects of MEMS manufacturing infrastructure must be improved: computer aided design and simulation; lithography; etching; parametric testing; functional testing; packaging; and capital equipment.
Abstract: Though microelectromechanical systems are often made from silicon, the existing semiconductor technology base faces numerous challenges to meet the special requirements of fabricating MEMS devices. To realize economies of scale, seven aspects of MEMS manufacturing infrastructure must be improved: computer aided design and simulation; lithography; etching; parametric testing; functional testing; packaging; and capital equipment.

Journal ArticleDOI
01 Oct 1997
TL;DR: A maximal matching tries to maximize the number of connections in the matching as discussed by the authors, where weights are attached to connections (in this case a maximal matching optimizes the sum of the weights), but we are not going to do that here.
Abstract: Finding a matching is a common task in computing that arises in physical and operational problems. If you have a set of possible connections between two groups of objects, you would like to know if there is a way to match each element from the first group with exactly one element of the second group. It is like arranging marriages between two groups or assigning a group of tasks to a group of people, where some connections are possible and some are not. It might be impossible to find a matching that uses all the people or objects, given the connections that are possible. A maximal matching tries to maximize the number of connections in the matching. A perfect matching is one in which you use all possible objects. Sometimes weights are attached to connections (in this case a maximal matching optimizes the sum of the weights), but we're not going to do that here. We will use interchangeably the terms objects and nodes for the objects, assignments, or people mentioned above. Similarly we will use the terms edges and connections interchangeably.

Journal ArticleDOI
01 Apr 1997

Journal ArticleDOI
01 Apr 1997
TL;DR: It is argued that a growing body of good teaching materials for CSE are available, on the Web and otherwise, and when used well it is especially suited to the needs of teaching computational science.
Abstract: The World Wide Web can be used both well and badly as an educational tool. When used well, it is especially suited to the needs of teaching computational science. It is argued that a growing body of good teaching materials for CSE are available, on the Web and otherwise.

Journal ArticleDOI
01 Jul 1997
TL;DR: This article explains how to reformulate the basic Metropolis algorithm so as to avoid the do-nothing steps and reduce the running time, while also keeping track of the simulated time as determined by themetropolis algorithm.
Abstract: N. Metropolis's (1953) algorithm has often been used for simulating physical systems that pass among a set of states, with the probabilities of the system being in such states distributed like the Boltzmann function. There are literally thousands of different applications in the physical sciences and elsewhere. In this article, we explain how to reformulate the basic Metropolis algorithm so as to avoid the do-nothing steps and reduce the running time, while also keeping track of the simulated time as determined by the Metropolis algorithm. By the simulated time, we mean the number of Monte Carlo steps that would have been taken if the basic Metropolis algorithm had been used. This approach has already proved successful when used for parallel simulations of molecular beam epitaxy. We show an example.