scispace - formally typeset
Search or ask a question

Showing papers presented at "Computational Science and Engineering in 1999"


Journal ArticleDOI
01 Jan 1999
TL;DR: A new scalable modeling framework and scalable parallel simulations make it possible to analyze the detailed behaviour of large, multidomain multiprotocol Internet models.
Abstract: A new scalable modeling framework and scalable parallel simulations make it possible to analyze the detailed behaviour of large, multidomain multiprotocol Internet models. The article focuses on simulation research. It describes the software designs that let us construct and run appropriately large models. After several years of research, we have developed a scalable network modeling framework, a scalable simulation framework (SSF), and scalable parallel discrete event simulators capable of modeling the Internet at unprecedented scales.

237 citations


Journal ArticleDOI
01 Mar 1999
TL;DR: The paper discusses the Sloan Digital Sky Survey, which promises to radically improve the way scientists do astronomy by digitally mapping half of the northern sky and could well lead to major new discoveries.
Abstract: Astronomy is about to undergo a major paradigm shift, with data sets becoming larger and more homogeneous, designed for the first time in a top-down fashion. The paper discusses the Sloan Digital Sky Survey. By digitally mapping half of the northern sky, this project promises to radically improve the way scientists do astronomy. In putting a digital sky on the desktops of astronomers, the SDSS could well lead to major new discoveries.

148 citations


Journal ArticleDOI
01 Nov 1999
TL;DR: This article reviews developments in four areas, including empirical statistical properties of prices, random-process models for price dynamics, agent-based modeling, and practical applications in finance.
Abstract: Physicists have recently begun doing research in finance, and even though this movement is less than five years old, interesting and useful contributions have already emerged. This article reviews these developments in four areas, including empirical statistical properties of prices, random-process models for price dynamics, agent-based modeling, and practical applications.

129 citations


Journal ArticleDOI
01 Sep 1999
TL;DR: In this article, the authors discuss only explicit methods, because they are useful for the varying boundary conditions found in realistic traffic simulations, where data is continuously fed into the simulation, and they are more flexible for the simulation of on-and off-ramps or entire road networks.
Abstract: The increasing need for efficient traffic optimization measures is making reliable, fast and robust methods for traffic simulation more and more important. Apart from developing cellular automata models of traffic flow, this need has stimulated studies of suitable numerical algorithms that can solve macroscopic traffic equations based on partial differential equations. The numerical integration of partial differential equations is a particularly difficult task, and there is no generally applicable method. In contrast to ordinary differential equations, the most natural explicit finite difference methods are often numerically unstable, even for very small discretizations of space and time. In general, numerical solutions to partial differential equations require special methods, which work only under certain conditions. Implicit integration methods are usually more stable but they require the frequent solution of linear systems with multidiagonal matrices. In this article, we discuss only explicit methods, because they are useful for the varying boundary conditions found in realistic traffic simulations, where data is continuously fed into the simulation. In addition, explicit methods are more flexible for the simulation of on- and off-ramps or entire road networks.

103 citations


Journal ArticleDOI
G. Myers1
01 May 1999
TL;DR: This article describes three current approaches for completing the sequencing of the complete human DNA sequence using the latest DNA sequencing methods.
Abstract: Computation is integrally and powerfully involved with the DNA sequencing technology that promises to reveal the complete human DNA sequence in the next several years. After introducing the latest DNA sequencing methods, this article describes three current approaches for completing the sequencing.

80 citations


Journal ArticleDOI
01 Sep 1999
TL;DR: The authors provide a survey of state-of-the-art molecular dynamics simulation of materials, shedding light on various facets of the rich phenomena of dynamic fracture, concluding that within 10 years, petaflop computers perform trillion-atom MD simulations to include the effects of microstructures that span diverse length scales up to the mesoscale regime above micron.
Abstract: The authors provide a survey of state-of-the-art molecular dynamics simulation of materials, shedding light on various facets of the rich phenomena of dynamic fracture. It is concluded that within 10 years, we will see petaflop computers perform trillion-atom MD simulations to include the effects of microstructures that span diverse length scales up to the mesoscale regime above micron. Within the same time frame, the interatomic potential models used in MD simulations will be refined with input from quantum-mechanical calculations of electronic structures. These atomistic simulations will be seamlessly combined with continuum schemes based on finite element methods to model truly macroscopic dynamic fractures at all length scales.

45 citations


Journal ArticleDOI
01 Jan 1999
TL;DR: The article presents a discussion on molecular dynamics simulation, which has proved capable of studying a broad range of phenomena associated with both simple and complex molecules and is destined to play an ever-increasing role in both science and engineering.
Abstract: The article presents a discussion on molecular dynamics (MD) simulation. MD requires a description of the molecules and the forces that act between them; a well known example is the Lennard-Jones potential, in which spherical particles repel one another at close range but otherwise attract. The MD simulation itself amounts to numerically integrating the equations of motion for systems of between a few hundred and a few million particles over many thousand (or more) timesteps. The paths the particles follow during the computation represent actual molecular trajectories. What does the future hold? MD simulation covers length scales ranging from the atomistic to entire microstructures. It has proved capable of studying a broad range of phenomena associated with both simple and complex molecules. It is free of many of the simplifying assumptions that tend to dominate theory and other modeling techniques. So, after making the reasonable extrapolation that computer power will continue to grow at its present rate, the author has little doubt that MD is destined to play an ever-increasing role in both science and engineering.

44 citations


Journal ArticleDOI
01 Mar 1999
TL;DR: The authors address the challenge of extracting high-precision measurements of cosmological parameters from upcoming megapixel cosmic-microwave data sets by discussing the computational challenges associated with current methods for going from the time streams to multifrequency sky maps.
Abstract: The authors address the challenge of extracting high-precision measurements of cosmological parameters from upcoming megapixel cosmic-microwave data sets. They discuss the computational challenges associated with current methods for going from the time streams to multifrequency sky maps, from these maps to the different sky signals, and from there to the statistical properties of the cosmic microwave background.

42 citations


Journal ArticleDOI
01 Jul 1999
TL;DR: The authors present a visualization methodology which provides users with a way to alter perspectives and interpret visualization so that they can quickly identify trends, outliers, and possible clusters while tuning for a particular context.
Abstract: The authors present a visualization methodology which provides users with a way to alter perspectives and interpret visualization so that they can quickly identify trends, outliers, and possible clusters while tuning for a particular context. The technology developed for text mining is called Trust, or Text Representation Using Subspace Transformation. Trust provides an analysis environment that can supply meaningful representations of text documents; it also supports the functional ability to visually present a collection of documents in a meaningful context that allows for user insight and textual content. Contrary to other similar technologies, Trust applies a novel analysis ability that allows different subspaces to generate views, providing content information for the basis of the visualization and allowing an analyst to specify subspaces for it based on content.

25 citations


Journal ArticleDOI
B. Segee1
01 Jan 1999
TL;DR: Methods in Neuronal Modeling: From Ions to Networks, 2nd edition, Christof Koch and Idan Segev, eds, MIT Press, Cambridge, Mass., 1998, 844 pp., ISBN 0-262-11231-0, $60.00, cloth.
Abstract: Methods in Neuronal Modeling: From Ions to Networks, 2nd edition, Christof Koch and Idan Segev, eds, MIT Press, Cambridge, Mass., 1998, 844 pp., ISBN 0-262-11231-0, $60.00, cloth.

23 citations


Journal ArticleDOI
01 Mar 1999
TL;DR: Some of the more exciting developments that have greatly helped research efforts, addressing the structural properties of metals by large-scale MD simulations (http://bifrost.lanl.gov/MD/MD) and that can be applied to many other subfields of scientific computation.
Abstract: There has been a great leap forward in large scale molecular dynamics simulations due to both the growing availability of massively parallel supercomputers and the algorithmic work on parallel neighbor-list and cell-based MD codes. Simulations involving tens or hundreds of millions of atoms have gone from being a computational curiosity to being a routine scientific tool used to study diverse phenomena; everything from the propagation of cracks and shock waves through various materials to the surprisingly complex processes that occur when a pair of extended dislocations intersect. We convey some of the more exciting developments that have greatly helped our own research efforts, addressing the structural properties of metals by large-scale MD simulations (http://bifrost.lanl.gov/MD/MD.html) and that can be applied to many other subfields of scientific computation.

Journal ArticleDOI
01 Nov 1999
TL;DR: To address difficulties with contact analysis in mechanical design-namely in planar mechanical systems with complex part shapes, tight fits, and contact changes-the authors have developed a general contact-analysis method based on configuration space computation.
Abstract: To address difficulties with contact analysis in mechanical design-namely in planar mechanical systems with complex part shapes, tight fits, and contact changes-the authors have developed a general contact-analysis method based on configuration space computation.

Journal ArticleDOI
01 Sep 1999
TL;DR: The article presents a generalized method, using the Fourier transform, for computing and graphically displaying the Fresnel diffraction intensity pattern for any planar aperture, and can obtain the entire pattern at once, in contrast to the standard methods' point-by-point evaluation process.
Abstract: The study of Fresnel diffraction is an integral part of any course in physical optics. Fresnel diffraction occurs when an aperture is illuminated with coherent light and the resulting diffraction pattern appears on a screen, a finite distance from the aperture. In general, the techniques found in standard optics texts' to compute the intensity of the diffraction pattern as a function of position on the screen are limited and aperture-specific. However, as texts on modern optics show, we can formulate Fresnel diffraction in terms of the Fourier transform of a modified aperture function. The article presents a generalized method, using the Fourier transform, for computing and graphically displaying the Fresnel diffraction intensity pattern for any planar aperture. With the advent of the PC and packaged mathematical software containing the fast Fourier transform algorithm, it is now possible to perform these calculations with minimum effort. The advantage of this method is that you can obtain the entire pattern at once, in contrast to the standard methods' point-by-point evaluation process.

Journal ArticleDOI
01 Jul 1999
TL;DR: This work discusses how it uses computer graphics, physics based modeling, and interactive visualization to assist knee-surgery study and osteotomies, and its patient-specific 3D knee surface model helps to calculate the contact stresses at the knee joint, perform virtual surgery, and record data from surgery simulation.
Abstract: Knee osteotomy is a kind of orthopedic surgery to realign the lower limb by opening or cutting a bone wedge from the leg. It is a better alternative than other types of knee-replacement surgeries, especially for young people. However, knee osteotomy requires understanding the imbalance of stresses at the knee joint, analyzing an abnormal gait cycle, and cutting the bone wedge precisely. Therefore, it is difficult and can cause further damage even though it is simply a bone cut. While some computer based surgical simulation systems have been developed to help surgeons perform knee surgeries, the knee models used either are not patient-specific or lack kinematic and kinetic information. We discuss how we use computer graphics, physics based modeling, and interactive visualization to assist knee-surgery study and osteotomies. Our patient-specific 3D knee surface model helps us calculate the contact stresses at the knee joint, perform virtual surgery, and record data from surgery simulation.

Journal ArticleDOI
01 Jan 1999
TL;DR: This work introduces a method for simulating the dust behaviors that a fast-traveling vehicle causes, which combines particle systems, rigid-body particle dynamics, computational fluid dynamics (CFD), rendering, and visualization techniques.
Abstract: Simulating physically realistic, complex dust behaviors is useful in interactive graphics applications, such as those used for education, entertainment, or training. Training in virtual environments is a major topic for research and applications, and generating dust behaviors in real time significantly increases the realism of the simulated training environment. We introduce a method for simulating the dust behaviors that a fast-traveling vehicle causes. Our method combines particle systems, rigid-body particle dynamics, computational fluid dynamics (CFD), rendering, and visualization techniques. Our work integrates physics-based computing and graphical visualization for applications in simulated virtual environments.

Journal ArticleDOI
01 Sep 1999
TL;DR: In this paper, the authors proposed to use large-scale scientific computing during the last decade to study dynamic fracture in materials and found that the results showed that the stress fields near the crack tip are highly nonlinear, and stress-field decay far from the tip is very slow.
Abstract: How materials fracture is one of the most fundamental problems in materials science and engineering. Typically, the stress fields near the crack tip are highly nonlinear, and stress-field decay far from the tip is very slow, making fracture a difficult problem. Progress in large-scale scientific computing during the last decade has helped researchers successfully study dynamic fracture.

Journal ArticleDOI
01 Jan 1999
TL;DR: This algorithm divides the environment into reflection, transmission, and possibly diffraction regions, then identifies infeasible image combinations and locations affected by similar propagation mechanisms a priori, enabling the fast computation of radio coverage that service providers require.
Abstract: Driven by our increasingly mobile society's ever growing demand for communications, today's providers of wireless telecommunication services must ensure reliable radio coverage "everywhere". Computer based techniques that reduce the need for expensive experimental measurements are invaluable tools for achieving this objective. Various computational algorithms based mainly on ray tracing have emerged in recent years for determining radio coverage. Although the output of these algorithms agrees well with measurement results, execution times remain high. To provide faster computational methods for determining radio coverage, we have developed a fast 3D method of regions (MR) algorithm. This algorithm divides the environment into reflection, transmission, and possibly diffraction regions, then identifies infeasible image combinations and locations affected by similar propagation mechanisms a priori. When implemented on a parallel machine, our algorithm provides close-to-ideal speedups, enabling the fast computation of radio coverage that service providers require.

Journal ArticleDOI
01 May 1999
TL;DR: This article shows how to perform simple abstract computations on state vectors and discusses the construction of lazy algorithms that enormously simplify manipulation of potentially infinite data structures or iterative processes.
Abstract: Modern functional programming languages and lazy functional techniques are useful for describing and implementing abstract mathematical objects in quantum mechanics. Scientists can use them both for pedagogical purposes and for real, not too computationally intensive, but conceptually and algorithmically difficult applications. This article shows how to perform simple abstract computations on state vectors and discusses the construction of lazy algorithms that enormously simplify manipulation of potentially infinite data structures or iterative processes. Lazy functional techniques can often replace the use of symbolic computer algebra packages, while also offering an interesting algorithmic complement to the manipulation of mathematical data. These techniques are more efficient than blindly used symbolic algebra and are easy to integrate with the numerical code.

Journal ArticleDOI
01 Sep 1999
TL;DR: In this paper, the authors discuss the microscale processes that control the development of fast fracture in brittle materials, and discuss the different approaches that can be used to solve the problem of brittle fracture.
Abstract: Brittle fracture in solids has attracted the attention of engineers and physicists because of both its technological interest and inherent scientific curiosity. Bridging the different approaches requires extensive experimental investigations and multiscale computational schemes. The authors discuss the microscale processes that control the development of fast fracture in brittle materials.

Journal ArticleDOI
K. Northover1, A.W. Lo
01 Nov 1999
TL;DR: Over the past 50 years, first in the US and then throughout the rest of the world, involvement in financial markets has grown spectacularly and touches hundreds of millions of people.
Abstract: In this issue are articles that demonstrate various computational-finance business applications and discussion of current research directions and opportunities. It is hoped this special issue illustrates challenges for coinputational scientists in the financial markets. Many opportunities exist for rescarch, but practical applications in the financial industry will not wait for the completion of research programs. As this year?s appearance of online trading demonstrates, the markets develop at their own pace. Business users will continue to demand accurate modeling, rapid implementation, and the ability to deliver without delay.

Journal ArticleDOI
John R. Rice1
01 Mar 1999
TL;DR: The author's focus is on the less obvious challenges of development, and considers multiphysics phenomena, software validation, multiscale phenomena and computational intelligence.
Abstract: Computational science's driving force has been and will continue to be the steady and rapid growth in available raw computing power. This growth exceeds anything else witnessed in the history of technology. The challenge for the 21st Century is to exploit properly this enormous potential. Several fairly obvious directions of development can (and do) present great technical challenges, but the author's focus is on the less obvious challenges. He considers multiphysics phenomena, software validation, multiscale phenomena and computational intelligence.

Journal ArticleDOI
01 Jan 1999
TL;DR: This review sampled eight commercial products costing from $300 to $1,000 to see how useful they would be to this magazine's readers, and found several of them meeting the needs of S&E.
Abstract: 17 signed for the needs of pollsters, quality control, medical researchers, and sociologists. Whether they also fit the needs of S&E is a complicated question for which there is no single answer. This review sampled eight commercial products costing from $300 to $1,000 to see how useful they would be to this magazine's readers. Table 1 lists the main features of the programs we reviewed. Unless otherwise noted, all provide standard summary statistics (standard deviations and histograms) for thousands of data points, import and export data from tab-delimited files and computer spreadsheets, do simple linear regressions and ANOVA calculations, and can produce many types of formattable graphic displays and printouts. All also have extensive online help and free technical support. Other useful features of these programs, such as fast Fourier transforms (FFT), con-volutions, or integration, are listed in the table or given later in this article. When you evaluate these and other statistics programs, keep your individual needs in mind. Do you need weighted or nonlinear fits? Must you have publication quality graphs? Do you need exploratory or nonparametric statistics, or do you want to use the program to help design experiments? Many of the vendors offer demonstration or evaluation versions of these programs, which you can try out for a limited time with your own experimental data. We recommend that you do so before you buy, because several of these programs offer quite dissimilar interfaces that might not appeal to all users. Only when you have found programs meeting your needs should you factor in such issues as ease of use, quality of documentation, cost, technical support, and flexibility. If there is one feature where the needs of S&E differ from statisticians, it is in the area of curve fitting. Even the vocabulary differs—statisticians call it regression and often just need to find an equation—any equation—that will enable them to summarize their data mathematically. Scientists usually have a pretty good idea of the mathematical model describing their data, and they want to know not only the fitting parameters that best fit the data but also the uncertainty of those parameters. While most of the reviewed programs provided this information for simple linear or polynomial fits, many did not furnish the parameter uncertainties for nonlinear fits, such as exponentials or power laws, or allow fits to weighted data where individual data points have different variances. If you need this type …

Journal ArticleDOI
01 Jan 1999
TL;DR: Computer simulations address the question of whether there is an evolutionary advantage of sexual over asexual reproduction and other areas addressed include mutation rates, biological aging of men and women and reasons for menopause in women.
Abstract: The article examines various computer simulations of biological and evolutionary systems. The simulations address the question of whether there is an evolutionary advantage of sexual over asexual reproduction. Other areas addressed by such simulations include mutation rates, biological aging of men and women and reasons for menopause in women.

Journal ArticleDOI
01 Jul 1999
TL;DR: In this article, a colleague of the authors recently asked for a way to sample from a Poisson distribution to simulate the way metallic flakes cluster in paint, where the flakes are not distributed uniformly at random throughout the paint binder but form clusters around center points.
Abstract: A colleague of the authors recently asked for a way to sample from a Poisson distribution to simulate the way metallic flakes cluster in paint. The flakes aren't distributed uniformly at random throughout the paint binder but form clusters around center points. The number of flakes around a center is distributed like the Poisson distribution. The article presents three ways to sample from the Poisson distribution. We describe the usual method, a faster method that uses some preprocessing, and an even faster method with more preprocessing. The last two methods work with any discrete probability distribution. Depending on how often we need to generate samples, paying extra for a deluxe model of sample generation might be worth it.

Journal ArticleDOI
01 Sep 1999
TL;DR: This article outlines a statistically reliable and efficient method for analyzing signal counts superimposed on background counts that are not subtracted before the data are analyzed, and produces reliable confidence levels for the estimated signal.
Abstract: When phenomena occur at the threshold of detection capabilities, event rates are low and observers usually record data as discrete counts. In this article, I outline a statistically reliable and efficient method for analyzing signal counts superimposed on background counts that are not subtracted before the data are analyzed. The Poisson statistics of the data are used with a maximum likelihood estimation (MLE) of the parameters. Although I devised the method to improve estimates of yields in low-energy nuclear reactions, it is generally applicable because it is independent of background or signal-shape details. In addition to estimating total signal counts, the analysis produces reliable confidence levels for the estimated signal.

Journal ArticleDOI
01 Sep 1999
TL;DR: Taken as a whole, research efforts in biocomputing and genomic evolution offer fertile ground for the development of new theoretical insights and future avenues for research in multiple fields, while also posing new ways of viewing old or existing problems.
Abstract: Taken as a whole, research efforts in biocomputing and genomic evolution offer fertile ground for the development of new theoretical insights and future avenues for research in multiple fields, while also posing new ways of viewing old or existing problems. Not only do biological and computational sciences stand to benefit, but continued work in this area might even provide the hands-on engineer with material for new tools and venues for technological innovation inspired by nature.

Journal ArticleDOI
01 May 1999
TL;DR: This paper considers how atomistic simulations can clarify phase transitions of simple systems, formed from atoms of a pure substance or from small molecules, in quantitative details.
Abstract: Since Monte Carlo and molecular dynamics (MD) simulations were invented as a tool less than 50 years ago to study thermal properties of condensed matter, two general problems have come into focus: fluid properties and phase transitions. The paper considers how atomistic simulations can clarify phase transitions of simple systems, formed from atoms of a pure substance or from small molecules, in quantitative details.

Journal ArticleDOI
01 Jan 1999

Journal ArticleDOI
01 Mar 1999
TL;DR: The role of computation in modern cosmological investigations has been discussed in this article, where the authors discuss the computational challenges associated with major observational projects and two detail challenges facing modern simulation efforts.
Abstract: Ee have constructed this theme issue on the role of computation in modern cosmological investigations principally because professionals across all disciplines of science and engineering are fascinated by questions related to the universe’s origin. However, an issue on this topic is also warranted because over the past decade, rapid technological advancements have triggered revolutionary developments in this field. Two of our theme articles discuss the computational challenges associated with major observational projects and two detail challenges facing modern simulation efforts. The authors hope that through CiSE’s interdisciplinary

Journal ArticleDOI
01 Sep 1999
TL;DR: A fictional description of the terms of service that might be inflicted on newborn infants if their future life were governed by legal counsel for service providers on the Internet.
Abstract: A fictional description of the terms of service that might be inflicted on newborn infants if their future life were governed by legal counsel for service providers on the Internet.