scispace - formally typeset
Search or ask a question

Showing papers presented at "Computational Science and Engineering in 2000"


Journal ArticleDOI
01 Sep 2000
TL;DR: This software provides a platform on which the biomechanics community can build a library of simulations that can be exchanged, tested and improved through multi-institutional collaboration.
Abstract: Computer simulations provide a framework for exploring the biomechanics of movement. The authors have developed a software system that lets users create computer graphics simulations of human and animal movement. This software provides a platform on which the biomechanics community can build a library of simulations that can be exchanged, tested and improved through multi-institutional collaboration.

289 citations


Journal ArticleDOI
01 Sep 2000
TL;DR: VPython as mentioned in this paper is an easy-to-use package for interactive 3D graphics, allowing even beginning physics students without prior programming experience to write programs with real-time 3D visualizations.
Abstract: VPython, a remarkable easy-to-use package for interactive 3D graphics, lets even beginning physics students without prior programming experience write programs with real time 3D visualizations. Developed at Carnegie Mellon University, VPython is part of an effort to develop a more modern first-year physics curriculum.

98 citations


Journal ArticleDOI
01 Jan 2000
TL;DR: The article outlines the decompositional approach, comments on its history, and surveys the six most widely used decompositions: Cholesky decomposition; pivoted LU decomposition.
Abstract: The introduction of matrix decomposition into numerical linear algebra revolutionized matrix computations. The article outlines the decompositional approach, comments on its history, and surveys the six most widely used decompositions: Cholesky decomposition; pivoted LU decomposition; QR decomposition; spectral decomposition; Schur decomposition; and singular value decomposition.

94 citations


Journal ArticleDOI
01 May 2000
TL;DR: The concept of self-organized criticality evolved from studies of three simple cellular automata models: the forest-fire, slider-block, and sandpile models, which are associated with natural hazards which have frequency-size statistics that are well approximated by power law distributions.
Abstract: The concept of self-organized criticality evolved from studies of three simple cellular automata models: the forest-fire, slider-block, and sandpile models. Each model is associated with natural hazards which have frequency-size statistics that are well approximated by power law distributions. These distributions have important implications for probabilistic hazard assessments.

71 citations


Journal ArticleDOI
01 Jan 2000
TL;DR: In this paper, the QR (or orthogonal triangular) matrix factorization algorithm is described and its major virtues are discussed. But the authors focus on the symmetric case, which brings with it guaranteed convergence and an elegant implementation.
Abstract: After a brief sketch of the early days of eigenvalue hunting, the author describes the QR (or orthogonal triangular) matrix factorization algorithm and its major virtues. The symmetric case brings with it guaranteed convergence and an elegant implementation. An account of the impressive discovery of the algorithm brings the article to a close.

65 citations


Journal ArticleDOI
01 Nov 2000
TL;DR: The extremal dynamics of the Bak-Sneppen model can be converted into an optimization algorithm called extremal optimization as discussed by the authors, which is a generalization of the BSP model.
Abstract: The extremal dynamics of the Bak-Sneppen model can be converted into an optimization algorithm called extremal optimization. Attractive features of the model include the following: it is straightforward to relate the sum of all fitnesses to the cost function of the system; in the self-organized critical state to which the system inevitably evolves, almost all species have a much better than random fitness; most species preserve a good fitness for long times unless they are connected to poorly adapted species, providing the system with a long memory; the system retains a potential for large, hill-climbing fluctuations at any stage; and the model accomplishes these features without any control parameters.

63 citations


Journal ArticleDOI
01 Jul 2000
TL;DR: A parallel implementation of an electronic-structure application on the Cray T3D and T3E has enabled the authors to perform some breakthrough calculations, for example, predicting the optical properties of systems on the order of 1,000 atoms from first principles.
Abstract: The authors present a parallel implementation of an electronic-structure application on the Cray T3D and T3E. This implementation has enabled the authors to perform some breakthrough calculations, for example, predicting the optical properties of systems on the order of 1,000 atoms from first principles.

57 citations


Journal ArticleDOI
01 Nov 2000
TL;DR: In this paper, the authors focus on how this technology has been extended to perform design optimization and robustness assessment for vehicle crashworthiness, and they focus on the use of explicit FE codes from several third-party mechanical computer aided engineering software vendors.
Abstract: Over the past 10 years (1990-2000), the computer analysis of vehicle crashworthiness has become a powerful and effective tool, reducing the cost and time to market of new vehicles that meet corporate and government crash safety requirements. Crash simulation is fundamentally computation-intensive, and it requires fast and powerful supercomputers to ensure reasonable turnaround time for the analyses. Because the explicit FE method is computationally efficient, researchers generally prefer it for crash simulation. Explicit FE codes are available from several third-party mechanical computer aided engineering (CAE) software vendors, and the major automotive manufacturers, including Ford Motor Company, use them for crash simulation. The article focuses on how this technology has been extended to perform design optimization and robustness assessment.

40 citations


Journal ArticleDOI
01 Jan 2000
TL;DR: In this article, the authors review the history and current importance of Krylov subspace iteration algorithms, and present a survey article on the current state-of-the-art.
Abstract: This survey article reviews the history and current importance of Krylov subspace iteration algorithms.

34 citations


Journal ArticleDOI
Tamar Schlick1, Daniel A. Beard1, Jing Huang1, D.A. Strahs1, Xiaoliang Qian1 
01 Nov 2000
TL;DR: In this paper, the authors describe computational challenges, solution approaches, and applications that their group has performed in DNA dynamics simulation, and present a set of algorithms appropriate for DNA's impressive spectrum of spatial and temporal levels.
Abstract: Simulating DNA's dynamics requires a sophisticated array of algorithms appropriate for DNA's impressive spectrum of spatial and temporal levels. The authors describe computational challenges, solution approaches, and applications that their group has performed in DNA dynamics simulation.

33 citations


Journal ArticleDOI
01 May 2000
TL;DR: Interferometric synthetic aperture radar and its spatially dense, accurate deformation measurements have advanced studies of the Earth's crust and the mapping of glacier and ice-sheet motions in the environmentally-sensitive and diagnostic polar regions.
Abstract: High-speed and large-volume computational capabilities have affected many branches of scientific research. Interferometric synthetic aperture radar (InSAR) and its spatially dense, accurate deformation measurements have advanced studies of the Earth's crust. The most important contributions are related to seismic and volcanic processes and the mapping of glacier and ice-sheet motions in the environmentally-sensitive and diagnostic polar regions.

Journal ArticleDOI
01 Mar 2000
TL;DR: The Center for Astrophysical Thermonuclear Flashes is constructing a new generation of codes designed to study runaway thermonuclear burning on the surface or in the interior of evolved compact stars with a major advance toward a fully flexible code for solving general astrophysical fluid dynamics problems.
Abstract: The Center for Astrophysical Thermonuclear Flashes is constructing a new generation of codes designed to study runaway thermonuclear burning on the surface or in the interior of evolved compact stars. The center has completed the first version of Flash, Flash-1, which addresses various astrophysics problems. Flash-1 represents a major advance toward a fully flexible code for solving general astrophysical fluid dynamics problems. Flash-1 is modular and adaptive and operates in parallel computing environments. It was designed to let users configure initial and boundary conditions, change algorithms, and add new physical effects with minimal effort. It uses the Paramesh library to manage a block-structured adaptive grid, placing resolution elements only where needed most. It also uses the message passing interface (MPI) library to achieve portability and scalability on a variety of different message passing parallel computers.

Journal ArticleDOI
M. Parrinello1
01 Nov 2000
TL;DR: The author reviews the principles on which molecular dynamics is based and illustrates how, in combination with modern density functional theory for the electronic structures, it provides a powerful tool for studying complex chemical processes without adjustable parameters.
Abstract: The author reviews the principles on which molecular dynamics is based. He also illustrates how, in combination with modern density functional theory for the electronic structures, it provides a powerful tool for studying complex chemical processes without adjustable parameters.

Journal ArticleDOI
01 Sep 2000
TL;DR: The nature of the steady states and fluctuations, as well as the kinetics of evolution toward these steady states, are understood and the effects of implementing new traffic rules and control systems without doing potentially hazardous experiments with real traffic are explored.
Abstract: Computer simulation of vehicular traffic can give us insight into a wide variety of phenomena observed in real traffic flow, thereby leading to a greater qualitative understanding of its basic principles. These simulations also help us explore the effects of implementing new traffic rules and control systems without doing potentially hazardous experiments with real traffic. For the sake of computational efficiency, many of the microscopic traffic models developed in recent years have been formulated using the language of cellular automata. These models of vehicular traffic belong to a class of nonequilibrium systems called driven-diffusive lattice gases (B. Schmittmann and R.K.P. Zia, 1995; G. Schutz, 2000). Over the last few years, several groups have been doing extensive computer simulations to understand the nature of the steady states and fluctuations, as well as the kinetics of evolution toward these steady states. The starting points of our discussion are the Nagel-Schreckenberg model (K. Nagel and M. Schreckenberg, 1992) and the Biham-Middleton-Levine model (O. Biham et al., 1992).

Journal ArticleDOI
01 Jul 2000
TL;DR: The author's approach generalizes previously reported results on wavelet-based operator compression to nonuniform grids and fairly general domains.
Abstract: Second-generation surface wavelets can be an effective tool for compressing integral operator matrices arising from large-scale simulations of 3D problems. The author's approach generalizes previously reported results on wavelet-based operator compression to nonuniform grids and fairly general domains.

Journal ArticleDOI
01 Jan 2000
TL;DR: The article presents a brief description of the techniques used in the Fortran I compiler for the parsing of expressions, loop optimization, and register allocation.
Abstract: The Fortran I compiler was the first demonstration that it is possible to automatically generate efficient machine code from high-level languages. It has thus been enormously influential. The article presents a brief description of the techniques used in the Fortran I compiler for the parsing of expressions, loop optimization, and register allocation.

Journal ArticleDOI
01 Mar 2000
TL;DR: In this article, the authors present a virtual test facility (VTF) for full 3D parallel simulation of the dynamic response of materials undergoing compression due to shock waves, including high explosives.
Abstract: The goal of the Caltech Center is to construct a virtual test facility (VTF): a problem solving environment for full 3D parallel simulation of the dynamic response of materials undergoing compression due to shock waves. The objective is to design a software environment that will: facilitate computation in a variety of experiments in which strong shock waves impinge on targets comprising various combinations of materials; compute the target materials' subsequent dynamic response; and validate these computations against experimental data. Successfully constructing such a facility requires modeling of the highest accuracy. We must model at atomistic scales to correctly describe the material properties of the target materials and high explosives; at intermediate (meso) scales to understand the micromechanical response of the target materials; and at continuum scales to capture properly the evolution of macroscopic effects. The article outlines such a test facility. Although it is a very simplified version of the facilities found in a shock-compression laboratory, our VTF includes all the basic features, offering a problem solving environment for validating experiments and facilitating further development of simulation capabilities.

Book ChapterDOI
01 Jan 2000
TL;DR: This series contains monographs of lecture notes type, lecture course material, and high-quality proceedings on topics described by the term "computational science and engineering".
Abstract: This series contains monographs of lecture notes type, lecture course material, and high-quality proceedings on topics described by the term "computational science and engineering". This includes theoretical aspects of scientific computing such as mathematical modeling, optimization methods, discretization techniques, multiscale approaches, fast solution algorithms, parallelization, and visualization methods as well as the application of these approaches throughout the disciplines of biology, chemistry, physics, engineering, earth sciences, and economics.

Book ChapterDOI
01 Jan 2000
TL;DR: A finite difference multigrid technique is presented that efficiently solves the optimal control problem associated to the solid fuel ignition model.
Abstract: We present a finite difference multigrid technique that efficiently solves the optimal control problem associated to the solid fuel ignition model.

Journal ArticleDOI
01 Nov 2000
TL;DR: The authors discuss their strategies for accurate modeling, and Cobalt, a Beowulf supercomputer they built and optimized for quantum chemistry applications.
Abstract: Developing new forms of plastic and improving the efficiency of the catalysts that produce them requires detailed insight into the reaction steps in the polymerization process. This calls for thorough analysis and hefty computational resources. The authors discuss their strategies for accurate modeling, and Cobalt, a Beowulf supercomputer they built and optimized for quantum chemistry applications.

Journal ArticleDOI
01 Jul 2000
TL;DR: At the Institute for Science Education, a simulation program named xyZET has been developed and combined with an introductory course in mechanics to demonstrate its usefulness in teaching and learning physics.
Abstract: At the Institute for Science Education, a simulation program named xyZET has been developed and combined with an introductory course in mechanics to demonstrate its usefulness in teaching and learning physics. One of its distinct features enables the visualization of the movement of interacting particles in 3D. It adds value to teaching and learning complex processes, where visualization can help reduce cognitive load. The program can be used during lectures, where a simulation can often speak for itself in place of verbal explanation. The program also supports the development of exercise material for individualized learning. Such material in the form of Web pages can be combined with xyZET simulations, controlled by applets. Its basic features are discussed.

Journal ArticleDOI
01 Nov 2000
TL;DR: Researchers at the Program for Climate Model Diagnosis and Intercomparison has built a prototype data server and catalog system for accessing data sets generated by climate models using a variety of open-source software components and languages.
Abstract: Researchers at the Program for Climate Model Diagnosis and Intercomparison has built a prototype data server and catalog system for accessing data sets generated by climate models. The output of a single climate model run might be large, on the order of hundreds of gigabytes stored in thousands of physical files. A number of modeling groups around the world produce and manage such data sets. Organizing, managing, and providing distributed access to this data is a significant challenge. Our prototype addresses this challenge using a variety of open-source software components and languages, including the Python language, the Medusa server, and Lightweight Directory Access Protocol (LDAP) directories.

Book ChapterDOI
01 Jun 2000
TL;DR: In this article, the combination of numerical simulations, interactive data mining, and visualization has proven to be very powerful when attempting to understand complex astrophysical systems, and examples from numerical magneto-hydrodynamical simulations within the context of astrophysical dynamos are discussed.
Abstract: The combination of numerical simulations, interactive data-mining, and visualization has proven to be very powerful when attempting to understand complex astrophysical systems. The present contribution aims at illustrating this by discussing examples from numerical magneto-hydrodynamical simulations within the context of astrophysical dynamos. Two qualitatively different simulations are discussed: A study of the magnetic field topology in a kinematic dynamo model, and a model of the buoyant rise of a twisted magnetic flux rope through a stellar convection zone.

Journal ArticleDOI
01 May 2000
TL;DR: One technique for working with extremely large integers, having perhaps thousands of digits, using only standard hardware and software, uses modular arithmetic in a way that lets us recover the actual integers if necessary.
Abstract: Describes one technique for working with extremely large integers, having perhaps thousands of digits, using only standard hardware and software. This technique uses modular arithmetic in a way that lets us recover the actual integers if necessary.

Journal ArticleDOI
01 Nov 2000
TL;DR: The Janus model, cautious and skeptical, has particular value when everything looks good, the present glitters, and the future seems destined to glow even more brightly.
Abstract: they are improving the universe. They mix thunderbolts and other blunt instruments of intervention with jealousy, treachery, and manipulation. Such direct management is hard work with uncertain results, so the goddesses and gods have encouraged the proliferation of myths that guide as well as entertain the mass audience down on earth. Myths tend, however, to ossify divine whims that no longer make sense in an unpredictable world. Janus is different. He comes equipped with a rearview mirror as well as a searchlight. One face looks toward the future, like his fellow deities, but he might see more clearly than his eminent colleagues, because he perceives rather than interferes. His other face looks backward toward past disappointments. Janus is mindful that similar—or worse—disasters are inevitable. Scientists and engineers are, for the most part, inclined to identify with the top-billed gods. After all, both scientists and the top-ranked deities analyze and attempt to solve the world’s problems. Melioristic expectations of constant improvement, achievement, and progress are especially prominent in computing. Computing cherishes its own myths. It is assumed that microprocessors will always get more powerful, bandwidth will always get wider, and this will always be good for everyone. Ever since the toddler days of computing five decades ago, forecasters and promoters have claimed that it would bring dazzling benefits. Many of these hopes have been realized. In these successes, you can easily forget that the delay between prediction and accomplishment has almost always been longer than initial expectations, and solutions are rarely as perfect as anticipated. A great deal of waste, confusion, and miscarried effort have accompanied constant shortfalls between ballyhoo and reality. Mortals in the computer business have replicated on earth the dramatic but messy work style of the frontline Roman deities. Science, engineering, and the computational capabilities that have become so indispensable for so many aspects of science might accomplish even more if they adopted Janus as their patron. In particular, practitioners of these arts should give more attention to negative consequences and the prospects of failed promises. They should weigh carefully whether they’ve based their confidence in the future on myths or reality. The Janus model, cautious and skeptical, has particular value when everything looks good, the present glitters, and the future seems destined to glow even more brightly. Gazing down at earth in the early 21st century, the top-billed stars of the Pantheon brag about the apparent strides in economics—a stubborn, opaque field whose dimness has been rendered less impenetrable by the application of computational tools. In the United States, for example, growth, unemployment, inflation, and the federal deficit have all moved in the preferred directions. The stock markets oscillate nervously, but they remain at nearly divine levels. Gods and mortals alike squabble over the explanations. Many readers of economic entrails—including that mighty augur, Alan Greenspan, chairman of the US Federal Reserve—attribute these happy omens to steady improvements

Journal ArticleDOI
01 Nov 2000
TL;DR: The article illustrates the challenges and opportunities with two specific examples of industrial problems and Computational chemistry needs to focus on method development at all length and time scales.
Abstract: To remain competitive, the US chemical industry needs to use all available tools to increase productivity. Computational chemistry can help design new materials and improve manufacturing efficiency. However, it needs to focus on method development at all length and time scales. The article illustrates the challenges and opportunities with two specific examples of industrial problems.

Journal ArticleDOI
01 Jul 2000
TL;DR: The paper discusses the use of Macromedia's Flash software to create and incorporate low bandwidth animations that give visitors considerable flexibility in how they interact with the Web page.
Abstract: These days, a visit to the World Wide Web usually involves a visual treat of things that move, blink, fade in or out, and morph from one shape to another. They sometimes even make noise. Banner advertisements are particularly notorious for using these effects to grab your attention. There are other good uses for animation, however, not the least of which is to illustrate a technological or scientific principle that would otherwise be hard to get across in a static, 2D graphic. The paper discusses the use of Macromedia's Flash software to create and incorporate low bandwidth animations that give visitors considerable flexibility in how they interact with the Web page.

Journal ArticleDOI
01 Nov 2000
TL;DR: The author has restructured the first year Electricity and Magnetism course that he has been teaching for a number of years, using only Microsoft Excel and elements of Microsoft's Visual Basic Applications, which students learn during the previous semester.
Abstract: The author has restructured the first year Electricity and Magnetism course that he has been teaching for a number of years in the Physics Department of the University of Ioannina. This is a required course for all our first-year physics students during the spring semester; about 150 students enter our physics program every year. The course includes a computer laboratory, which replaces the traditional recitation sessions accompanying the four-hour-per-week formal lectures. In looking at how to organize the class, he studied a number of introductory physics courses that use computers but he found them to be mostly demonstrational. They ask students to execute a few instructions-most of which seem somewhat artificial-and then follow a demonstration that has been prepared for each student. However, he believes that the method most pedagogically beneficial to students is one in which they do everything themselves, understanding what they are doing at every step. To implement this approach, he considered several computer packages such as MathCad and Mathematica, but they would require an inordinately long period of instruction before a first-year student could produce useful results. Therefore, he used only Microsoft Excel and elements of Microsoft's Visual Basic Applications, which students learn during the previous semester.

Book ChapterDOI
01 Jan 2000
TL;DR: In this article, an AUSM-based discretization method, using an explicit third-order discretisation for the convective part, a line-implicit central Discretization for the acoustic part and for the diffusive part, has been developed for incompressible and low speed compressible Navier-Stokes equations.
Abstract: An AUSM (Advection Upstream Splitting Method) based discretization method, using an explicit third-order discretization for the convective part, a line-implicit central discretization for the acoustic part and for the diffusive part, has been developed for incompressible and low speed compressible Navier-Stokes equations. The lines are chosen in the direction of the gridpoints with shortest connection. The semi-implicit line method is used in multistage form because of the explicit third-order discretization of the convective part. Multigrid is used as acceleration technique. Due to the implicit treatment of the acoustic and the diffusive terms, the stiffness otherwise caused by high aspect ratio cells is removed. Low Mach number stiffness is treated by a preconditioning technique. To ensure physical correct behaviour of the discretization for vanishing Mach number, extreme care has been taken. For vanishing Mach number, stabilization terms are added to the mass flux. Pressure and temperature stabilization terms are necessary. The coefficients of these terms are chosen so that correct scaling with Mach number is obtained. A blend of the low speed algorithm with the original AUSM algorithm developed for high speed applications has been constructed so that the resulting algorithm can be used at all speeds.

Book ChapterDOI
01 Jan 2000
TL;DR: A fundamental study is carried out to find the appropriate numerical treatment, allowing the use of the multigrid technique, for two-equation turbulence models, and convergence rates are compared.
Abstract: Source terms in two-equation turbulence models require special attention with respect to numerical stability. They can be treated explicitly or implicitly, depending on the Jacobian of the source terms. A fundamental study is carried out to find the appropriate numerical treatment, allowing the use of the multigrid technique. Different low-Reynolds turbulence models are investigated (k — e and k — ω based models). Convergence rates are compared for different test cases (flat plate flow, backward-facing step flow). A comparison is made between solving all the equations in a coupled way and decoupling the NS-subsystem from the turbulence subsystem. Convergence rates are comparable. A comparison between the use of V-and W-cycles shows comparable convergence rates, too.