scispace - formally typeset
Search or ask a question

Showing papers presented at "Computational Science and Engineering in 2001"


Journal ArticleDOI
01 May 2001
TL;DR: Two public-domain programs that jointly predict macroscopic behavior are described: OOF (Object-Oriented Finite elements) and ppm2oof (Portable Pixel Map to OOF format translator).
Abstract: Determining a material's macroscopic properties given its microscopic structure is of fundamental importance to materials science. The authors describe two public-domain programs that jointly predict macroscopic behavior: OOF (Object-Oriented Finite elements) and ppm2oof (Portable Pixel Map to OOF format translator). The programs start from an image of the microstructure and end with results from finite-element calculations.

225 citations


Journal ArticleDOI
01 Jul 2001
TL;DR: Simulation approaches that seamlessly combine continuum mechanics with atomistic simulations and quantum mechanics, as well as computational and visualization issues associated with these simulations on massively parallel computers are described.
Abstract: The authors describe simulation approaches that seamlessly combine continuum mechanics with atomistic simulations and quantum mechanics. They also discuss computational and visualization issues associated with these simulations on massively parallel computers. Scientists are combining continuum mechanics and atomistic simulations through integrated multidisciplinary efforts so that a single simulation couples diverse length scales. However, the complexity of these hybrid schemes poses an unprecedented challenge, and developments in scalable parallel algorithms as well as interactive and immersive visualization are crucial for their success. This article describes such multiscale simulation approaches and associated computational issues using recent work as an example.

100 citations


Journal ArticleDOI
15 Sep 2001
TL;DR: The authors describe how computational tools, frequently in conjunction with advanced imaging tools, help to understand biomechanical effects in health and diseases.
Abstract: Factors such as local stresses seem to be important in the development of arterial diseases such as atherosclerosis. The authors describe how computational tools, frequently in conjunction with advanced imaging tools, help us understand these biomechanical effects in health and diseases.

45 citations


Journal ArticleDOI
15 Sep 2001
TL;DR: Emphasizing both I/O and computation, the authors apply several numerical methods including finite element analysis, wave equation integration, linear algebra subroutines, fast Fourier transforms, filters, and a variety of PDEs and ODEs.
Abstract: The authors discuss methods for expressing and tuning the performance of parallel programs, using two programming models in the same program: distributed and shared memory. Such methods are important for anyone who uses these large machines for parallel programs as well as for those who study combinations of the two programming models. The article outlines applications in hydrology, computational chemistry, general science, seismic processing, aeronautics, and computational physics. Emphasizing both I/O and computation, they apply several numerical methods including finite element analysis, wave equation integration, linear algebra subroutines, fast Fourier transforms (FFTs), filters, and a variety of PDEs (partial differential equations) and ODEs (ordinary differential equations).

33 citations


Journal ArticleDOI
01 May 2001
TL;DR: The authors present the basic dynamic portfolio optimization problem and then consider three aspects of it: taxes, investor preferences, and portfolio constraints.
Abstract: The authors describe a relatively simple problem that all investors face: managing a portfolio of financial securities over time to optimize a particular objective function. They show how complex such a problem can become when real-world constraints are incorporated into its formulation. More specifically, the authors present the basic dynamic portfolio optimization problem and then consider three aspects of it: taxes, investor preferences, and portfolio constraints. These three issues are by no means exhaustive, they merely illustrate examples of the kinds of challenges financial engineers face today.

26 citations


Journal ArticleDOI
01 Jul 2001
TL;DR: Because properties at the nanoscale are size-dependent, nanoscales science and engineering offer an entirely new design motif for developing advanced materials and their applications.
Abstract: The terms nanostructure, nanoscience, and nanotechnology are currently quite popular in both the scientific and the general press. These intermediate-length structures are intriguing because generally in the nanometer region, almost all physical and chemical properties of systems become size-dependent. For example, although the color of a piece of gold remains golden as it reduces from inches to millimeters to microns, the color changes substantially in the regime of nanometers. Similarly, the melting points of such particles change as they enter the nanoscale, where the surface energies become comparable to the bulk energies. Because properties at the nanoscale are size-dependent, nanoscale science and engineering offer an entirely new design motif for developing advanced materials and their applications.

18 citations


Journal ArticleDOI
01 Nov 2001
TL;DR: Simulated annealing is an optimization procedure based on the behavior of condensed matter at low temperatures that mirrors theAnnealing process that takes place in nature and employs methods that originated from statistical mechanics to find global minima of systems with large degrees of freedom.
Abstract: Simulated annealing is an optimization procedure based on the behavior of condensed matter at low temperatures that mirrors the annealing process that takes place in nature. The procedure employs methods that originated from statistical mechanics to find global minima of systems with large degrees of freedom. Scott Kirkpatrick and his colleagues first realized the correspondence between combinatorial optimization problems and the way natural systems search for the ground state (lowest energy state).1 They applied Monte Carlo methods (used in statistical mechanics) to the solution of global optimization problems and showed that you could obtain better solutions by simulating the annealing process. They also generalized the basic concepts that Nicholas Metropolis and his colleagues made by introducing a multitemperature approach, which lowers the temperature slowly in stages.2 At each stage, the Metropolis procedure simulates the system until the system reaches equilibrium. At the outset, the system starts with a high T, and then a cooling (or annealing) scheme is applied by slowly decreasing T according to some given procedure. At each T, a series of random new states are generated, which are accepted if they improve the cost function. Now, rather than always rejecting states that don’t improve the cost function, these states can be accepted with some finite probability depending on the amount of increase and T. This process randomizes the iterative improvement phase and also allows occasional uphill moves (meaning moves that don’t improve the solution) in an attempt to reduce the probability of falling into a local minimum. As T decreases, configurations that increase the cost function are more likely to be rejected, and the process eventually terminates with a configuration (solution) that has the lowest cost. This whole procedure leads to a solution that is arbitrarily close to the global minimum.3 In addition to having a wellformulated cost function, the design of an efficient annealing algorithm requires three other ingredients:3,4

15 citations


Journal ArticleDOI
15 Sep 2001
TL;DR: The Surface Evolver simulates an instability as the drops were pressed together numerically by assuming that the stability is attributable to a minimization of surface area under a uniform surface tension.
Abstract: In previous experimental studies of noncoalescing liquid drops, researchers encountered an instability as the drops were pressed together. The Surface Evolver simulates this instability numerically by assuming that the stability is attributable to a minimization of surface area under a uniform surface tension.

14 citations


Journal ArticleDOI
E.A. Lunney1
15 Sep 2001
TL;DR: Computational chemistry, a technology that can play an important role in expediting the drug design phase, is highlighted in this article.
Abstract: Drug discovery is an extended process that can take as many as 15 years from the first compound synthesis in the laboratory until the therapeutic agent, or drug, is brought to market. Reducing the research timeline in the discovery stage is a key priority for pharmaceutical companies worldwide, and through the application and collaboration of current, advanced technologies, the industry is poised to achieve that goal. This article highlights computational chemistry, a technology that can play an important role in expediting the drug design phase.

14 citations


Journal ArticleDOI
01 Jan 2001
TL;DR: The article shows that a simple criterion can help predict which species survive in the long run, an example of the principle of mutual exclusion.
Abstract: The author describes the chemostat, a device used to study bacterial populations under nutrient-limited conditions. The article shows that a simple criterion can help predict which species survive in the long run, an example of the principle of mutual exclusion.

11 citations


Journal ArticleDOI
01 Nov 2001
TL;DR: Computational models based on natural phenomena have gained popularity, and many perceive them to hold the promise for building more powerful massively parallel computers that can provide considerably more computing power than the authors currently have.
Abstract: Computational models based on natural phenomena have gained popularity. Other techniques are more esoteric, such as DNA-based and quantum computation. Many perceive them to hold the promise for building more powerful massively parallel computers that can provide considerably more computing power than we currently have.

Journal ArticleDOI
01 Mar 2001
TL;DR: SPAM is a technique which provides a versatile approach to simulating many difficult problems in continuum mechanics, such as the breakup of a cavitating fluid and the penetration of one solid by another.
Abstract: Smooth Particle Applied Mechanics (SPAM) is a technique which provides a versatile approach to simulating many difficult problems in continuum mechanics, such as the breakup of a cavitating fluid and the penetration of one solid by another. SPAM also provides a simple evaluation method for all the continuum variables, as well as the spatial gradients required by the evolution equations. This global evaluation simplifies interpolation, reasoning, and Fourier transformation. Because SPAM is so flexible and easy to program, it should be included in the toolkit of anyone doing simulations.

Journal ArticleDOI
01 May 2001
TL;DR: The authors anticipate the greatest benefit will come from mathematical platforms that allow for computer-assisted insight generation, not from solutions of grand-challenge problems.
Abstract: Almost all interesting mathematical algorithmic questions relate to NP-hard questions. Such computation is prone to explode exponentially. The authors anticipate the greatest benefit will come from mathematical platforms that allow for computer-assisted insight generation, not from solutions of grand-challenge problems.

Journal ArticleDOI
01 Jul 2001
TL;DR: Which chemical elements in the given literature are most widely studied by experiment and simulation; how did the experiment and theory of these materials change from 1989 to 1999; and can the authors observe any trends from such work?
Abstract: Based on information derived from bibliographic titles in the INSPEC database, this report identifies trends in experimental and theoretical materials research. It considers three specific questions: which chemical elements in the given literature are most widely studied by experiment and simulation; how did the experiment and theory of these materials change from 1989 to 1999; and can we observe any trends from such work?.

Journal ArticleDOI
01 Jul 2001
TL;DR: To meet the high computing-power and memory requirements of applying the full-potential linearized augmented plane wave method to complex problems, the authors have parallelized the FP-LAPW code WIEN97 for distributed-memory machines.
Abstract: To meet the high computing-power and memory requirements of applying the full-potential linearized augmented plane wave method to complex problems, the authors have parallelized the FP-LAPW code WIEN97 for distributed-memory machines. They then implemented the parallel code on a Cray T3E.

Journal ArticleDOI
01 Jan 2001
TL;DR: Today, you might argue the gender bias of such games and maybe even their usefulness, but you can't ignore the power of games as educational tools.
Abstract: Historically, useful adult survival skills have often been taught to children through games. In most societies through the ages, warfare and hunting games have been encouraged for boys, and childcare games have been encouraged for girls. Today, you might argue the gender bias of such games and maybe even their usefulness, but you can't ignore the power of games as educational tools.

Book ChapterDOI
01 Jan 2001
TL;DR: In this article, a conjugate gradient solver for harmonic balanced finite element systems was developed for transformer with ferromagnetic material, which avoids the construction of real equivalent systems which suffer from worse spectral conditions for Krylov subspace solvers.
Abstract: A Conjugate Gradient solver is developed for harmonic balanced finite element systems This approach avoids the construction of real equivalent systems which suffer from worse spectral conditions for Krylov subspace solvers The application to a transformer with ferromagnetic material shows the better convergence of the Conjugate Gradient solver

Journal ArticleDOI
01 Jul 2001
TL;DR: The development of a complete predictive simulation capability for the effects of general anisotropic nonuniform stress on dopant diffusion in silicon is presented as an example for modern physical process modeling.
Abstract: The silicon-based metal oxide semiconductor field effect transistor (MOSFET) is at the heart of today's semiconductor industry. Because the switching speed of a MOSFET increases linearly with shrinking dimensions, the semiconductor industry has constantly improved computer performance by scaling a more or less unchanged device geometry. Despite the successful history of device miniaturization, scaling is reaching the physical limits of traditional device materials. With the reduction of gate lengths and the use of more exotic materials such as metal gates, the influence of stress on diffusion becomes a more prevalent component in determining the final dopant profile and subsequent device performance. We present the development of a complete predictive simulation capability for the effects of general anisotropic nonuniform stress on dopant diffusion in silicon as an example for modern physical process modeling. We also discuss how to effectively integrate predictive modeling tools such as this into the development of state-of-the-art semiconductor devices.

Journal ArticleDOI
01 May 2001
TL;DR: A problem-based learning environment where knowledge is acquired by performing well-defined tasks and the students choose their own focus, order, and pace as they explore the material.
Abstract: Computational methods are part of the problem solving skills that professionals working in quantitative fields need. Advanced text-books provide the mathematical foundation for one specific approach, however, they miss the overview and examples that are generally necessary to choose the right method and implement a practical solution. Internet technology can be of great value in this context. Thus, we created a problem-based learning environment where knowledge is acquired by performing well-defined tasks. We also switched from a teaching-centered course to a learning-centered course; the students choose their own focus, order, and pace as they explore the material (http://pde.fusion.kth.se). Since 1997, the concept has been validated three times in summer courses with about 15 participants geographically dispersed around Stockholm and Goteborg (Sweden). It is now being tested year round with distance learning students on the Web.

Book ChapterDOI
01 Jan 2001
TL;DR: Besides differential forms, special tensor fields are introduced and discussed for the representation of physical fields in continuous medium problems.
Abstract: Besides differential forms, special tensor fields are introduced and discussed for the representation of physical fields in continuous medium problems.

Journal ArticleDOI
01 May 2001
TL;DR: Recently introduced mixed-signal oscilloscopes combine the functionality of traditional analog measurement with basic features of digital logic analyzers.
Abstract: Oscilloscopes are among the most important measurement tools available to electrical engineers, and incorporating digital technology into these tools has significantly increased their capabilities. Recently introduced mixed-signal oscilloscopes combine the functionality of traditional analog measurement with basic features of digital logic analyzers.

Journal ArticleDOI
01 Jan 2001
TL;DR: To create electronic devices far smaller than those currently available, researchers are turning to single, electron-conducting molecules or small groups of metal atoms.
Abstract: Silicon based transistors and circuits are just too big, say a small group of researchers in the new field of nanoelectronics. To create electronic devices far smaller than those currently available, researchers are turning to single, electron-conducting molecules or small groups of metal atoms. The effort has received support from the Defense Advanced Research Projects Agency in the United States and from the European Commission abroad.

Journal ArticleDOI
15 Sep 2001

Journal ArticleDOI
01 May 2001
TL;DR: The article focuses on the use of computers for analysing and simulating turbulent flow and the power of even modest computers now opens the door to this avenue of investigation.
Abstract: The article focuses on the use of computers for analysing and simulating turbulent flow. Turbulence theory has a well-earned reputation for being difficult, but, paradoxically, it is so difficult that it is actually accessible. Precisely because it is difficult to make progress analytically, an increasingly important part of turbulence theory is computational. The peculiarities of real turbulence suggest that new mathematics and new physics are waiting to be discovered. The power of even modest computers now opens the door to this avenue of investigation.

Journal ArticleDOI
01 May 2001
TL;DR: David Hilbert's problems spanned the spectrum from rather trivial (Problem 3 on the equality of the volumes of two tetrahedra) to the probably impossible (Problem 6 on the axiomatization of physics).
Abstract: In 1900, David Hilbert presented 23 problems at the International Congress of Mathematicians held in Paris. Hilbert's problems spanned the spectrum from rather trivial (Problem 3 on the equality of the volumes of two tetrahedra) to the probably impossible (Problem 6 on the axiomatization of physics). Nonetheless, they challenged many of the leading scientists and mathematicians of the 20th century and set the tone for mathematical research, especially in the early part of the century.

Journal ArticleDOI
01 Nov 2001
TL;DR: The authors in this article argue that the real secret to terra's success will lie in how the scientific community reacts to the data generated by NASA's terra spacecraft, and they point out that advances in hardware and theoretical knowledge have combined to allow for such leaps.
Abstract: There is no doubt the amount of data generated by NASA's terra spacecraft is impressive, at about a terabyte per day, it is billed as the largest-ever civilian computational science undertaking. While the investigators say advances in hardware and theoretical knowledge have combined to allow for such leaps, the real secret to terra's success will lie in how the scientific community reacts to the data.

Journal ArticleDOI
01 Nov 2001
TL;DR: The LabPro is a compact device that can operate as a standard computer interface connected to the serial or USB bus on either a Mac or Windows computer, and can serve as an interface to a Texas Instruments graphical calculator or even as a stand-alone data collector.
Abstract: 9 scientists and engineers have special requirements—their interfaces must be easy to use, adaptable to many kinds of experiments, and not too expensive. For 25 years, I have struggled to reconcile these needs with different computer systems. Starting with home-built analog-to-digital converter chips connected to PDP-8s and 6502s, I progressed to similar external circuitry connected to the parallel port of the first IBM PCs in the 1980s, but all these solutions required students to know assembly language programming. As commercial I/O boards became available for both PCs and Unix systems, those turned up in labs, but commercial interfaces by National and others—with LabView, Matlab, or Mathcad software—eventually replaced the I/O boards in advanced labs. Unfortunately, these interfaces still demanded that students have considerable computer knowledge. More to the point, I couldn't afford to equip a first-year lab with dozens of expensive workstations as well as interfaces that cost nearly a thousand dollars. Another drawback was that many data acquisition boards took full-size I/O slots, and most PCs included only half-size ones. Fortunately, help was on the way. About 10 years back, Vernier produced an external Universal Lab Interface (ULI) for the Macintosh, which was less than half the price of previous devices, had many transducers, and included software that was not only student-friendly but also almost instructor-proof. I successfully equipped several networked freshman and sophomore physics labs with these devices. Even its appearance was appealing. The transparent cover let students see the chips and components making up the interface and removed some of the black-box mystery of other models. Alas, an update a few years later produced a streamlined and metal-shielded version that was less scrutable but otherwise worked as well (see Figure 1). Other manufacturers eventually followed with desktop interfaces as well. Recently, iMacs replaced the lab computers, but the ULI interface doesn't support the iMac USB. Evidently , Vernier realized this and produced the LabPro interface (see Figure 2), a compact device that can operate as a standard computer interface connected to the serial or USB bus on either a Mac or Windows computer. (Windows 95 or earlier does not support a USB connection, but you can use the serial connector on these machines .) The LabPro can also serve as an interface to a Texas Instruments graphical calculator or even as a stand-alone data collector. At US $220, it is fairly economical—even for high school laboratories—so through …

Proceedings Article
01 Jan 2001
TL;DR: In this paper, a subgrid model for linear convection-diffusion-reaction problems with fractal rough coefficients is proposed and a priori and a posteriori error estimates are presented.
Abstract: In this paper we propose and study a subgrid model for linear convection-diffusion-reaction problems with fractal rough coefficients. The subgrid model is based on extrapolation model of a modeling residual from coarser scales using a computed solution without subgrid model on a finest scale as reference. We present a priori and a posteriori error estimates, and we show in experiments that a solution with subgrid model on a scale h corresponds to a solution without subgrid model on a scale less than h/4

Journal ArticleDOI
01 May 2001
TL;DR: The authors present their 2nd February 2001 Department of Defense report, with minor edits, analyzing US government export control strategies for high-performance computing, and recommend a shift away from hardware controls to protections on critical defense applications software.
Abstract: The authors present their 2nd February 2001 Department of Defense report, with minor edits, analyzing US government export control strategies for high-performance computing. Due to advances in clustering technology, export controls on high-performance computer systems are found to be no longer effective. The authors recommend a shift away from hardware controls to protections on critical defense applications software.