scispace - formally typeset
Search or ask a question

Showing papers presented at "Computational Science and Engineering in 2002"


Journal ArticleDOI
01 May 2002
TL;DR: In this article, an object-oriented scripting interface to a mature density functional theory code is presented, which gives users a high-level, flexible handle on the code without rewriting the underlying number-crunching code.
Abstract: The authors have created an object-oriented scripting interface to a mature density functional theory code. The interface gives users a high-level, flexible handle on the code without rewriting the underlying number-crunching code. The authors also discuss the design issues and advantages of homogeneous interfaces.

1,231 citations


Journal ArticleDOI
01 Mar 2002
TL;DR: The Zoltan library simplifies the load-balancing, data movement, unstructured-communication, and memory usage difficulties that arise in dynamic applications such as adaptive finite-element methods, particle methods, and crash simulations.
Abstract: The Zoltan library is a collection of data management services for parallel, unstructured, adaptive, and dynamic applications that is available as open-source software. It simplifies the load-balancing, data movement, unstructured-communication, and memory usage difficulties that arise in dynamic applications such as adaptive finite-element methods, particle methods, and crash simulations. Zoltan's data-structure-neutral design also lets a wide range of applications use it without imposing restrictions on application data structures. Its object-based interface provides a simple and inexpensive way for application developers to use the library and researchers to make new capabilities available under a common interface.

307 citations


Journal ArticleDOI
01 May 2002
TL;DR: The author reviews the fundamental technology-independent limits of computing and concludes that novel physically motivated computing paradigms can help in certain ways, but even they remain subject to some basic limits.
Abstract: Many of the fundamental limits on information processing, from thermodynamics, relativity, and quantum mechanics, are only a few decades away. Novel physically motivated computing paradigms, such as reversible computing and quantum computing, may help in certain ways, but even they remain subject to some basic limits. One can arrive at several firm conclusions regarding upper bounds on the limits of computing. In this article, the author only reviews the fundamental technology-independent limits.

92 citations


Book ChapterDOI
01 Jan 2002
TL;DR: In this paper, a one dimensional model of blood flow in human arteries is proposed for the case when an abrupt variation of the mechanical characteristic of an artery is caused by the presence of a vascular prosthesis (e.g. a stent).
Abstract: We investigate a one dimensional model of blood flow in human arteries. In particular we consider the case when an abrupt variation of the mechanical characteristic of an artery is caused by the presence of a vascular prosthesis (e.g. a stent). The derivation of the model and the numerical scheme adopted for its solution are detailed. Numerical experiments show the effectiveness of the model for the problem at hand.

77 citations


Journal ArticleDOI
01 Nov 2002
TL;DR: In this paper, a hybrid-order treelike Markov model is proposed to predict Web access precisely while providing high coverage and scalability, which is crucial in the rapidly growing World Wide Web.
Abstract: Accurately predicting Web user access behavior can minimize user-perceived latency, which is crucial in the rapidly growing World Wide Web. Although traditional Markov models have helped predict user access behavior, they have serious limitations. Hybrid-order treelike Markov models predict Web access precisely while providing high coverage and scalability.

74 citations


Book ChapterDOI
01 Jan 2002
TL;DR: In this paper, a numerical procedure for homogenization, which starts from a discretization of the multiscale differential equation, is described, where the discrete operator is represented in a wavelet space and projected onto a coarser subspace.
Abstract: Classical homogenization is an analytic technique for approximating multiscale differential equations. The numbers of scales are reduced and the resulting equations are easier to analyze or numerically approximate. The class of problems that classical homogenization applies to is quite restricted. We shall describe a numerical procedure for homogenization, which starts from a discretization of the multiscale differential equation. In this procedure the discrete operator is represented in a wavelet space and projected onto a coarser subspace. The wavelet homogenization applies to a wider class of problems than classical homogenization. The projection procedure is general and we give a presentation of a framework in Hilbert space, which also applies to the differential equation directly. The wavelet based homogenization technique is applied to discretizations of the Helmholtz equation. In one problem from electromagnetic compatibility a subgrid scale geometrical detail is represented on a coarser grid. In another a wave-guide filter is efficiently approximated in a lower dimension. The technique is also applied to the derivation of effective equations for a nonlinear problem and to the derivation of coarse grid operators in multigrid. These multigrid methods work very well for equations with highly oscillatory or discontinuous coefficients.

48 citations


Journal ArticleDOI
01 May 2002
TL;DR: The theory of computational complexity has some interesting links to physics, in particular to quantum computing and statistical mechanics as mentioned in this paper, and the article contains an informal introduction to this theory and its connections to physics.
Abstract: The theory of computational complexity has some interesting links to physics, in particular to quantum computing and statistical mechanics. The article contains an informal introduction to this theory and its links to physics.

48 citations


Journal ArticleDOI
01 Nov 2002
TL;DR: Soft-devices are software mechanisms that provide services to each other and to other virtual roles according to the content of their resources and related configuration information.
Abstract: Soft-devices are promising next-generation Web resources. they are software mechanisms that provide services to each other and to other virtual roles according to the content of their resources and related configuration information. They can contain various kinds of resources such as text, images, and other services. Configuring a resource in a soft-device is similar to installing software in a computer: both processes contain multi-step human-computer interactions.

46 citations


Journal ArticleDOI
01 May 2002
TL;DR: The author explores the application of patterns to dynamic-systems simulation, such as molecular dynamics, and identifies four design patterns that emerge in modeling such systems.
Abstract: Patterns are well understood methodology for object-oriented software architecture, especially for business applications. Scientific programmers have generally avoided object-oriented approaches because of their heavy computational overhead, but the benefits of using patterns for scientific problems can outweigh their costs. This article introduces the concept of object oriented software patterns and discusses how they can be applied to scientific software problems. After a brief explanation of what patterns are and why they can be relevant to scientific software, the author explores the application of patterns to dynamic-systems simulation, such as molecular dynamics, and identifies four design patterns that emerge in modeling such systems. To illustrate how to reuse a general pattern for a specific problem, he applies one of the dynamic simulation patterns to the different problem of hydrodynamic chemistry tracers.

33 citations


Journal ArticleDOI
01 Jul 2002
TL;DR: In this paper, two approaches are described to locate specific features through algorithms that are geared to those features underlying physics and denoising feature maps that exploit spatial-scale coherence and uses what they call feature preserving wavelets.
Abstract: One effective way of exploring large scientific data sets is a process called feature mining. The two approaches described here locate specific features through algorithms that are geared to those features underlying physics. Our intent with both approaches is to exploit the physics of the problem at hand to develop highly discriminating, application-dependent feature detection algorithms and then use available data mining algorithms to classify, cluster, and categorize the identified features. We have also developed a technique for denoising feature maps that exploits spatial-scale coherence and uses what we call feature preserving wavelets. The examples presented demonstrate our feature mining approach as applied to steady computational fluid dynamics simulations on curvilinear grids.

24 citations


Journal ArticleDOI
01 Sep 2002
TL;DR: The spectral element method offers distinct advantages for geophysical simulations, including geometric flexibility, accuracy and scalability, which are capitalizing on to create new models that can accurately and effectively simulate multi-scale flows in complex geometries.
Abstract: The spectral element method offers distinct advantages for geophysical simulations, including geometric flexibility, accuracy and scalability. Developers of atmospheric and oceanic models are capitalizing on these properties to create new models that can accurately and effectively simulate multi-scale flows in complex geometries.

Journal ArticleDOI
01 May 2002
TL;DR: Simulations work in practice because they exploit higher-level organizing principles in nature as discussed by the authors, and good code writing requires faithfulness to these principles and the discipline not to exceed their limits of validity.
Abstract: Simulations work in practice because they exploit higher-level organizing principles in nature. Good code writing requires faithfulness to these principles and the discipline not to exceed their limits of validity. An important exception is the use of simulation to search for new kinds of emergence.

Journal ArticleDOI
01 Jul 2002
TL;DR: Evans as mentioned in this paper argues that without intellectual ownership of data as well as ideas, people will balk, and it's naive to think they wouldn't. But that's a dangerous game, because people get very protective of their own way of doing things.
Abstract: tion to first-author publication, if you built a database or contributed a certain amount of data to it, it counts in some tangible way toward academic advancement. Without that, without intellectual ownership of data as well as ideas, people will balk, and it's naive to think they wouldn't. " NIMH's Koslow agrees with Evans that change is needed in the reward system. He believes the NIH's new policy could gradually help change the prevailing cultural mindset. " Actually, on grants now, one of the review criteria is what individuals propose to do in terms of sharing data, " he says. The issue of setting technical standards for neuroscience also looms. Evans says the Organization for Human Brain Mapping, a larger umbrella group of researchers, might spin off a subcommittee to address the issue. " But that's a dangerous game, because people get very protective of their own way of doing things. When you make recommendations as a committee, you're often standing there to be shot at by various factions. The OHBM may have a stronger credibility to recommend international standards, but the issue is fraught with politics. " LONI's Toga says the prospect of formal standards in neu-roscience technology will be a nonstarter. " I think you'll see folks who will build things that will become remarkably well accepted, and then, because of their large constituency, will emerge as standards, " he says, noting the AIR program as an example. " A top-down approach in this field will never happen. " Despite the potential obstacles to building a worldwide neuroscience network, Evans says the computational advances have profound implications for researchers and clinicians. " It's such an exciting time to be in this business, " he says. " The speed of the computer allows us to conceive of things we couldn't even conceive of. It's not just a matter of doing things faster now that we could do before. " If we look at the area of functional connectivity, for ex-ample—if A fires off, what other parts of the brain fire— what's the temporal ordering between steps as you go from one part of the brain to another? If you think about that, you're looking at a 3D field, plus time, which becomes a 4D data set. For every voxel, you have to look up the time in all the others, so for every 3D voxel, you have to study …

Journal ArticleDOI
01 Jul 2002
TL;DR: The work was performed while using the catalog from the FIRST survey to classify galaxies with a bent-double morphology, meaning those galaxies that appear to be bent in shape.
Abstract: Astronomy data sets have led to interesting problems in mining scientific data. These problems will likely become more challenging as the astronomy community brings several surveys online as part of the National Virtual Observatory, giving rise to the possibility of mining data across many different surveys. In this article, we discuss the work we performed while using the catalog from the FIRST (Faint Images of the Radio Sky at Twenty centimetres) survey to classify galaxies with a bent-double morphology, meaning those galaxies that appear to be bent in shape. We describe the approach we took to mine this data, the issues we addressed in working with a real data set, and the lessons we learned in the process.

Journal ArticleDOI
01 Jul 2002
TL;DR: In previous parts the authors used linear least squares to fit polynomials of various degree to the annual global temperature anomalies for 1856 to 1999 to show that higher-order polynomial behavior is rare in nature and gives spurious wiggles between the data points.
Abstract: For Part II see ibid. vol.3 no.6 (2001). In previous parts we used linear least squares to fit polynomials of various degree to the annual global temperature anomalies for 1856 to 1999. Polynomials are much beloved by mathematicians but are of limited value for modeling measured data. Natural processes often display linear trends, and occasionally a constant acceleration process exhibits quadratic variation. However, higher-order polynomial behavior is rare in nature, which is more likely to produce exponentials, sinusoids, logistics, Gaussians, or other special functions. Modeling such behaviors with high-order polynomials usually gives spurious wiggles between the data points, and low-order polynomial fits give nonrandom residuals. We saw an example of this syndrome in Figure 4 of Part 1 (ibid. vol.3 no.5 (2001)), where we attempted to model a quasicyclic variation with a fifth-degree polynomial. That example also illustrated that polynomial fits usually give unrealistic extrapolations of the data.

Journal ArticleDOI
01 Jul 2002
TL;DR: This article describes focused sampling strategies for mining scientific data based on the spatial aggregation language, which supports construction of data interpretation and control design applications for spatially distributed physical systems in a bottom-up manner.
Abstract: A novel framework leverages physical properties for mining in data-scarce domains. It interleaves bottom-up data mining with top-down data collection, leading to effective and explainable sampling strategies. This article describes focused sampling strategies for mining scientific data. Our approach is based on the spatial aggregation language, which supports construction of data interpretation and control design applications for spatially distributed physical systems in a bottom-up manner. Used as a basis for describing data mining algorithms, SAL programs also help exploit knowledge of physical properties such as continuity and locality in data fields. We also introduce a top-down sampling strategy that focuses data collection in only those regions that are deemed most important to support a data mining objective.

Journal ArticleDOI
01 Jan 2002
TL;DR: For example, the restoration of the Smithsonian Institution's trademark triceratops mounted some 90 years ago as part of a refurbishing of the dinosaur hall in the Smithsonian's National Museum of Natural History in Washington, DC as discussed by the authors.
Abstract: Over the past 150 years, our perception of how dinosaurs looked has changed on the basis of museum displays and artists' renditions. Reproductions of lumbering dinosaurs at London's Crystal Palace during the mid 1800s look little like the creatures in the 40-year-old murals at Yale University's Peabody Museum of Natural History, or the computer-animated images of agile animals depicted in the Jurassic Park movies and recent television documentaries. Now, researchers are combining techniques from computer-aided design, rapid prototyping, and biomechanics to develop more accurate theories of dinosaurs' posture and movements. This information will affect paleontologists' concepts of how the animals lived and what they ate. Perhaps the most notable example of this synergy of paleontology and computing-related fields has been the restoration of the Smithsonian Institution's trademark dinosaur -a triceratops mounted some 90 years ago - as part of a refurbishing of the dinosaur hall in the Smithsonian's National Museum of Natural History in Washington, DC.

Journal ArticleDOI
01 Sep 2002
TL;DR: A semi-implicit 3D spectral-element dynamical core should achieve high performance levels on microprocessor-based parallel clusters and substantially accelerate the climate simulation rate.
Abstract: Spectral elements are a potential alternative to the spectral transform method widely used in atmospheric general circulation models. A semi-implicit formulation permits larger time steps than an explicit formulation. Thus, a semi-implicit 3D spectral-element dynamical core should achieve high performance levels on microprocessor-based parallel clusters and substantially accelerate the climate simulation rate.

Journal ArticleDOI
01 Mar 2002
TL;DR: An overview of computational terminal ballistics methods and their application to large-scale problems of complex weapon-target interactions is provided.
Abstract: Large-scale numerical simulations of complex weapon-target interactions help guide experiments, illustrate physical processes, ascertain performance limits, extract transient response characteristics, and augment experimental databases. This article provides an overview of computational terminal ballistics methods and their application to large-scale problems.

Journal ArticleDOI
01 Jul 2002

Journal ArticleDOI
01 Sep 2002
TL;DR: The general circulation model, developed at the NASA Goddard Space Flight Center for climate simulations and other meteorological applications, emphasizes the conservative and monotonic transport and achieves sufficient accuracy for the global consistency of climate processes.
Abstract: The general circulation model, developed at the NASA Goddard Space Flight Center for climate simulations and other meteorological applications, emphasizes the conservative and monotonic transport and achieves sufficient accuracy for the global consistency of climate processes.

Journal ArticleDOI
01 Jan 2002
TL;DR: In this paper, the authors present techniques to simplify complex computer simulations by partitioning parameter spaces and using mathematical tools to visualize implementation effects, which is similar to the approach we use in this paper.
Abstract: The authors present techniques to simplify complex computer simulations by partitioning parameter spaces and by using mathematical tools to visualize implementation effects.

Journal ArticleDOI
01 Mar 2002
TL;DR: The Naval Research Laboratory has developed the world's first eddy-resolving global ocean nowcast and forecast system that uses satellite observations and on ocean model running on a high-performance-computing platform to enhance real-time knowledge of the marine environment in which naval submarines and ships must operate.
Abstract: The Naval Research Laboratory has developed the world's first eddy-resolving global ocean nowcast and forecast system. It uses satellite observations and on ocean model running on a high-performance-computing platform to enhance real-time knowledge of the marine environment in which naval submarines and ships must operate.

Journal ArticleDOI
01 Sep 2002
TL;DR: The National Infrastructure Simulation and Analysis Center (NISAC), operated by a core partnership at Sandia and Los Alamos National Laboratories, is being envisioned as the central component of this protective architecture.
Abstract: The continuing availability of vital resources is paramount in ensuring the US national economy's well being. Yet until recently, no attempt was made to form an overreaching computational architecture capable of helping policymakers identify and protect those resources on a national level. The National Infrastructure Simulation and Analysis Center (NISAC), operated by a core partnership at Sandia and Los Alamos National Laboratories, is being envisioned as the central component of this protective architecture. The laboratories plan to leverage their modeling and simulation experience in areas as varied as nuclear weapons and traffic patterns.




Journal ArticleDOI
01 Nov 2002
TL;DR: The Michelson-Morley experiment can be examined, using standard mathematical concepts of error analysis, as a case study in validation, and the author proposes two research directions.
Abstract: The Michelson-Morley experiment can be examined, using standard mathematical concepts of error analysis, as a case study in validation. A few philosophical principles focusing on scientific confirmation help elucidate the formal logical concepts that result in a foundation for scientific validation. The author proposes two research directions.

Journal ArticleDOI
01 Sep 2002
TL;DR: This paper shows how to construct stable finite-difference schemes that preserve accurate numerical orthogonality of the solutions of the Schrodinger equation.
Abstract: Solutions of the Schrodinger equation that pertain to different energies are orthogonal by virtue of quantum dynamics. However, when we obtain such solutions numerically using library differential equation solvers, and when the inner product is defined by numerical quadrature, the result is not sufficiently orthogonal for certain purposes. This paper shows how to construct stable finite-difference schemes that preserve accurate numerical orthogonality of the solutions.