scispace - formally typeset
Search or ask a question
Author

Gerd Heber

Other affiliations: University of Delaware
Bio: Gerd Heber is an academic researcher from Cornell University. The author has contributed to research in topics: Load balancing (computing) & Parallel algorithm. The author has an hindex of 15, co-authored 39 publications receiving 1640 citations. Previous affiliations of Gerd Heber include University of Delaware.

Papers
More filters
Posted Content
TL;DR: Analyzing this data to find the subtle effects missed by previous studies requires algorithms that can simultaneously deal with huge datasets and that can find very subtle effects --- finding both needles in the haystack and finding very small haystacks that were undetected in previous measurements.
Abstract: This is a thought piece on data-intensive science requirements for databases and science centers. It argues that peta-scale datasets will be housed by science centers that provide substantial storage and processing for scientists who access the data via smart notebooks. Next-generation science instruments and simulations will generate these peta-scale datasets. The need to publish and share data and the need for generic analysis and visualization tools will finally create a convergence on common metadata standards. Database systems will be judged by their support of these metadata standards and by their ability to manage and access peta-scale datasets. The procedural stream-of-bytes-file-centric approach to data analysis is both too cumbersome and too serial for such large datasets. Non-procedural query and analysis of schematized self-describing data is both easier to use and allows much more parallelism.

476 citations

Journal ArticleDOI
01 Dec 2005
TL;DR: In this article, the authors propose algorithms that can simultaneously deal with huge datasets and that can find very subtle effects, finding both needles in the haystack and finding very small haystacks that were undetected in previous measurements.
Abstract: Scientific instruments and computer simulations are creating vast data stores that require new scientific methods to analyze and organize the data. Data volumes are approximately doubling each year. Since these new instruments have extraordinary precision, the data quality is also rapidly improving. Analyzing this data to find the subtle effects missed by previous studies requires algorithms that can simultaneously deal with huge datasets and that can find very subtle effects --- finding both needles in the haystack and finding very small haystacks that were undetected in previous measurements.

432 citations

Journal ArticleDOI
TL;DR: This work determined the dynamical stability of a universe of mathematical, nonlinear food web models with varying degrees of organizational complexity and found that the frequency of unpredictable, chaotic dynamics increases with the number of trophic levels in a food web but decreases with the degree of complexity.
Abstract: In mathematical models, very simple communities consisting of three or more species frequently display chaotic dynamics which implies that long-term predictions of the population trajectories in time are impossible. Communities in the wild tend to be more complex, but evidence for chaotic dynamics from such communities is scarce. We used supercomputing power to test the hypothesis that chaotic dynamics become less frequent in model ecosystems when their complexity increases. We determined the dynamical stability of a universe of mathematical, nonlinear food web models with varying degrees of organizational complexity. We found that the frequency of unpredictable, chaotic dynamics increases with the number of trophic levels in a food web but decreases with the degree of complexity. Our results suggest that natural food webs possess architectural properties that may intrinsically lower the likelihood of chaotic community dynamics.

116 citations

Journal ArticleDOI
TL;DR: In this paper, a crack path prediction method based on as-manufactured component geometry is proposed to resolve the crack-path ambiguity in the Digital Twin concept, which is also related to our work.
Abstract: A simple, nonstandardized material test specimen, which fails along one of two different likely crack paths, is considered herein. The result of deviations in geometry on the order of tenths of a millimeter, this ambiguity in crack path motivates the consideration of as-manufactured component geometry in the design, assessment, and certification of structural systems. Herein, finite element models of as-manufactured specimens are generated and subsequently analyzed to resolve the crack-path ambiguity. The consequence and benefit of such a “personalized” methodology is the prediction of a crack path for each specimen based on its as-manufactured geometry, rather than a distribution of possible specimen geometries or nominal geometry. The consideration of as-manufactured characteristics is central to the Digital Twin concept. Therefore, this work is also intended to motivate its development.

106 citations

Journal ArticleDOI
TL;DR: In this paper, Spievak et al. used finite element methods to simulate arbitrarily shaped fatigue crack growth in a spiral bevel gear more efficiently and with much higher resolution than with a previous boundary element-based approach.

91 citations


Cited by
More filters
Journal ArticleDOI

[...]

08 Dec 2001-BMJ
TL;DR: There is, I think, something ethereal about i —the square root of minus one, which seems an odd beast at that time—an intruder hovering on the edge of reality.
Abstract: There is, I think, something ethereal about i —the square root of minus one. I remember first hearing about it at school. It seemed an odd beast at that time—an intruder hovering on the edge of reality. Usually familiarity dulls this sense of the bizarre, but in the case of i it was the reverse: over the years the sense of its surreal nature intensified. It seemed that it was impossible to write mathematics that described the real world in …

33,785 citations

Journal ArticleDOI
Fei Tao1, Jiangfeng Cheng1, Qinglin Qi1, Meng Zhang1, He Zhang1, Fangyuan Sui1 
TL;DR: In this paper, a new method for product design, manufacturing, and service driven by digital twin is proposed, and three cases are given to illustrate the future applications of digital twin in three phases of a product respectively.
Abstract: Nowadays, along with the application of new-generation information technologies in industry and manufacturing, the big data-driven manufacturing era is coming. However, although various big data in the entire product lifecycle, including product design, manufacturing, and service, can be obtained, it can be found that the current research on product lifecycle data mainly focuses on physical products rather than virtual models. Besides, due to the lack of convergence between product physical and virtual space, the data in product lifecycle is isolated, fragmented, and stagnant, which is useless for manufacturing enterprises. These problems lead to low level of efficiency, intelligence, sustainability in product design, manufacturing, and service phases. However, physical product data, virtual product data, and connected data that tie physical and virtual product are needed to support product design, manufacturing, and service. Therefore, how to generate and use converged cyber-physical data to better serve product lifecycle, so as to drive product design, manufacturing, and service to be more efficient, smart, and sustainable, is emphasized and investigated based on our previous study on big data in product lifecycle management. In this paper, a new method for product design, manufacturing, and service driven by digital twin is proposed. The detailed application methods and frameworks of digital twin-driven product design, manufacturing, and service are investigated. Furthermore, three cases are given to illustrate the future applications of digital twin in the three phases of a product respectively.

1,571 citations

Journal ArticleDOI
TL;DR: This paper thoroughly reviews the state-of-the-art of the DT research concerning the key components of DTs, the current development ofDTs, and the major DT applications in industry and outlines the current challenges and some possible directions for future work.
Abstract: Digital twin (DT) is one of the most promising enabling technologies for realizing smart manufacturing and Industry 4.0. DTs are characterized by the seamless integration between the cyber and physical spaces. The importance of DTs is increasingly recognized by both academia and industry. It has been almost 15 years since the concept of the DT was initially proposed. To date, many DT applications have been successfully implemented in different industries, including product design, production, prognostics and health management, and some other fields. However, at present, no paper has focused on the review of DT applications in industry. In an effort to understand the development and application of DTs in industry, this paper thoroughly reviews the state-of-the-art of the DT research concerning the key components of DTs, the current development of DTs, and the major DT applications in industry. This paper also outlines the current challenges and some possible directions for future work.

1,467 citations

Proceedings Article
01 Jan 2003

1,212 citations

Book ChapterDOI
01 Jan 2017
TL;DR: Digital twins as discussed by the authors link the physical system with its virtual equivalent to mitigate the problematic issues due to human interaction in the process of creation, production, operations, and disposal of a system.
Abstract: Systems do not simply pop into existence. They progress through lifecycle phases of creation, production, operations, and disposal. The issues leading to undesirable and unpredicted emergent behavior are set in place during the phases of creation and production and realized during the operational phase, with many of those problematic issues due to human interaction. We propose that the idea of the Digital Twin, which links the physical system with its virtual equivalent can mitigate these problematic issues. We describe the Digital Twin concept and its development, show how it applies across the product lifecycle in defining and understanding system behavior, and define tests to evaluate how we are progressing. We discuss how the Digital Twin relates to Systems Engineering and how it can address the human interactions that lead to “normal accidents.” We address both Digital Twin obstacles and opportunities, such as system replication and front running. We finish with NASA’s current work with the Digital Twin.

1,031 citations