scispace - formally typeset
Search or ask a question
Author

Elías Cueto

Bio: Elías Cueto is an academic researcher from University of Zaragoza. The author has contributed to research in topics: Finite element method & Model order reduction. The author has an hindex of 38, co-authored 224 publications receiving 4659 citations. Previous affiliations of Elías Cueto include Dassault Aviation & Arts et Métiers ParisTech.


Papers
More filters
Journal ArticleDOI
TL;DR: This paper revisits a new model reduction methodology based on the use of separated representations, the so called Proper Generalized Decomposition—PGD, which allows to treat efficiently models defined in degenerated domains as well as the multidimensional models arising from multiddimensional physics or from the standard ones when some sources of variability are introduced in the model as extra-coordinates.
Abstract: This paper revisits a new model reduction methodology based on the use of separated representations, the so called Proper Generalized Decomposition—PGD. Space and time separated representations generalize Proper Orthogonal Decompositions—POD—avoiding any a priori knowledge on the solution in contrast to the vast majority of POD based model reduction technologies as well as reduced bases approaches. Moreover, PGD allows to treat efficiently models defined in degenerated domains as well as the multidimensional models arising from multidimensional physics (quantum chemistry, kinetic theory descriptions,…) or from the standard ones when some sources of variability are introduced in the model as extra-coordinates.

590 citations

Journal ArticleDOI
TL;DR: This paper revisits a powerful discretization technique, the Proper Generalized Decomposition—PGD, illustrating its ability for solving highly multidimensional models.
Abstract: This paper revisits a powerful discretization technique, the Proper Generalized Decomposition—PGD, illustrating its ability for solving highly multidimensional models. This technique operates by constructing a separated representation of the solution, such that the solution complexity scales linearly with the dimension of the space in which the model is defined, instead the exponentially-growing complexity characteristic of mesh based discretization strategies. The PGD makes possible the efficient solution of models defined in multidimensional spaces, as the ones encountered in quantum chemistry, kinetic theory description of complex fluids, genetics (chemical master equation), financial mathematics, … but also those, classically defined in the standard space and time, to which we can add new extra-coordinates (parametric models, …) opening numerous possibilities (optimization, inverse identification, real time simulations, …).

351 citations

Journal ArticleDOI
TL;DR: A new paradigm in the field of simulation-based engineering sciences (SBES) to face the challenges posed by current ICT technologies is addressed, by combining an off-line stage in which the general PGD solution, the vademecum, is computed, and an on-line phase in which real-time response is obtained as a result of the queries.
Abstract: In this paper we are addressing a new paradigm in the field of simulation-based engineering sciences (SBES) to face the challenges posed by current ICT technologies. Despite the impressive progress attained by simulation capabilities and techniques, some challenging problems remain today intractable. These problems, that are common to many branches of science and engineering, are of different nature. Among them, we can cite those related to high-dimensional problems, which do not admit mesh-based approaches due to the exponential increase of degrees of freedom. We developed in recent years a novel technique, called Proper Generalized Decomposition (PGD). It is based on the assumption of a separated form of the unknown field and it has demonstrated its capabilities in dealing with high-dimensional problems overcoming the strong limitations of classical approaches. But the main opportunity given by this technique is that it allows for a completely new approach for classic problems, not necessarily high dimensional. Many challenging problems can be efficiently cast into a multidimensional framework and this opens new possibilities to solve old and new problems with strategies not envisioned until now. For instance, parameters in a model can be set as additional extra-coordinates of the model. In a PGD framework, the resulting model is solved once for life, in order to obtain a general solution that includes all the solutions for every possible value of the parameters, that is, a sort of computational vademecum. Under this rationale, optimization of complex problems, uncertainty quantification, simulation-based control and real-time simulation are now at hand, even in highly complex scenarios, by combining an off-line stage in which the general PGD solution, the vademecum, is computed, and an on-line phase in which, even on deployed, handheld, platforms such as smartphones or tablets, real-time response is obtained as a result of our queries.

265 citations

Journal ArticleDOI
TL;DR: This work proposes a new method, able to directly link data to computers in order to perform numerical simulations that will employ axiomatic, universal laws while minimizing the need of explicit, often phenomenological, models.
Abstract: Standard simulation in classical mechanics is based on the use of two very different types of equations. The first one, of axiomatic character, is related to balance laws (momentum, mass, energy,...), whereas the second one consists of models that scientists have extracted from collected, natural or synthetic data. Even if one can be confident on the first type of equations, the second one contains modeling errors. Moreover, this second type of equations remains too particular and often fails in describing new experimental results. The vast majority of existing models lack of generality, and therefore must be constantly adapted or enriched to describe new experimental findings. In this work we propose a new method, able to directly link data to computers in order to perform numerical simulations. These simulations will employ axiomatic, universal laws while minimizing the need of explicit, often phenomenological, models. This technique is based on the use of manifold learning methodologies, that allow to extract the relevant information from large experimental datasets.

174 citations

Journal ArticleDOI
TL;DR: Not only data serve to enrich physically-based models, but also modeling and simulation viewpoints, which could allow us to perform a tremendous leap forward, by replacing big-data-based habits by the incipient smart-data paradigm.
Abstract: Engineering is evolving in the same way than society is doing. Nowadays, data is acquiring a prominence never imagined. In the past, in the domain of materials, processes and structures, testing machines allowed extract data that served in turn to calibrate state-of-the-art models. Some calibration procedures were even integrated within these testing machines. Thus, once the model had been calibrated, computer simulation takes place. However, data can offer much more than a simple state-of-the-art model calibration, and not only from its simple statistical analysis, but from the modeling and simulation viewpoints. This gives rise to the the family of so-called twins: the virtual, the digital and the hybrid twins. Moreover, as discussed in the present paper, not only data serve to enrich physically-based models. These could allow us to perform a tremendous leap forward, by replacing big-data-based habits by the incipient smart-data paradigm.

154 citations


Cited by
More filters
Journal ArticleDOI

1,604 citations

Posted Content
TL;DR: This work proposes the Learning without Forgetting method, which uses only new task data to train the network while preserving the original capabilities, and performs favorably compared to commonly used feature extraction and fine-tuning adaption techniques.
Abstract: When building a unified vision system or gradually adding new capabilities to a system, the usual assumption is that training data for all tasks is always available. However, as the number of tasks grows, storing and retraining on such data becomes infeasible. A new problem arises where we add new capabilities to a Convolutional Neural Network (CNN), but the training data for its existing capabilities are unavailable. We propose our Learning without Forgetting method, which uses only new task data to train the network while preserving the original capabilities. Our method performs favorably compared to commonly used feature extraction and fine-tuning adaption techniques and performs similarly to multitask learning that uses original task data we assume unavailable. A more surprising observation is that Learning without Forgetting may be able to replace fine-tuning with similar old and new task datasets for improved new task performance.

1,037 citations

Journal ArticleDOI
TL;DR: In many situations across computational science and engineering, multiple computational models are available that describe a system of interest as discussed by the authors, and these different models have varying evaluation costs, i.e.
Abstract: In many situations across computational science and engineering, multiple computational models are available that describe a system of interest. These different models have varying evaluation costs...

678 citations

Journal ArticleDOI
TL;DR: This work reviews the recent status of methodologies and techniques related to the construction of digital twins mostly from a modeling perspective to provide a detailed coverage of the current challenges and enabling technologies along with recommendations and reflections for various stakeholders.
Abstract: Digital twin can be defined as a virtual representation of a physical asset enabled through data and simulators for real-time prediction, optimization, monitoring, controlling, and improved decision making. Recent advances in computational pipelines, multiphysics solvers, artificial intelligence, big data cybernetics, data processing and management tools bring the promise of digital twins and their impact on society closer to reality. Digital twinning is now an important and emerging trend in many applications. Also referred to as a computational megamodel, device shadow, mirrored system, avatar or a synchronized virtual prototype, there can be no doubt that a digital twin plays a transformative role not only in how we design and operate cyber-physical intelligent systems, but also in how we advance the modularity of multi-disciplinary systems to tackle fundamental barriers not addressed by the current, evolutionary modeling practices. In this work, we review the recent status of methodologies and techniques related to the construction of digital twins mostly from a modeling perspective. Our aim is to provide a detailed coverage of the current challenges and enabling technologies along with recommendations and reflections for various stakeholders.

660 citations

Journal ArticleDOI
TL;DR: This paper presents the most significant contributions of the past decade, which produce such impressive and perceivably realistic animations and simulations: finite element/difference/volume methods, mass‐spring systems, mesh‐free methods, coupled particle systems and reduced deformable models‐based on modal analysis.
Abstract: Physically based deformable models have been widely embraced by the Computer Graphics community. Many problems outlined in a previous survey by Gibson and Mirtich [ GM97] have been addressed, thereby making these models interesting and useful for both offline and real-time applications, such as motion pictures and video games. In this paper, we present the most significant contributions of the past decade, which produce such impressive and perceivably realistic animations and simulations: finite element/difference/volume methods, mass-spring systems, meshfree methods, coupled particle systems and reduced deformable models based on modal analysis. For completeness, we also make a connection to the simulation of other continua, such as fluids, gases and melting objects. Since time integration is inherent to all simulated phenomena, the general notion of time discretization is treated separately, while specifics are left to the respective models. Finally, we discuss areas of application, such as elastoplastic deformation and fracture, cloth and hair animation, virtual surgery simulation, interactive entertainment and fluid/smoke animation, and also suggest areas for future research.

636 citations