scispace - formally typeset
Search or ask a question
Author

Costas Papadimitriou

Bio: Costas Papadimitriou is an academic researcher from University of Thessaly. The author has contributed to research in topics: Bayesian inference & Uncertainty quantification. The author has an hindex of 37, co-authored 220 publications receiving 5417 citations. Previous affiliations of Costas Papadimitriou include Hong Kong University of Science and Technology & California Institute of Technology.


Papers
More filters
Journal ArticleDOI
TL;DR: The theoretical and computational issues arising in the selection of the optimal sensor configuration for parameter estimation in structural dynamics are addressed and two algorithms are proposed for constructing effective sensor configurations that are superior in terms of computational efficiency and accuracy to the sensor configurations provided by genetic algorithms.

367 citations

Journal ArticleDOI
TL;DR: In this article, a dual implementation of the Kalman filter for estimating the unknown input and states of a linear state-space model by using sparse noisy acceleration measurements is proposed, which avoids numerical issues attributed to unobservability and rank deficiency of the augmented formulation of the problem.

304 citations

Journal ArticleDOI
TL;DR: The proposed entropy-based measure of uncertainty is well-suited for making quantitative evaluations and comparisons of the quality of the parameter estimates that can be achieved using sensor configurations with different numbers of sensors in each configuration.
Abstract: A statistical methodology is presented for optimally locating the sensors in a structure for the purpose of extracting from the measured data the most information about the parameters of the model used to represent structural behavior. The methodology can be used in model updating and in damage detection and localization applications. It properly handles the unavoidable uncertainties in the measured data as well as the model uncertainties. The optimality criterion for the sensor locations is based on information entropy, which is a unique measure of the uncertainty in the model parameters. The uncertainty in these parameters is computed by a Bayesian statistical methodology, and then the entropy measure is minimized over the set of possible sensor configurations using a genetic algorithm. The information entropy measure is also extended to handle large uncertainties expected in the pretest nominal model of a structure. In experimental design, the proposed entropy-based measure of uncertainty is also well-suited for making quantitative evaluations and comparisons of the quality of the parameter estimates that can be achieved using sensor configurations with different numbers of sensors in each configuration. Simplified models for a shear building and a truss structure are used to illustrate the methodology.

299 citations

Journal ArticleDOI
TL;DR: In this article, the concept of robust reliability is defined to take into account uncertainties from structural modeling in addition to the uncertain excitement that a structure will experience during its lifetime, and a Bayesian probabilistic methodology for system identification is integrated for updating the assessment of the robust reliability based on dynamic test data.

254 citations

Journal ArticleDOI
TL;DR: In this paper, an asymptotic approximation for evaluating the probability integrals that arise in the determination of the reliability and response moments of uncertain dynamic systems subject to stochastic excitation is developed.
Abstract: An asymptotic approximation is developed for evaluating the probability integrals that arise in the determination of the reliability and response moments of uncertain dynamic systems subject to stochastic excitation. The method is applicable when the probabilities of failure or response moments conditional on the system parameters are available, and the effect of the uncertainty in the system parameters is to be investigated. In particular, a simple analytical formula for the probability of failure of the system is derived and compared to some existing approximations, including an asymptotic approximation based on second-order reliability methods. Simple analytical formulas are also derived for the sensitivity of the failure probability and response moments to variations in parameters of interest. Conditions for which the proposed asymptotic expansion is expected to be accurate are presented. Since numerical integration is only computationally feasible for investigating the accuracy of the proposed method for a small number of uncertain system parameters, simulation techniques are also used. A simple importance sampling method is shown to converge much more rapidly than straightforward Monte Carlo simulation. Simple structures subjected to white noise stochastic excitation are used to illustrate the accuracy of the proposed analytical approximation. Results from the computationally efficient perturbation method are also included for comparison. The results show that the asymptotic method gives acceptable approximations, even for systems with relatively large uncertainty, and in most cases, it outperforms the perturbation method.

230 citations


Cited by
More filters
01 May 1993
TL;DR: Comparing the results to the fastest reported vectorized Cray Y-MP and C90 algorithm shows that the current generation of parallel machines is competitive with conventional vector supercomputers even for small problems.
Abstract: Three parallel algorithms for classical molecular dynamics are presented. The first assigns each processor a fixed subset of atoms; the second assigns each a fixed subset of inter-atomic forces to compute; the third assigns each a fixed spatial region. The algorithms are suitable for molecular dynamics models which can be difficult to parallelize efficiently—those with short-range forces where the neighbors of each atom change rapidly. They can be implemented on any distributed-memory parallel machine which allows for message-passing of data between independently executing processors. The algorithms are tested on a standard Lennard-Jones benchmark problem for system sizes ranging from 500 to 100,000,000 atoms on several parallel supercomputers--the nCUBE 2, Intel iPSC/860 and Paragon, and Cray T3D. Comparing the results to the fastest reported vectorized Cray Y-MP and C90 algorithm shows that the current generation of parallel machines is competitive with conventional vector supercomputers even for small problems. For large problems, the spatial algorithm achieves parallel efficiencies of 90% and a 1840-node Intel Paragon performs up to 165 faster than a single Cray C9O processor. Trade-offs between the three algorithms and guidelines for adapting them to more complex molecular dynamics simulations are also discussed.

29,323 citations

Christopher M. Bishop1
01 Jan 2006
TL;DR: Probability distributions of linear models for regression and classification are given in this article, along with a discussion of combining models and combining models in the context of machine learning and classification.
Abstract: Probability Distributions.- Linear Models for Regression.- Linear Models for Classification.- Neural Networks.- Kernel Methods.- Sparse Kernel Machines.- Graphical Models.- Mixture Models and EM.- Approximate Inference.- Sampling Methods.- Continuous Latent Variables.- Sequential Data.- Combining Models.

10,141 citations

Journal ArticleDOI
TL;DR: In this article, a set simulation approach is proposed to compute small failure probabilities encountered in reliability analysis of engineering systems, which can be expressed as a product of larger conditional failure probabilities by introducing intermediate failure events.

1,890 citations

Journal ArticleDOI

1,604 citations

07 Apr 2002
TL;DR: An updated review covering the years 1996 2001 will summarize the outcome of an updated review of the structural health monitoring literature, finding that although there are many more SHM studies being reported, the investigators, in general, have not yet fully embraced the well-developed tools from statistical pattern recognition.
Abstract: Staff members at Los Alamos National Laboratory (LANL) produced a summary of the structural health monitoring literature in 1995. This presentation will summarize the outcome of an updated review covering the years 1996 2001. The updated review follows the LANL statistical pattern recognition paradigm for SHM, which addresses four topics: 1. Operational Evaluation; 2. Data Acquisition and Cleansing; 3. Feature Extraction; and 4. Statistical Modeling for Feature Discrimination. The literature has been reviewed based on how a particular study addresses these four topics. A significant observation from this review is that although there are many more SHM studies being reported, the investigators, in general, have not yet fully embraced the well-developed tools from statistical pattern recognition. As such, the discrimination procedures employed are often lacking the appropriate rigor necessary for this technology to evolve beyond demonstration problems carried out in laboratory setting.

1,467 citations