Author

# George Deodatis

Other affiliations: Princeton University, University of Innsbruck, J.P. Morgan & Co.

Bio: George Deodatis is an academic researcher from Columbia University. The author has contributed to research in topics: Stochastic process & Monte Carlo method. The author has an hindex of 36, co-authored 125 publications receiving 5798 citations. Previous affiliations of George Deodatis include Princeton University & University of Innsbruck.

##### Papers published on a yearly basis

##### Papers

More filters

••

1,069 citations

••

TL;DR: In this paper, a simulation algorithm is proposed to generate ergodic sample functions of a stationary, multivariate stochastic process according to its prescribed cross-spectral density matrix.

Abstract: A simulation algorithm is proposed to generate sample functions of a stationary, multivariate stochastic process according to its prescribed cross-spectral density matrix. If the components of the vector process correspond to different locations in space, then the process is nonhomogeneous in space. The ensemble cross-correlation matrix of the generated sample functions is identical to the corresponding target. The simulation algorithm generates ergodic sample functions in the sense that the temporal cross-correlation matrix of each and every generated sample function is identical to the corresponding target, when the length of the generated sample function is equal to one period (the generated sample functions are periodic). The proposed algorithm is based on an extension of the spectral representation method and is very efficient computationally since it takes advantage of the fast Fourier transform technique. The generated sample functions are Gaussian in the limit as the number of terms in the frequency discretization of the cross-spectral density matrix approaches infinity. An example involving simulation of turbulent wind velocity fluctuations is presented in order to demonstrate the capabilities and efficiency of the proposed algorithm.

446 citations

••

TL;DR: In this article, the spectral representation of the stochastic field is used to obtain the mean value, autocorrelation function, and power spectral density function of a multi-dimensional, homogeneous Gaussian field.

Abstract: The subject of this paper is the simulation of multi-dimensional, homogeneous, Gaussian stochastic fields using the spectral representation method. Following this methodology, sample functions of the stochastic field can be generated using a cosine series formula. These sample functions accurately reflect the prescribed probabilistic characteristics of the stochastic field when the number of terms in the cosine series is large. The ensemble-averaged power spectral density or autocorrelation function approaches the corresponding target function as the sample size increases. In addition, the generated sample functions possess ergodic characteristics in the sense that the spatially-averaged mean value, autocorrelation function and power spectral density function are identical with the corresponding targets, when the averaging takes place over the multi-dimensional domain associated with the fundamental period of the cosine series. Another property of the simulated stochastic field is that it is asymptotically Gaussian as the number of terms in the cosine series approaches infinity. The most important feature of the method is that the cosine series formula can be numerically computed very efficiently using the Fast Fourier Transform technique. The main area of application of this method is the Monte Carlo solution of stochastic problems in structural engineering, engineering mechanics and physics. Specifically, the method has been applied to problems involving random loading (random vibration theory) and random material and geometric properties (response variability due to system stochasticity).

421 citations

••

TL;DR: A spectral-representation-based simulation algorithm is used in this paper to generate sample functions of a non-stationary, multi-variate stochastic process with evolutionary power, according to its prescribed non- stationary cross-spectral density matrix.

370 citations

••

TL;DR: In this article, the response variability of finite element systems arising from the spatial randomness of the material properties is examined using the finite element method along with a first order Neumann expansion of the stiffness matrix of the system.

Abstract: The response variability of finite element systems arising from the spatial randomness of the material properties is examined. The system is subjected to static loads of a deterministic nature. The problem is analyzed using the finite element method along with a first-order Neumann expansion of the stiffness matrix of the system. The covariance matrix of the response displacement vector is calculated analytically as a function of the number of finite elements. The finite element size necessary to obtain sufficiently accurate values of the stochastic response parameters is examined thoroughly. Various conclusions are drawn concerning the convergence of the coefficient of variation of the response strain to its final value as a function of the number of finite elements. The main advantage of the method is its computational efficiency, since all the response statistics are computed analytically and not through a Monte Carlo simulation.

240 citations

##### Cited by

More filters

••

TL;DR: In this article, generalized polynomial chaos expansions (PCE) are used to build surrogate models that allow one to compute the Sobol' indices analytically as a post-processing of the PCE coefficients.

1,934 citations

••

Brown University

^{1}TL;DR: In this paper, the authors present a new algorithm to model the input uncertainty and its propagation in incompressible flow simulations, which is represented spectrally by employing orthogonal polynomial functionals from the Askey scheme as trial basis to represent the random space.

1,412 citations

••

TL;DR: This paper proposes the use of outdoor millimeter wave communications for backhaul networking between cells and mobile access within a cell, and proposes an efficient beam alignment technique using adaptive subspace sampling and hierarchical beam codebooks.

Abstract: Recently, there has been considerable interest in new tiered network cellular architectures, which would likely use many more cell sites than found today. Two major challenges will be i) providing backhaul to all of these cells and ii) finding efficient techniques to leverage higher frequency bands for mobile access and backhaul. This paper proposes the use of outdoor millimeter wave communications for backhaul networking between cells and mobile access within a cell. To overcome the outdoor impairments found in millimeter wave propagation, this paper studies beamforming using large arrays. However, such systems will require narrow beams, increasing sensitivity to movement caused by pole sway and other environmental concerns. To overcome this, we propose an efficient beam alignment technique using adaptive subspace sampling and hierarchical beam codebooks. A wind sway analysis is presented to establish a notion of beam coherence time. This highlights a previously unexplored tradeoff between array size and wind-induced movement. Generally, it is not possible to use larger arrays without risking a corresponding performance loss from wind-induced beam misalignment. The performance of the proposed alignment technique is analyzed and compared with other search and alignment methods. The results show significant performance improvement with reduced search time.

975 citations

01 Jan 1990

TL;DR: The ASCE/SEI 7-05 standard as discussed by the authors provides a complete update and reorganization of the wind load provisions, expanding them from one chapter into six, and includes new ultimate event wind maps with corresponding reductions in load factors.

Abstract: Minimum Design Loads for Buildings and Other Structures provides requirements for general structural design and includes means for determining dead, live, soil, flood, wind, snow, rain, atmospheric ice, and earthquake loads, as well as their combinations, which are suitable for inclusion in building codes and other documents. This Standard, a revision of ASCE/SEI 7-05, offers a complete update and reorganization of the wind load provisions, expanding them from one chapter into six. The Standard contains new ultimate event wind maps with corresponding reductions in load factors, so that the loads are not affected, and updates the seismic loads with new risk-targeted seismic maps. The snow, live, and atmospheric icing provisions are updated as well. In addition, the Standard includes a detailed Commentary with explanatory and supplementary information designed to assist building code committees and regulatory authorities. Standard ASCE/SEI 7 is an integral part of building codes in the United States. Many of the load provisions are substantially adopted by reference in the International Building Code and the NFPA 5000 Building Construction and Safety Code. Structural engineers, architects, and those engaged in preparing and administering local building codes will find this Standard an essential reference in their practice. Note: New orders are fulfilled from the second printing, which incorporates the errata to the first printing.

974 citations

16 Jun 2003

TL;DR: In this article, the authors presented methods of bridge fragility curve development on the basis of statistical analysis, and applied these methods in the assessment of seismic performance of expressway network systems.

Abstract: This report presents methods of bridge fragility curve development on the basis of statistical analysis. Both empirical and analytical fragility curves are considered. The empirical fragility curves are developed utilizing bridge damage data obtained from past earthquakes, particularly the 1994 Northridge and 1995 Hyogo-ken Nanbu (Kobe) earthquakes. Analytical fragility curves are constructed for typical bridges in the Memphis, Tennessee area utilizing nonlinear dynamic analysis. Two-parameter lognormal distribution functions are used to represent the fragility curves. These two parameters (referred to as fragility parameters) are estimated by two distinct methods. The first method is more traditional and uses the maximum likelihood procedure treating each event of bridge damage as a realization from a Bernoulli experiment. The second method is unique in that it permits simultaneous estimation of the fragility parameters of the family of fragility curves, each representing a particular state of damage, associated with a population of bridges. The method still utilizes the maximum likelihood procedure, however, each event of bridge damage is treated as a realization from a multi-outcome Bernoulli type experiment. These two methods of parameter estimation are used for each of the populations of bridges inspected for damage after the Northridge and Kobe earthquakes and with numerically simulated damage for the population of typical Memphis area bridges. Corresponding to these two methods of estimation, this report introduces statistical procedures for testing goodness of fit of the fragility curves and of estimating the confidence intervals of the fragility parameters. Some preliminary evaluations are made on the significance of the fragility curves developed as a function of ground intensity measures other than PGA. Furthermore, applications of fragility curves in the seismic performance estimation of expressway network systems are demonstrated. Exploratory research was performed to compare the empirical and analytical fragility curves developed in the major part of this report with those constructed utilizing the nonlinear static method currently promoted by the profession in conjunction with performance-based structural design. The conceptual and theoretical treatment discussed herein is believed to provide a theoretical basis and practical analytical tools for the development of fragility curves, and their application in the assessment of seismic performance of expressway network systems.

894 citations