scispace - formally typeset
Search or ask a question
Author

Gordon A. Fenton

Bio: Gordon A. Fenton is an academic researcher from Dalhousie University. The author has contributed to research in topics: Random field & Slope stability analysis. The author has an hindex of 39, co-authored 161 publications receiving 7468 citations. Previous affiliations of Gordon A. Fenton include University of Adelaide & Delft University of Technology.


Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, the authors investigated the probability of failure of a cohesive slope using both simple and more advanced probabilistic analysis tools, and concluded that simplified probabilism, in which spatial variability is ignored by assuming perfect correlation, can lead to unconservative estimates of the failure probability.
Abstract: In this paper we investigate the probability of failure of a cohesive slope using both simple and more advanced probabilistic analysis tools. The influence of local averaging on the probability of failure of a test problem is thoroughly investigated. In the simple approach, classical slope stability analysis techniques are used, and the shear strength is treated as a single random variable. The advanced method, called the random finite-element method (RFEM), uses elastoplasticity combined with random field theory. The RFEM method is shown to offer many advantages over traditional probabilistic slope stability techniques, because it enables slope failure to develop naturally by “seeking out” the most critical mechanism. Of particular importance in this work is the conclusion that simplified probabilistic analysis, in which spatial variability is ignored by assuming perfect correlation, can lead to unconservative estimates of the probability of failure. This contradicts the findings of other investigators w...

794 citations

Book
02 Sep 2008
TL;DR: In this article, the authors present a review of Probability theory and its application in the field of mine planning, including the following: 1.1 Introduction. 2.2 Discrete-Time, Discrete State Markov Chains (DSMC) 2.3 Monte Carlo Analysis and Results. 3.4 Probabilistic Interpretation.
Abstract: Preface. Acknowledgements. PART 1: THEORY. Chapter 1: Review of Probability Theory. 1.1 Introduction. 1.2 Basic Set Theory. 1.3 Probability. 1.4 Conditional Probability. 1.5 Random Variables and Probability Distributions. 1.6 Measures of Central Tendency, Variability, and Association. 1.7 Linear Combinations of Random Variables. 1.8 Functions of Random Variables. 1.9 Common Discrete Probability Distributions. 1.10 Common Continuous Probability Distributions. 1.11 Extreme-Value Distributions. Chapter2: Discrete random Processes. 2.1 Introduction. 2.2 Discrete-Time, Discrete-State Markov Chains. 2.3 Continuous-Time Markov Chains. 2.4 Queueing Models. Chapter 3: Random Fields. 3.1 Introduction. 3.2 Covariance Function. 3.3 Spectral Density Function. 3.4 Variance Function. 3.5 Correlation Length. 3.6 Some Common Models. 3.7 Random Fields in Higher Dimensions. Chapter 4: Best Estimates, Excursions, and Averages. 4.1 Best Linear Unbiased Estimation. 4.2 Threshold Excursions in One Dimension. 4.3 Threshold Excursions in Two Dimensions. 4.4 Averages. Chapter 5: Estimation. 5.1 Introduction. 5.2 Choosing a Distribution. 5.3 Estimation in Presence of Correlation. 5.4 Advanced Estimation Techniques. Chapter 6: Simulation. 6.1 Introduction. 6.2 Random-Number Generators. 6.3 Generating Nonuniform Random Variables. 6.4 Generating Random Fields. 6.5 Conditional Simulation of Random Fields. 6.6 Monte carlo Simulation. Chapter 7: Reliability-Based Design. 7.1 Acceptable Risk. 7.2 Assessing Risk. 7.3 Background to Design Methodologies. 7.4 Load and Resistance Factor Design. 7.5 Going Beyond Calibration. 7.6 Risk-Based Decision making. PART 2: PRACTICE. Chapter 8: Groundwater Modeling. 8.1 Introduction. 8.2 Finite-Element Model. 8.3 One-Dimensional Flow. 8.4 Simple Two-Dimensional Flow. 8.5 Two-Dimensional Flow Beneath Water-Retaining Structures. 8.6 Three-Dimensional Flow. 8.7 Three Dimensional Exit Gradient Analysis. Chapter 9: Flow Through Earth Dams. 9.1 Statistics of Flow Through Earth Dams. 9.2 Extreme Hydraulic Gradient Statistics. Chapter 10: Settlement of Shallow Foundations. 10.1 Introduction. 10.2 Two-Dimensional Probabilistic Foundation Settlement. 10.3 Three-Dimensional Probabilistic Foundation Settlement. 10.4 Strip Footing Risk Assessment. 10.5 Resistance Factors for Shallow-Foundation Settlement Design. Chapter 11: Bearing Capacity. 11.1 Strip Footings on c-o Soils. 11.2 Load and Resistance Factor Design of Shallow Foundations. 11.3 Summary. Chapter 12: Deep Foundations. 12.1 Introduction. 12.2 Random Finite-Element Method. 12.3 Monte Carlo Estimation of Pile Capacity. 12.4 Summary. Chapter 13: Slope Stability. 13.1 Introduction. 13.2 Probabilistic Slope Stability Analysis. 13.3 Slope Stability Reliability Model. Chapter 14: Earth Pressure. 14.1 Introduction. 14.2 Passive Earth Pressures. 14.3 Active Earth Pressures: Retaining Wall Reliability. Chapter 15: Mine Pillar Capacity. 15.1 Introduction. 15.2 Literature. 15.3 Parametric Studies. 15.4 Probabilistic Interpretation. 15.5 Summary. Chapter 16: Liquefaction. 16.1 Introduction. 16.2 Model Size: Soil Liquefaction. 16.3 Monte Carlo Analysis and Results. 16.4 Summary PART 3: APPENDIXES. APPENDIX A: PROBABILITY TABLES. A.1 Normal Distribution. A.2 Inverse Student t -Distribution. A.3 Inverse Chi-Square Distribution APPENDIX B: NUMERICAL INTEGRATION. B.1 Gaussian Quadrature. APPENDIX C. COMPUTING VARIANCES AND CONVARIANCES OF LOCAL AVERAGES. C.1 One-Dimensional Case. C.2 Two-Dimensional Case C.3 Three-Dimensional Case. Index.

751 citations

Journal ArticleDOI
TL;DR: A fast and accurate method of generating realizations of a homogeneous Gaussian scalar random process in one, two, or three dimensions is presented, motivated first by the need to represent engineering properties as local averages and second to be able to condition the realization easily to incorporate known data or change resolution within sub‐regions.
Abstract: A fast and accurate method of generating realizations of a homogeneous Gaussian scalar random process in one, two, or three dimensions is presented. The resulting discrete process represents local averages of a homogeneous random function defined by its mean and covariance function, the averaging being performed over incremental domains formed by different levels of discretization of the field. The approach is motivated first by the need to represent engineering properties as local averages (since many properties are not well defined at a point and show significant scale effects), and second to be able to condition the realization easily to incorporate known data or change resolution within sub‐regions. The ability to condition the realization or increase the resolution in certain regions is an important contribution to finite element modeling of random phenomena. The Ornstein‐Uhlenbeck and fractional Gaussian noise processes are used as illustrations.

490 citations

Journal ArticleDOI
TL;DR: This paper used random field theory and elasto-plastic finite element analysis to evaluate the extent to which spatial variability and cross-correlati cation is associated with spatially varying shear strengths.
Abstract: Soils with spatially varying shear strengths are modeled using random field theory and elasto-plastic finite element analysis to evaluate the extent to which spatial variability and cross-correlati...

427 citations

Journal ArticleDOI
TL;DR: In this article, the authors investigated the probability of failure of slopes using both traditional and more advanced probabilistic analysis tools, and they showed that simplified analysis in which spatial variability of soil properties is not properly accounted for, can lead to unconservative estimates of the failure probability if the coefficient of variation of the shear strength parameters exceeds a critical value.
Abstract: The paper investigates the probability of failure of slopes using both traditional and more advanced probabilistic analysis tools. The advanced method, called the random finite-element method, uses elastoplasticity in a finite-element model combined with random field theory in a Monte-Carlo framework. The traditional method, called the first-order reliability method, computes a reliability index which is the shortest distance (in units of directional equivalent standard deviations) from the equivalent mean-value point to the limit state surface and estimates the probability of failure from the reliability index. Numerical results show that simplified probabilistic analyses in which spatial variability of soil properties is not properly accounted for, can lead to unconservative estimates of the probability of failure if the coefficient of variation of the shear strength parameters exceeds a critical value. The influences of slope inclination, factor of safety (based on mean strength values), and cross correlation between strength parameters on this critical value have been investigated by parametric studies in this paper. The results indicate when probabilistic approaches, which do not model spatial variation, may lead to unconservative estimates of slope failure probability and when more advanced probabilistic methods are warranted.

413 citations


Cited by
More filters
Journal ArticleDOI

6,278 citations

Journal ArticleDOI
TL;DR: Current perspectives on the mechanisms that generate 24 h, short-term (<5 min), and ultra-short-term HRV are reviewed, and the importance of HRV, and its implications for health and performance are reviewed.
Abstract: Healthy biological systems exhibit complex patterns of variability that can be described by mathematical chaos. Heart rate variability (HRV) consists of changes in the time intervals between consecutive heartbeats called interbeat intervals (IBIs). A healthy heart is not a metronome. The oscillations of a healthy heart are complex and constantly changing, which allow the cardiovascular system to rapidly adjust to sudden physical and psychological challenges to homeostasis. This article briefly reviews current perspectives on the mechanisms that generate 24 h, short-term (~5 min), and ultra-short-term (<5 min) HRV, the importance of HRV, and its implications for health and performance. The authors provide an overview of widely-used HRV time-domain, frequency-domain, and non-linear metrics. Time-domain indices quantify the amount of HRV observed during monitoring periods that may range from ~2 min to 24 h. Frequency-domain values calculate the absolute or relative amount of signal energy within component bands. Non-linear measurements quantify the unpredictability and complexity of a series of IBIs. The authors survey published normative values for clinical, healthy, and optimal performance populations. They stress the importance of measurement context, including recording period length, subject age, and sex, on baseline HRV values. They caution that 24 h, short-term, and ultra-short-term normative values are not interchangeable. They encourage professionals to supplement published norms with findings from their own specialized populations. Finally, the authors provide an overview of HRV assessment strategies for clinical and optimal performance interventions.

3,046 citations

11 Jun 2010
Abstract: The validity of the cubic law for laminar flow of fluids through open fractures consisting of parallel planar plates has been established by others over a wide range of conditions with apertures ranging down to a minimum of 0.2 µm. The law may be given in simplified form by Q/Δh = C(2b)3, where Q is the flow rate, Δh is the difference in hydraulic head, C is a constant that depends on the flow geometry and fluid properties, and 2b is the fracture aperture. The validity of this law for flow in a closed fracture where the surfaces are in contact and the aperture is being decreased under stress has been investigated at room temperature by using homogeneous samples of granite, basalt, and marble. Tension fractures were artificially induced, and the laboratory setup used radial as well as straight flow geometries. Apertures ranged from 250 down to 4µm, which was the minimum size that could be attained under a normal stress of 20 MPa. The cubic law was found to be valid whether the fracture surfaces were held open or were being closed under stress, and the results are not dependent on rock type. Permeability was uniquely defined by fracture aperture and was independent of the stress history used in these investigations. The effects of deviations from the ideal parallel plate concept only cause an apparent reduction in flow and may be incorporated into the cubic law by replacing C by C/ƒ. The factor ƒ varied from 1.04 to 1.65 in these investigations. The model of a fracture that is being closed under normal stress is visualized as being controlled by the strength of the asperities that are in contact. These contact areas are able to withstand significant stresses while maintaining space for fluids to continue to flow as the fracture aperture decreases. The controlling factor is the magnitude of the aperture, and since flow depends on (2b)3, a slight change in aperture evidently can easily dominate any other change in the geometry of the flow field. Thus one does not see any noticeable shift in the correlations of our experimental results in passing from a condition where the fracture surfaces were held open to one where the surfaces were being closed under stress.

1,557 citations

Journal ArticleDOI
TL;DR: The authors conclude that a coherent heart is not a metronome because its rhythms are characterized by both complexity and stability over longer time scales.
Abstract: Heart rate variability (HRV), the change in the time intervals between adjacent heartbeats, is an emergent property of interdependent regulatory systems that operate on different time scales to adapt to challenges and achieve optimal performance. This article briefly reviews neural regulation of the heart, and its basic anatomy, the cardiac cycle, and the sinoatrial and atrioventricular pacemakers. The cardiovascular regulation center in the medulla integrates sensory information and input from higher brain centers, and afferent cardiovascular system inputs to adjust heart rate and blood pressure via sympathetic and parasympathetic efferent pathways. This article reviews sympathetic and parasympathetic influences on the heart, and examines the interpretation of HRV and the association between reduced HRV, risk of disease and mortality, and the loss of regulatory capacity. This article also discusses the intrinsic cardiac nervous system and the heart-brain connection, through which afferent information can influence activity in the subcortical and frontocortical areas, and motor cortex. It also considers new perspectives on the putative underlying physiological mechanisms and properties of the ultra-low-frequency (ULF), very-low-frequency (VLF), low-frequency (LF), and high-frequency (HF) bands. Additionally, it reviews the most common time and frequency domain measurements as well as standardized data collection protocols. In its final section, this article integrates Porges’ polyvagal theory, Thayer and colleagues’ neurovisceral integration model, Lehrer et al.’s resonance frequency model, and the Institute of HeartMath’s coherence model. The authors conclude that a coherent heart is not a metronome because its rhythms are characterized by both complexity and stability over longer time scales. Future research should expand understanding of how the heart and its intrinsic nervous system influence the brain.

1,102 citations

Journal ArticleDOI
TL;DR: A state-of-the-art review of past and recent developments in the SFEM area and indicating future directions as well as some open issues to be examined by the computational mechanics community in the future are provided.

851 citations