Author
George Shu Heng Pau
Other affiliations: Institute of High Performance Computing Singapore, Massachusetts Institute of Technology, National University of Singapore
Bio: George Shu Heng Pau is an academic researcher from Lawrence Berkeley National Laboratory. The author has contributed to research in topics: Markov chain Monte Carlo & Adaptive mesh refinement. The author has an hindex of 15, co-authored 35 publications receiving 1063 citations. Previous affiliations of George Shu Heng Pau include Institute of High Performance Computing Singapore & Massachusetts Institute of Technology.
Papers
More filters
••
TL;DR: In this paper, the authors describe the modeling and simulation of the dissolution-diffusion-convection process based on a total velocity splitting formulation for a variable-density incompressible single-phase model.
312 citations
••
TL;DR: In this article, Lagrangian interpolation is used to approximate general functions by finite sums of well chosen, pre-defined, linearly independent interpolating functions; it is much simpler to implement than determining the best fits with respect to some Banach (or even Hilbert) norm.
Abstract: Lagrangian interpolation is a classical way to approximate general
functions by finite sums of well chosen, pre-defined, linearly
independent interpolating functions; it is much simpler to implement than
determining the best fits with respect to some Banach (or even Hilbert)
norms. In addition, only partial knowledge is required (here values on some
set of points). The problem of defining the best sample of points is
nevertheless rather complex and is in general open. In this paper we
propose a way to derive such sets of points. We do not claim that the
points resulting from the construction explained here are optimal in any
sense. Nevertheless, the resulting interpolation method is proven to work under certain hypothesis, the
process is very general and simple to implement, and compared to situations
where the best behavior is known, it is relatively competitive.
288 citations
••
TL;DR: In this article, a layout optimization of passive constrained layer damping (PCLD) treatment for minimizing the vibration response of cylindrical shells is presented with consideration of broadband transverse force excitation.
109 citations
••
TL;DR: TOUGH3—a new base version of TOUGH—addresses the increasing complexity of the simulated processes as well as the growing size of model domains that need to be handled and incorporates many new features, addresses bugs, and improves the flexibility of data handling.
59 citations
Oak Ridge National Laboratory1, Lawrence Berkeley National Laboratory2, University of Michigan3, National Center for Atmospheric Research4, University of California, Irvine5, Lund University6, University of New South Wales7, University of California, Berkeley8, Met Office9, Goddard Institute for Space Studies10, Goddard Space Flight Center11
TL;DR: In this article, the authors present a comprehensive and multi-faceted evaluation of Earth System Models (ESMs) projections, which use observations to constrain model predictions, inform model development, and identify needed field experiments.
Abstract: As earth system models (ESMs) become increasingly complex, there is a growing need for comprehensive and multi-faceted evaluation of model projections. To advance understanding of terrestrial biogeochemical processes and their interactions with hydrology and climate under conditions of increasing atmospheric carbon dioxide, new analysis methods are required that use observations to constrain model predictions, inform model development, and identify needed measurements and field experiments. Better representations of biogeochemistryclimate feedbacks and ecosystem processes in these models are essential for reducing the acknowledged substantial uncertainties in 21st century climate change projections.
42 citations
Cited by
More filters
•
[...]
TL;DR: In this paper, the authors describe photonic crystals as the analogy between electron waves in crystals and the light waves in artificial periodic dielectric structures, and the interest in periodic structures has been stimulated by the fast development of semiconductor technology that now allows the fabrication of artificial structures, whose period is comparable with the wavelength of light in the visible and infrared ranges.
Abstract: The term photonic crystals appears because of the analogy between electron waves in crystals and the light waves in artificial periodic dielectric structures. During the recent years the investigation of one-, two-and three-dimensional periodic structures has attracted a widespread attention of the world optics community because of great potentiality of such structures in advanced applied optical fields. The interest in periodic structures has been stimulated by the fast development of semiconductor technology that now allows the fabrication of artificial structures, whose period is comparable with the wavelength of light in the visible and infrared ranges.
2,722 citations
•
TL;DR: The methodology proposed automatically adapts to the local structure when simulating paths across this manifold, providing highly efficient convergence and exploration of the target density, and substantial improvements in the time‐normalized effective sample size are reported when compared with alternative sampling approaches.
Abstract: The paper proposes Metropolis adjusted Langevin and Hamiltonian Monte Carlo sampling methods defined on the Riemann manifold to resolve the shortcomings of existing Monte Carlo algorithms when sampling from target densities that may be high dimensional and exhibit strong correlations. The methods provide fully automated adaptation mechanisms that circumvent the costly pilot runs that are required to tune proposal densities for Metropolis-Hastings or indeed Hamiltonian Monte Carlo and Metropolis adjusted Langevin algorithms. This allows for highly efficient sampling even in very high dimensions where different scalings may be required for the transient and stationary phases of the Markov chain. The methodology proposed exploits the Riemann geometry of the parameter space of statistical models and thus automatically adapts to the local structure when simulating paths across this manifold, providing highly efficient convergence and exploration of the target density. The performance of these Riemann manifold Monte Carlo methods is rigorously assessed by performing inference on logistic regression models, log-Gaussian Cox point processes, stochastic volatility models and Bayesian estimation of dynamic systems described by non-linear differential equations. Substantial improvements in the time-normalized effective sample size are reported when compared with alternative sampling approaches. MATLAB code that is available from http://www.ucl.ac.uk/statistics/research/rmhmc allows replication of all the results reported.
1,031 citations
01 Dec 2012
Abstract: We upscaled FLUXNET observations of carbon dioxide, water, and energy fluxes to the global scale using the machine learning technique, model tree ensembles (MTE). We trained MTE to predict site-level gross primary productivity (GPP), terrestrial ecosystem respiration (TER), net ecosystem exchange (NEE), latent energy (LE), and sensible heat (H) based on remote sensing indices, climate and meteorological data, and information on land use. We applied the trained MTEs to generate global flux fields at a 0.5 degrees x 0.5 degrees spatial resolution and a monthly temporal resolution from 1982 to 2008. Cross-validation analyses revealed good performance of MTE in predicting among-site flux variability with modeling efficiencies (MEf) between 0.64 and 0.84, except for NEE (MEf = 0.32). Performance was also good for predicting seasonal patterns (MEf between 0.84 and 0.89, except for NEE (0.64)). By comparison, predictions of monthly anomalies were not as strong (MEf between 0.29 and 0.52). Improved accounting of disturbance and lagged environmental effects, along with improved characterization of errors in the training data set, would contribute most to further reducing uncertainties. Our global estimates of LE (158 +/- 7 J x 10(18) yr(-1)), H (164 +/- 15 J x 10(18) yr(-1)), and GPP (119 +/- 6 Pg C yr(-1)) were similar to independent estimates. Our global TER estimate (96 +/- 6 Pg C yr(-1)) was likely underestimated by 5-10%. Hot spot regions of interannual variability in carbon fluxes occurred in semiarid to semihumid regions and were controlled by moisture supply. Overall, GPP was more important to interannual variability in NEE than TER. Our empirically derived fluxes may be used for calibration and evaluation of land surface process models and for exploratory and diagnostic assessments of the biosphere.
948 citations
•
01 Sep 2015TL;DR: In this article, the authors provide a thorough introduction to the mathematical and algorithmic aspects of certified reduced basis methods for parametrized partial differential equations, including model construction, error estimation and computational efficiency.
Abstract: This book provides a thorough introduction to the mathematical and algorithmic aspects of certified reduced basis methods for parametrized partial differential equations. Central aspects ranging from model construction, error estimation and computational efficiency to empirical interpolation methods are discussed in detail for coercive problems. More advanced aspects associated with time-dependent problems, non-compliant and non-coercive problems and applications with geometric variation are also discussed as examples.
831 citations
••
National Center for Atmospheric Research1, Lawrence Berkeley National Laboratory2, University of Utah3, Los Alamos National Laboratory4, Boston University5, University of Texas at Austin6, University of Maryland, College Park7, University of Florida8, Empresa Brasileira de Pesquisa Agropecuária9, University of Notre Dame10, Smithsonian Tropical Research Institute11, Brookhaven National Laboratory12, Japan Agency for Marine-Earth Science and Technology13, Lund University14, Princeton University15, Ghent University16, Columbia University17, Harvard University18
TL;DR: It is argued that stronger and more innovative connections to data are required to address gaps in understanding, and that constrained predictions at ecologically relevant spatial and temporal scales will require a similar investment of effort and intensified inter-disciplinary communication.
Abstract: Numerous current efforts seek to improve the representation of ecosystem ecology and vegetation demographic processes within Earth System Models (ESMs). These developments are widely viewed as an important step in developing greater realism in predictions of future ecosystem states and fluxes. Increased realism, however, leads to increased model complexity, with new features raising a suite of ecological questions that require empirical constraints. Here, we review the developments that permit the representation of plant demographics in ESMs, and identify issues raised by these developments that highlight important gaps in ecological understanding. These issues inevitably translate into uncertainty in model projections but also allow models to be applied to new processes and questions concerning the dynamics of real-world ecosystems. We argue that stronger and more innovative connections to data, across the range of scales considered, are required to address these gaps in understanding. The development of first-generation land surface models as a unifying framework for ecophysiological understanding stimulated much research into plant physiological traits and gas exchange. Constraining predictions at ecologically relevant spatial and temporal scales will require a similar investment of effort and intensified inter-disciplinary communication.
445 citations