scispace - formally typeset
Search or ask a question
Author

Qi-Man Shao

Bio: Qi-Man Shao is an academic researcher from The Chinese University of Hong Kong. The author has contributed to research in topics: Random variable & Law of the iterated logarithm. The author has an hindex of 41, co-authored 216 publications receiving 8257 citations. Previous affiliations of Qi-Man Shao include Hangzhou University & University of Oregon.


Papers
More filters
Journal ArticleDOI
TL;DR: In this article, a Markov chain Monte Carlo (MCMC) sampling algorithm is used to estimate Bayesian credible and highest probability density (HPD) intervals for parameters of interest and provides a simple Monte Carlo approach to approximate these Bayesian intervals when a sample of the relevant parameters can be generated from their respective marginal posterior distribution using a sample from an importance sampling distribution.
Abstract: This article considers how to estimate Bayesian credible and highest probability density (HPD) intervals for parameters of interest and provides a simple Monte Carlo approach to approximate these Bayesian intervals when a sample of the relevant parameters can be generated from their respective marginal posterior distribution using a Markov chain Monte Carlo (MCMC) sampling algorithm. We also develop a Monte Carlo method to compute HPD intervals for the parameters of interest from the desired posterior distribution using a sample from an importance sampling distribution. We apply our methodology to a Bayesian hierarchical model that has a posterior density containing analytically intractable integrals that depend on the (hyper) parameters. We further show that our methods are useful not only for calculating the HPD intervals for the parameters of interest but also for computing the HPD intervals for functions of the parameters. Necessary theory is developed and illustrative examples—including a si...

844 citations

Book
21 Jan 2000
TL;DR: This book examines advanced Bayesian computational methods and presents methods for sampling from posterior distributions and discusses how to compute posterior quantities of interest using Markov chain Monte Carlo (MCMC) samples.
Abstract: This book examines advanced Bayesian computational methods. It presents methods for sampling from posterior distributions and discusses how to compute posterior quantities of interest using Markov chain Monte Carlo (MCMC) samples. This book examines each of these issues in detail and heavily focuses on computing various posterior quantities of interest from a given MCMC sample. Several topics are addressed, including techniques for MCMC sampling, Monte Carlo methods for estimation of posterior quantities, improving simulation accuracy, marginal posterior density estimation, estimation of normalizing constants, constrained parameter problems, highest posterior density interval calculations, computation of posterior modes, and posterior computations for proportional hazards models and Dirichlet process models. The authors also discuss computions involving model comparisons, including both nested and non-nested models, marginal likelihood methods, ratios of normalizing constants, Bayes factors, the Savage-Dickey density ratio, Stochastic Search Variable Selection, Bayesian Model Averaging, the reverse jump algorithm, and model adequacy using predictive and latent residual approaches. The book presents an equal mixture of theory and applications involving real data. The book is intended as a graduate textbook or a reference book for a one semester course at the advanced masters or Ph.D. level. It would also serve as a useful reference book for applied or theoretical researchers as well as practitioners. Ming-Hui Chen is Associate Professor of Mathematical Sciences at Worcester Polytechnic Institute, Qu-Man Shao is Assistant Professor of Mathematics at the University of Oregon. Joseph G. Ibrahim is Associate Professor of Biostatistics at the Harvard School of Public Health and Dana-Farber Cancer Institute.

762 citations

Book
04 Nov 2010
TL;DR: In this paper, Stein's method is used for non-linear statistics and multivariate normal approximations for independent random variables with moderate deviations, and a non-normal approximation for nonlinear statistics.
Abstract: Preface.- 1.Introduction.- 2.Fundamentals of Stein's Method.- 3.Berry-Esseen Bounds for Independent Random Variables.- 4.L^1 Bounds.- 5.L^1 by Bounded Couplings.- 6 L^1: Applications.- 7.Non-uniform Bounds for Independent Random Variables.- 8.Uniform and Non-uniform Bounds under Local Dependence.- 9.Uniform and Non-Uniform Bounds for Non-linear Statistics.- 10.Moderate Deviations.- 11.Multivariate Normal Approximation.- 12.Discretized normal approximation.- 13.Non-normal Approximation.- 14.Extensions.- References.- Author Index .- Subject Index.- Notation.

468 citations

Book ChapterDOI
TL;DR: In this paper, the authors focus on the inequalities, small-ball probabilities, and application of Gaussian processes, and find that the small ball probability is a key step in studying the lower limits of the Gaussian process.
Abstract: Publisher Summary This chapter focuses on the inequalities, small ball probabilities, and application of Gaussian processes. It is well-known that the large deviation result plays a fundamental role in studying the upper limits of Gaussian processes, such as the Strassen type law of the iterated logarithm. However, the complexity of the small ball estimate is well-known, and there are only a few Gaussian measures for which the small ball probability can be determined completely. The small ball probability is a key step in studying the lower limits of the Gaussian process. It has been found that the small ball estimate has close connections with various approximation quantities of compact sets and operators, and has a variety of applications in studies of Hausdorff dimensions, rate of convergence in Strassen's law of the iterated logarithm, and empirical processes.

442 citations

Journal ArticleDOI
Qi-Man Shao1
TL;DR: The comparison theorem on moment inequalities between negatively associated and independent random variables extends the Hoeffding inequality on the probability bounds for the sum of a random sample without replacement from a finite population as discussed by the authors.
Abstract: Let {X i, 1≤i≤n} be a negatively associated sequence, and let {X* i , 1≤i≤n} be a sequence of independent random variables such that X* i and X i have the same distribution for each i=1, 2,..., n. It is shown in this paper that Ef(∑ n i=1 X i)≤Ef(∑ n i=1 X* i ) for any convex function f on R 1 and that Ef(max1≤k≤n ∑ n i=k X i)≤Ef(max1≤k≤n ∑ k i=1 X* i ) for any increasing convex function. Hence, most of the well-known inequalities, such as the Rosenthal maximal inequality and the Kolmogorov exponential inequality, remain true for negatively associated random variables. In particular, the comparison theorem on moment inequalities between negatively associated and independent random variables extends the Hoeffding inequality on the probability bounds for the sum of a random sample without replacement from a finite population.

322 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: Convergence of Probability Measures as mentioned in this paper is a well-known convergence of probability measures. But it does not consider the relationship between probability measures and the probability distribution of probabilities.
Abstract: Convergence of Probability Measures. By P. Billingsley. Chichester, Sussex, Wiley, 1968. xii, 253 p. 9 1/4“. 117s.

5,689 citations

01 Jan 2002
TL;DR: The mixed logit model is considered to be the most promising state of the art discrete choice model currently available, but estimation and data issues are far from clear and possibly for the first time there is an estimation method that requires extremely high quality data.
Abstract: The mixed logit model is considered to be the most promising state of the art discrete choice model currently available. Increasingly researchers and practitioners are estimating mixed logit models of various degrees of sophistication with mixtures of revealed preference and stated choice data. It is timely to review progress in model estimation since the learning curve is steep and the unwary are likely to fall into a chasm if not careful. These chasms are very deep indeed given the complexity of the mixed logit model. Although the theory is relatively clear, estimation and data issues are far from clear. Indeed there is a great deal of potential mis-inference consequent on trying to extract increased behavioural realism from data that are often not able to comply with the demands of mixed logit models. Possibly for the first time we now have an estimation method that requires extremely high quality data if the analyst wishes to take advantage of the extended behavioural capabilities of such models. This paper focuses on the new opportunities offered by mixed logit models and some issues to be aware of to avoid misuse of such advanced discrete choice methods by the practitioner.

1,806 citations