scispace - formally typeset
Search or ask a question

Showing papers by "Bernard W. Silverman published in 2005"


Journal ArticleDOI
25 Aug 2005-Nature
TL;DR: It is demonstrated that there is a decelerating relationship between community respiration and increasing bacterial diversity, and both synergistic interactions among bacterial species and the composition of the bacterial community are important in determining the level of ecosystem functioning.
Abstract: Despite their importance, we are only beginning to understand how mixed communities of bacteria operate. There is a good reason for this: the microbial world is remarkably complex and dynamic so it is difficult to design experiments that ask the right questions. Laboratory microcosms are useful but involve small numbers of species in unreal situations. A natural ecosystem that can be manipulated experimentally is available, however. Rainpools that form in bark-lined depressions at the base of European beech trees are communities of up to 72 species, rather than the thousands found in, say, pond water. In this rainpool ecosystem the number of bacterial species (the biodiversity) strongly influences the rate at which the community provides a particular service (in this case, respiration). On this scale at least, species richness determines the level at which an ecosystem can function. Bacterial communities provide important services. They break down pollutants, municipal waste and ingested food, and they are the primary means by which organic matter is recycled to plants and other autotrophs. However, the processes that determine the rate at which these services are supplied are only starting to be identified. Biodiversity influences the way in which ecosystems function1, but the form of the relationship between bacterial biodiversity and functioning remains poorly understood. Here we describe a manipulative experiment that measured how biodiversity affects the functioning of communities containing up to 72 bacterial species constructed from a collection of naturally occurring culturable bacteria. The experimental design allowed us to manipulate large numbers of bacterial species selected at random from those that were culturable. We demonstrate that there is a decelerating relationship between community respiration and increasing bacterial diversity. We also show that both synergistic interactions among bacterial species and the composition of the bacterial community are important in determining the level of ecosystem functioning.

836 citations


Journal ArticleDOI
TL;DR: In this paper, the authors explore a class of empirical Bayes methods for level-dependent threshold selection in wavelet shrinkage, where the prior considered for each wavelet coefficient is a mixture of an atom of probability at zero and a heavy-tailed density.
Abstract: This paper explores a class of empirical Bayes methods for level-dependent threshold selection in wavelet shrinkage. The prior considered for each wavelet coefficient is a mixture of an atom of probability at zero and a heavy-tailed density. The mixing weight, or sparsity parameter, for each level of the transform is chosen by marginal maximum likelihood. If estimation is carried out using the posterior median, this is a random thresholding procedure; the estimation can also be carried out using other thresholding rules with the same threshold. Details of the calculations needed for implementing the procedure are included. In practice, the estimates are quick to compute and there is software available. Simulations on the standard model functions show excellent performance, and applications to data drawn from various fields of application are used to explore the practical performance of the approach. By using a general result on the risk of the corresponding marginal maximum likelihood approach for a single sequence, overall bounds on the risk of the method are found subject to membership of the unknown function in one of a wide range of Besov classes, covering also the case of f of bounded variation. The rates obtained are optimal for any value of the parameter p in (0, ∞], simultaneously for a wide range of loss functions, each dominating the Lq norm of the σth derivative, with σ ≥ 0 and 0 < q ≤ 2. Attention is paid to the distinction between sampling the unknown function within white noise and sampling at discrete points, and between placing constraints on the function itself and on the discrete wavelet transform of its sequence of values at the observation points. Results for all relevant combinations of these scenarios are obtained. In some cases a key feature of the theory is a particular boundary-corrected wavelet basis, details of which are discussed. Overall, the approach described seems so far unique in combining the properties of fast computation, good theoretical properties and good performance in simulations and in practice. A key feature appears to be that the estimate of sparsity adapts to three different zones of estimation, first where the signal is not sparse enough for thresholding to be of benefit, second where an appropriately chosen threshold results in substantially improved estimation, and third where the signal is so sparse that the zero estimate gives the optimum accuracy rate.

317 citations


Journal ArticleDOI
TL;DR: A key feature appears to be that the estimate of sparsity adapts to three different zones of estimation, first where the signal is not sparse enough for thresholding to be of benefit, second where an appropriately chosen threshold results in substantially improved estimation, and third where the signals are so sparse that the zero estimate gives the optimum accuracy rate.
Abstract: This paper explores a class of empirical Bayes methods for level-dependent threshold selection in wavelet shrinkage. The prior considered for each wavelet coefficient is a mixture of an atom of probability at zero and a heavy-tailed density. The mixing weight, or sparsity parameter, for each level of the transform is chosen by marginal maximum likelihood. If estimation is carried out using the posterior median, this is a random thresholding procedure; the estimation can also be carried out using other thresholding rules with the same threshold. Details of the calculations needed for implementing the procedure are included. In practice, the estimates are quick to compute and there is software available. Simulations on the standard model functions show excellent performance, and applications to data drawn from various fields of application are used to explore the practical performance of the approach. By using a general result on the risk of the corresponding marginal maximum likelihood approach for a single sequence, overall bounds on the risk of the method are found subject to membership of the unknown function in one of a wide range of Besov classes, covering also the case of f of bounded variation. The rates obtained are optimal for any value of the parameter p in (0,\infty], simultaneously for a wide range of loss functions, each dominating the L_q norm of the \sigmath derivative, with \sigma\ge0 and 0

310 citations


Journal ArticleDOI
TL;DR: It is shown that recombination influences genetic diversity only at the level of recombination hotspots, and that hotspots have no influence on substitution rates, suggesting that they are too ephemeral on an evolutionary time scale to have a strong influence on broader scale patterns of base composition and long-term molecular evolution.
Abstract: In humans, the rate of recombination, as measured on the megabase scale, is positively associated with the level of genetic variation, as measured at the genic scale. Despite considerable debate, it is not clear whether these factors are causally linked or, if they are, whether this is driven by the repeated action of adaptive evolution or molecular processes such as double-strand break formation and mismatch repair. We introduce three innovations to the analysis of recombination and diversity: fine-scale genetic maps estimated from genotype experiments that identify recombination hotspots at the kilobase scale, analysis of an entire human chromosome, and the use of wavelet techniques to identify correlations acting at different scales. We show that recombination influences genetic diversity only at the level of recombination hotspots. Hotspots are also associated with local increases in GC content and the relative frequency of GC-increasing mutations but have no effect on substitution rates. Broad-scale association between recombination and diversity is explained through covariance of both factors with base composition. To our knowledge, these results are the first evidence of a direct and local influence of recombination hotspots on genetic variation and the fate of individual mutations. However, that hotspots have no influence on substitution rates suggests that they are too ephemeral on an evolutionary time scale to have a strong influence on broader scale patterns of base composition and long-term molecular evolution.

275 citations


Journal ArticleDOI
TL;DR: In this article, a functional adaptive model estimation (FAME) approach is proposed to model the relationship between a scalar, Y, and a functional predictor, X(t), which extends generalized linear models, generalized additive models, and projection pursuit regression (PPR) to handle functional predictors.
Abstract: In this article we are interested in modeling the relationship between a scalar, Y, and a functional predictor, X(t). We introduce a highly flexible approach called functional adaptive model estimation (FAME), which extends generalized linear models (GLMs), generalized additive models (GAMs), and projection pursuit regression (PPR) to handle functional predictors. The FAME approach can model any of the standard exponential family of response distributions that are assumed for GLM or GAM while maintaining the flexibility of PPR. For example, standard linear or logistic regression with functional predictors, as well as far more complicated models, can easily be applied using this approach. We use a functional principal components decomposition of the predictor functions to aid visualization of the relationship between X(t) and Y. We also show how the FAME procedure can be extended to deal with multiple functional and standard finite-dimensional predictors, possibly with missing data. We illustrate the FAME ...

120 citations


Journal ArticleDOI
TL;DR: The EbayesThresh package in the S language implements a class of Empirical Bayes thresholding methods that can take advantage of possible sparsity in the sequence, to improve the quality of estimation.
Abstract: Suppose that a sequence of unknown parameters is observed sub ject to independent Gaussian noise. The EbayesThresh package in the S language implements a class of Empirical Bayes thresholding methods that can take advantage of possible sparsity in the sequence, to improve the quality of estimation. The prior for each parameter in the sequence is a mixture of an atom of probability at zero and a heavy-tailed density. Within the package, this can be either a Laplace (double exponential) density or else a mixture of normal distributions with tail behavior similar to the Cauchy distribution. The mixing weight, or sparsity parameter, is chosen automatically by marginal maximum likelihood. If estimation is carried out using the posterior median, this is a random thresholding procedure; the estimation can also be carried out using other thresholding rules with the same threshold, and the package provides the posterior mean, and hard and soft thresholding, as additional options. This paper reviews the method, and gives details (far beyond those previously published) of the calculations needed for implementing the procedures. It explains and motivates both the general methodology, and the use of the EbayesThresh package, through simulated and real data examples. When estimating the wavelet transform of an unknown function, it is appropriate to apply the method level by level to the transform of the observed data. The package can carry out these calculations for wavelet transforms obtained using various packages in R and S-PLUS. Details, including a motivating example, are presented, and the application of the method to image estimation is also explored. The final topic considered is the estimation of a single sequence that may become progressively sparser along the sequence. An iterated least squares isotone regression method allows for the choice of a threshold that depends monotonically on the order in which the observations are made. An alternative possibility, also discussed in detail, is a particular parametric dependence of the sparsity parameter on the position in the sequence.

88 citations