scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Pattern Recognition and Machine Learning

01 Aug 2007-Technometrics (Taylor & Francis)-Vol. 49, Iss: 3, pp 366-366
TL;DR: This book covers a broad range of topics for regular factorial designs and presents all of the material in very mathematical fashion and will surely become an invaluable resource for researchers and graduate students doing research in the design of factorial experiments.
Abstract: (2007). Pattern Recognition and Machine Learning. Technometrics: Vol. 49, No. 3, pp. 366-366.
Citations
More filters
Journal ArticleDOI
TL;DR: It is shown that machine learning plays a key role in many radiology applications and the performance of machine learning-based automatic detection and diagnosis systems has shown to be comparable to that of a well-trained and experienced radiologist.

526 citations


Cites background or methods from "Pattern Recognition and Machine Lea..."

  • ...can learn complex relationships or patterns from empirical data and make accurate decisions (Bishop, 2006; Duda et al., 2000; Mitchell, 1997)....

    [...]

  • ...were widely used to obtain a point estimation in parametric space (Bishop, 2006)....

    [...]

  • ...So in this section we will give a concise introduction to the most important topics of machine learning (Bishop, 2006)....

    [...]

  • ...Machine learning is the study of computer algorithms which can learn complex relationships or patterns from empirical data and make accurate decisions (Bishop, 2006; Duda et al., 2000; Mitchell, 1997)....

    [...]

  • ...Given a probability model, maximum likelihood (ML) and MAP were widely used to obtain a point estimation in parametric space (Bishop, 2006)....

    [...]

Posted Content
TL;DR: In this article, the authors explore the use of neural networks as an alternative to GPs to model distributions over functions, and show that performing adaptive basis function regression with a neural network as the parametric form performs competitively with state-of-the-art GP-based approaches, but scales linearly with the number of data rather than cubically.
Abstract: Bayesian optimization is an effective methodology for the global optimization of functions with expensive evaluations. It relies on querying a distribution over functions defined by a relatively cheap surrogate model. An accurate model for this distribution over functions is critical to the effectiveness of the approach, and is typically fit using Gaussian processes (GPs). However, since GPs scale cubically with the number of observations, it has been challenging to handle objectives whose optimization requires many evaluations, and as such, massively parallelizing the optimization. In this work, we explore the use of neural networks as an alternative to GPs to model distributions over functions. We show that performing adaptive basis function regression with a neural network as the parametric form performs competitively with state-of-the-art GP-based approaches, but scales linearly with the number of data rather than cubically. This allows us to achieve a previously intractable degree of parallelism, which we apply to large scale hyperparameter optimization, rapidly finding competitive models on benchmark object recognition tasks using convolutional networks, and image caption generation using neural language models.

524 citations

Journal ArticleDOI
10 Dec 2008-Neuron
TL;DR: This study reconstructed visual images by combining local image bases of multiple scales, whose contrasts were independently decoded from fMRI activity by automatically selecting relevant voxels and exploiting their correlated patterns.

522 citations


Cites background or methods from "Pattern Recognition and Machine Lea..."

  • ...Our classification model is based on multinomial logistic regression (Bishop, 2006), in which each contrast class has a linear discriminant function that calculates the weighted sum of the inputs (voxel values)....

    [...]

  • ...The sparse Neuron 60, 915–929, December 11, 2008 ª2008 Elsevier Inc. 927 parameter estimation could avoid overfitting to noisy training data by pruning irrelevant voxels (Bishop, 2006), and thereby help to achieve high generalization (test) performance (Yamashita et al., 2008)....

    [...]

Posted Content
TL;DR: It is shown that a heuristic called minimum information constraint that has been shown to mitigate this effect in VAEs can also be applied to improve unsupervised clustering performance with this variant of the variational autoencoder model with a Gaussian mixture as a prior distribution.
Abstract: We study a variant of the variational autoencoder model (VAE) with a Gaussian mixture as a prior distribution, with the goal of performing unsupervised clustering through deep generative models. We observe that the known problem of over-regularisation that has been shown to arise in regular VAEs also manifests itself in our model and leads to cluster degeneracy. We show that a heuristic called minimum information constraint that has been shown to mitigate this effect in VAEs can also be applied to improve unsupervised clustering performance with our model. Furthermore we analyse the effect of this heuristic and provide an intuition of the various processes with the help of visualizations. Finally, we demonstrate the performance of our model on synthetic data, MNIST and SVHN, showing that the obtained clusters are distinct, interpretable and result in achieving competitive performance on unsupervised clustering to the state-of-the-art results.

521 citations

Journal ArticleDOI
TL;DR: A novel method to infer microbial community ecology directly from time-resolved metagenomics is presented, extending generalized Lotka–Volterra dynamics to account for external perturbations and suggests a subnetwork of bacterial groups implicated in protection against C. difficile.
Abstract: The intestinal microbiota is a microbial ecosystem of crucial importance to human health. Understanding how the microbiota confers resistance against enteric pathogens and how antibiotics disrupt that resistance is key to the prevention and cure of intestinal infections. We present a novel method to infer microbial community ecology directly from time-resolved metagenomics. This method extends generalized Lotka–Volterra dynamics to account for external perturbations. Data from recent experiments on antibiotic-mediated Clostridium difficile infection is analyzed to quantify microbial interactions, commensal-pathogen interactions, and the effect of the antibiotic on the community. Stability analysis reveals that the microbiota is intrinsically stable, explaining how antibiotic perturbations and C. difficile inoculation can produce catastrophic shifts that persist even after removal of the perturbations. Importantly, the analysis suggests a subnetwork of bacterial groups implicated in protection against C. difficile. Due to its generality, our method can be applied to any high-resolution ecological time-series data to infer community structure and response to external stimuli.

521 citations