scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Pattern Recognition and Machine Learning

01 Aug 2007-Technometrics (Taylor & Francis)-Vol. 49, Iss: 3, pp 366-366
TL;DR: This book covers a broad range of topics for regular factorial designs and presents all of the material in very mathematical fashion and will surely become an invaluable resource for researchers and graduate students doing research in the design of factorial experiments.
Abstract: (2007). Pattern Recognition and Machine Learning. Technometrics: Vol. 49, No. 3, pp. 366-366.
Citations
More filters
Proceedings ArticleDOI
01 Jan 2009
TL;DR: This paper expresses the objective function purely in terms of the unknown segmentation, using higher-order cliques, and shows that the optimal decomposition for the problem can be computed efficiently via a parametric maxflow algorithm.
Abstract: Many interactive image segmentation approaches use an objective function which includes appearance models as an unknown variable. Since the resulting optimization problem is NP-hard the segmentation and appearance are typically optimized separately, in an EM-style fashion. One contribution of this paper is to express the objective function purely in terms of the unknown segmentation, using higher-order cliques. This formulation reveals an interesting bias of the model towards balanced segmentations. Furthermore, it enables us to develop a new dual decomposition optimization procedure, which provides additionally a lower bound. Hence, we are able to improve on existing optimizers, and verify that for a considerable number of real world examples we even achieve global optimality. This is important since we are able, for the first time, to analyze the deficiencies of the model. Another contribution is to establish a property of a particular dual decomposition approach which involves convex functions depending on foreground area. As a consequence, we show that the optimal decomposition for our problem can be computed efficiently via a parametric maxflow algorithm.

128 citations


Cites background from "Pattern Recognition and Machine Lea..."

  • ...(Note, it is well-known that MAP estimation with the GMM model is strictly speaking an ill-posed problem since by fitting a Gaussian to the color of a single pixel we may get an infinite likelihood - see [3], section 9....

    [...]

Journal ArticleDOI
TL;DR: It is argued that the structural representationalist interpretation of generative and recognition models does not do justice to the role that these constructs play in active inference under the FEP, and an enactive interpretation of active inference is proposed.
Abstract: The aim of this article is to clarify how best to interpret some of the central constructs that underwrite the free-energy principle (FEP) – and its corollary, active inference – in theoretical neu...

128 citations


Cites background from "Pattern Recognition and Machine Lea..."

  • ...…models are structural representations rests on an oversimplified reading of these constructs, based in older Bayesian theories such as the Helmholtz machine (Dayan et al., 1995) and nonenactive appeals to variational Bayesian methods (Bishop, 2006), rather than on active inference under the FEP....

    [...]

  • ..., 1995) and non-enactive appeals to variational Bayesian methods (Bishop, 2006), rather than on active inference under the FEP....

    [...]

Journal ArticleDOI
TL;DR: The algorithm V-Bay is described, a variational Bayes algorithm for multiple locus GWA analysis, which is designed to identify weaker associations that may contribute to this missing heritability.
Abstract: Background The success achieved by genome-wide association (GWA) studies in the identification of candidate loci for complex diseases has been accompanied by an inability to explain the bulk of heritability. Here, we describe the algorithm V-Bay, a variational Bayes algorithm for multiple locus GWA analysis, which is designed to identify weaker associations that may contribute to this missing heritability.

128 citations


Cites methods from "Pattern Recognition and Machine Lea..."

  • ...The specific type of variational method implemented in V-Bay is a mean-field approximation, where a high dimensional joint distribution of many variables (in this case genetic marker effects) is approximated by a product of many lower dimensional distributions [23]....

    [...]

  • ...This is accomplished by taking the expectation of the log joint posterior density, with respect to each parameter’s density from the factorized form, and iterating until convergence [23]....

    [...]

  • ...iterated through until convergence [23,27]....

    [...]

Journal ArticleDOI
TL;DR: In this paper, the focus of emotion research will leap forward when its focus changes from comparing averaged statistics of self-report data across people experiencing emotion in laboratories to characterizing patterns of data from individuals and clusters of similar individuals in real life.
Abstract: Emotion research will leap forward when its focus changes from comparing averaged statistics of self-report data across people experiencing emotion in laboratories to characterizing patterns of data from individuals and clusters of similar individuals experiencing emotion in real life. Such an advance will come about through engineers and psychologists collaborating to create new ways for people to measure, share, analyze, and learn from objective emotional responses in situations that truly matter to people. This approach has the power to greatly advance the science of emotion while also providing personalized help to participants in the research.

128 citations


Cites background from "Pattern Recognition and Machine Lea..."

  • ...Is there a way to return to the meaningfulness of idiographic research while preserving the objectivity of nomothetic research? Increasingly, there is a way, using new technologies that allow ultradense objective measurements of individuals, coupled with pattern analysis tools that characterize not merely gross statistics like averages, but complex dynamic structures both within and across individuals (Bishop, 2006; Duda, Hart, & Stork, 2001; Jain & Dubes, 1988)....

    [...]

  • ...…technologies that allow ultradense objective measurements of individuals, coupled with pattern analysis tools that characterize not merely gross statistics like averages, but complex dynamic structures both within and across individuals (Bishop, 2006; Duda, Hart, & Stork, 2001; Jain & Dubes, 1988)....

    [...]

Journal ArticleDOI
TL;DR: The experimental results, analysis and statistical tests prove the ability of the proposed approach to improve prediction performance against all the base classifiers, hybrid and the traditional combination methods in terms of average accuracy, the area under the curve (AUC) H-measure and the Brier Score.
Abstract: A new hybrid ensemble model for credit scoring problem is proposed.An improved data filtering technique is developed based on GNG method.GNG with MARS combined proved to be better than applying them individually.Our model is validated on four performance measures over seven credit datasets.Classifiers decisions after consensus effectively improved prediction performance. During the last few years there has been marked attention towards hybrid and ensemble systems development, having proved their ability to be more accurate than single classifier models. However, among the hybrid and ensemble models developed in the literature there has been little consideration given to: 1) combining data filtering and feature selection methods 2) combining classifiers of different algorithms; and 3) exploring different classifier output combination techniques other than the traditional ones found in the literature. In this paper, the aim is to improve predictive performance by presenting a new hybrid ensemble credit scoring model through the combination of two data pre-processing methods based on Gabriel Neighbourhood Graph editing (GNG) and Multivariate Adaptive Regression Splines (MARS) in the hybrid modelling phase. In addition, a new classifier combination rule based on the consensus approach (ConsA) of different classification algorithms during the ensemble modelling phase is proposed. Several comparisons will be carried out in this paper, as follows: 1) Comparison of individual base classifiers with the GNG and MARS methods applied separately and combined in order to choose the best results for the ensemble modelling phase; 2) Comparison of the proposed approach with all the base classifiers and ensemble classifiers with the traditional combination methods; and 3) Comparison of the proposed approach with recent related studies in the literature. Five of the well-known base classifiers are used, namely, neural networks (NN), support vector machines (SVM), random forests (RF), decision trees (DT), and naive Bayes (NB). The experimental results, analysis and statistical tests prove the ability of the proposed approach to improve prediction performance against all the base classifiers, hybrid and the traditional combination methods in terms of average accuracy, the area under the curve (AUC) H-measure and the Brier Score. The model was validated over seven real world credit datasets.

127 citations


Cites background from "Pattern Recognition and Machine Lea..."

  • ...Bayesian classification is based on Bayesian theory and is a valuable measure when the input feature space is high (Bishop, 2006)....

    [...]