scispace - formally typeset
A

Andreas Damianou

Researcher at University of Oxford

Publications -  69
Citations -  3315

Andreas Damianou is an academic researcher from University of Oxford. The author has contributed to research in topics: Gaussian process & Inference. The author has an hindex of 24, co-authored 67 publications receiving 2347 citations. Previous affiliations of Andreas Damianou include University of Leicester & Amazon.com.

Papers
More filters
Proceedings Article

Deep Gaussian Processes

TL;DR: Deep Gaussian process (GP) models are introduced and model selection by the variational bound shows that a five layer hierarchy is justified even when modelling a digit data set containing only 150 examples.
Proceedings ArticleDOI

Variational Information Distillation for Knowledge Transfer

TL;DR: In this article, the authors propose an information-theoretic framework for knowledge transfer which formulates knowledge transfer as maximizing the mutual information between the teacher and the student networks, and compare their method with existing knowledge transfer methods on both knowledge distillation and transfer learning tasks and show that their method consistently outperforms existing methods.
Journal ArticleDOI

Nonlinear information fusion algorithms for data-efficient multi-fidelity modelling.

TL;DR: A probabilistic framework based on Gaussian process regression and nonlinear autoregressive schemes that is capable of learning complex nonlinear and space-dependent cross-correlations between models of variable fidelity, and can effectively safeguard against low-fidelity models that provide wrong trends is put forth.
Posted Content

Variational Information Distillation for Knowledge Transfer.

TL;DR: An information-theoretic framework for knowledge transfer is proposed which formulates knowledge transfer as maximizing the mutual information between the teacher and the student networks and which consistently outperforms existing methods.
Journal ArticleDOI

Variational inference for latent variables and uncertain inputs in Gaussian processes

TL;DR: A Bayesian method for training GP-LVMs by introducing a non-standard variational inference framework that allows to approximately integrate out the latent variables and subsequently train a GP-LVM by maximising an analytic lower bound on the exact marginal likelihood.