M
Matthew Thorpe
Researcher at University of Manchester
Publications - 39
Citations - 1496
Matthew Thorpe is an academic researcher from University of Manchester. The author has contributed to research in topics: Semi-supervised learning & Graph (abstract data type). The author has an hindex of 14, co-authored 34 publications receiving 724 citations. Previous affiliations of Matthew Thorpe include University of Warwick & University of Cambridge.
Papers
More filters
Journal ArticleDOI
Common pitfalls and recommendations for using machine learning to detect and prognosticate for COVID-19 using chest radiographs and CT scans
Michael S. Roberts,Michael S. Roberts,Derek Driggs,Matthew Thorpe,Julian D. Gilbey,Michael Yeung,Stephan Ursprung,Angelica I. Aviles-Rivero,Christian Etmann,Cathal McCague,Lucian Beer,Jonathan R. Weir-McCall,Jonathan R. Weir-McCall,Zhongzhao Teng,Effrossyni Gkrania-Klotsas,James H.F. Rudd,Evis Sala,Carola-Bibiane Schönlieb +17 more
TL;DR: It is found that none of the models identified are of potential clinical use due to methodological flaws and/or underlying biases, which is a major weakness, given the urgency with which validated COVID-19 models are needed.
Journal ArticleDOI
Optimal Mass Transport: Signal processing and machine-learning applications
TL;DR: A practical overview of the mathematical underpinnings of mass transport-related methods, including numerical implementation, are provided as well as a review, with demonstrations, of several applications.
Journal ArticleDOI
Analysis of $p$-Laplacian Regularization in Semisupervised Learning
Dejan Slepčev,Matthew Thorpe +1 more
TL;DR: The authors investigate a family of regression problems in a semisupervised setting, where the task is to assign real-valued labels to a set of sample points provided by a small training subset of labeled po...
Posted Content
Deep Limits of Residual Neural Networks
Matthew Thorpe,Yves van Gennip +1 more
TL;DR: The variational analysis provides a discrete-to-continuum $\Gamma$-convergence result for the objective function of the residual neural network training step to a variational problem constrained by a system of ordinary differential equations; this rigorously connects the discrete setting to a continuum problem.
Journal ArticleDOI
Large data and zero noise limits of graph-based semi-supervised learning algorithms
TL;DR: Scalings in which the graph Laplacian approaches a differential operator in the large graph limit are used to develop understanding of a number of algorithms for semi-supervised learning; in particular the extension, to this graph setting, of the probit algorithm, level set and kriging methods are studied.