W
William A. P. Smith
Researcher at University of York
Publications - 202
Citations - 5631
William A. P. Smith is an academic researcher from University of York. The author has contributed to research in topics: Statistical model & Facial recognition system. The author has an hindex of 35, co-authored 198 publications receiving 4489 citations. Previous affiliations of William A. P. Smith include Imperial College London & Daresbury Laboratory.
Papers
More filters
Book ChapterDOI
“Look Ma, No Landmarks!” – Unsupervised, Model-Based Dense Face Alignment
TL;DR: This paper shows how to train an image-to-image network to predict dense correspondence between a face image and a 3D morphable model using only the model for supervision and shows that both geometric parameters and photometric parameters can be inferred directly from the correspondence map using linear least squares and the novel inverse spherical harmonic lighting model.
Proceedings ArticleDOI
Functional Faces: Groupwise Dense Correspondence Using Functional Maps
TL;DR: This paper proposes a groupwise variant of the method for computing dense correspondence between a set of 3D face meshes using functional maps, and shows how a functional map provides a geometric constraint that can be used to filter feature matches between non-rigidly deforming surfaces.
Journal ArticleDOI
Molecular dynamics simulations of (001) MgO surface contacts: effects of tip structures and surface matching
TL;DR: In this paper, molecular dynamics simulations were carried out to study contact behavior of MgO pyramidal tips with atomically flat surfaces in the (001) direction, and it was found that detailed contact and withdrawal behavior depended very much upon the local geometrical structure of the contacting tips.
Journal ArticleDOI
Gender discriminating models from facial surface normals
TL;DR: The classification accuracy, which is as high as 97%, demonstrates the effectiveness of using facial shape information for gender classification.
Journal ArticleDOI
A shadow constrained conditional generative adversarial net for SRTM data restoration
TL;DR: Experimental results validate the superiority of the SCGAN over the comparison methods, i.e., the interpolation, the convolutional neural network (CNN) and the baseline CGAN, in SRTM data restoration.