scispace - formally typeset
A

Andrew Zisserman

Researcher at University of Oxford

Publications -  808
Citations -  312028

Andrew Zisserman is an academic researcher from University of Oxford. The author has contributed to research in topics: Convolutional neural network & Real image. The author has an hindex of 167, co-authored 808 publications receiving 261717 citations. Previous affiliations of Andrew Zisserman include University of Edinburgh & Microsoft.

Papers
More filters
Proceedings Article

A Plane Measuring Device.

TL;DR: In this article, an uncertainty analysis for both the errors in image localization and the uncertainty in the imaging transformation is presented, which is valid if the matrix is over determined and also if the minimum number of correspondences are used.
Proceedings ArticleDOI

Human pose search using deep poselets

TL;DR: This work introduces `deep poselets' for pose-sensitive detection of various body parts, that are built on convolutional neural network (CNN) features, that significantly outperform previous instantiations of Berkeley poselets.
Journal ArticleDOI

Human pose search using deep networks

TL;DR: Two novel ways for representing the image of a person striking a pose, one looking for parts and other looking at the whole image are proposed, and these representations are then used for retrieval.
Proceedings ArticleDOI

QUERYD: A Video Dataset with High-Quality Text and Audio Narrations

TL;DR: The QuerYD dataset as mentioned in this paper is a large-scale dataset for retrieval and event localisation in videos, which contains highly detailed, temporally aligned audio and text annotations, which can be used to train and benchmark strong models for retrieval.
Posted Content

Perceiver IO: A General Architecture for Structured Inputs & Outputs

TL;DR: Perceiver IO as mentioned in this paper proposes to learn to flexibly query the model's latent space to produce outputs of arbitrary size and semantics, and achieves state-of-the-art results on tasks with highly structured output spaces.