scispace - formally typeset
S

Sadeep Jayasumana

Researcher at Google

Publications -  30
Citations -  5216

Sadeep Jayasumana is an academic researcher from Google. The author has contributed to research in topics: Deep learning & Conditional random field. The author has an hindex of 14, co-authored 27 publications receiving 4500 citations. Previous affiliations of Sadeep Jayasumana include NICTA & University of Oxford.

Papers
More filters
Proceedings ArticleDOI

Conditional Random Fields as Recurrent Neural Networks

TL;DR: In this article, a new form of convolutional neural network that combines the strengths of Convolutional Neural Networks (CNNs) and Conditional Random Fields (CRFs)-based probabilistic graphical modelling is introduced.
Proceedings ArticleDOI

Conditional Random Fields as Recurrent Neural Networks

TL;DR: A new form of convolutional neural network that combines the strengths of Convolutional Neural Networks (CNNs) and Conditional Random Fields (CRFs)-based probabilistic graphical modelling is introduced, and top results are obtained on the challenging Pascal VOC 2012 segmentation benchmark.
Proceedings ArticleDOI

Kernel Methods on the Riemannian Manifold of Symmetric Positive Definite Matrices

TL;DR: To encode the geometry of the manifold in the mapping, a family of provably positive definite kernels on the Riemannian manifold of SPD matrices is introduced, derived from the Gaussian kernel, but exploit different metrics on the manifold.
Posted Content

Long-tail learning via logit adjustment

TL;DR: These techniques revisit the classic idea of logit adjustment based on the label frequencies, either applied post-hoc to a trained model, or enforced in the loss during training, to encourage a large relative margin between logits of rare versus dominant labels.
Journal ArticleDOI

Kernel Methods on Riemannian Manifolds with Gaussian RBF Kernels

TL;DR: It is shown that many popular algorithms designed for Euclidean spaces, such as support vector machines, discriminant analysis and principal component analysis can be generalized to Riemannian manifolds with the help of such positive definite Gaussian kernels.