scispace - formally typeset
N

Niki Parmar

Researcher at Google

Publications -  41
Citations -  66115

Niki Parmar is an academic researcher from Google. The author has contributed to research in topics: Transformer (machine learning model) & Machine translation. The author has an hindex of 22, co-authored 39 publications receiving 31763 citations. Previous affiliations of Niki Parmar include University of Southern California.

Papers
More filters
Proceedings ArticleDOI

Conformer: Convolution-augmented Transformer for Speech Recognition

TL;DR: Conformer as mentioned in this paper combines convolution neural networks and transformers to model both local and global dependencies of an audio sequence in a parameter-efficient way, achieving state-of-the-art accuracies.
Proceedings Article

Stand-Alone Self-Attention in Vision Models

TL;DR: The results establish that stand-alone self-attention is an important addition to the vision practitioner's toolbox and is especially impactful when used in later layers.
Proceedings Article

Tensor2Tensor for Neural Machine Translation

TL;DR: Tensor2Tensor as mentioned in this paper is a library for deep learning models that is well-suited for neural machine translation and includes the reference implementation of the state-of-the-art Transformer model.
Posted Content

One Model To Learn Them All

TL;DR: It is shown that tasks with less data benefit largely from joint training with other tasks, while performance on large tasks degrades only slightly if at all, and that adding a block to the model never hurts performance and in most cases improves it on all tasks.
Proceedings ArticleDOI

Scaling Local Self-Attention for Parameter Efficient Visual Backbones

TL;DR: In this article, self-attention has been shown to have encouraging improvements on accuracy-parameter trade-offs compared to baseline convolutional models such as ResNet-50.