S
Sarah Adel Bargal
Researcher at Boston University
Publications - 44
Citations - 2170
Sarah Adel Bargal is an academic researcher from Boston University. The author has contributed to research in topics: Computer science & Artificial neural network. The author has an hindex of 15, co-authored 42 publications receiving 1390 citations. Previous affiliations of Sarah Adel Bargal include Istituto Italiano di Tecnologia.
Papers
More filters
Journal ArticleDOI
Top-Down Neural Attention by Excitation Backprop
TL;DR: A new backpropagation scheme, called Excitation Backprop, is proposed to pass along top-down signals downwards in the network hierarchy via a probabilistic Winner-Take-All process, and the concept of contrastive attention is introduced to make the top- down attention maps more discriminative.
Journal ArticleDOI
Moments in Time Dataset: One Million Videos for Event Understanding
Mathew Monfort,Carl Vondrick,Aude Oliva,Alex Andonian,Bolei Zhou,Kandan Ramakrishnan,Sarah Adel Bargal,Tom Yan,Lisa M. Brown,Quanfu Fan,Dan Gutfreund +10 more
TL;DR: The Moments in Time dataset, a large-scale human-annotated collection of one million short videos corresponding to dynamic events unfolding within three seconds, can serve as a new challenge to develop models that scale to the level of complexity and abstract reasoning that a human processes on a daily basis.
Posted Content
Moments in Time Dataset: one million videos for event understanding
Mathew Monfort,Bolei Zhou,Sarah Adel Bargal,Alex Andonian,Tom Yan,Kandan Ramakrishnan,Lisa M. Brown,Quanfu Fan,Dan Gutfruend,Carl Vondrick,Aude Oliva +10 more
TL;DR: The Moments in Time dataset as mentioned in this paper is a large-scale human-annotated collection of one million short videos corresponding to dynamic events unfolding within three seconds, where each video is tagged with one action or activity label among 339 different classes.
Proceedings ArticleDOI
Emotion recognition in the wild from videos using images
TL;DR: This paper presents the implementation details of the proposed solution to the Emotion Recognition in the Wild 2016 Challenge, in the category of video-based emotion recognition, which achieves 59.42% validation accuracy and improves the competition baseline of 38.81%.
Proceedings ArticleDOI
MIHash: Online Hashing with Mutual Information
TL;DR: This paper proposes an efficient quality measure for hash functions, based on an information-theoretic quantity, mutual information, and uses it successfully as a criterion to eliminate unnecessary hash table updates, and develops a novel hashing method, MIHash, that can be used in both online and batch settings.