S
Samyak Parajuli
Researcher at University of California, Berkeley
Publications - 7
Citations - 773
Samyak Parajuli is an academic researcher from University of California, Berkeley. The author has contributed to research in topics: Artificial neural network & Computer science. The author has an hindex of 3, co-authored 6 publications receiving 188 citations.
Papers
More filters
Posted Content
The Many Faces of Robustness: A Critical Analysis of Out-of-Distribution Generalization
Dan Hendrycks,Steven Basart,Norman Mu,Saurav Kadavath,Frank Wang,Evan Dorundo,Rahul Desai,Tyler Zhu,Samyak Parajuli,Mike Guo,Dawn Song,Jacob Steinhardt,Justin Gilmer +12 more
TL;DR: It is found that using larger models and artificial data augmentations can improve robustness on real-world distribution shifts, contrary to claims in prior work.
Proceedings Article
The Many Faces of Robustness: A Critical Analysis of Out-of-Distribution Generalization
Dan Hendrycks,Steven Basart,Norman Mu,Saurav Kadavath,Frank Wang,Evan Dorundo,Rahul Desai,Tyler Zhu,Samyak Parajuli,Mike Guo,Dawn Song,Jacob Steinhardt,Justin Gilmer +12 more
TL;DR: In this article, the authors introduce four new real-world distribution shift datasets consisting of changes in image style, image blurriness, geographic location, camera operation, and more.
Posted Content
Inter-Level Cooperation in Hierarchical Reinforcement Learning
Abdul Rahman Kreidieh,Samyak Parajuli,Nathan Lichtle,Yiling You,Rayyan Nasr,Alexandre M. Bayen +5 more
TL;DR: It is hypothesized that improved cooperation between the internal agents of a hierarchy can simplify the credit assignment problem from the perspective of the high-level policies, thereby leading to significant improvements to training in situations where intricate sets of action primitives must be performed to yield improvements in performance.
Posted Content
Generalized Ternary Connect: End-to-End Learning and Compression of Multiplication-Free Deep Neural Networks
TL;DR: Generalized Ternary Connect is proposed, which allows an arbitrary number of levels while at the same time eliminating multiplications by restricting the parameters to integer powers of two, and demonstrates superior compression and similar accuracy of GTC in comparison to several state-of-the-art methods for neural network compression.
Posted Content
Dynamically Throttleable Neural Networks (TNN).
TL;DR: This work presents a runtime throttleable neural network (TNN) that can adaptively self-regulate its own performance target and computing resources and designed TNN with several properties that enable more flexibility for dynamic execution based on runtime context.