D
Dan Moldovan
Researcher at Google
Publications - 8
Citations - 561
Dan Moldovan is an academic researcher from Google. The author has contributed to research in topics: Source code & Python (programming language). The author has an hindex of 6, co-authored 8 publications receiving 237 citations.
Papers
More filters
Posted Content
Underspecification Presents Challenges for Credibility in Modern Machine Learning
Alexander D'Amour,Katherine Heller,Dan Moldovan,Ben Adlam,Babak Alipanahi,Alex Beutel,Christina Chen,Jonathan Deaton,Jacob Eisenstein,Matthew D. Hoffman,Farhad Hormozdiari,Neil Houlsby,Shaobo Hou,Ghassen Jerfel,Alan Karthikesalingam,Mario Lucic,Yi-An Ma,Cory Y. McLean,Diana Mincu,Akinori Mitani,Andrea Montanari,Zachary Nado,Vivek T. Natarajan,Christopher Nielson,Thomas F. Osborne,Rajiv Raman,Kim Ramasamy,Rory Sayres,Jessica Schrouff,Martin G. Seneviratne,Shannon Sequeira,Harini Suresh,Victor Veitch,Max Vladymyrov,Xuezhi Wang,Kellie Webster,Steve Yadlowsky,Taedong Yun,Xiaohua Zhai,D. Sculley +39 more
TL;DR: This work shows the need to explicitly account for underspecification in modeling pipelines that are intended for real-world deployment in any domain, and shows that this problem appears in a wide variety of practical ML pipelines.
Posted Content
On Robustness and Transferability of Convolutional Neural Networks
Josip Djolonga,Jessica Yung,Michael Tschannen,Rob Romijnders,Lucas Beyer,Alexander Kolesnikov,Joan Puigcerver,Matthias Minderer,Alexander D'Amour,Dan Moldovan,Sylvain Gelly,Neil Houlsby,Xiaohua Zhai,Mario Lucic +13 more
TL;DR: It is found that increasing both the training set and model sizes significantly improve the distributional shift robustness and it is shown that, perhaps surprisingly, simple changes in the preprocessing can significantly mitigate robustness issues in some cases.
Proceedings ArticleDOI
On Robustness and Transferability of Convolutional Neural Networks
Josip Djolonga,Jessica Yung,Michael Tschannen,Rob Romijnders,Lucas Beyer,Alexander Kolesnikov,Joan Puigcerver,Matthias Minderer,Alexander D'Amour,Dan Moldovan,Sylvain Gelly,Neil Houlsby,Xiaohua Zhai,Mario Lucic +13 more
TL;DR: In this article, the authors investigate the impact of pre-training data size, the model scale, and the data preprocessing pipeline on distributional shift robustness of CNNs.
Proceedings Article
Tangent: Automatic differentiation using source-code transformation for dynamically typed array programming
TL;DR: Techniques from the field of automatic differentiation that can give researchers expressive power, performance and strong usability are explored, including source-code transformation (SCT), flexible gradient surgery, efficient in-place array operations, and higher-order derivatives.
Posted Content
AutoGraph: Imperative-style Coding with Graph-based Performance
Dan Moldovan,James M. Decker,Fei Wang,Andrew A. Johnson,Brian Lee,Zachary Nado,D. Sculley,Tiark Rompf,Alexander B. Wiltschko +8 more
TL;DR: This work describes how the use of staged programming in Python, via source code transformation, offers a midpoint between these two library design patterns, capturing the benefits of both machine learning and imperative programming.