J
James Bergstra
Researcher at Université de Montréal
Publications - 45
Citations - 22978
James Bergstra is an academic researcher from Université de Montréal. The author has contributed to research in topics: Python (programming language) & Feature learning. The author has an hindex of 28, co-authored 45 publications receiving 18522 citations. Previous affiliations of James Bergstra include Harvard University & University of Waterloo.
Papers
More filters
Journal Article
Random search for hyper-parameter optimization
James Bergstra,Yoshua Bengio +1 more
TL;DR: This paper shows empirically and theoretically that randomly chosen trials are more efficient for hyper-parameter optimization than trials on a grid, and shows that random search is a natural baseline against which to judge progress in the development of adaptive (sequential) hyper- parameter optimization algorithms.
Proceedings Article
Algorithms for Hyper-Parameter Optimization
TL;DR: This work contributes novel techniques for making response surface models P(y|x) in which many elements of hyper-parameter assignment (x) are known to be irrelevant given particular values of other elements.
Posted Content
Theano: A Python framework for fast computation of mathematical expressions
Rami Al-Rfou,Guillaume Alain,Amjad Almahairi,Christof Angermueller,Dzmitry Bahdanau,Nicolas Ballas,Frédéric Bastien,Justin Bayer,Anatoly Belikov,Alexander Belopolsky,Yoshua Bengio,Arnaud Bergeron,James Bergstra,Valentin Bisson,Josh Bleecher Snyder,Nicolas Bouchard,Nicolas Boulanger-Lewandowski,Xavier Bouthillier,Alexandre de Brébisson,Olivier Breuleux,Pierre Luc Carrier,Kyunghyun Cho,Jan Chorowski,Paul F. Christiano,Tim Cooijmans,Marc-Alexandre Côté,Myriam Côté,Aaron Courville,Yann N. Dauphin,Olivier Delalleau,Julien Demouth,Guillaume Desjardins,Sander Dieleman,Laurent Dinh,Mélanie Ducoffe,Vincent Dumoulin,Samira Ebrahimi Kahou,Dumitru Erhan,Ziye Fan,Orhan Firat,Mathieu Germain,Xavier Glorot,Ian Goodfellow,Matthew M. Graham,Caglar Gulcehre,Philippe Hamel,Iban Harlouchet,Jean-Philippe Heng,Balázs Hidasi,Sina Honari,Arjun Jain,Sébastien Jean,Kai Jia,Mikhail Korobov,Vivek Kulkarni,Alex Lamb,Pascal Lamblin,Eric Larsen,César Laurent,Sean Lee,Simon Lefrancois,Simon Lemieux,Nicholas Léonard,Zhouhan Lin,Jesse A. Livezey,Cory Lorenz,Jeremiah Lowin,Qianli Ma,Pierre-Antoine Manzagol,Olivier Mastropietro,Robert T. McGibbon,Roland Memisevic,Bart van Merriënboer,Vincent Michalski,Mehdi Mirza,Alberto Orlandi,Chris Pal,Razvan Pascanu,Mohammad Pezeshki,Colin Raffel,Daniel Renshaw,Matthew Rocklin,Adriana Romero,Markus Roth,Peter Sadowski,John Salvatier,François Savard,Jan Schlüter,John Schulman,Gabriel Schwartz,Iulian Vlad Serban,Dmitriy Serdyuk,Samira Shabanian,Étienne Simon,Sigurd Spieckermann,S. Ramana Subramanyam,Jakub Sygnowski,Jérémie Tanguay,Gijs van Tulder,Joseph Turian,Sebastian Urban,Pascal Vincent,Francesco Visin,Harm de Vries,David Warde-Farley,Dustin J. Webb,Matthew Willson,Kelvin Xu,Lijun Xue,Li Yao,Saizheng Zhang,Ying Zhang +111 more
TL;DR: The performance of Theano is compared against Torch7 and TensorFlow on several machine learning models and recently-introduced functionalities and improvements are discussed.
Proceedings Article
Making a Science of Model Search: Hyperparameter Optimization in Hundreds of Dimensions for Vision Architectures
TL;DR: This work proposes a meta-modeling approach to support automated hyperparameter optimization, with the goal of providing practical tools that replace hand-tuning with a reproducible and unbiased optimization process.
Posted Content
Theano: new features and speed improvements
Frédéric Bastien,Pascal Lamblin,Razvan Pascanu,James Bergstra,Ian Goodfellow,Arnaud Bergeron,Nicolas Bouchard,David Warde-Farley,Yoshua Bengio +8 more
TL;DR: New features and efficiency improvements to Theano are presented, and benchmarks demonstrating Theano's performance relative to Torch7, a recently introduced machine learning library, and to RNNLM, a C++ library targeted at recurrent neural networks.