A
Andrew Patterson
Researcher at University of Alberta
Publications - 18
Citations - 312
Andrew Patterson is an academic researcher from University of Alberta. The author has contributed to research in topics: Computer science & Temporal difference learning. The author has an hindex of 8, co-authored 13 publications receiving 243 citations. Previous affiliations of Andrew Patterson include Indiana University.
Papers
More filters
Proceedings Article
Supervised autoencoders: Improving generalization performance with unsupervised regularizers
TL;DR: This work theoretically and empirically analyze and provides a novel generalization result for linear auto-encoders, proving uniform stability based on the inclusion of the reconstruction error in a neural network that predicts both inputs (reconstruction error) and targets jointly.
Journal ArticleDOI
The open diffusion data derivatives, brain data upcycling via integrated publishing of derivatives and reproducible open cloud services
Paolo Avesani,Paolo Avesani,Brent McPherson,Soichi Hayashi,Cesar F. Caiafa,Cesar F. Caiafa,Cesar F. Caiafa,Robert Henschel,Eleftherios Garyfallidis,Lindsey Kitchell,Daniel Bullock,Andrew Patterson,Emanuele Olivetti,Emanuele Olivetti,Olaf Sporns,Andrew J. Saykin,Lei Wang,Ivo D. Dinov,David Y. Hancock,Bradley Caron,Yiming Qian,Franco Pestilli +21 more
TL;DR: The Open Diffusion Data Derivatives (O3D) repository is described: an integrated collection of preserved brain data derivatives and processing pipelines, published together using a single digital-object-identifier.
Proceedings ArticleDOI
Organizing Experience: a Deeper Look at Replay Mechanisms for Sample-Based Planning in Continuous State Domains
TL;DR: In this article, a semi-parametric model learning approach, called Reweighted Experience Models (REMs), is proposed to sample next states or predecessors during the planning process.
Journal ArticleDOI
General Value Function Networks
TL;DR: This work forms a novel RNN architecture, called a General Value Function Network (GVFN), where each internal state component corresponds to a prediction about the future represented as a value function, and shows that GVFNs are more robust to the truncation level.
The open diffusion data derivatives, brain data upcycling via integrated publishing of derivatives and reproducible open cloud services
Paolo Avesani,Paolo Avesani,Brent McPherson,Soichi Hayashi,Cesar F. Caiafa,Cesar F. Caiafa,Cesar F. Caiafa,Robert Henschel,Eleftherios Garyfallidis,Lindsey Kitchell,Daniel Bullock,Andrew Patterson,Emanuele Olivetti,Emanuele Olivetti,Olaf Sporns,Andrew J. Saykin,Lei Wang,Ivo D. Dinov,David Y. Hancock,Bradley Caron,Yiming Qian,Franco Pestilli +21 more
TL;DR: The Open Diffusion Data Derivatives (O3D) repository as mentioned in this paper is an integrated collection of preserved brain data derivatives and processing pipelines, published together using a single digital-object-identifier.