D
David Sussillo
Researcher at Google
Publications - 66
Citations - 7190
David Sussillo is an academic researcher from Google. The author has contributed to research in topics: Recurrent neural network & Computer science. The author has an hindex of 31, co-authored 56 publications receiving 5474 citations. Previous affiliations of David Sussillo include Columbia University & Stanford University.
Papers
More filters
Journal ArticleDOI
Context-dependent computation by recurrent dynamics in prefrontal cortex
TL;DR: This work studies prefrontal cortex activity in macaque monkeys trained to flexibly select and integrate noisy sensory inputs towards a choice, and finds that the observed complexity and functional roles of single neurons are readily understood in the framework of a dynamical process unfolding at the level of the population.
Journal ArticleDOI
Generating coherent patterns of activity from chaotic neural networks.
David Sussillo,L. F. Abbott +1 more
TL;DR: The results reproduce data on premovement activity in motor and premotor cortex, and suggest that synaptic plasticity may be a more rapid and powerful modulator of network activity than generally appreciated.
Journal ArticleDOI
A neural network that finds a naturalistic solution for the production of muscle activity
TL;DR: This work explored the hypothesis that motor cortex reflects dynamics appropriate for generating temporally patterned outgoing commands and trained recurrent neural networks to reproduce the muscle activity of reaching monkeys to formalize this hypothesis.
Journal ArticleDOI
Opening the black box: Low-dimensional dynamics in high-dimensional recurrent neural networks
David Sussillo,Omri Barak +1 more
TL;DR: The hypothesis that fixed points, both stable and unstable, and the linearized dynamics around them, can reveal crucial aspects of how RNNs implement their computations is explored.
Journal ArticleDOI
Inferring single-trial neural population dynamics using sequential auto-encoders.
Chethan Pandarinath,Daniel J. O’Shea,Jasmine Collins,Rafal Jozefowicz,Rafal Jozefowicz,Sergey D. Stavisky,Jonathan C. Kao,Jonathan C. Kao,Eric M. Trautmann,Matthew T. Kaufman,Matthew T. Kaufman,Stephen I. Ryu,Stephen I. Ryu,Leigh R. Hochberg,Jaimie M. Henderson,Krishna V. Shenoy,Larry F. Abbott,David Sussillo,David Sussillo +18 more
TL;DR: LFADS, a deep learning method for analyzing neural population activity, can extract neural dynamics from single-trial recordings, stitch separate datasets into a single model, and infer perturbations, for example, from behavioral choices to these dynamics.