scispace - formally typeset
D

Daniel Tranchina

Researcher at New York University

Publications -  60
Citations -  5028

Daniel Tranchina is an academic researcher from New York University. The author has contributed to research in topics: Population & Receptive field. The author has an hindex of 31, co-authored 59 publications receiving 4675 citations. Previous affiliations of Daniel Tranchina include Albert Einstein College of Medicine & Center for Neural Science.

Papers
More filters
Journal ArticleDOI

Stochastic mRNA Synthesis in Mammalian Cells

TL;DR: The results demonstrate that gene expression in mammalian cells is subject to large, intrinsically random fluctuations and raise questions about how cells are able to function in the face of such noise.
Journal ArticleDOI

Recovery of Sparse Translation-Invariant Signals With Continuous Basis Pursuit

TL;DR: This work develops two implementations of CBP for a one-dimensional translation-invariant source, one using a first-order Taylor approximation, and another using a form of trigonometric spline, and examines the tradeoff between sparsity and signal reconstruction accuracy in these methods.
Journal ArticleDOI

A population density approach that facilitates large-scale modeling of neural networks: analysis and an application to orientation tuning.

TL;DR: A computationally efficient method of simulating realistic networks of neurons introduced by Knight, Manin, and Sirovich (1996) in which integrate-and-fire neurons are grouped into large populations of similar neurons, which captures the dynamics of single-neuron activity that are missed in simple firing-rate models.
Journal ArticleDOI

A model for the polarization of neurons by extrinsically applied electric fields.

TL;DR: The way in which orientation of the various parts of the neuron affects its polarization is examined, and the soma seems to be a likely site for action potential initiation when the field is strong enough to elicit suprathreshold polarization.

A population density approach that facilitates large-scale modeling of neural networks

TL;DR: In this article, a population density approach is proposed to speed up large-scale neural network simulations by calculating the evolution of a probability density function for each population and obtaining population firing rates and the distribution of neurons over state space.