scispace - formally typeset
Search or ask a question
Author

Geir Halnes

Bio: Geir Halnes is an academic researcher from Norwegian University of Life Sciences. The author has contributed to research in topics: Computational neuroscience & Biological neuron model. The author has an hindex of 15, co-authored 55 publications receiving 667 citations. Previous affiliations of Geir Halnes include Royal Institute of Technology & Norwegian University of Science and Technology.


Papers
More filters
Journal ArticleDOI
TL;DR: This work uses a new metric that allows for the identification and quantification of cyclic energy pathways, specifically the maximum eigenvalue of the connectance matrix, which is used to identify both the presence and strength of these structural cycles.

92 citations

Journal ArticleDOI
TL;DR: Uncertainpy, an open-source Python toolbox, tailored to perform uncertainty quantification and sensitivity analysis of neuroscience models, is presented to the neuroscience community in a user-oriented manner.
Abstract: Computational models in neuroscience typically contain many parameters that are poorly constrained by experimental data. Uncertainty quantification and sensitivity analysis provide rigorous procedures to quantify how the model output depends on this parameter uncertainty. Unfortunately, the application of such methods is not yet standard within the field of neuroscience. Here we present Uncertainpy, an open-source Python toolbox, tailored to perform uncertainty quantification and sensitivity analysis of neuroscience models. Uncertainpy aims to make it quick and easy to get started with uncertainty analysis, without any need for detailed prior knowledge. The toolbox allows uncertainty quantification and sensitivity analysis to be performed on already existing models without needing to modify the model equations or model implementation. Uncertainpy bases its analysis on polynomial chaos expansions, which are more efficient than the more standard Monte-Carlo based approaches. Uncertainpy is tailored for neuroscience applications by its built-in capability for calculating characteristic features in the model output. The toolbox does not merely perform a point-to-point comparison of the ``raw'' model output (e.g., membrane voltage traces), but can also calculate the uncertainty and sensitivity of salient model response features such as spike timing, action potential width, average interspike interval, and other features relevant for various neural and neural network models. Uncertainpy comes with several common models and features built in, and including custom models and new features is easy. The aim of the current paper is to present Uncertainpy to the neuroscience community in a user-oriented manner. To demonstrate its broad applicability, we perform an uncertainty quantification and sensitivity analysis of three case studies relevant for neuroscience: the original Hodgkin-Huxley point-neuron model for action potential generation, a multi-compartmental model of a thalamic interneuron implemented in the NEURON simulator, and a sparsely connected recurrent network model implemented in the NEST simulator.

60 citations

Journal ArticleDOI
TL;DR: A general electrodiffusive formalism for modeling of ion concentration dynamics in a one-dimensional geometry that ensures that the membrane potential and ion concentrations are in consistency, it ensures global particle/charge conservation and it accounts for diffusion and concentration dependent variations in resistivity.
Abstract: The cable equation is a proper framework for modeling electrical neural signalling that takes place at a timescale at which the ionic concentrations vary little. However, in neural tissue there are also key dynamic processes that occur at longer timescales. For example, endured periods of intense neural signaling may cause the local extracellular K+-concentration to increase by several millimolars. The clearance of this excess K+ depends partly on diffusion in the extracellular space, partly on local uptake by astrocytes, and partly on intracellular transport (spatial buffering) within astrocytes. These processes, that take place at the time scale of seconds, demand a mathematical description able to account for the spatiotemporal variations in ion concentrations as well as the subsequent effects of these variations on the membrane potential. Here, we present a general electrodiffusive formalism for modeling of ion concentration dynamics in a one-dimensional geometry, including both the intra- and extracellular domains. Based on the Nernst-Planck equations, this formalism ensures that the membrane potential and ion concentrations are in consistency, it ensures global particle/charge conservation and it accounts for diffusion and concentration dependent variations in resistivity. We apply the formalism to a model of astrocytes exchanging ions with the extracellular space. The simulations show that K+-removal from high-concentration regions is driven by a local depolarization of the astrocyte membrane, which concertedly (i) increases the local astrocytic uptake of K+, (ii) suppresses extracellular transport of K+, (iii) increases axial transport of K+ within astrocytes, and (iv) facilitates astrocytic relase of K+ in regions where the extracellular concentration is low. Together, these mechanisms seem to provide a robust regulatory scheme for shielding the extracellular space from excess K+.

59 citations

18 Aug 2016
TL;DR: Table of contents Functional advantages of cell-type heterogeneity in neural circuits, Dynamics and biomarkers of mental disorders, and Objective criteria for computational neuroscience model selection are presented.
Abstract: Table of contentsA1 Functional advantages of cell-type heterogeneity in neural circuitsTatyana O. SharpeeA2 Mesoscopic modeling of propagating waves in visual cortexAlain DestexheA3 Dynamics and biomarkers of mental disordersMitsuo KawatoF1 Precise recruitment of spiking output at theta frequencies requires dendritic h-channels in multi-compartment models of oriens-lacunosum/moleculare hippocampal interneuronsVladislav Sekulić, Frances K. SkinnerF2 Kernel methods in reconstruction of current sources from extracellular potentials for single cells and the whole brainsDaniel K. Wójcik, Chaitanya Chintaluri, Dorottya Cserpán, Zoltán SomogyváriF3 The synchronized periods depend on intracellular transcriptional repression mechanisms in circadian clocks.Jae Kyoung Kim, Zachary P. Kilpatrick, Matthew R. Bennett, Kresimir JosićO1 Assessing irregularity and coordination of spiking-bursting rhythms in central pattern generatorsIrene Elices, David Arroyo, Rafael Levi, Francisco B. Rodriguez, Pablo VaronaO2 Regulation of top-down processing by cortically-projecting parvalbumin positive neurons in basal forebrainEunjin Hwang, Bowon Kim, Hio-Been Han, Tae Kim, James T. McKenna, Ritchie E. Brown, Robert W. McCarley, Jee Hyun ChoiO3 Modeling auditory stream segregation, build-up and bistabilityJames Rankin, Pamela Osborn Popp, John RinzelO4 Strong competition between tonotopic neural ensembles explains pitch-related dynamics of auditory cortex evoked fieldsAlejandro Tabas, André Rupp, Emili Balaguer-BallesterO5 A simple model of retinal response to multi-electrode stimulationMatias I. Maturana, David B. Grayden, Shaun L. Cloherty, Tatiana Kameneva, Michael R. Ibbotson, Hamish MeffinO6 Noise correlations in V4 area correlate with behavioral performance in visual discrimination taskVeronika Koren, Timm Lochmann, Valentin Dragoi, Klaus ObermayerO7 Input-location dependent gain modulation in cerebellar nucleus neuronsMaria Psarrou, Maria Schilstra, Neil Davey, Benjamin Torben-Nielsen, Volker SteuberO8 Analytic solution of cable energy function for cortical axons and dendritesHuiwen Ju, Jiao Yu, Michael L. Hines, Liang Chen, Yuguo YuO9 C. elegans interactome: interactive visualization of Caenorhabditis elegans worm neuronal networkJimin Kim, Will Leahy, Eli ShlizermanO10 Is the model any good? Objective criteria for computational neuroscience model selectionJustas Birgiolas, Richard C. Gerkin, Sharon M. CrookO11 Cooperation and competition of gamma oscillation mechanismsAtthaphon Viriyopase, Raoul-Martin Memmesheimer, Stan GielenO12 A discrete structure of the brain wavesYuri Dabaghian, Justin DeVito, Luca PerottiO13 Direction-specific silencing of the Drosophila gaze stabilization systemAnmo J. Kim, Lisa M. Fenk, Cheng Lyu, Gaby MaimonO14 What does the fruit fly think about values? A model of olfactory associative learningChang Zhao, Yves Widmer, Simon Sprecher,Walter SennO15 Effects of ionic diffusion on power spectra of local field potentials (LFP)Geir Halnes, Tuomo Mäki-Marttunen, Daniel Keller, Klas H. Pettersen,Ole A. Andreassen, Gaute T. EinevollO16 Large-scale cortical models towards understanding relationship between brain structure abnormalities and cognitive deficitsYasunori YamadaO17 Spatial coarse-graining the brain: origin of minicolumnsMoira L. Steyn-Ross, D. Alistair Steyn-RossO18 Modeling large-scale cortical networks with laminar structureJorge F. Mejias, John D. Murray, Henry Kennedy, Xiao-Jing WangO19 Information filtering by partial synchronous spikes in a neural populationAlexandra Kruscha, Jan Grewe, Jan Benda, Benjamin LindnerO20 Decoding context-dependent olfactory valence in Drosophila Laurent Badel, Kazumi Ohta, Yoshiko Tsuchimoto, Hokto KazamaP1 Neural network as a scale-free network: the role of a hubB. KahngP2 Hemodynamic responses to emotions and decisions using near-infrared spectroscopy optical imagingNicoladie D. TamP3 Phase space analysis of hemodynamic responses to intentional movement directions using functional near-infrared spectroscopy (fNIRS) optical imaging techniqueNicoladie D.Tam, Luca Pollonini, George ZouridakisP4 Modeling jamming avoidance of weakly electric fishJaehyun Soh, DaeEun KimP5 Synergy and redundancy of retinal ganglion cells in predictionMinsu Yoo, S. E. PalmerP6 A neural field model with a third dimension representing cortical depthViviana Culmone, Ingo BojakP7 Network analysis of a probabilistic connectivity model of the Xenopus tadpole spinal cordAndrea Ferrario, Robert Merrison-Hort, Roman BorisyukP8 The recognition dynamics in the brainChang Sub KimP9 Multivariate spike train analysis using a positive definite kernelTaro TezukaP10 Synchronization of burst periods may govern slow brain dynamics during general anesthesiaPangyu JooP11 The ionic basis of heterogeneity affects stochastic synchronyYoung-Ah Rho, Shawn D. Burton, G. Bard Ermentrout, Jaeseung Jeong, Nathaniel N. UrbanP12 Circular statistics of noise in spike trains with a periodic componentPetr MarsalekP14 Representations of directions in EEG-BCI using Gaussian readoutsHoon-Hee Kim, Seok-hyun Moon, Do-won Lee, Sung-beom Lee, Ji-yong Lee, Jaeseung JeongP15 Action selection and reinforcement learning in basal ganglia during reaching movementsYaroslav I. Molkov, Khaldoun Hamade, Wondimu Teka, William H. Barnett, Taegyo Kim, Sergey Markin, Ilya A. RybakP17 Axon guidance: modeling axonal growth in T-Junction assayCsaba Forro, Harald Dermutz, László Demkó, János VörösP19 Transient cell assembly networks encode persistent spatial memoriesYuri Dabaghian, Andrey BabichevP20 Theory of population coupling and applications to describe high order correlations in large populations of interacting neuronsHaiping HuangP21 Design of biologically-realistic simulations for motor controlSergio Verduzco-FloresP22 Towards understanding the functional impact of the behavioural variability of neuronsFilipa Dos Santos, Peter AndrasP23 Different oscillatory dynamics underlying gamma entrainment deficits in schizophreniaChristoph Metzner, Achim Schweikard, Bartosz ZurowskiP24 Memory recall and spike frequency adaptationJames P. Roach, Leonard M. Sander, Michal R. ZochowskiP25 Stability of neural networks and memory consolidation preferentially occur near criticalityQuinton M. Skilling, Nicolette Ognjanovski, Sara J. Aton, Michal ZochowskiP26 Stochastic Oscillation in Self-Organized Critical States of Small Systems: Sensitive Resting State in Neural SystemsSheng-Jun Wang, Guang Ouyang, Jing Guang, Mingsha Zhang, K. Y. Michael Wong, Changsong ZhouP27 Neurofield: a C++ library for fast simulation of 2D neural field modelsPeter A. Robinson, Paula Sanz-Leon, Peter M. Drysdale, Felix Fung, Romesh G. Abeysuriya, Chris J. Rennie, Xuelong ZhaoP28 Action-based grounding: Beyond encoding/decoding in neural codeYoonsuck Choe, Huei-Fang YangP29 Neural computation in a dynamical system with multiple time scalesYuanyuan Mi, Xiaohan Lin, Si WuP30 Maximum entropy models for 3D layouts of orientation selectivityJoscha Liedtke, Manuel Schottdorf, Fred WolfP31 A behavioral assay for probing computations underlying curiosity in rodentsYoriko Yamamura, Jeffery R. WickensP32 Using statistical sampling to balance error function contributions to optimization of conductance-based modelsTimothy Rumbell, Julia Ramsey, Amy Reyes, Danel Draguljić, Patrick R. Hof, Jennifer Luebke, Christina M. WeaverP33 Exploration and implementation of a self-growing and self-organizing neuron network building algorithmHu He, Xu Yang, Hailin Ma, Zhiheng Xu, Yuzhe WangP34 Disrupted resting state brain network in obese subjects: a data-driven graph theory analysisKwangyeol Baek, Laurel S. Morris, Prantik Kundu, Valerie VoonP35 Dynamics of cooperative excitatory and inhibitory plasticityEverton J. Agnes, Tim P. VogelsP36 Frequency-dependent oscillatory signal gating in feed-forward networks of integrate-and-fire neuronsWilliam F. Podlaski, Tim P. VogelsP37 Phenomenological neural model for adaptation of neurons in area ITMartin Giese, Pradeep Kuravi, Rufin VogelsP38 ICGenealogy: towards a common topology of neuronal ion channel function and genealogy in model and experimentAlexander Seeholzer, William Podlaski, Rajnish Ranjan, Tim VogelsP39 Temporal input discrimination from the interaction between dynamic synapses and neural subthreshold oscillationsJoaquin J. Torres, Fabiano Baroni, Roberto Latorre, Pablo VaronaP40 Different roles for transient and sustained activity during active visual processingBart Gips, Eric Lowet, Mark J. Roberts, Peter de Weerd, Ole Jensen, Jan van der EerdenP41 Scale-free functional networks of 2D Ising model are highly robust against structural defects: neuroscience implicationsAbdorreza Goodarzinick, Mohammad D. Niry, Alireza ValizadehP42 High frequency neuron can facilitate propagation of signal in neural networksAref Pariz, Shervin S. Parsi, Alireza ValizadehP43 Investigating the effect of Alzheimer’s disease related amyloidopathy on gamma oscillations in the CA1 region of the hippocampusJulia M. Warburton, Lucia Marucci, Francesco Tamagnini, Jon Brown, Krasimira Tsaneva-AtanasovaP44 Long-tailed distributions of inhibitory and excitatory weights in a balanced network with eSTDP and iSTDPFlorence I. Kleberg, Jochen TrieschP45 Simulation of EMG recording from hand muscle due to TMS of motor cortexBahar Moezzi, Nicolangelo Iannella, Natalie Schaworonkow, Lukas Plogmacher, Mitchell R. Goldsworthy, Brenton Hordacre, Mark D. McDonnell, Michael C. Ridding, Jochen TrieschP46 Structure and dynamics of axon network formed in primary cell cultureMartin Zapotocky, Daniel Smit, Coralie Fouquet, Alain TrembleauP47 Efficient signal processing and sampling in random networks that generate variabilitySakyasingha Dasgupta, Isao Nishikawa, Kazuyuki Aihara, Taro ToyoizumiP48 Modeling the effect of riluzole on bursting in respiratory neural networksDaniel T. Robb, Nick Mellen, Natalia ToporikovaP49 Mapping relaxation training using effective connectivity analysisRongxiang Tang

59 citations


Cited by
More filters
Journal Article
TL;DR: This is a paid internship where interns work directly to assist the Director of Marketing and Communications on various tasks relating to upcoming GRA events.
Abstract: OVERVIEW The GRA Marketing Internship Program is offered to students who are interested in gaining valuable work experience through efforts in marketing, membership, sales, and events. Interns work directly to assist the Director of Marketing and Communications on various tasks relating to upcoming GRA events. During this internship, students will work a minimum of 10 hours a week and a maximum of 20 hours a week. Students are encouraged to earn credit for their internship, however this is a paid internship. Students interested in obtaining credit for their internship must consult their academic advisor or the intern coordinator at their academic unit.

1,309 citations

Journal Article
TL;DR: The methodology proposed automatically adapts to the local structure when simulating paths across this manifold, providing highly efficient convergence and exploration of the target density, and substantial improvements in the time‐normalized effective sample size are reported when compared with alternative sampling approaches.
Abstract: The paper proposes Metropolis adjusted Langevin and Hamiltonian Monte Carlo sampling methods defined on the Riemann manifold to resolve the shortcomings of existing Monte Carlo algorithms when sampling from target densities that may be high dimensional and exhibit strong correlations. The methods provide fully automated adaptation mechanisms that circumvent the costly pilot runs that are required to tune proposal densities for Metropolis-Hastings or indeed Hamiltonian Monte Carlo and Metropolis adjusted Langevin algorithms. This allows for highly efficient sampling even in very high dimensions where different scalings may be required for the transient and stationary phases of the Markov chain. The methodology proposed exploits the Riemann geometry of the parameter space of statistical models and thus automatically adapts to the local structure when simulating paths across this manifold, providing highly efficient convergence and exploration of the target density. The performance of these Riemann manifold Monte Carlo methods is rigorously assessed by performing inference on logistic regression models, log-Gaussian Cox point processes, stochastic volatility models and Bayesian estimation of dynamic systems described by non-linear differential equations. Substantial improvements in the time-normalized effective sample size are reported when compared with alternative sampling approaches. MATLAB code that is available from http://www.ucl.ac.uk/statistics/research/rmhmc allows replication of all the results reported.

1,031 citations

Journal ArticleDOI

663 citations

01 Jan 2016
TL;DR: Thank you very much for downloading using mpi portable parallel programming with the message passing interface for reading a good book with a cup of coffee in the afternoon, instead they are facing with some malicious bugs inside their laptop.
Abstract: Thank you very much for downloading using mpi portable parallel programming with the message passing interface. As you may know, people have search hundreds times for their chosen novels like this using mpi portable parallel programming with the message passing interface, but end up in harmful downloads. Rather than reading a good book with a cup of coffee in the afternoon, instead they are facing with some malicious bugs inside their laptop.

593 citations