scispace - formally typeset
Journal ArticleDOI

Modelling and analysis of local field potentials for studying the function of cortical circuits

Reads0
Chats0
TLDR
Careful mathematical modelling and analysis are needed to take full advantage of the opportunities that this signal offers in understanding signal processing in cortical circuits and, ultimately, the neural basis of perception and cognition.
Abstract
Local field potentials (LFPs) provide a wealth of information about synaptic processing in cortical populations but are difficult to interpret. Einevoll and colleagues consider the neural origin of cortical LFPs and discuss LFP modelling and analysis methods that can improve the interpretation of LFP data.

read more

Citations
More filters
Journal ArticleDOI

Action potentials contribute to epileptic high-frequency oscillations recorded with electrodes remote from neurons.

TL;DR: HFOs recorded with electrodes remote from neurons could actually be generated by clusters of action potentials, which can possibly extend the clinical meaning of EEG.
Posted ContentDOI

Two frequency bands contain the most stimulus-related information in visual cortex

TL;DR: Using datasets from intracortical multi-electrode recordings and from large-scale electrocorticography grids, it is investigated how visual features could be extracted from the local field potential (LFP) and how this compared with the information available from multi-unit activity (MUA).
Journal ArticleDOI

Ambiguity in the interpretation of the low-frequency dielectric properties of biological tissues.

TL;DR: In this paper, the authors reanalyse the raw data of the main resource of the dielectric properties of biological tissues in impedance representation, and employ a Kramers-Kronig validity test and parameter estimation techniques to describe the data by two physical parametric models that correspond to opposing biophysical interpretations.
Posted ContentDOI

Hippocampal sharp wave-ripples and the associated sequence replay emerge from structured synaptic interactions in a network model of area CA3

TL;DR: This model provides a unifying framework for diverse phenomena involving hippocampal plasticity, representations, and dynamics and finds that bidirectional replay requires the interplay of the experimentally confirmed, temporally symmetric plasticity rule, and cellular adaptation.
Journal ArticleDOI

An electrodiffusive neuron-extracellular-glia model for exploring the genesis of slow potentials in the brain.

TL;DR: The edNEG model as mentioned in this paper combines compartmental neuron modeling with an electrodiffusive framework for intra-and extracellular ion concentration dynamics in a local piece of neuro-glial brain tissue.
References
More filters
Journal ArticleDOI

A mathematical theory of communication

TL;DR: This final installment of the paper considers the case where the signals or the messages or both are continuously variable, in contrast with the discrete nature assumed until now.
MonographDOI

Causality: models, reasoning, and inference

TL;DR: The art and science of cause and effect have been studied in the social sciences for a long time as mentioned in this paper, see, e.g., the theory of inferred causation, causal diagrams and the identification of causal effects.
Book ChapterDOI

Investigating causal relations by econometric models and cross-spectral methods

TL;DR: In this article, it is shown that the cross spectrum between two variables can be decomposed into two parts, each relating to a single causal arm of a feedback situation, and measures of causal lag and causal strength can then be constructed.
Journal ArticleDOI

Learning the parts of objects by non-negative matrix factorization

TL;DR: An algorithm for non-negative matrix factorization is demonstrated that is able to learn parts of faces and semantic features of text and is in contrast to other methods that learn holistic, not parts-based, representations.

Learning parts of objects by non-negative matrix factorization

D. D. Lee
TL;DR: In this article, non-negative matrix factorization is used to learn parts of faces and semantic features of text, which is in contrast to principal components analysis and vector quantization that learn holistic, not parts-based, representations.
Related Papers (5)