scispace - formally typeset
Search or ask a question
Author

Robert A. McDougal

Bio: Robert A. McDougal is an academic researcher from Yale University. The author has contributed to research in topics: Medicine & Biology. The author has an hindex of 14, co-authored 35 publications receiving 625 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: A future for neuroscience is predicted increasingly fueled by new technology and high performance computation, and increasingly in need of comprehensive user-friendly databases such as ModelDB to provide the means to integrate the data for deeper insights into brain function in health and disease.
Abstract: Neuron modeling may be said to have originated with the Hodgkin and Huxley action potential model in 1952 and Rall's models of integrative activity of dendrites in 1964. Over the ensuing decades, these approaches have led to a massive development of increasingly accurate and complex data-based models of neurons and neuronal circuits. ModelDB was founded in 1996 to support this new field and enhance the scientific credibility and utility of computational neuroscience models by providing a convenient venue for sharing them. It has grown to include over 1100 published models covering more than 130 research topics. It is actively curated and developed to help researchers discover and understand models of interest. ModelDB also provides mechanisms to assist running models both locally and remotely, and has a graphical tool that enables users to explore the anatomical and biophysical properties that are represented in a model. Each of its capabilities is undergoing continued refinement and improvement in response to user experience. Large research groups (Allen Brain Institute, EU Human Brain Project, etc.) are emerging that collect data across multiple scales and integrate that data into many complex models, presenting new challenges of scale. We end by predicting a future for neuroscience increasingly fueled by new technology and high performance computation, and increasingly in need of comprehensive user-friendly databases such as ModelDB to provide the means to integrate the data for deeper insights into brain function in health and disease.

180 citations

Journal ArticleDOI
26 Apr 2019-eLife
TL;DR: The NetPyNE tool provides both programmatic and graphical interfaces to develop data-driven multiscale network models in NEURON, and facilitates model sharing by exporting and importing standardized formats (NeuroML and SONATA).
Abstract: The approximately 100 billion neurons in our brain are responsible for everything we do and experience. Experiments aimed at discovering how these cells encode and process information generate vast amounts of data. These data span multiple scales, from interactions between individual molecules to coordinated waves of electrical activity that spread across the entire brain surface. To understand how the brain works, we must combine and make sense of these diverse types of information. Computational modeling provides one way of doing this. Using equations, we can calculate the chemical and electrical changes that take place in neurons. We can then build models of neurons and neural circuits that reproduce the patterns of activity seen in experiments. Exploring these models can provide insights into how the brain itself works. Several software tools are available to simulate neural circuits, but none provide an easy way of incorporating data that span different scales, from molecules to cells to networks. Moreover, most of the models require familiarity with computer programming. Dura-Bernal et al. have now developed a new software tool called NetPyNE, which allows users without programming expertise to build sophisticated models of brain circuits. It features a user-friendly interface for defining the properties of the model at molecular, cellular and circuit scales. It also provides an easy and automated method to identify the properties of the model that enable it to reproduce experimental data. Finally, NetPyNE makes it possible to run the model on supercomputers and offers a variety of ways to visualize and analyze the resulting output. Users can save the model and output in standardized formats, making them accessible to as many people as possible. Researchers in labs across the world have used NetPyNE to study different brain regions, phenomena and diseases. The software also features in courses that introduce students to neurobiology and computational modeling. NetPyNE can help to interpret isolated experimental findings, and also makes it easier to explore interactions between brain activity at different scales. This will enable researchers to decipher how the brain encodes and processes information, and ultimately could make it easier to understand and treat brain disorders.

100 citations

Journal ArticleDOI
22 Jan 2020-eLife
TL;DR: Human Neocortical Neurosolver (HNN) has a graphical user interface designed to help researchers and clinicians interpret the neural origins of MEG/EEG and its ability to associate signals across scales makes it a unique tool for translational neuroscience research.
Abstract: Neurons carry information in the form of electrical signals. Each of these signals is too weak to detect on its own. But the combined signals from large groups of neurons can be detected using techniques called EEG and MEG. Sensors on or near the scalp detect changes in the electrical activity of groups of neurons from one millisecond to the next. These recordings can also reveal changes in brain activity due to disease. But how do EEG/MEG signals relate to the activity of neural circuits? While neuroscientists can rarely record electrical activity from inside the human brain, it is much easier to do so in other animals. Computer models can then compare these recordings from animals to the signals in human EEG/MEG to infer how the activity of neural circuits is changing. But building and interpreting these models requires advanced skills in mathematics and programming, which not all researchers possess. Neymotin et al. have therefore developed a user-friendly software platform that can help translate human EEG/MEG recordings into circuit-level activity. Known as the Human Neocortical Neurosolver, or HNN for short, the open-source tool enables users to develop and test hypotheses on the neural origin of EEG/MEG signals. The model simulates the electrical activity of cells in the outer layers of the human brain, the neocortex. By feeding human EEG/MEG data into the model, researchers can predict patterns of circuit-level activity that might have given rise to the EEG/MEG data. The HNN software includes tutorials and example datasets for commonly measured signals, including brain rhythms. It is free to use and can be installed on all major computer platforms or run online. HNN will help researchers and clinicians who wish to identify the neural origins of EEG/MEG signals in the healthy or diseased brain. Likewise, it will be useful to researchers studying brain activity in animals, who want to know how their findings might relate to human EEG/MEG signals. As HNN is suitable for users without training in computational neuroscience, it offers an accessible tool for discoveries in translational neuroscience.

68 citations

Journal ArticleDOI
TL;DR: The Reaction-Diffusion (rxd) module in Python as mentioned in this paper provides specification and simulation for these dynamics, coupled with the electrophysiological dynamics of the cell membrane, allowing arbitrary reaction formulas to be specified using Python syntax, which are then transparently compiled into bytecode that uses NumPy for fast vectorized calculations.
Abstract: In order to support research on the role of cell biological principles (genomics, proteomics, signaling cascades and reaction dynamics) on the dynamics of neuronal response in health and disease, NEURON has developed a Reaction-Diffusion (rxd) module in Python which provides specification and simulation for these dynamics, coupled with the electrophysiological dynamics of the cell membrane. Arithmetic operations on species and parameters are overloaded, allowing arbitrary reaction formulas to be specified using Python syntax. These expressions are then transparently compiled into bytecode that uses NumPy for fast vectorized calculations. At each time step, rxd combines NEURON's integrators with SciPy’s sparse linear algebra library.

63 citations

Journal ArticleDOI
07 Aug 2019-Neuron
TL;DR: Open Source Brain is a platform for sharing, viewing, analyzing, and simulating standardized models from different brain regions and species, and it is demonstrated how existing components can be reused by constructing new models of inhibition-stabilized cortical networks that match recent experimental results.

60 citations


Cited by
More filters
01 Jan 2010
TL;DR: In this paper, the authors describe a scenario where a group of people are attempting to find a solution to the problem of "finding the needle in a haystack" in the environment.
Abstract: 中枢神経系疾患の治療は正常細胞(ニューロン)の機能維持を目的とするが,脳血管障害のように機能障害の原因が細胞の死滅に基づくことは多い.一方,脳腫瘍の治療においては薬物療法や放射線療法といった腫瘍細胞の死滅を目標とするものが大きな位置を占める.いずれの場合にも,細胞死の機序を理解することは各種病態や治療法の理解のうえで重要である.現在のところ最も研究の進んでいる細胞死の型はアポトーシスである.そのなかで重要な位置を占めるミトコンドリアにおける反応および抗アポトーシス因子について概要を紹介する.

2,716 citations

Book ChapterDOI
01 Jul 2010

532 citations

Journal ArticleDOI
25 Nov 2019
TL;DR: It is demonstrated that machine learning and multiscale modeling can naturally complement each other to create robust predictive models that integrate the underlying physics to manage ill-posed problems and explore massive design spaces.
Abstract: Fueled by breakthrough technology developments, the biological, biomedical, and behavioral sciences are now collecting more data than ever before. There is a critical need for time- and cost-efficient strategies to analyze and interpret these data to advance human health. The recent rise of machine learning as a powerful technique to integrate multimodality, multifidelity data, and reveal correlations between intertwined phenomena presents a special opportunity in this regard. However, machine learning alone ignores the fundamental laws of physics and can result in ill-posed problems or non-physical solutions. Multiscale modeling is a successful strategy to integrate multiscale, multiphysics data and uncover mechanisms that explain the emergence of function. However, multiscale modeling alone often fails to efficiently combine large datasets from different sources and different levels of resolution. Here we demonstrate that machine learning and multiscale modeling can naturally complement each other to create robust predictive models that integrate the underlying physics to manage ill-posed problems and explore massive design spaces. We review the current literature, highlight applications and opportunities, address open questions, and discuss potential challenges and limitations in four overarching topical areas: ordinary differential equations, partial differential equations, data-driven approaches, and theory-driven approaches. Towards these goals, we leverage expertise in applied mathematics, computer science, computational biology, biophysics, biomechanics, engineering mechanics, experimentation, and medicine. Our multidisciplinary perspective suggests that integrating machine learning and multiscale modeling can provide new insights into disease mechanisms, help identify new targets and treatment strategies, and inform decision making for the benefit of human health.

315 citations

Book ChapterDOI
01 Jan 2006

197 citations

Journal ArticleDOI
TL;DR: BioModels has become the world’s largest repository of curated models and emerged as the third most used data resource after PubMed and Google Scholar among the scientists who use modelling in their research.
Abstract: Computational modelling has become increasingly common in life science research. To provide a platform to support universal sharing, easy accessibility and model reproducibility, BioModels (https://www.ebi.ac.uk/biomodels/), a repository for mathematical models, was established in 2005. The current BioModels platform allows submission of models encoded in diverse modelling formats, including SBML, CellML, PharmML, COMBINE archive, MATLAB, Mathematica, R, Python or C++. The models submitted to BioModels are curated to verify the computational representation of the biological process and the reproducibility of the simulation results in the reference publication. The curation also involves encoding models in standard formats and annotation with controlled vocabularies following MIRIAM (minimal information required in the annotation of biochemical models) guidelines. BioModels now accepts large-scale submission of auto-generated computational models. With gradual growth in content over 15 years, BioModels currently hosts about 2000 models from the published literature. With about 800 curated models, BioModels has become the world's largest repository of curated models and emerged as the third most used data resource after PubMed and Google Scholar among the scientists who use modelling in their research. Thus, BioModels benefits modellers by providing access to reliable and semantically enriched curated models in standard formats that are easy to share, reproduce and reuse.

193 citations