scispace - formally typeset
Search or ask a question

Showing papers in "Frontiers in Neurology in 2015"


Journal ArticleDOI
TL;DR: Standard operating procedures (SOPs) with step-by-step instructions for a number of different validation parameters is included in the present work together with a validation report template, which allow for a well-ordered presentation of the results.
Abstract: Biochemical markers have a central position in the diagnosis and management of patients in clinical medicine, and also in clinical research and drug development, also for brain disorders, such as Alzheimer's disease. The enzyme-linked immunosorbent assay (ELISA) is frequently used for measurement of low-abundance biomarkers. However, the quality of ELISA methods varies, which may introduce both systematic and random errors. This urges the need for more rigorous control of assay performance, regardless of its use in a research setting, in clinical routine, or drug development. The aim of a method validation is to present objective evidence that a method fulfills the requirements for its intended use. Although much has been published on which parameters to investigate in a method validation, less is available on a detailed level on how to perform the corresponding experiments. To remedy this, standard operating procedures (SOPs) with step-by-step instructions for a number of different validation parameters is included in the present work together with a validation report template, which allow for a well-ordered presentation of the results. Even though the SOPs were developed with the intended use for immunochemical methods and to be used for multicenter evaluations, most of them are generic and can be used for other technologies as well.

336 citations


Journal ArticleDOI
TL;DR: In this paper, the same operator delivered every impulse to every subject and measured the compensatory eye movement response to a small, unpredictable, abrupt head rotation (head impulse) was measured by the ICS Impulse prototype system.
Abstract: Background/Hypothesis. The video Head Impulse Test (vHIT) is now widely used to test the function of each of the six semicircular canals individually by measuring the eye rotation response to an abrupt head rotation in the plane of the canal. The main measure of canal adequacy is the ratio of the eye movement response to the head movement stimulus i.e. the gain of the vestibulo-ocular reflex (VOR). However there is a need for normative data about how VOR gain is affected by age and also by head velocity, to allow the response of any particular patient to be compared to response of healthy subjects in their age range. In this study we determined for all six semicircular canals, normative values of VOR gain, for each canal across a range of head velocities, for healthy subjects in each decade of life. Study Design. The VOR gain was measured for all canals across a range of head velocities for at least 10 healthy subjects in decade age bands: 10-19, 20-29, 30-39, 40-49, 50-59, 60-69, 70-79, 80-89. Methods. The compensatory eye movement response to a small, unpredictable, abrupt head rotation (head impulse) was measured by the ICS Impulse prototype system. The same operator delivered every impulse to every subject. Results. VOR gain decreased at high head velocities, but was largely unaffected by age into the 80-89 year age group. There were some small but systematic differences between the two directions of head rotation, which appear to be largely due to the fact that in this study only the right eye was measured. The results are considered in relation to recent evidence about the effect of age on VOR performance. Conclusion. These normative values allow the results of any particular patient to be compared to the values of healthy people in their age range and so allow, for example, detection of whether a patient has a bilateral vestibular loss. VOR gain, as measured directly by the eye movement response to head rotation, seems largely unaffected by aging.

315 citations


Journal ArticleDOI
TL;DR: Observational and experimental research suggests that a Mediterranean-type diet may reduce risk for first ischemic stroke with an effect size comparable to statin therapy, and diet modification might represent an effective intervention for secondary prevention.
Abstract: Diet is strongly associated with risk for first stroke. In particular, observational and experimental research suggests that a Mediterranean-type diet may reduce risk for first ischemic stroke with an effect size comparable to statin therapy. These data for first ischemic stroke suggest that diet may also be associated with risk for recurrent stroke and that diet modification might represent an effective intervention for secondary prevention. However, research on dietary pattern after stroke is limited and direct experimental evidence for a therapeutic effect in secondary prevention does not exist. The uncertain state of science in this area is reflected in recent guidelines on secondary stroke prevention from the American Heart Association, in which the Mediterranean-type diet is listed with only a class IIa recommendation (level of evidence C). To change guidelines and practice, research is needed, starting with efforts to better define current nutritional practices of stroke patients. Food frequency questionnaires (FFQ) and mobile applications for real-time recording of intake are available for this purpose. Dietary strategies for secondary stroke prevention are low risk, high potential, and warrant further evaluation.

252 citations


Journal ArticleDOI
TL;DR: The selective responses of microglia and macrophages to hypoxia after stroke are discussed and relevant markers are reviewed with the aim of defining the different subpopulations of myeloid cells that are recruited to the injured site.
Abstract: Cells of myeloid origin, such as microglia and macrophages, act at the crossroads of several inflammatory mechanisms during pathophysiology. Besides pro-inflammatory activity (M1 polarization), myeloid cells acquire protective functions (M2) and participate in the neuroprotective innate mechanisms after brain injury. Experimental research is making considerable efforts to understand the rules that regulate the balance between toxic and protective brain innate immunity. Environmental changes affect microglia/macrophage functions. Hypoxia can affect myeloid cell distribution, activity, and phenotype. With their intrinsic differences, microglia and macrophages respond differently to hypoxia, the former depending on ATP to activate and the latter switching to anaerobic metabolism and adapting to hypoxia. Myeloid cell functions include homeostasis control, damage-sensing activity, chemotaxis, and phagocytosis, all distinctive features of these cells. Specific markers and morphologies enable to recognize each functional state. To ensure homeostasis and activate when needed, microglia/macrophage physiology is finely tuned. Microglia are controlled by several neuron-derived components, including contact-dependent inhibitory signals and soluble molecules. Changes in this control can cause chronic activation or priming with specific functional consequences. Strategies, such as stem cell treatment, may enhance microglia protective polarization. This review presents data from the literature that has greatly advanced our understanding of myeloid cell action in brain injury. We discuss the selective responses of microglia and macrophages to hypoxia after stroke and review relevant markers with the aim of defining the different subpopulations of myeloid cells that are recruited to the injured site. We also cover the functional consequences of chronically active microglia and review pivotal works on microglia regulation that offer new therapeutic possibilities for acute brain injury.

210 citations


Journal ArticleDOI
TL;DR: Better knowledge of molecular and functional properties of astrocytes should promote understanding of their specific role in MS pathophysiology, and consequently lead to development of novel and more successful therapeutic approaches.
Abstract: Multiple sclerosis is an inflammatory disorder causing central nervous system demyelination and axonal injury. Although its etiology remains elusive, several lines of evidence support the concept that autoimmunity plays a major role in disease pathogenesis. The course ofMS is highly variable; nevertheless, the majority of patients initially present a relapsing-remitting clinical course. After 10-15 years of disease, this pattern becomes progressive in up to 50% of untreated patients, during which time clinical symptoms slowly cause constant deterioration over a period of many years. In about 15% of MS patients however, disease progression is relentless from disease onset. Published evidence supports the concept that progressive multiple sclerosis reflects a poorly understood mechanism of insidious axonal degeneration and neuronal loss. Recently, the type of microglial cell and of astrocyte activation and proliferation observed has suggested contribution of resident central nervous system cells may play a critical role in disease progression. Astrocytes could contribute to this process through several mechanisms: a) as part of the innate immune system, b) as a source of cytotoxic factors, c) inhibiting re-myelination and axonal regeneration by forming a glial scar, and d) contributing to axonal mitochondrial dysfunction. Furthermore, regulatory mechanisms mediated by astrocytes can be affected by aging. Notably, astrocytes might also limit the detrimental effects of pro-inflammatory factors, while providing support and protection for oligodendrocytes and neurons. Because of the dichotomy observed in astrocytic effects, the design of therapeutic strategies targeting astrocytes becomes a challenging endeavor. Better knowledge of molecular and functional properties of astrocytes therefore, should promote understanding of their specific role in multiple sclerosis pathophysiology, and consequently lead to development of novel and more successful therapeutic approache

200 citations


Journal ArticleDOI
TL;DR: Though in their infancy, epigenetics studies have the potential to increase the understanding of the etiology of ASD and may assist in the development of biomarkers for its prediction, diagnosis, prognosis, and eventually in its prevention and intervention.
Abstract: Autism spectrum disorders (ASD) are a heterogeneous group of neurodevelopmental disorders characterized by problems with social communication, social interaction, and repetitive or restricted behaviors. ASD are comorbid with other disorders including attention deficit hyperactivity disorder, epilepsy, Rett syndrome, and Fragile X syndrome. Neither the genetic nor the environmental components have been characterized well enough to aid diagnosis or treatment of non-syndromic ASD. However, genome-wide association studies have amassed evidence suggesting involvement of hundreds of genes and a variety of associated genetic pathways. Recently, investigators have turned to epigenetics, a prime mediator of environmental effects on genomes and phenotype, to characterize changes in ASD that constitute a molecular level on top of DNA sequence. Though in their infancy, such studies have the potential to increase our understanding of the etiology of ASD and may assist in the development of biomarkers for its prediction, diagnosis, prognosis, and eventually in its prevention and intervention. This review focuses on the first few epigenome-wide association studies of ASD and discusses future directions.

198 citations


Journal ArticleDOI
TL;DR: TCD can be considered as an adaptive mechanism to retrieve missing auditory input in tinnitus, and may change to parahippocampocortical persisting and thus pathological theta–gamma rhythm.
Abstract: Tinnitus is the perception of a sound in the absence of a corresponding external sound source. Pathophysiologically it has been attributed to bottom-up deafferentation and/or top-down noise-cancelling deficit. Both mechanisms are proposed to alter auditory ­thalamocortical signal transmission, resulting in thalamocortical dysrhythmia (TCD). In deafferentation, TCD is characterized by a slowing down of resting state alpha to theta activity associated with an increase in surrounding gamma activity, resulting in persisting cross-frequency coupling between theta and gamma activity. Theta burst-firing increases network synchrony and recruitment, a mechanism, which might enable long-range synchrony, which in turn could represent a means for finding the missing thalamocortical information and for gaining access to consciousness. Theta oscillations could function as a carrier wave to integrate the tinnitus-related focal auditory gamma activity in a consciousness enabling network, as envisioned by the global workspace model. This model suggests that focal activity in the brain does not reach consciousness, except if the focal activity becomes functionally coupled to a consciousness enabling network, aka the global workspace. In limited deafferentation, the missing information can be retrieved from the auditory cortical neighborhood, decreasing surround inhibition, resulting in TCD. When the deafferentation is too wide in bandwidth, it is hypothesized that the missing information is retrieved from theta-mediated parahippocampal auditory memory. This suggests that based on the amount of deafferentation TCD might change to parahippocampocortical persisting and thus pathological theta–gamma rhythm. From a Bayesian point of view, in which the brain is conceived as a prediction machine that updates its memory-based predictions through sensory updating, tinnitus is the result of a prediction error between the predicted and sensed auditory input. The decrease in sensory updating is reflected by decreased alpha activity and the prediction error results in theta–gamma and beta–gamma coupling. Thus, TCD can be considered as an adaptive mechanism to retrieve missing auditory input in tinnitus.

197 citations


Journal ArticleDOI
TL;DR: It is proposed that increasing structural damage, in combination with an optimum curve of “functional reorganization,” results in a delayed, non-linear, development of cognitive dysfunction, raising doubts on the previous concept of functional reorganization in MS.
Abstract: FUNCTIONAL REORGANIZATION IN MS: AN OUTDATED CONCEPT? The current field of multiple sclerosis (MS) research is an active and highly interesting one: structural abnormalities such as inflammatory lesions and brain atrophy are studied with a wide array of advanced neuroimaging techniques (1). These techniques are subsequently used to try to explain the large clinical heterogeneity in patients. Clinically important in MS is cognitive dysfunction, which is present in 40– 70% of all patients (2, 3). Cognitive impairment in MS receives much attention, as there is currently no proven effective treatment, but symptoms may nevertheless start in early stages of disease already (4). Cognitive decline is known to exert deleterious effects on psychosocial functioning (2, 5, 6). Traditional structural imaging measures like lesion volumes are notoriously poorly related with cognitive function (7), so a move toward more sensitive, comprehensive measures is required, such as those that measure brain function in addition to brain structure. Historically, most early imaging studies have used the paced auditory serial addition test (PASAT) to study cognition in MS, a task that measures information processing speed (8–10). These observed a combination of hyperactivation of frontal regions in response to the task and a recruitment of additional areas, not normally attributed to the task in controls. The functional changes were mostly positively related to the amount of structural damage in the brain, and were stronger in patients who scored normally on the PASAT, indicating that it might be a beneficial process. Later studies investigated other cognitive domains and also showed such an apparently beneficial increased local activation, for example, during a memory task in the hippocampus (11) and during the N-back working memory task in the dorsolateral prefrontal cortex (DLPFC) (12). Importantly, these studies also showed decreased activation in cognitively impaired patients. The body of literature of that point in time led to our previous hypothesis of functional reorganization in MS (13). This hypothesis asserted that a “compensatory” change is seen in the brains of MS patients in the form of an increase in brain function, i.e., both increased activation and increased connectivity. Functional connectivity is conceptually quite different from task-based activation and reflects the amount of communication between brain regions, i.e., coherent patterns of firing typically measured with correlation measures. Early connectivity studies investigated the so-called “default mode network” (DMN), which is only coherently active during a resting state. Two such studies found DMN changes that were interpreted in the same way as the task-based activation studies: increased DMN connectivity in clinically isolated syndrome (CIS) patients (14) and decreased DMN connectivity in progressive MS, which was related to cognitive impairment (15). We proposed that increasing structural damage, in combination with an optimum curve of “functional reorganization,” results in a delayed, non-linear, development of cognitive dysfunction. However, the previous model was mostly based on task-based activation studies, while the connectivity field was still in its infancy. As the concept of functional reorganization was gaining support, the field was primed for finding cognitively relevant connectivity changes. Interestingly, recent studies have mostly related increased functional connectivity to cognitive dysfunction, raising doubts on the previous concept of functional reorganization in MS. In this paper, we will review this recent functional connectivity literature and reiterate the case around functional connectivity changes in MS and their potential effects on cognition. Which reported connectivity changes can be justifiably said to be “compensatory”or“beneficial”? Which are likely “maladaptive”? Can any such predicate be arrived at all, based on the neuroscientific studies available? Is it perhaps time to revise our previous model of functional reorganization?

165 citations


Journal ArticleDOI
TL;DR: This paper reviews the structural and functional neuroimaging evidence as well as pharmacological studies that suggest that dopamine plays a critical role in the phenomenon of fatigue and concludes with how specific aspects of the dopamine imbalance hypothesis can be tested in future research.
Abstract: Fatigue is one of the most pervasive symptoms of multiple sclerosis (MS), and has engendered hundreds of investigations on the topic. While there is a growing literature using various methods to study fatigue, a unified theory of fatigue in MS is yet to emerge. In the current review, we synthesize findings from neuroimaging, pharmacological, neuropsychological, and immunological studies of fatigue in MS, which point to a specific hypothesis of fatigue in MS: the dopamine imbalance hypothesis. The communication between the striatum and prefrontal cortex is reliant on dopamine, a modulatory neurotransmitter. Neuroimaging findings suggest that fatigue results from the disruption of communication between these regions. Supporting the dopamine imbalance hypothesis, structural and functional neuroimaging studies show abnormalities in the frontal and striatal regions that are heavily innervated by dopamine neurons. Further, dopaminergic psychostimulant medication has been shown to alleviate fatigue in individuals with traumatic brain injury, chronic fatigue syndrome, and in cancer patients, also indicating that dopamine might play an important role in fatigue perception. This paper reviews the structural and functional neuroimaging evidence as well as pharmacological studies that suggest that dopamine plays a critical role in the phenomenon of fatigue. We conclude with how specific aspects of the dopamine imbalance hypothesis can be tested in future research.

163 citations


Journal ArticleDOI
TL;DR: Evidence is examined that current protocols are reasonably close to the optimal depth and duration of cooling, but that the optimal rate of rewarming after hypothermia is unclear and the potential for combination treatments to augment hypothermic neuroprotection has considerable promise.
Abstract: Hypoxia-ischemia before or around the time of birth occurs in approximately 2/1000 live births and is associated with a high risk of death or lifelong disability. Therapeutic hypothermia is now well established as standard treatment for infants with moderate to severe hypoxic-ischemic encephalopathy but is only partially effective. There is compelling preclinical and clinical evidence that hypothermia is most protective when it is started as early as possible after hypoxia-ischemia. Further improvements in outcome from therapeutic hypothermia are very likely to arise from strategies to reduce the delay before starting treatment of affected infants. In this review, we examine evidence that current protocols are reasonably close to the optimal depth and duration of cooling, but that the optimal rate of rewarming after hypothermia is unclear. The potential for combination treatments to augment hypothermic neuroprotection has considerable promise, particularly with endogenous targets such as melatonin and erythropoietin, and noble gases such as xenon. We dissect the critical importance of preclinical studies using realistic delays in treatment and clinically relevant cooling protocols when examining combination treatment, and that for many strategies overlapping mechanisms of action can substantially attenuate any effects.

150 citations


Journal ArticleDOI
TL;DR: Overall, insufficient sleep was associated with being female, White or Black/African-American, unemployed, without health insurance, and not married; decreased age, income, education, physical activity; worse diet and overall health; and increased household size, alcohol, and smoking.
Abstract: Insufficient sleep is associated with cardiometabolic disease and poor health. However, few studies have assessed its determinants in a nationally representative sample. Data from the 2009 behavioral risk factor surveillance system were used (N = 323,047 adults). Insufficient sleep was assessed as insufficient rest/sleep over 30 days. This was evaluated relative to sociodemographics (age, sex, race/ethnicity, marital status, region), socioeconomics (education, income, employment, insurance), health behaviors (diet, exercise, smoking, alcohol), and health/functioning (emotional support, BMI, mental/physical health). Overall, insufficient sleep was associated with being female, White or Black/African-American, unemployed, without health insurance, and not married; decreased age, income, education, physical activity; worse diet and overall health; and increased household size, alcohol, and smoking. These factors should be considered as risk factors for insufficient sleep.

Journal ArticleDOI
TL;DR: In this article, a review of advanced imaging techniques including perfusion MRI, diffusion MRI, MR spectroscopy, and new PET imaging tracers are discussed in the context of determining response and progression during treatment of glioblastoma both in the standard care as well as clinical trial context.
Abstract: Glioblastoma, the most common malignant primary brain tumor in adults is a devastating diagnosis with an average survival of 14-16 months using the current standard of care treatment. The determination of treatment response and clinical decision making is based on the accuracy of radiographic assessment. Notwithstanding, challenges exist in the neuroimaging evaluation of patients undergoing treatment for malignant glioma. Differentiating treatment response from tumor progression is problematic and currently combines long-term follow-up using standard MRI, with clinical status and corticosteroid-dependency assessments. In the clinical trial setting, treatment with gene therapy, vaccines, immunotherapy, and targeted biologicals similarly produces MRI changes mimicking disease progression. A neuroimaging method to clearly distinguish between pseudoprogression and tumor progression has unfortunately not been found to date. With the incorporation of antiangiogenic therapies, a further pitfall in imaging interpretation is pseudoresponse. The Macdonald Criteria that correlate tumor burden with contrast enhanced imaging proved insufficient and misleading in the context of rapid blood brain barrier normalization following antiangiogenic treatment that is not accompanied by expected survival benefit. Even improved criteria, such as the RANO criteria, that incorporate non-enhancing disease, clinical status, and need for corticosteroid use, fall short of definitively distinguishing tumor progression, pseudoresponse, and pseudoprogression. This review focuses on advanced imaging techniques including perfusion MRI, diffusion MRI, MR spectroscopy, and new PET imaging tracers. The relevant image analysis algorithms and interpretation methods of these promising techniques are discussed in the context of determining response and progression during treatment of glioblastoma both in the standard of care as well as clinical trial context.

Journal ArticleDOI
TL;DR: There is some evidence that activity in contralesional M1 will impact the extent of motor function of the paretic limb in the subacute and chronic phase post-stroke and may serve as a new target for rehabilitation treatment strategies, the precise factors that specifically influence its role in the recovery process remain to be defined.
Abstract: Identification of optimal treatment strategies to improve recovery is limited by the incomplete understanding of the neurobiological principles of recovery. Motor cortex (M1) reorganization of the lesioned hemisphere (ipsilesional M1) plays a major role in post-stroke motor recovery and is a primary target for rehabilitation therapy. Reorganization of M1 in the hemisphere contralateral to the stroke (contralesional M1) may, however, serve as an additional source of cortical reorganization and related recovery. The extent and outcome of such reorganization depends on many factors, including lesion size and time since stroke. In the chronic phase post-stroke, contralesional M1 seems to interfere with motor function of the paretic limb in a subset of patients, possibly through abnormally increased inhibition of lesioned M1 by the contralesional M1. In such patients, decreasing contralesional M1 excitability by cortical stimulation results in improved performance of the paretic limb. However, emerging evidence suggests a potentially supportive role of contralesional M1. After infarction of M1 or its corticospinal projections, there is abnormally increased excitatory neural activity and activation in contralesional M1 that correlates with favorable motor recovery. Decreasing contralesional M1 excitability in these patients may result in deterioration of paretic limb performance. In animal stroke models, reorganizational changes in contralesional M1 depend on the lesion size and rehabilitation treatment and include long-term changes in neurotransmitter systems, dendritic growth, and synapse formation. While there is, therefore, some evidence that activity in contralesional M1 will impact the extent of motor function of the paretic limb in the subacute and chronic phase post-stroke and may serve as a new target for rehabilitation treatment strategies, the precise factors that specifically influence its role in the recovery process remain to be defined.

Journal ArticleDOI
TL;DR: Current understanding of mechanisms generating healthy REM sleep and how dysfunction of these circuits contributes to common REM sleep disorders such as cataplexy/narcolepsy and RBD are synthesized.
Abstract: REM sleep is generated and maintained by the interaction of a variety of neurotransmitter systems in the brainstem, forebrain and hypothalamus. Within these circuits lies a core region that is active during REM sleep, known as the subcoeruleus nucleus (SubC) or sublaterodorsal nucleus. It is hypothesized that glutamatergic SubC neurons regulate REM sleep and its defining features such as muscle paralysis and cortical activation. REM sleep paralysis is initiated when glutamatergic SubC activate neurons in the ventral medial medulla (VMM), which causes release of GABA and glycine onto skeletal motoneurons. REM sleep timing is controlled by activity of GABAergic neurons in the ventrolateral periaqueductal gray (vlPAG) and dorsal paragigantocellular reticular nucleus (DPGi) as well as melanin-concentrating hormone (MCH) neurons in the hypothalamus and cholinergic cells in the laterodorsal (LDT) and pedunculo-pontine tegmentum (PPT) in the brainstem. Determining how these circuits interact with the SubC is important because breakdown in their communication is hypothesized to underlie cataplexy/narcolepsy and REM sleep behaviour disorder (RBD). This review synthesizes our current understanding of mechanisms generating healthy REM sleep and how dysfunction of these circuits contributes to common REM sleep disorders such as cataplexy/narcolepsy and RBD.

Journal ArticleDOI
TL;DR: It is proposed that the cell cycle may be synchronized or slowed down through coupling with the circadian clock, which results in reduced tumor growth in both healthy and cancerous cells.
Abstract: Uncontrolled cell proliferation is one of the key features leading to cancer. Seminal works in chronobiology have revealed that disruption of the circadian timing system in mice, either by surgical, genetic, or environmental manipulation, increased tumor development. In humans, shift work is a risk factor for cancer. Based on these observations, the link between the circadian clock and cell cycle has become intuitive. But despite identification of molecular connections between the two processes, the influence of the clock on the dynamics of the cell cycle has never been formally observed. Recently, two studies combining single live cell imaging with computational methods have shed light on robust coupling between clock and cell cycle oscillators. We recapitulate here these novel findings and integrate them with earlier results in both healthy and cancerous cells. Moreover, we propose that the cell cycle may be synchronized or slowed down through coupling with the circadian clock, which results in reduced tumor growth. More than ever, systems biology has become instrumental to understand the dynamic interaction between the circadian clock and cell cycle, which is critical in cellular coordination and for diseases such as cancer.

Journal ArticleDOI
TL;DR: Recent translational studies using movement recognition technology as a method of assessing movement in high risk infants are identified and may lead to a greater understanding of the development of the nervous system in infants at high risk of motor impairment.
Abstract: Preterm birth is associated with increased risks of neurological and motor impairments such as cerebral palsy. The risks are highest in those born at the lowest gestations. Early identification of those most at risk is challenging meaning that a critical window of opportunity to improve outcomes through therapy-based interventions may be missed. Clinically, the assessment of spontaneous general movements is an important tool, which can be used for the prediction of movement impairments in high risk infants. Movement recognition aims to capture and analyze relevant limb movements through computerized approaches focusing on continuous, objective, and quantitative assessment. Different methods of recording and analyzing infant movements have recently been explored in high risk infants. These range from camera-based solutions to body-worn miniaturized movement sensors used to record continuous time-series data that represent the dynamics of limb movements. Various machine learning methods have been developed and applied to the analysis of the recorded movement data. This analysis has focused on the detection and classification of atypical spontaneous general movements. This article aims to identify recent translational studies using movement recognition technology as a method of assessing movement in high risk infants. The application of this technology within pediatric practice represents a growing area of inter-disciplinary collaboration, which may lead to a greater understanding of the development of the nervous system in infants at high risk of motor impairment.

Journal ArticleDOI
TL;DR: This review summarizes common and less frequent adverse events that have been discovered in preclinical and clinical investigations assessing cell therapies for stroke, and describes potential complications of clinically applicable administration procedures, detrimental interactions between therapeutic cells, and the pathophysiological environment that they are placed into.
Abstract: Cell therapies are increasingly recognized as a promising option to augment the limited therapeutic arsenal available to fight ischemic stroke. During the last two decades, cumulating preclinical evidence has indicated a substantial efficacy for most cell treatment paradigms and first clinical trials are currently underway to assess safety and feasibility in patients. However, the strong and still unmet demand for novel stroke treatment options and exciting findings reported from experimental studies may have drawn our attention away from potential side effects related to cell therapies and the ways by which they are commonly applied. This review summarizes common and less frequent adverse events that have been discovered in preclinical and clinical investigations assessing cell therapies for stroke. Such adverse events range from immunological and neoplastic complications over seizures to cell clotting and cell-induced embolism. It also describes potential complications of clinically applicable administration procedures, detrimental interactions between therapeutic cells, and the pathophysiological environment that they are placed into, as well as problems related to cell manufacturing. Virtually each therapeutic intervention comes at a certain risk for complications. Side effects do therefore not generally compromise the value of cell treatments for stroke, but underestimating such complications might severely limit therapeutic safety and efficacy of cell treatment protocols currently under development. On the other hand, a better understanding will provide opportunities to further improve existing therapeutic strategies and might help to define those circumstances, under which an optimal effect can be realized. Hence, the review eventually discusses strategies and recommendations allowing us to prevent or at least balance potential complications in order to ensure the maximum therapeutic benefit at minimum risk for stroke patients.

Journal ArticleDOI
TL;DR: Roles for a series of novel mechanisms in driving the immune suppression that is observed post-TBI are proposed, including the CNS-driven emergence into the circulation of myeloid-derived suppressor cells and suppressive neutrophil subsets.
Abstract: Nosocomial infections are a common occurrence in patients following traumatic brain injury (TBI) and are associated with an increased risk of mortality, longer length of hospital stay, and poor neurological outcome. Systemic immune suppression arising as a direct result of injury to the central nervous system (CNS) is considered to be primarily responsible for this increased incidence of infection, a view strengthened by recent studies that have reported novel changes in the composition and function of the innate and adaptive arms of the immune system post-TBI. However, our knowledge of the mechanisms that underlie TBI-induced immune suppression is equivocal at best. Here, after summarizing our current understanding of the impact of TBI on peripheral immunity and discussing CNS-mediated regulation of immune function, we propose roles for a series of novel mechanisms in driving the immune suppression that is observed post-TBI. These mechanisms, which have never been considered before in the context of TBI-induced immune paresis, include the CNS-driven emergence into the circulation of myeloid-derived suppressor cells and suppressive neutrophil subsets, and the release from injured tissue of nuclear and mitochondria-derived damage associated molecular patterns. Moreover, in an effort to further our understanding of the mechanisms that underlie TBI-induced changes in immunity, we pose throughout the review a series of questions, which if answered would address a number of key issues, such as establishing whether manipulating peripheral immune function has potential as a future therapeutic strategy by which to treat and/or prevent infections in the hospitalized TBI patient.

Journal ArticleDOI
TL;DR: The affected neurological circuits underlying gait and temporal processing in PD patients are examined and the current studies demonstrating the effects of RAS on improving these gait deficits are summarized.
Abstract: Gait abnormalities, such as shuffling steps, start hesitation, and freezing, are common and often incapacitating symptoms of Parkinson's disease (PD) and other parkinsonian disorders. Pharmacological and surgical approaches have only limited efficacy in treating these gait disorders. Rhythmic auditory stimulation (RAS), such as playing marching music and dance therapy, has been shown to be a safe, inexpensive, and an effective method in improving gait in PD patients. However, RAS that adapts to patients' movements may be more effective than rigid, fixed-tempo RAS used in most studies. In addition to auditory cueing, immersive virtual reality technologies that utilize interactive computer-generated systems through wearable devices are increasingly used for improving brain-body interaction and sensory-motor integration. Using multisensory cues, these therapies may be particularly suitable for the treatment of parkinsonian freezing and other gait disorders. In this review, we examine the affected neurological circuits underlying gait and temporal processing in PD patients and summarize the current studies demonstrating the effects of RAS on improving these gait deficits.

Journal ArticleDOI
TL;DR: This review will examine the current literature regarding blood-based proteomic biomarkers of AD and its associated pathology and adopt various “omics” approaches in order to achieve this.
Abstract: The complexity of Alzheimer’s disease (AD) and its long prodromal phase poses challenges for early diagnosis and yet allows for the possibility of the development of disease modifying treatments for secondary prevention. It is, therefore, of importance to develop biomarkers, in particular, in the preclinical or early phases that reflect the pathological characteristics of the disease and, moreover, could be of utility in triaging subjects for preventative therapeutic clinical trials. Much research has sought biomarkers for diagnostic purposes by comparing affected people to unaffected controls. However, given that AD pathology precedes disease onset, a pathology endophenotype design for biomarker discovery creates the opportunity for detection of much earlier markers of disease. Blood-based biomarkers potentially provide a minimally invasive option for this purpose and research in the field has adopted various “omics” approaches in order to achieve this. This review will, therefore, examine the current literature regarding blood-based proteomic biomarkers of AD and its associated pathology.

Journal ArticleDOI
TL;DR: The effect of pathogenic mutations on the structure of the channel protein, the rate of recurrent mutation, and changes in channel function underlying this devastating disorder are discussed.
Abstract: Mutations of the voltage-gated sodium channel SCN8A have been identified in approximately 1% of nearly 1,000 children with early-infantile epileptic encephalopathies (EIEE) who have been tested by DNA sequencing. EIEE caused by mutation of SCN8A is designated EIEE13 (OMIM #614558). Affected children have seizure onset before 18 months of age as well as developmental and cognitive disabilities, movement disorders, and a high incidence of sudden death (SUDEP). EIEE13 is caused by de novo missense mutations of evolutionarily conserved residues in the Nav1.6 channel protein. One-third of the mutations are recurrent, and many occur at CpG dinucleotides. In this review we discuss the effect of pathogenic mutations on the structure of the channel protein, the rate of recurrent mutation, and changes in channel function underlying this devastating disorder.

Journal ArticleDOI
TL;DR: 10 major recommendations are proposed as ways to identify an optimal functional recovery of vestibular loss patients, and the crucial role of active and early VR therapy and the importance of motivational and ecologic contexts are proposed.
Abstract: This review questions the relationships between the plastic events responsible for the recovery of vestibular function after a unilateral vestibular loss (vestibular compensation), which has been well described in animal models in the last decades, and the vestibular rehabilitation (VR) therapy elaborated on a more empirical basis for vestibular loss patients. The main objective is not to propose a catalogue of results but to provide clinicians with an understandable view on when and how to perform VR therapy, and why VR may benefit from basic knowledge and may influence the recovery process. With this perspective, 10 major recommendations are proposed as ways to identify an optimal functional recovery. Among them are the crucial role of active and early VR therapy, coincidental with a post-lesion sensitive period for neuronal network remodelling, the instructive role that VR therapy may play in this functional reorganisation, the need for progression in the VR therapy protocol, which is based mainly on adaptation processes, the necessity to take into account the sensorimotor, cognitive and emotional profile of the patient to propose individual or “a la carte” VR therapies, and the importance of motivational and ecologic contexts. More than 10 general principles are very likely, but these principles seem crucial for the fast recovery of vestibular loss patients to ensure good quality of life.

Journal ArticleDOI
TL;DR: A narrative review of data from studies utilizing DTI, MRS, fMRI, EEG, and brain stimulation techniques focusing on TMS and its combination with uni- and multimodal neuroimaging methods to assess recovery after stroke is provided.
Abstract: Following stroke, the brain undergoes various stages of recovery where the central nervous system can reorganize neural circuitry (neuroplasticity) both spontaneously and with the aid of behavioral rehabilitation and non-invasive brain stimulation. Multiple neuroimaging techniques can characterize common structural and functional stroke-related deficits, and importantly, help predict recovery of function. Diffusion tensor imaging (DTI) typically reveals increased overall diffusivity throughout the brain following stroke, and is capable of indexing the extent of white matter damage. Magnetic resonance spectroscopy (MRS) provides an index of metabolic changes in surviving neural tissue after stroke, serving as a marker of brain function. The neural correlates of altered brain activity after stroke have been demonstrated by abnormal activation of sensorimotor cortices during task performance, and at rest, using functional magnetic resonance imaging (fMRI). Electroencephalography (EEG) has been used to characterize motor dysfunction in terms of increased cortical amplitude in the sensorimotor regions when performing upper limb movement, indicating abnormally increased cognitive effort and planning in individuals with stroke. Transcranial magnetic stimulation (TMS) work reveals changes in ipsilesional and contralesional cortical excitability in the sensorimotor cortices. The severity of motor deficits indexed using TMS has been linked to the magnitude of activity imbalance between the sensorimotor cortices. In this paper, we will provide a narrative review of data from studies utilizing DTI, MRS, fMRI, EEG, and brain stimulation techniques focusing on TMS and its combination with uni- and multimodal neuroimaging methods to assess recovery after stroke. Approaches that delineate the best measures with which to predict or positively alter outcomes will be highlighted.

Journal ArticleDOI
TL;DR: The finding that LDLs are decreased across the full range of audiometric frequencies, regardless of the pattern or degree of hearing loss, indicates that hyperacusis might be due to a generalized increase in auditory gain.
Abstract: Hyperacusis is a frequent auditory disorder where sounds of normal volume are perceived as too loud or even painfully loud. There is a high degree of co-morbidity between hyperacusis and tinnitus, most hyperacusis patients also have tinnitus, but only about 30-40% of tinnitus patients also show symptoms of hyperacusis. In order to elucidate the mechanisms of hyperacusis, detailed measurements of loudness discomfort levels (LDLs) across the hearing range would be desirable. However, previous studies have only reported LDLs for a restricted frequency range, e.g. from 0.5 to 4 kHz, or from 1 to 8 kHz. We have measured audiograms and LDLs in 381 patients with a primary complaint of hyperacusis for the full standard audiometric frequency range from 0.125 to 8 kHz. On average, patients had mild high-frequency hearing loss, but more than a third of the tested ears had normal hearing thresholds, i.e. ≤ 20 dB HL. LDLs were found to be significantly decreased compared to a normal-hearing reference group, with average values around 85 dB HL across the frequency range. However, receiver operating characteristic analysis showed that LDL measurements are neither sensitive nor specific enough to serve as a single test for hyperacusis. There was a moderate positive correlation between hearing thresholds and LDLs (r = 0.36), i.e. LDLs tended to be higher at frequencies where hearing loss was present, suggesting that hyperacusis is unlikely to be caused by hearing threshold increase, in contrast to tinnitus for which hearing loss is a main trigger. Moreover, our finding that LDLs are decreased across the full range of audiometric frequencies, regardless of the pattern or degree of hearing loss, indicates that hyperacusis might be due to a generalized increase in auditory gain. Tinnitus on the other hand is thought to be caused by neuroplastic changes in a restricted frequency range, suggesting that tinnitus and hyperacusis might not share a common mechanism.

Journal ArticleDOI
TL;DR: Functional neuroimaging techniques suggest a dysmodulation in the multimodal sensory integration and processing of vestibular and nociceptive information, resulting from a vestibulo-thalamo-cortical dysfunction, as the pathogenic mechanism underlying VM.
Abstract: Vestibular migraine (VM) is a common disorder in which genetic, epigenetic and environmental factors probably contribute to its development. The pathophysiology of VM is unknown; nevertheless in the last few years, several studies are contributing to understand the neurophysiological pathways involved in VM. The current hypotheses are mostly based on the knowledge of migraine itself. The evidence of trigeminal innervation of the labyrinth vessels and the localization of vasoactive neuropeptides in the perivascular afferent terminals of these trigeminal fibers support the involvement of the trigemino-vascular system. The neurogenic inflammation triggered by activation of the trigeminal-vestibulocochlear reflex, with the subsequent inner ear plasma protein extravasation and the release of inflammatory mediators, can contribute to a sustained activation and sensitization of the trigeminal primary afferent neurons explaining VM symptoms. The reciprocal connections between brainstem vestibular nuclei and the structures that modulate trigeminal nociceptive inputs (rostral ventromedial medulla, ventrolateral periaqueductal grey, locus coeruleus and nucleus raphe magnus) are critical to understand the pathophysiology of VM. Although cortical spreading depression can affect cortical areas involved in processing vestibular information, functional neuroimaging techniques suggest a dysmodulation in the multimodal sensory integration and processing of vestibular and nociceptive information, resulting from a vestibulo-thalamo-cortical dysfunction, as the pathogenic mechanism underlying VM. The elevated prevalence of VM suggests that multiple functional variants may confer a genetic susceptibility leading to a dysregulation of excitatory-inhibitory balance in brain structures involved in the processing of sensory information, vestibular inputs and pain. The interactions among several functional and structural neural networks could explain the pathogenic mechanisms of VM.

Journal ArticleDOI
TL;DR: In this article, a coordinated reset (CR) stimulation was developed to specifically counteract abnormal neuronal synchrony by desynchronization, and the potential efficacy of acoustic CR neuromodulation was shown in a clinical proof-of-concept trial.
Abstract: Tinnitus is the conscious perception of sound heard in the absence of physical sound sources external or internal to the body, reflected in aberrant neural synchrony of spontaneous or resting state brain activity. Neural synchrony is generated by the nearly simultaneous firing of individual neurons, of the synchronization of membrane potential changes in local neural groups as reflected in the local field potentials, resulting in the presence of oscillatory brain waves in the EEG. Noise-induced hearing loss, often resulting in tinnitus, causes a reorganization of the tonotopic map in auditory cortex and increased spontaneous firing rates and neural synchrony. Spontaneous brain rhythms rely on neural synchrony. Abnormal neural synchrony in tinnitus appears to be confined to specific frequency bands of brain rhythms. Increases in delta-band activity are generated by deafferented/deprived neuronal networks resulting from hearing loss. Coordinated reset (CR) stimulation was developed in order to specifically counteract such abnormal neuronal synchrony by desynchronization. The goal of acoustic CR neuromodulation is to desynchronize tinnitus-related abnormal delta band oscillations. CR neuromodulation does not require permanent stimulus delivery in order to achieve long-lasting desynchronization or even a full-blown anti-kindling but may have cumulative effects, i.e. the effect of different CR epochs separated by pauses may accumulate. Unlike other approaches, acoustic CR neuromodulation does not intend to reduce tinnitus-related neuronal activity by employing lateral inhibition. The potential efficacy of acoustic CR modulation was shown in a clinical proof of concept trial, where effects achieved in 12 weeks of treatment delivered 4-6h/day persisted through a preplanned 4-week therapy pause and showed sustained long-term effects after 10 months of therapy, leading to 75% responders.

Journal ArticleDOI
TL;DR: In this mini review, age-related degenerative processes that affect balance are presented and diagnostic and therapeutic approaches oriented to the specific impaired system, including visual, proprioceptive, and vestibular pathways, are proposed.
Abstract: The prevalence of vertigo and dizziness in people aged more than 60 years reaches 30%, and due to aging of world population, the number of patients is rapidly increasing. The presence of dizziness in the elderly is a strong predictor of falls, which is the leading cause of accidental death in people older than 65 years. Balance disorders in the elderly constitute a major public health problem, and require an adequate diagnosis and management by trained physicians. In the elderly, common causes of vertigo may manifest differently, as patients tend to report less rotatory vertigo and more nonspecific dizziness and instability than younger patients, making diagnosis more complex. In this mini review, age-related degenerative processes that affect balance are presented. Diagnostic and therapeutic approaches oriented to the specific impaired system, including visual, proprioceptive, and vestibular pathways are proposed. In addition, presbystasis -the loss of vestibular and balance functions associated with aging-, benign paroxysmal positional vertigo, and stroke (in acute syndromes) should always be considered.

Journal ArticleDOI
TL;DR: NMT aims at enhancing sensory, cognitive, and motor functions (as in PD treatment, in which specific rhythmic techniques can strengthen and improve the rehabilitative process), and rhythm has a crucial role in rehabilitation.
Abstract: Parkinson’s disease (PD) is a neurological disorder involving the progressive degeneration of the dopaminergic system, which gives rise to movement-related dysfunctions (such as bradykinesia, tremor, and rigidity) as well as other symptoms, mainly of cognitive and psychological nature. In the latter case, mood disorders prevails frequently causing anxiety and depression in all phases of the disease, sometimes even before the motor symptoms occur. Aarsland and colleagues (1) report that 35% of the patients affected by PD present depression, whereas Richard (2) states that anxiety is to be found in 40% of the cases. The literature shows that playing and listening to music may modulate emotions, behaviors, movements, communication, and cognitive factors,modifying the activity of the brain areas involved in the perception and regulation of these aspects (3, 4). Music can produce substantial effects on movement-related symptoms as well as psychological ones in PD treatment. Concerning the first aspect, rhythm has a crucial role in rehabilitation, enhancing connections between the motor and auditory systems (5). Literature showed how a rhythmic auditory cues-based training can produce a compensation of the cerebello-thalamo-cortical network leading to beneficial effects, for example, improving not only speed and step length but also perceptual and motor timing abilities (6, 7). Areas involving rhythm perception are closely related to those that regulate movement (such as the premotor cortex, supplementarymotor area, cerebellum, and basal ganglia – especially putamen) (8–18). A study conducted with fMRI (19) shows that whereas a regular pulse (in contrast to an irregular one) generally activates basal ganglia in a significant way, this is not the case in PD. Other studies (7, 20) support the idea that external cues (in particular rhythmical cues) can modulate the activity within the impaired timing system. This may mean that a regular rhythmic pulse stimulates the putamen activity, facilitating movement and providing an input for sequential movements and impaired automatized processes. Moreover, this could compensate for the lack of dopaminergic stimulation. Rhythm can be also perceived visually and through the tactile sense, but the reaction time of the human auditory system is shorter by 20–50ms, when compared to visual and tactile stimuli; moreover, it has a stronger capacity of perceiving rhythm periodicity and structure (6). Therefore, rhythm influences the kinetic system (through synchronization and adjustment of muscles to auditory stimuli), facilitates movement synchronization, coordination, and regularization, and may even produce an internal rhythm that persists in the absence of stimuli (21–23). Many studies report that musical rhythm in PD treatment can improve gait (speed, frequency, and step length), limbs coordination, postural control, and balance (7, 18, 24–36). In view of the above, Neurologic Music Therapy (NMT) – especially Rhythmic Auditory Stimulation, one of its techniques – characterizes this approach to the disease: NMT aims at enhancing sensory, cognitive, andmotor functions (as in PD treatment, in which specific rhythmic techniques can strengthen and improve the rehabilitative process).

Journal ArticleDOI
TL;DR: Through inclusion of additional significantly dysregulated analyte species, the new biomarker panel provides greater accuracy in the authors' cohort but remains limited by predictive power.
Abstract: We recently documented plasma lipid dysregulation in preclinical late-onset Alzheimer's disease (LOAD). A 10 plasma lipid panel, predicted phenoconversion and provided 90% sensitivity and 85% specificity in differentiating an at-risk group from those that would remain cognitively intact. Despite these encouraging results, low positive predictive values limit the clinical usefulness of this panel as a screening tool in subjects aged 70-80 years or younger. In this report, we re-examine our metabolomic data, analyzing baseline plasma specimens from our group of phenoconverters (n = 28) and a matched set of cognitively normal subjects (n = 73), and discover and internally validate a panel of 24 plasma metabolites. The new panel provides a classifier with receiver operating characteristic area under the curve for the discovery and internal validation cohort of 1.0 and 0.995 (95% confidence intervals of 1.0-1.0, and 0.981-1.0), respectively. Twenty-two of the 24 metabolites were significantly dysregulated lipids. While positive and negative predictive values were improved compared to our 10-lipid panel, low positive predictive values provide a reality check on the utility of such biomarkers in this age group (or younger). Through inclusion of additional significantly dysregulated analyte species, our new biomarker panel provides greater accuracy in our cohort but remains limited by predictive power. Unfortunately, the novel metabolite panel alone may not provide improvement in counseling and management of at-risk individuals but may further improve selection of subjects for LOAD secondary prevention trials. We expect that external validation will remain challenging due to our stringent study design, especially compared with more diverse subject cohorts. We do anticipate, however, external validation of reduced plasma lipid species as a predictor of phenoconversion to either prodromal or manifest LOAD.

Journal ArticleDOI
TL;DR: A growing body of evidence suggests that the functional integrity of the SCN contributes to health, well-being, cognitive performance, and alertness; in contrast, deterioration of the 24-h rhythm is a risk factor for neurodegenerative disease, cancer, depression, and sleep disorders.
Abstract: In mammals, the suprachiasmatic nucleus (SCN) functions as a circadian clock that drives 24-h rhythms in both physiology and behavior The SCN is a multicellular oscillator in which individual neurons function as cell-autonomous oscillators The production of a coherent output rhythm is dependent upon mutual synchronization among single cells and requires both synaptic communication and gap junctions Changes in phase-synchronization between individual cells have consequences on the amplitude of the SCN’s electrical activity rhythm, and these changes play a major role in the ability to adapt to seasonal changes Both aging and sleep deprivation negatively affect the circadian amplitude of the SCN, whereas behavioral activity (ie, exercise) has a positive effect on amplitude Given that the amplitude of the SCN’s electrical activity rhythm is essential for achieving robust rhythmicity in physiology and behavior, the mechanisms that underlie neuronal synchronization warrant further study A growing body of evidence suggests that the functional integrity of the SCN contributes to health, well-being, cognitive performance, and alertness; in contrast, deterioration of the 24-h rhythm is a risk factor for neurodegenerative disease, cancer, depression, and sleep disorders