scispace - formally typeset
Search or ask a question
Browse all papers

Journal ArticleDOI
TL;DR: The main challenges raised by imbalanced domains are discussed, a definition of the problem is proposed, the main approaches to these tasks are described, and a taxonomy of the methods are proposed.
Abstract: Many real-world data-mining applications involve obtaining predictive models using datasets with strongly imbalanced distributions of the target variable. Frequently, the least-common values of this target variable are associated with events that are highly relevant for end users (e.g., fraud detection, unusual returns on stock markets, anticipation of catastrophes, etc.). Moreover, the events may have different costs and benefits, which, when associated with the rarity of some of them on the available training data, creates serious problems to predictive modeling techniques. This article presents a survey of existing techniques for handling these important applications of predictive analytics. Although most of the existing work addresses classification tasks (nominal target variables), we also describe methods designed to handle similar problems within regression tasks (numeric target variables). In this survey, we discuss the main challenges raised by imbalanced domains, propose a definition of the problem, describe the main approaches to these tasks, propose a taxonomy of the methods, summarize the conclusions of existing comparative studies as well as some theoretical analyses of some methods, and refer to some related problems within predictive modeling.

730 citations


Journal ArticleDOI
18 Apr 2018-Nature
TL;DR: Normothermic machine perfusion of the liver improved early graft function, demonstrated by reduced peak serum aspartate transaminase levels and early allograft dysfunction rates, and improved organ utilization and preservation times, although no differences were seen in graft or patient survival.
Abstract: Liver transplantation is a highly successful treatment, but is severely limited by the shortage in donor organs. However, many potential donor organs cannot be used; this is because sub-optimal livers do not tolerate conventional cold storage and there is no reliable way to assess organ viability preoperatively. Normothermic machine perfusion maintains the liver in a physiological state, avoids cooling and allows recovery and functional testing. Here we show that, in a randomized trial with 220 liver transplantations, compared to conventional static cold storage, normothermic preservation is associated with a 50% lower level of graft injury, measured by hepatocellular enzyme release, despite a 50% lower rate of organ discard and a 54% longer mean preservation time. There was no significant difference in bile duct complications, graft survival or survival of the patient. If translated to clinical practice, these results would have a major impact on liver transplant outcomes and waiting list mortality.

730 citations


Journal ArticleDOI
TL;DR: It is found that approximately 1.3 billion people are at risk for cholera in endemic countries, and Sub-Saharan Africa accounts for the majority of this burden.
Abstract: Background The global burden of cholera is largely unknown because the majority of cases are not reported. The low reporting can be attributed to limited capacity of epidemiological surveillance and laboratories, as well as social, political, and economic disincentives for reporting. We previously estimated 2.8 million cases and 91,000 deaths annually due to cholera in 51 endemic countries. A major limitation in our previous estimate was that the endemic and non-endemic countries were defined based on the countries’ reported cholera cases. We overcame the limitation with the use of a spatial modelling technique in defining endemic countries, and accordingly updated the estimates of the global burden of cholera.

730 citations


Proceedings ArticleDOI
01 Apr 2017
TL;DR: This article showed that weight tying can reduce the size of neural translation models to less than half of their original size without harming their performance and proposed a new method of regularizing the output embedding.
Abstract: We study the topmost weight matrix of neural network language models. We show that this matrix constitutes a valid word embedding. When training language models, we recommend tying the input embedding and this output embedding. We analyze the resulting update rules and show that the tied embedding evolves in a more similar way to the output embedding than to the input embedding in the untied model. We also offer a new method of regularizing the output embedding. Our methods lead to a significant reduction in perplexity, as we are able to show on a variety of neural network language models. Finally, we show that weight tying can reduce the size of neural translation models to less than half of their original size without harming their performance.

730 citations


Journal ArticleDOI
TL;DR: Deep learning predicts, from retinal images, cardiovascular risk factors—such as smoking status, blood pressure and age—not previously thought to be present or quantifiable in these images.
Abstract: Traditionally, medical discoveries are made by observing associations and then designing experiments to test these hypotheses. However, observing and quantifying associations in images can be difficult because of the wide variety of features, patterns, colors, values, shapes in real data. In this paper, we use deep learning, a machine learning technique that learns its own features, to discover new knowledge from retinal fundus images. Using models trained on data from 284,335 patients, and validated on two independent datasets of 12,026 and 999 patients, we predict cardiovascular risk factors not previously thought to be present or quantifiable in retinal images, such as such as age (within 3.26 years), gender (0.97 AUC), smoking status (0.71 AUC), HbA1c (within 1.39%), systolic blood pressure (within 11.23mmHg) as well as major adverse cardiac events (0.70 AUC). We further show that our models used distinct aspects of the anatomy to generate each prediction, such as the optic disc or blood vessels, opening avenues of further research.

730 citations


Journal ArticleDOI
TL;DR: Among patients with atherosclerotic vascular disease who were receiving intensive statin therapy, the use of anacetrapib resulted in a lower incidence of major coronary events than the use in the placebo group.
Abstract: Background Patients with atherosclerotic vascular disease remain at high risk for cardiovascular events despite effective statin-based treatment of low-density lipoprotein (LDL) cholesterol levels. The inhibition of cholesteryl ester transfer protein (CETP) by anacetrapib reduces LDL cholesterol levels and increases high-density lipoprotein (HDL) cholesterol levels. However, trials of other CETP inhibitors have shown neutral or adverse effects on cardiovascular outcomes. Methods We conducted a randomized, double-blind, placebo-controlled trial involving 30,449 adults with atherosclerotic vascular disease who were receiving intensive atorvastatin therapy and who had a mean LDL cholesterol level of 61 mg per deciliter (1.58 mmol per liter), a mean non-HDL cholesterol level of 92 mg per deciliter (2.38 mmol per liter), and a mean HDL cholesterol level of 40 mg per deciliter (1.03 mmol per liter). The patients were assigned to receive either 100 mg of anacetrapib once daily (15,225 patients) or matching placebo (15,224 patients). The primary outcome was the first major coronary event, a composite of coronary death, myocardial infarction, or coronary revascularization. Results During the median follow-up period of 4.1 years, the primary outcome occurred in significantly fewer patients in the anacetrapib group than in the placebo group (1640 of 15,225 patients [10.8%] vs. 1803 of 15,224 patients [11.8%]; rate ratio, 0.91; 95% confidence interval, 0.85 to 0.97; P=0.004). The relative difference in risk was similar across multiple prespecified subgroups. At the trial midpoint, the mean level of HDL cholesterol was higher by 43 mg per deciliter (1.12 mmol per liter) in the anacetrapib group than in the placebo group (a relative difference of 104%), and the mean level of non-HDL cholesterol was lower by 17 mg per deciliter (0.44 mmol per liter), a relative difference of -18%. There were no significant between-group differences in the risk of death, cancer, or other serious adverse events. Conclusions Among patients with atherosclerotic vascular disease who were receiving intensive statin therapy, the use of anacetrapib resulted in a lower incidence of major coronary events than the use of placebo. (Funded by Merck and others; Current Controlled Trials number, ISRCTN48678192 ; ClinicalTrials.gov number, NCT01252953 ; and EudraCT number, 2010-023467-18 .).

730 citations


Journal ArticleDOI
TL;DR: Among patients with HCV genotype 2 or 3 with or without previous treatment, including those with compensated cirrhosis, 12 weeks of treatment with sofosbuvir-velpatasvir resulted in rates of sustained virologic response that were superior to those with standard treatment withSofosBuvir-ribavirin.
Abstract: BackgroundIn phase 2 trials, treatment with the combination of the nucleotide polymerase inhibitor sofosbuvir and the NS5A inhibitor velpatasvir resulted in high rates of sustained virologic response in patients chronically infected with hepatitis C virus (HCV) genotype 2 or 3. MethodsWe conducted two randomized, phase 3, open-label studies involving patients who had received previous treatment for HCV genotype 2 or 3 and those who had not received such treatment, including patients with compensated cirrhosis. In one trial, patients with HCV genotype 2 were randomly assigned in a 1:1 ratio to receive sofosbuvir–velpatasvir, in a once-daily, fixed-dose combination tablet (134 patients), or sofosbuvir plus weight-based ribavirin (132 patients) for 12 weeks. In a second trial, patients with HCV genotype 3 were randomly assigned in a 1:1 ratio to receive sofosbuvir–velpatasvir for 12 weeks (277 patients) or sofosbuvir–ribavirin for 24 weeks (275 patients). The primary end point for the two trials was a sustai...

730 citations


MonographDOI
21 Apr 2022
TL;DR: This book discusses methods in corpus linguistics: interpreting concordance lines, applications of corpora in applied linguistics, and more.
Abstract: Corpus Linguistics has revolutionised the world of language study and is an essential component of work in Applied Linguistics. This book, now in its second edition, provides a thorough introduction to all the key research issues in Corpus Linguistics, from the point of view of Applied Linguistics. The field has progressed a great deal since the first edition, so this edition has been completely rewritten to reflect these advances, whilst still maintaining the emphasis on hands-on corpus research of the first edition. It includes chapters on qualitative and quantitative research, applications in language teaching, discourse studies, and beyond. It also includes an extensive discussion of the place of Corpus Linguistics in linguistic theory, and provides numerous detailed examples of corpus studies throughout. Providing an accessible but thorough grounding to the fascinating, fast-moving field of Corpus Linguistics, this book is essential reading for the student and the researcher alike.

730 citations


Journal ArticleDOI
Rebecca Sims1, Sven J. van der Lee2, Adam C. Naj3, Céline Bellenguez4  +484 moreInstitutions (120)
TL;DR: Three new genome-wide significant nonsynonymous variants associated with Alzheimer's disease are observed, providing additional evidence that the microglia-mediated innate immune response contributes directly to the development of Alzheimer's Disease.
Abstract: We identified rare coding variants associated with Alzheimer's disease in a three-stage case–control study of 85,133 subjects. In stage 1, we genotyped 34,174 samples using a whole-exome microarray. In stage 2, we tested associated variants (P < 1 × 10−4) in 35,962 independent samples using de novo genotyping and imputed genotypes. In stage 3, we used an additional 14,997 samples to test the most significant stage 2 associations (P < 5 × 10−8) using imputed genotypes. We observed three new genome-wide significant nonsynonymous variants associated with Alzheimer's disease: a protective variant in PLCG2 (rs72824905: p.Pro522Arg, P = 5.38 × 10−10, odds ratio (OR) = 0.68, minor allele frequency (MAF)cases = 0.0059, MAFcontrols = 0.0093), a risk variant in ABI3 (rs616338: p.Ser209Phe, P = 4.56 × 10−10, OR = 1.43, MAFcases = 0.011, MAFcontrols = 0.008), and a new genome-wide significant variant in TREM2 (rs143332484: p.Arg62His, P = 1.55 × 10−14, OR = 1.67, MAFcases = 0.0143, MAFcontrols = 0.0089), a known susceptibility gene for Alzheimer's disease. These protein-altering changes are in genes highly expressed in microglia and highlight an immune-related protein–protein interaction network enriched for previously identified risk genes in Alzheimer's disease. These genetic findings provide additional evidence that the microglia-mediated innate immune response contributes directly to the development of Alzheimer's disease.

730 citations


Book
25 Jun 2015
TL;DR: In this paper, a new set of standard fire behavior fuel models for use with Rothermel's surface fire spread model and the relationship of the new set to the original set of 13 fire behaviour fuel models is described.
Abstract: This report describes a new set of standard fire behavior fuel models for use with Rothermel's surface fire spread model and the relationship of the new set to the original set of 13 fire behavior fuel models. To assist with transition to using the new fuel models, a fuel model selection guide, fuel model crosswalk, and set of fuel model photos are provided.

730 citations


Journal ArticleDOI
TL;DR: The TRIPOD Statement aims to improve the transparency of the reporting of a prediction model study regardless of the study methods used and is recommended that authors include a completed checklist in their submission.

Journal ArticleDOI
16 Jan 2020-Nature
TL;DR: The worldwide distribution and water supply of water towers (snowy or glacierized mountain ranges) is indexed, showing that the most important water towers are also the most vulnerable to socio-economic and climate-change stresses, with huge potential negative impacts on populations downstream.
Abstract: Mountains are the water towers of the world, supplying a substantial part of both natural and anthropogenic water demands1,2. They are highly sensitive and prone to climate change3,4, yet their importance and vulnerability have not been quantified at the global scale. Here we present a global water tower index (WTI), which ranks all water towers in terms of their water-supplying role and the downstream dependence of ecosystems and society. For each water tower, we assess its vulnerability related to water stress, governance, hydropolitical tension and future climatic and socio-economic changes. We conclude that the most important (highest WTI) water towers are also among the most vulnerable, and that climatic and socio-economic changes will affect them profoundly. This could negatively impact 1.9 billion people living in (0.3 billion) or directly downstream of (1.6 billion) mountainous areas. Immediate action is required to safeguard the future of the world’s most important and vulnerable water towers.

Proceedings ArticleDOI
27 Jun 2016
TL;DR: A new class of fast algorithms for convolutional neural networks is introduced using Winograd's minimal filtering algorithms, which compute minimal complexity convolution over small tiles, which makes them fast with small filters and small batch sizes.
Abstract: Deep convolutional neural networks take GPU-days of computation to train on large data sets. Pedestrian detection for self driving cars requires very low latency. Image recognition for mobile phones is constrained by limited processing resources. The success of convolutional neural networks in these situations is limited by how fast we can compute them. Conventional FFT based convolution is fast for large filters, but state of the art convolutional neural networks use small, 3 3 filters. We introduce a new class of fast algorithms for convolutional neural networks using Winograd's minimal filtering algorithms. The algorithms compute minimal complexity convolution over small tiles, which makes them fast with small filters and small batch sizes. We benchmark a GPU implementation of our algorithm with the VGG network and show state of the art throughput at batch sizes from 1 to 64.

Journal ArticleDOI
04 Apr 2019-Cell
TL;DR: The findings show that exosomal PD-L1 represents an unexplored therapeutic target, which could overcome resistance to current antibody approaches, and is described as a potential new therapeutic target for cancer patients.

Journal ArticleDOI
02 Jun 2016-Nature
TL;DR: It is demonstrated that contextual memories encoded close in time are linked by directing storage into overlapping ensembles by increasing cellular excitability and activating a common ensemble of CA1 neurons during two distinct context exposures rescued the deficit in linking memories.
Abstract: Recent studies suggest that a shared neural ensemble may link distinct memories encoded close in time. According to the memory allocation hypothesis, learning triggers a temporary increase in neuronal excitability that biases the representation of a subsequent memory to the neuronal ensemble encoding the first memory, such that recall of one memory increases the likelihood of recalling the other memory. Here we show in mice that the overlap between the hippocampal CA1 ensembles activated by two distinct contexts acquired within a day is higher than when they are separated by a week. Several findings indicate that this overlap of neuronal ensembles links two contextual memories. First, fear paired with one context is transferred to a neutral context when the two contexts are acquired within a day but not across a week. Second, the first memory strengthens the second memory within a day but not across a week. Older mice, known to have lower CA1 excitability, do not show the overlap between ensembles, the transfer of fear between contexts, or the strengthening of the second memory. Finally, in aged mice, increasing cellular excitability and activating a common ensemble of CA1 neurons during two distinct context exposures rescued the deficit in linking memories. Taken together, these findings demonstrate that contextual memories encoded close in time are linked by directing storage into overlapping ensembles. Alteration of these processes by ageing could affect the temporal structure of memories, thus impairing efficient recall of related information.

Journal ArticleDOI
27 Apr 2020-BMJ
TL;DR: Several mechanisms through which the pandemic response is likely to affect health are summarised: economic effects, social isolation, family relationships, health related behaviours, disruption to essential services, disrupted education, transport and green space, social disorder, and psychosocial effects.
Abstract: Countries worldwide have implemented strict controls on movement in response to the covid-19 pandemic. The aim is to cut transmission by reducing close contact (box 1), but the measures have profound consequences. Several sectors are seeing steep reductions in business, and there has been panic buying in shops. Social, economic, and health consequences are inevitable. Box 1 ### Social distancing measuresRETURN TO TEXT The health benefits of social distancing measures are obvious, with a slower spread of infection reducing the risk that health services will be overwhelmed. But they may also prolong the pandemic and the restrictions adopted to mitigate it.1 Policy makers need to balance these considerations while paying attention to broader effects on health and health equity. Several groups may be particularly vulnerable to the effects of both the pandemic and the social distancing measures (box 2). Table 1 summarises several mechanisms through which the pandemic response is likely to affect health: economic effects, social isolation, family relationships, health related behaviours, disruption to essential services, disrupted education, transport and green space, social disorder, and psychosocial effects. Figure 1 shows the complexity of the pathways through which these effects may arise. Below we expand on the first three mechanisms, using Scotland as an example. The appendix on bmj.com provides further details of mechanisms, effects, and mitigation measures. Box 2 ### Groups at particular risk from responses to covid-19RETURN TO TEXT

Journal ArticleDOI
TL;DR: This review describes the recent results in hydrothermal liquefaction (HTL) of biomass in continuous-flow processing systems, and process models have been developed, and mass and energy balances determined.

Posted Content
TL;DR: It is demonstrated that a third approach, synchronous optimization with backup workers, can avoid asynchronous noise while mitigating for the worst stragglers and is empirically validated and shown to converge faster and to better test accuracies.
Abstract: Distributed training of deep learning models on large-scale training data is typically conducted with asynchronous stochastic optimization to maximize the rate of updates, at the cost of additional noise introduced from asynchrony. In contrast, the synchronous approach is often thought to be impractical due to idle time wasted on waiting for straggling workers. We revisit these conventional beliefs in this paper, and examine the weaknesses of both approaches. We demonstrate that a third approach, synchronous optimization with backup workers, can avoid asynchronous noise while mitigating for the worst stragglers. Our approach is empirically validated and shown to converge faster and to better test accuracies.

Journal ArticleDOI
TL;DR: In this study, the selective BTK inhibitor acalabrutinib had promising safety and efficacy profiles in patients with relapsed CLL, including those with chromosome 17p13.1 deletion.
Abstract: BACKGROUND Irreversible inhibition of Bruton’s tyrosine kinase (BTK) by ibrutinib represents an important therapeutic advance for the treatment of chronic lymphocytic leukemia (CLL). However, ibrutinib also irreversibly inhibits alternative kinase targets, which potentially compromises its therapeutic index. Acalabrutinib (ACP-196) is a more selective, irreversible BTK inhibitor that is specifically designed to improve on the safety and efficacy of first-generation BTK inhibitors. METHODS In this uncontrolled, phase 1–2, multicenter study, we administered oral acalabrutinib to 61 patients who had relapsed CLL to assess the safety, efficacy, pharmacokinetics, and pharmacodynamics of acalabrutinib. Patients were treated with acalabrutinib at a dose of 100 to 400 mg once daily in the dose-escalation (phase 1) portion of the study and 100 mg twice daily in the expansion (phase 2) portion. RESULTS The median age of the patients was 62 years, and patients had received a median of three previous therapies for CLL; 31% had chromosome 17p13.1 deletion, and 75% had unmutated immunoglobulin heavy-chain variable genes. No dose-limiting toxic effects occurred during the dose-escalation portion of the study. The most common adverse events observed were headache (in 43% of the patients), diarrhea (in 39%), and increased weight (in 26%). Most adverse events were of grade 1 or 2. At a median followup of 14.3 months, the overall response rate was 95%, including 85% with a partial response and 10% with a partial response with lymphocytosis; the remaining 5% of patients had stable disease. Among patients with chromosome 17p13.1 deletion, the overall response rate was 100%. No cases of Richter’s transformation (CLL that has evolved into large-cell lymphoma) and only one case of CLL progression have occurred. CONCLUSIONS In this study, the selective BTK inhibitor acalabrutinib had promising safety and efficacy profiles in patients with relapsed CLL, including those with chromosome 17p13.1 deletion. (Funded by the Acerta Pharma and others; ClinicalTrials.gov number, NCT02029443.)

Journal ArticleDOI
TL;DR: Returning to level I sports after ACL reconstruction leads to a more than 4-fold increase in reinjury rates over 2 years, and more symmetrical quadriceps strength prior to return substantially reduce the reinjury rate.
Abstract: Background Knee reinjury after ACL reconstruction is common and increases the risk of osteoarthritis. There is sparse evidence to guide return to sport (RTS) decisions in this population. Objectives To assess the relationship between knee reinjury after ACL reconstruction and (1) return to level I sports, (2) timing of RTS and (3) knee function prior to return. Methods 106 patients who participated in pivoting sports participated in this prospective 2-year cohort study. Sports participation and knee reinjury were recorded monthly. Knee function was assessed with the Knee Outcome Survey—Activities of Daily Living Scale, global rating scale of function, and quadriceps strength and hop test symmetry. Pass RTS criteria were defined as scores >90 on all tests, failure as failing any. Results Patients who returned to level I sports had a 4.32 (p=0.048) times higher reinjury rate than those who did not. The reinjury rate was significantly reduced by 51% for each month RTS was delayed until 9 months after surgery, after which no further risk reduction was observed. 38.2% of those who failed RTS criteria suffered reinjuries versus 5.6% of those who passed (HR 0.16, p=0.075). More symmetrical quadriceps strength prior to return significantly reduced the knee reinjury rate. Conclusions Returning to level I sports after ACL reconstruction leads to a more than 4-fold increase in reinjury rates over 2 years. RTS 9 months or later after surgery and more symmetrical quadriceps strength prior to return substantially reduce the reinjury rate.

Journal ArticleDOI
Lianne Schmaal1, Derrek P. Hibar2, Philipp G. Sämann3, Geoffrey B. Hall4, Bernhard T. Baune5, Neda Jahanshad2, Joshua W. Cheung2, T.G.M. van Erp6, Daniel Bos7, M. A. Ikram7, Meike W. Vernooij7, Wiro J. Niessen8, Wiro J. Niessen7, Henning Tiemeier7, Henning Tiemeier9, A. Hofman7, Katharina Wittfeld10, Hans-Jörgen Grabe10, Hans-Jörgen Grabe11, Deborah Janowitz11, Robin Bülow11, M Selonke11, Henry Völzke11, Dominik Grotegerd12, Udo Dannlowski12, Udo Dannlowski13, Volker Arolt12, Nils Opel12, Walter Heindel12, Harald Kugel12, D. Hoehn3, Michael Czisch3, Baptiste Couvy-Duchesne14, Baptiste Couvy-Duchesne15, Miguel E. Rentería14, Lachlan T. Strike15, Margaret J. Wright15, Natalie T. Mills14, Natalie T. Mills15, G.I. de Zubicaray16, Katie L. McMahon15, Sarah E. Medland14, Nicholas G. Martin14, Nathan A. Gillespie17, Roberto Goya-Maldonado18, Oliver Gruber19, Bernd Krämer19, Sean N. Hatton20, Jim Lagopoulos20, Ian B. Hickie20, Thomas Frodl21, Thomas Frodl22, Angela Carballedo21, Eva-Maria Frey23, L. S. van Velzen1, B.W.J.H. Penninx1, M-J van Tol24, N.J. van der Wee25, Christopher G. Davey26, Ben J. Harrison26, Benson Mwangi27, Bo Cao27, Jair C. Soares27, Ilya M. Veer28, Henrik Walter28, D. Schoepf29, Bartosz Zurowski30, Carsten Konrad13, Elisabeth Schramm31, Claus Normann31, Knut Schnell19, Matthew D. Sacchet32, Ian H. Gotlib32, Glenda MacQueen33, Beata R. Godlewska34, Thomas Nickson35, Andrew M. McIntosh36, Andrew M. McIntosh35, Martina Papmeyer37, Martina Papmeyer35, Heather C. Whalley35, Jeremy Hall38, Jeremy Hall35, J.E. Sussmann35, Meng Li39, Martin Walter39, Martin Walter40, Lyubomir I. Aftanas, Ivan Brack, Nikolay A. Bokhan41, Nikolay A. Bokhan42, Nikolay A. Bokhan43, Paul M. Thompson2, Dick J. Veltman1 
TL;DR: In this article, the authors present the largest ever worldwide study by the ENIGMA (Enhancing Neuro Imaging Genetics through Meta-Analysis) Major Depressive Disorder Working Group on cortical structural alterations in MDD.
Abstract: The neuro-anatomical substrates of major depressive disorder (MDD) are still not well understood, despite many neuroimaging studies over the past few decades. Here we present the largest ever worldwide study by the ENIGMA (Enhancing Neuro Imaging Genetics through Meta-Analysis) Major Depressive Disorder Working Group on cortical structural alterations in MDD. Structural T1-weighted brain magnetic resonance imaging (MRI) scans from 2148 MDD patients and 7957 healthy controls were analysed with harmonized protocols at 20 sites around the world. To detect consistent effects of MDD and its modulators on cortical thickness and surface area estimates derived from MRI, statistical effects from sites were meta-analysed separately for adults and adolescents. Adults with MDD had thinner cortical gray matter than controls in the orbitofrontal cortex (OFC), anterior and posterior cingulate, insula and temporal lobes (Cohen's d effect sizes: -0.10 to -0.14). These effects were most pronounced in first episode and adult-onset patients (>21 years). Compared to matched controls, adolescents with MDD had lower total surface area (but no differences in cortical thickness) and regional reductions in frontal regions (medial OFC and superior frontal gyrus) and primary and higher-order visual, somatosensory and motor areas (d: -0.26 to -0.57). The strongest effects were found in recurrent adolescent patients. This highly powered global effort to identify consistent brain abnormalities showed widespread cortical alterations in MDD patients as compared to controls and suggests that MDD may impact brain structure in a highly dynamic way, with different patterns of alterations at different stages of life.

Journal ArticleDOI
TL;DR: A. H. Wallace, J. L. Carruthers, S. L€ ohr, Y. Khosroshahi, Z. Chari, E. Della-Torre, L. Frulloni, H.
Abstract: A. Khosroshahi, Z. S. Wallace, J. L. Crowe, T. Akamizu, A. Azumi, M. N. Carruthers, S. T. Chari, E. Della-Torre, L. Frulloni, H. Goto, P. A. Hart, T. Kamisawa, S. Kawa, M. Kawano, M. H. Kim, Y. Kodama, K. Kubota, M. M. Lerch, M. L€ ohr, Y. Masaki, S. Matsui, T. Mimori, S. Nakamura, T. Nakazawa, H. Ohara, K. Okazaki, J. H. Ryu, T. Saeki, N. Schleinitz, A. Shimatsu, T. Shimosegawa, H. Takahashi, M. Takahira, A. Tanaka, M. Topazian, H. Umehara, G. J. Webster, T. E. Witzig, M. Yamamoto, W. Zhang, T. Chiba, and J. H. Stone

Proceedings ArticleDOI
01 Jul 2015
TL;DR: This paper proposed and implemented an effective technique to address the problem of out-of-vocabulary (OOV) word translation in NMT, which trains an NMT system on data that is augmented by the output of a word alignment algorithm, and then uses this information in a post-processing step that translates every OOV word using a dictionary.
Abstract: Neural Machine Translation (NMT) is a new approach to machine translation that has shown promising results that are comparable to traditional approaches. A significant weakness in conventional NMT systems is their inability to correctly translate very rare words: end-to-end NMTs tend to have relatively small vocabularies with a single unk symbol that represents every possible out-of-vocabulary (OOV) word. In this paper, we propose and implement an effective technique to address this problem. We train an NMT system on data that is augmented by the output of a word alignment algorithm, allowing the NMT system to emit, for each OOV word in the target sentence, the position of its corresponding word in the source sentence. This information is later utilized in a post-processing step that translates every OOV word using a dictionary. Our experiments on the WMT’14 English to French translation task show that this method provides a substantial improvement of up to 2.8 BLEU points over an equivalent NMT system that does not use this technique. With 37.5 BLEU points, our NMT system is the first to surpass the best result achieved on a WMT’14 contest task.

Proceedings ArticleDOI
13 Apr 2016
TL;DR: This paper proposes a deep learning approach for accelerating magnetic resonance imaging (MRI) using a large number of existing high quality MR images as the training datasets and an off-line convolutional neural network to identify the mapping relationship between the MR images obtained from zero-filled and fully-sampled k-space data.
Abstract: This paper proposes a deep learning approach for accelerating magnetic resonance imaging (MRI) using a large number of existing high quality MR images as the training datasets. An off-line convolutional neural network is designed and trained to identify the mapping relationship between the MR images obtained from zero-filled and fully-sampled k-space data. The network is not only capable of restoring fine structures and details but is also compatible with online constrained reconstruction methods. Experimental results on real MR data have shown encouraging performance of the proposed method for efficient and accurate imaging.

Journal ArticleDOI
TL;DR: A review of the latest developments in TICT research from a materials chemistry point of view can be found in this paper, where the authors present a compact overview of the current state-of-the-art.
Abstract: Twisted intramolecular charge transfer (TICT) is an electron transfer process that occurs upon photoexcitation in molecules that usually consist of a donor and acceptor part linked by a single bond. Following intramolecular twisting, the TICT state returns to the ground state either through red-shifted emission or by nonradiative relaxation. The emission properties are potentially environment-dependent, which makes TICT-based fluorophores ideal sensors for solvents, (micro)viscosity, and chemical species. Recently, several TICT-based materials have been discovered to become fluorescent upon aggregation. Furthermore, various recent studies in organic optoelectronics, non-linear optics and solar energy conversions utilised the concept of TICT to modulate the electronic-state mixing and coupling on charge transfer states. This review presents a compact overview of the latest developments in TICT research, from a materials chemistry point of view.

Proceedings Article
16 Mar 2016
TL;DR: This paper implements Bitcoin-NG, a new blockchain protocol designed to scale, which is Byzantine fault tolerant, is robust to extreme churn, and shares the same trust model obviating qualitative changes to the ecosystem.
Abstract: Cryptocurrencies, based on and led by Bitcoin, have shown promise as infrastructure for pseudonymous online payments, cheap remittance, trustless digital asset exchange, and smart contracts. However, Bitcoinderived blockchain protocols have inherent scalability limits that trade off between throughput and latency, which withhold the realization of this potential. This paper presents Bitcoin-NG (Next Generation), a new blockchain protocol designed to scale. Bitcoin-NG is a Byzantine fault tolerant blockchain protocol that is robust to extreme churn and shares the same trust model as Bitcoin. In addition to Bitcoin-NG, we introduce several novel metrics of interest in quantifying the security and efficiency of Bitcoin-like blockchain protocols. We implement Bitcoin-NG and perform large-scale experiments at 15% the size of the operational Bitcoin system, using unchanged clients of both protocols. These experiments demonstrate that Bitcoin-NG scales optimally, with bandwidth limited only by the capacity of the individual nodes and latency limited only by the propagation time of the network.

Journal ArticleDOI
13 Jan 2017-Gut
TL;DR: This consensus report strongly recommends the implementation of FMT centres for the treatment of C. difficile infection as well as traces the guidelines of technicality, regulatory, administrative and laboratory requirements.
Abstract: Faecal microbiota transplantation (FMT) is an important therapeutic option for Clostridium difficile infection. Promising findings suggest that FMT may play a role also in the management of other disorders associated with the alteration of gut microbiota. Although the health community is assessing FMT with renewed interest and patients are becoming more aware, there are technical and logistical issues in establishing such a non-standardised treatment into the clinical practice with safety and proper governance. In view of this, an evidence-based recommendation is needed to drive the practical implementation of FMT. In this European Consensus Conference, 28 experts from 10 countries collaborated, in separate working groups and through an evidence-based process, to provide statements on the following key issues: FMT indications; donor selection; preparation of faecal material; clinical management and faecal delivery and basic requirements for implementing an FMT centre. Statements developed by each working group were evaluated and voted by all members, first through an electronic Delphi process, and then in a plenary consensus conference. The recommendations were released according to best available evidence, in order to act as guidance for physicians who plan to implement FMT, aiming at supporting the broad availability of the procedure, discussing other issues relevant to FMT and promoting future clinical research in the area of gut microbiota manipulation. This consensus report strongly recommends the implementation of FMT centres for the treatment of C. difficile infection as well as traces the guidelines of technicality, regulatory, administrative and laboratory requirements.

Journal ArticleDOI
TL;DR: The demonstrated helicity multiplexed metasurface hologram with its high performance opens avenues for future applications with functionality switchable optical devices.
Abstract: Metasurfaces are engineered interfaces that contain a thin layer of plasmonic or dielectric nanostructures capable of manipulating light in a desirable manner. Advances in metasurfaces have led to various practical applications ranging from lensing to holography. Metasurface holograms that can be switched by the polarization state of incident light have been demonstrated for achieving polarization multiplexed functionalities. However, practical application of these devices has been limited by their capability for achieving high efficiency and high image quality. Here we experimentally demonstrate a helicity multiplexed metasurface hologram with high efficiency and good image fidelity over a broad range of frequencies. The metasurface hologram features the combination of two sets of hologram patterns operating with opposite incident helicities. Two symmetrically distributed off-axis images are interchangeable by controlling the helicity of the input light. The demonstrated helicity multiplexed metasurface hologram with its high performance opens avenues for future applications with functionality switchable optical devices.

Journal ArticleDOI
TL;DR: A framework to demonstrate how the temporal dynamics of the embedding helps to quantify changes in stereotypes and attitudes toward women and ethnic minorities in the 20th and 21st centuries in the United States is developed.
Abstract: Word embeddings are a powerful machine-learning framework that represents each English word by a vector. The geometric relationship between these vectors captures meaningful semantic relationships between the corresponding words. In this paper, we develop a framework to demonstrate how the temporal dynamics of the embedding helps to quantify changes in stereotypes and attitudes toward women and ethnic minorities in the 20th and 21st centuries in the United States. We integrate word embeddings trained on 100 y of text data with the US Census to show that changes in the embedding track closely with demographic and occupation shifts over time. The embedding captures societal shifts-e.g., the women's movement in the 1960s and Asian immigration into the United States-and also illuminates how specific adjectives and occupations became more closely associated with certain populations over time. Our framework for temporal analysis of word embedding opens up a fruitful intersection between machine learning and quantitative social science.

Journal ArticleDOI
Jordan Fallon1
28 Mar 2019
TL;DR: The conceptual and stylistic limits of Undoing the Demos are discussed in this article, where the destructive effects of contemporary neoliberalism, construed largely as an insidious form of rationality rather than simply an economic system, and the hollowing out of democratic political life which has ensued from its ascension are discussed.
Abstract: This review charts the substantive theoretical import, diagnostic utility, as well as the conceptual and stylistic limits of Wendy Brown’s Undoing the Demos. Brown adamantly charts the destructive effects of contemporary neoliberalism, construed largely as an insidious form of rationality rather than simply an economic system, and the hollowing out of democratic political life which has ensued from its ascension. The account of neoliberalism supplied by Undoing the Demos presents an indispensable tool with which to forge modalities of both analysis and resistance yet also contains important limitations which circumscribe some of the book’s utility and gesture toward the need for critical supplement.