scispace - formally typeset
Search or ask a question
Browse all papers

Journal ArticleDOI
TL;DR: The material covered includes tensor rank and rank decomposition; basic tensor factorization models and their relationships and properties; broad coverage of algorithms ranging from alternating optimization to stochastic gradient; statistical performance analysis; and applications ranging from source separation to collaborative filtering, mixture and topic modeling, classification, and multilinear subspace learning.
Abstract: Tensors or multiway arrays are functions of three or more indices $(i,j,k,\ldots)$ —similar to matrices (two-way arrays), which are functions of two indices $(r,c)$ for (row, column). Tensors have a rich history, stretching over almost a century, and touching upon numerous disciplines; but they have only recently become ubiquitous in signal and data analytics at the confluence of signal processing, statistics, data mining, and machine learning. This overview article aims to provide a good starting point for researchers and practitioners interested in learning about and working with tensors. As such, it focuses on fundamentals and motivation (using various application examples), aiming to strike an appropriate balance of breadth and depth that will enable someone having taken first graduate courses in matrix algebra and probability to get started doing research and/or developing tensor algorithms and software. Some background in applied optimization is useful but not strictly required. The material covered includes tensor rank and rank decomposition; basic tensor factorization models and their relationships and properties (including fairly good coverage of identifiability); broad coverage of algorithms ranging from alternating optimization to stochastic gradient; statistical performance analysis; and applications ranging from source separation to collaborative filtering, mixture and topic modeling, classification, and multilinear subspace learning.

1,284 citations


Book
30 Dec 2020
TL;DR: Hayek argues that socialism has, from its origins, been mistaken on factual, and even on logical, grounds and that its repeated failures in the many different practical applications of socialist ideas that this century has witnessed were the direct outcome of these errors as mentioned in this paper.
Abstract: Hayek gives the main arguments for the free-market case and presents his manifesto on the "errors of socialism." Hayek argues that socialism has, from its origins, been mistaken on factual, and even on logical, grounds and that its repeated failures in the many different practical applications of socialist ideas that this century has witnessed were the direct outcome of these errors. He labels as the "fatal conceit" the idea that "man is able to shape the world around him according to his wishes." "The achievement of "The Fatal Conceit" is that it freshly shows why socialism must be refuted rather than merely dismissed--then refutes it again."--David R. Henderson, "Fortune." "Fascinating. . . . The energy and precision with which Mr. Hayek sweeps away his opposition is impressive."--Edward H. Crane, "Wall Street Journal" F. A. Hayek is considered a pioneer in monetary theory, the preeminent proponent of the libertarian philosophy, and the ideological mentor of the Reagan and Thatcher "revolutions."

1,284 citations


Journal ArticleDOI
TL;DR: Based on MMR estimates for 2015, scenario-based projections are constructed to highlight the accelerations needed to accomplish the Sustainable Development Goal (SDG) global target of less than 70 maternal deaths per 100,000 live births globally by 2030.

1,284 citations


Journal ArticleDOI
TL;DR: PM2.5 exposure may be related to additional causes of death than the five considered by the GBD and that incorporation of risk information from other, nonoutdoor, particle sources leads to underestimation of disease burden, especially at higher concentrations.
Abstract: Exposure to ambient fine particulate matter (PM2.5) is a major global health concern. Quantitative estimates of attributable mortality are based on disease-specific hazard ratio models that incorporate risk information from multiple PM2.5 sources (outdoor and indoor air pollution from use of solid fuels and secondhand and active smoking), requiring assumptions about equivalent exposure and toxicity. We relax these contentious assumptions by constructing a PM2.5-mortality hazard ratio function based only on cohort studies of outdoor air pollution that covers the global exposure range. We modeled the shape of the association between PM2.5 and nonaccidental mortality using data from 41 cohorts from 16 countries-the Global Exposure Mortality Model (GEMM). We then constructed GEMMs for five specific causes of death examined by the global burden of disease (GBD). The GEMM predicts 8.9 million [95% confidence interval (CI): 7.5-10.3] deaths in 2015, a figure 30% larger than that predicted by the sum of deaths among the five specific causes (6.9; 95% CI: 4.9-8.5) and 120% larger than the risk function used in the GBD (4.0; 95% CI: 3.3-4.8). Differences between the GEMM and GBD risk functions are larger for a 20% reduction in concentrations, with the GEMM predicting 220% higher excess deaths. These results suggest that PM2.5 exposure may be related to additional causes of death than the five considered by the GBD and that incorporation of risk information from other, nonoutdoor, particle sources leads to underestimation of disease burden, especially at higher concentrations.

1,283 citations


Journal ArticleDOI
TL;DR: The ESP block holds promise as a simple and safe technique for thoracic analgesia in both chronic neuropathic pain as well as acute postsurgical or posttraumatic pain.

1,283 citations


Journal ArticleDOI
TL;DR: The guideline update reflects changes in evidence since the previous guideline and inpatients and outpatients with advanced cancer should receive dedicated palliative care services, early in the disease course, concurrent with active treatment.
Abstract: Purpose To provide evidence-based recommendations to oncology clinicians, patients, family and friend caregivers, and palliative care specialists to update the 2012 American Society of Clinical Oncology (ASCO) provisional clinical opinion (PCO) on the integration of palliative care into standard oncology care for all patients diagnosed with cancer. Methods ASCO convened an Expert Panel of members of the ASCO Ad Hoc Palliative Care Expert Panel to develop an update. The 2012 PCO was based on a review of a randomized controlled trial (RCT) by the National Cancer Institute Physicians Data Query and additional trials. The panel conducted an updated systematic review seeking randomized clinical trials, systematic reviews, and meta-analyses, as well as secondary analyses of RCTs in the 2012 PCO, published from March 2010 to January 2016. Results The guideline update reflects changes in evidence since the previous guideline. Nine RCTs, one quasiexperimental trial, and five secondary analyses from RCTs in the 2012 PCO on providing palliative care services to patients with cancer and/or their caregivers, including family caregivers, were found to inform the update. Recommendations Inpatients and outpatients with advanced cancer should receive dedicated palliative care services, early in the disease course, concurrent with active treatment. Referral of patients to interdisciplinary palliative care teams is optimal, and services may complement existing programs. Providers may refer family and friend caregivers of patients with early or advanced cancer to palliative care services.

1,283 citations


Journal ArticleDOI
TL;DR: The JASPAR 2018 CORE vertebrate collection of PFMs was used to predict TF-binding sites in the human genome and this update comes with a new web framework with an interactive and responsive user-interface, along with new features.
Abstract: JASPAR (http://jaspar.genereg.net) is an open-access database of curated, non-redundant transcription factor (TF)-binding profiles stored as position frequency matrices (PFMs) and TF flexible models (TFFMs) for TFs across multiple species in six taxonomic groups. In the 2018 release of JASPAR, the CORE collection has been expanded with 322 new PFMs (60 for vertebrates and 262 for plants) and 33 PFMs were updated (24 for vertebrates, 8 for plants and 1 for insects). These new profiles represent a 30% expansion compared to the 2016 release. In addition, we have introduced 316 TFFMs (95 for vertebrates, 218 for plants and 3 for insects). This release incorporates clusters of similar PFMs in each taxon and each TF class per taxon. The JASPAR 2018 CORE vertebrate collection of PFMs was used to predict TF-binding sites in the human genome. The predictions are made available to the scientific community through a UCSC Genome Browser track data hub. Finally, this update comes with a new web framework with an interactive and responsive user-interface, along with new features. All the underlying data can be retrieved programmatically using a RESTful API and through the JASPAR 2018 R/Bioconductor package.

1,282 citations


Posted Content
TL;DR: The problem of attributing the prediction of a deep network to its input features, a problem previously studied by several other works, is studied and two fundamental axioms— Sensitivity and Implementation Invariance that attribution methods ought to satisfy are identified.
Abstract: We study the problem of attributing the prediction of a deep network to its input features, a problem previously studied by several other works. We identify two fundamental axioms---Sensitivity and Implementation Invariance that attribution methods ought to satisfy. We show that they are not satisfied by most known attribution methods, which we consider to be a fundamental weakness of those methods. We use the axioms to guide the design of a new attribution method called Integrated Gradients. Our method requires no modification to the original network and is extremely simple to implement; it just needs a few calls to the standard gradient operator. We apply this method to a couple of image models, a couple of text models and a chemistry model, demonstrating its ability to debug networks, to extract rules from a network, and to enable users to engage with models better.

1,282 citations


Journal ArticleDOI
TL;DR: This work demonstrates optical pumping of interlayer electric polarization, which may provoke further exploration of inter layer exciton condensation, as well as new applications in two-dimensional lasers, light-emitting diodes and photovoltaic devices.
Abstract: Monolayer transition metal dichalcogenide heterostructures with type II band alignment have generated wide interest in device physics at the two-dimensional limit. Here, Rivera et al. observe interlayer excitons in vertically stacked MoSe2–WSe2 heterostructures and demonstrate tunability of the energy and luminescence.

1,282 citations


Journal ArticleDOI
TL;DR: In this article, a Theta vacua of gauge theories is proposed for cosmologists. But the authors do not consider the cosmological perturbation theory of axions in string theory.
Abstract: 1 Introduction 2 Models: the QCD axion; the strong CP problem; PQWW, KSVZ, DFSZ; anomalies, instantons and the potential; couplings; axions in string theory 3 Production and IC's: SSB and non-perturbative physics; the axion field during inflation and PQ SSB; cosmological populations - decay of parent, topological defects, thermal production, vacuum realignment 4 The Cosmological Field: action; background evolution; misalignment for QCD axion and ALPs; cosmological perturbation theory - ic's, early time treatment, axion sound speed and Jeans scale, transfer functions and WDM; the Schrodinger picture; simualting axions; BEC 5 CMB and LSS: Primary anisotropies; matter power; combined constraints; Isocurvature and inflation 6 Galaxy Formation; halo mass function; high-z and the EOR; density profiles; the CDM small-scale crises 7 Accelerated expansion: the cc problem; axion inflation (natural and monodromy) 8 Gravitational interactions with black holes and pulsars 9 Non-gravitational interactions: stellar astrophysics; LSW; vacuum birefringence; axion forces; direct detection with ADMX and CASPEr; Axion decays; dark radiation; astrophysical magnetic fields; cosmological birefringence 10 Conclusions A Theta vacua of gauge theories B EFT for cosmologists C Friedmann equations D Cosmological fluids E Bayes Theorem and priors F Degeneracies and sampling G Sheth-Tormen HMF

1,282 citations


Journal ArticleDOI
TL;DR: Timely detection and treatment may ensure better survival prognosis in patients suffering this event, which is associated with extremely highmorbidity andmortality in all cases.

Journal ArticleDOI
TL;DR: This year, the battery industry celebrated the 25th anniversary of the introduction of the lithium ion rechargeable battery by Sony as discussed by the authors, which used a combination of lower temperature carbons for the negative electrode to prevent solvent degradation and lithium cobalt dioxide modified somewhat from Goodenough's earlier work.
Abstract: This year, the battery industry celebrates the 25th anniversary of the introduction of the lithium ion rechargeable battery by Sony Corporation. The discovery of the system dates back to earlier work by Asahi Kasei in Japan, which used a combination of lower temperature carbons for the negative electrode to prevent solvent degradation and lithium cobalt dioxide modified somewhat from Goodenough’s earlier work. The development by Sony was carried out within a few years by bringing together technology in film coating from their magnetic tape division and electrochemical technology from their battery division. The past 25 years has shown rapid growth in the sales and in the benefits of lithium ion in comparison to all the earlier rechargeable battery systems. Recent work on new materials shows that there is a good likelihood that the lithium ion battery will continue to improve in cost, energy, safety and power capability and will be a formidable competitor for some years to come. © The Author(s) 2016. Published by ECS. This is an open access article distributed under the terms of the Creative Commons Attribution 4.0 License (CC BY, http://creativecommons.org/licenses/by/4.0/), which permits unrestricted reuse of the work in any medium, provided the original work is properly cited. [DOI: 10.1149/2.0251701jes] All rights reserved.

Journal ArticleDOI
TL;DR: In this article, guidelines summarize and evaluate all available evidence on a particular issue, with the aim of assisting health professionals in selecting the best management strategies for an individual patient with a given condition, taking into account the impact on outcome, as well as the risk-benefit ratio of particular diagnostic or therapeutic means.
Abstract: Guidelines summarize and evaluate all available evidence on a particular issue at the time of the writing process, with the aim of assisting health professionals in selecting the best management strategies for an individual patient with a given condition, taking into account the impact on outcome, as well as the risk–benefit ratio of particular diagnostic or therapeutic means. Guidelines and recommendations should help health professionals to make decisions in their daily practice. However, the final decisions concerning an individual patient must be made by the responsible health professional(s) in consultation with the patient and caregiver as appropriate.

Journal ArticleDOI
12 Feb 2015-Nature
TL;DR: A genome-wide association meta-analyses of traits related to waist and hip circumferences in up to 224,459 individuals implicated adipogenesis, angiogenesis, transcriptional regulation and insulin resistance as processes affecting fat distribution, providing insight into potential pathophysiological mechanisms.
Abstract: Body fat distribution is a heritable trait and a well-established predictor of adverse metabolic outcomes, independent of overall adiposity. To increase our understanding of the genetic basis of body fat distribution and its molecular links to cardiometabolic traits, here we conduct genome-wide association meta-analyses of traits related to waist and hip circumferences in up to 224,459 individuals. We identify 49 loci (33 new) associated with waist-to-hip ratio adjusted for body mass index (BMI), and an additional 19 loci newly associated with related waist and hip circumference measures (P < 5 × 10(-8)). In total, 20 of the 49 waist-to-hip ratio adjusted for BMI loci show significant sexual dimorphism, 19 of which display a stronger effect in women. The identified loci were enriched for genes expressed in adipose tissue and for putative regulatory elements in adipocytes. Pathway analyses implicated adipogenesis, angiogenesis, transcriptional regulation and insulin resistance as processes affecting fat distribution, providing insight into potential pathophysiological mechanisms.

Journal ArticleDOI
26 Mar 2020-BMJ
TL;DR: A long list is emerging from largely unadjusted analyses, with age near the top of the list of top 10 causes of death in the world of sport.
Abstract: A long list is emerging from largely unadjusted analyses, with age near the top

Journal ArticleDOI
TL;DR: A comprehensive overview of the modern classification algorithms used in EEG-based BCIs is provided, the principles of these methods and guidelines on when and how to use them are presented, and a number of challenges to further advance EEG classification in BCI are identified.
Abstract: Objective: Most current Electroencephalography (EEG)-based Brain-Computer Interfaces (BCIs) are based on machine learning algorithms. There is a large diversity of classifier types that are used in this field, as described in our 2007 review paper. Now, approximately 10 years after this review publication, many new algorithms have been developed and tested to classify EEG signals in BCIs. The time is therefore ripe for an updated review of EEG classification algorithms for BCIs. Approach: We surveyed the BCI and machine learning literature from 2007 to 2017 to identify the new classification approaches that have been investigated to design BCIs. We synthesize these studies in order to present such algorithms, to report how they were used for BCIs, what were the outcomes, and to identify their pros and cons. Main results: We found that the recently designed classification algorithms for EEG-based BCIs can be divided into four main categories: adaptive classifiers, matrix and tensor classifiers, transfer learning and deep learning, plus a few other miscellaneous classifiers. Among these, adaptive classifiers were demonstrated to be generally superior to static ones, even with unsupervised adaptation. Transfer learning can also prove useful although the benefits of transfer learning remain unpredictable. Riemannian geometry-based methods have reached state-of-the-art performances on multiple BCI problems and deserve to be explored more thoroughly, along with tensor-based methods. Shrinkage linear discriminant analysis and random forests also appear particularly useful for small training samples settings. On the other hand, deep learning methods have not yet shown convincing improvement over state-of-the-art BCI methods. Significance: This paper provides a comprehensive overview of the modern classification algorithms used in EEG-based BCIs, presents the principles of these Review of Classification Algorithms for EEG-based BCI 2 methods and guidelines on when and how to use them. It also identifies a number of challenges to further advance EEG classification in BCI.

Journal ArticleDOI
TL;DR: The rationale underlying the iterated racing procedures in irace is described and a number of recent extensions are introduced, including a restart mechanism to avoid premature convergence, the use of truncated sampling distributions to handle correctly parameter bounds, and an elitist racing procedure for ensuring that the best configurations returned are also those evaluated in the highest number of training instances.

Proceedings ArticleDOI
21 Jul 2017
TL;DR: The utility of the OctNet representation is demonstrated by analyzing the impact of resolution on several 3D tasks including 3D object classification, orientation estimation and point cloud labeling.
Abstract: We present OctNet, a representation for deep learning with sparse 3D data. In contrast to existing models, our representation enables 3D convolutional networks which are both deep and high resolution. Towards this goal, we exploit the sparsity in the input data to hierarchically partition the space using a set of unbalanced octrees where each leaf node stores a pooled feature representation. This allows to focus memory allocation and computation to the relevant dense regions and enables deeper networks without compromising resolution. We demonstrate the utility of our OctNet representation by analyzing the impact of resolution on several 3D tasks including 3D object classification, orientation estimation and point cloud labeling.

Journal ArticleDOI
14 Jan 2016-Nature
TL;DR: The findings demonstrate the promise of porous cyclodextrin-based polymers for rapid, flow-through water treatment and outperformed a leading activated carbon for the rapid removal of a complex mixture of organic micropollutants at environmentally relevant concentrations.
Abstract: The global occurrence in water resources of organic micropollutants, such as pesticides and pharmaceuticals, has raised concerns about potential negative effects on aquatic ecosystems and human health. Activated carbons are the most widespread adsorbent materials used to remove organic pollutants from water but they have several deficiencies, including slow pollutant uptake (of the order of hours) and poor removal of many relatively hydrophilic micropollutants. Furthermore, regenerating spent activated carbon is energy intensive (requiring heating to 500-900 degrees Celsius) and does not fully restore performance. Insoluble polymers of β-cyclodextrin, an inexpensive, sustainably produced macrocycle of glucose, are likewise of interest for removing micropollutants from water by means of adsorption. β-cyclodextrin is known to encapsulate pollutants to form well-defined host-guest complexes, but until now cross-linked β-cyclodextrin polymers have had low surface areas and poor removal performance compared to conventional activated carbons. Here we crosslink β-cyclodextrin with rigid aromatic groups, providing a high-surface-area, mesoporous polymer of β-cyclodextrin. It rapidly sequesters a variety of organic micropollutants with adsorption rate constants 15 to 200 times greater than those of activated carbons and non-porous β-cyclodextrin adsorbent materials. In addition, the polymer can be regenerated several times using a mild washing procedure with no loss in performance. Finally, the polymer outperformed a leading activated carbon for the rapid removal of a complex mixture of organic micropollutants at environmentally relevant concentrations. These findings demonstrate the promise of porous cyclodextrin-based polymers for rapid, flow-through water treatment.

Journal ArticleDOI
19 Apr 2016-Test
TL;DR: The present article reviews the most recent theoretical and methodological developments for random forests, with special attention given to the selection of parameters, the resampling mechanism, and variable importance measures.
Abstract: The random forest algorithm, proposed by L. Breiman in 2001, has been extremely successful as a general-purpose classification and regression method. The approach, which combines several randomized decision trees and aggregates their predictions by averaging, has shown excellent performance in settings where the number of variables is much larger than the number of observations. Moreover, it is versatile enough to be applied to large-scale problems, is easily adapted to various ad hoc learning tasks, and returns measures of variable importance. The present article reviews the most recent theoretical and methodological developments for random forests. Emphasis is placed on the mathematical forces driving the algorithm, with special attention given to the selection of parameters, the resampling mechanism, and variable importance measures. This review is intended to provide non-experts easy access to the main ideas.

Journal ArticleDOI
TL;DR: Experimental work is presented showing that two-dimensional boron sheets can be grown epitaxially on a Ag(111) substrate and density functional theory simulations agree well with experiments, and indicate that both sheets are planar without obvious vertical undulations.
Abstract: A variety of two-dimensional materials have been reported in recent years, yet single-element systems such as graphene and black phosphorus have remained rare. Boron analogues have been predicted, as boron atoms possess a short covalent radius and the flexibility to adopt sp2 hybridization, features that favour the formation of two-dimensional allotropes, and one example of such a borophene material has been reported recently. Here, we present a parallel experimental work showing that two-dimensional boron sheets can be grown epitaxially on a Ag(111) substrate. Two types of boron sheet, a β12 sheet and a χ3 sheet, both exhibiting a triangular lattice but with different arrangements of periodic holes, are observed by scanning tunnelling microscopy. Density functional theory simulations agree well with experiments, and indicate that both sheets are planar without obvious vertical undulations. The boron sheets are quite inert to oxidization and interact only weakly with their substrate. We envisage that such boron sheets may find applications in electronic devices in the future. A variety of two-dimensional materials have been reported in the past few years, yet single-element systems—such as graphene and black phosphorus—have remained rare. 2D allotropes of boron have long been predicted and recently investigated. Two boron sheets have now been grown on a Ag(111) surface by molecule beam epitaxy that exhibit significant chemical stability against oxidation.

Posted Content
TL;DR: The superiority of the proposed HRNet in a wide range of applications, including human pose estimation, semantic segmentation, and object detection, is shown, suggesting that the HRNet is a stronger backbone for computer vision problems.
Abstract: High-resolution representations are essential for position-sensitive vision problems, such as human pose estimation, semantic segmentation, and object detection. Existing state-of-the-art frameworks first encode the input image as a low-resolution representation through a subnetwork that is formed by connecting high-to-low resolution convolutions \emph{in series} (e.g., ResNet, VGGNet), and then recover the high-resolution representation from the encoded low-resolution representation. Instead, our proposed network, named as High-Resolution Network (HRNet), maintains high-resolution representations through the whole process. There are two key characteristics: (i) Connect the high-to-low resolution convolution streams \emph{in parallel}; (ii) Repeatedly exchange the information across resolutions. The benefit is that the resulting representation is semantically richer and spatially more precise. We show the superiority of the proposed HRNet in a wide range of applications, including human pose estimation, semantic segmentation, and object detection, suggesting that the HRNet is a stronger backbone for computer vision problems. All the codes are available at~{\url{this https URL}}.

Journal ArticleDOI
10 Mar 2017-Science
TL;DR: A metamaterial composed of a polymer layer embedded with microspheres, backed with a thin layer of silver, which shows a noontime radiative cooling power of 93 watts per square meter under direct sunshine is constructed.
Abstract: Passive radiative cooling draws heat from surfaces and radiates it into space as infrared radiation to which the atmosphere is transparent. However, the energy density mismatch between solar irradiance and the low infrared radiation flux from a near-ambient-temperature surface requires materials that strongly emit thermal energy and barely absorb sunlight. We embedded resonant polar dielectric microspheres randomly in a polymeric matrix, resulting in a metamaterial that is fully transparent to the solar spectrum while having an infrared emissivity greater than 0.93 across the atmospheric window. When backed with a silver coating, the metamaterial shows a noontime radiative cooling power of 93 watts per square meter under direct sunshine. More critically, we demonstrated high-throughput, economical roll-to-roll manufacturing of the metamaterial, which is vital for promoting radiative cooling as a viable energy technology.

Posted Content
TL;DR: This paper describes a simple procedure called AutoAugment to automatically search for improved data augmentation policies, which achieves state-of-the-art accuracy on CIFAR-10, CIFar-100, SVHN, and ImageNet (without additional data).
Abstract: Data augmentation is an effective technique for improving the accuracy of modern image classifiers. However, current data augmentation implementations are manually designed. In this paper, we describe a simple procedure called AutoAugment to automatically search for improved data augmentation policies. In our implementation, we have designed a search space where a policy consists of many sub-policies, one of which is randomly chosen for each image in each mini-batch. A sub-policy consists of two operations, each operation being an image processing function such as translation, rotation, or shearing, and the probabilities and magnitudes with which the functions are applied. We use a search algorithm to find the best policy such that the neural network yields the highest validation accuracy on a target dataset. Our method achieves state-of-the-art accuracy on CIFAR-10, CIFAR-100, SVHN, and ImageNet (without additional data). On ImageNet, we attain a Top-1 accuracy of 83.5% which is 0.4% better than the previous record of 83.1%. On CIFAR-10, we achieve an error rate of 1.5%, which is 0.6% better than the previous state-of-the-art. Augmentation policies we find are transferable between datasets. The policy learned on ImageNet transfers well to achieve significant improvements on other datasets, such as Oxford Flowers, Caltech-101, Oxford-IIT Pets, FGVC Aircraft, and Stanford Cars.

Journal ArticleDOI
TL;DR: Up-to-date statistics on pancreatic cancer occurrence and outcome along with a better understanding of the etiology and identifying the causative risk factors are essential for the primary prevention of this disease.
Abstract: Pancreatic cancer is the seventh leading cause of cancer-related deaths worldwide. However, its toll is higher in more developed countries. Reasons for vast differences in mortality rates of pancreatic cancer are not completely clear yet, but it may be due to lack of appropriate diagnosis, treatment and cataloging of cancer cases. Because patients seldom exhibit symptoms until an advanced stage of the disease, pancreatic cancer remains one of the most lethal malignant neoplasms that caused 432,242 new deaths in 2018 (GLOBOCAN 2018 estimates). Globally, 458,918 new cases of pancreatic cancer have been reported in 2018, and 355,317 new cases are estimated to occur until 2040. Despite advancements in the detection and management of pancreatic cancer, the 5-year survival rate still stands at 9% only. To date, the causes of pancreatic carcinoma are still insufficiently known, although certain risk factors have been identified, such as tobacco smoking, diabetes mellitus, obesity, dietary factors, alcohol abuse, age, ethnicity, family history and genetic factors, Helicobacter pylori infection, non-O blood group and chronic pancreatitis. In general population, screening of large groups is not considered useful to detect the disease at its early stage, although newer techniques and the screening of tightly targeted groups (especially of those with family history), are being evaluated. Primary prevention is considered of utmost importance. Up-to-date statistics on pancreatic cancer occurrence and outcome along with a better understanding of the etiology and identifying the causative risk factors are essential for the primary prevention of this disease.

Journal ArticleDOI
TL;DR: In this paper, the authors discuss the use of the synthetic control method (Abadie and Gardeazabal, 2003; Abadie, Diamond, and Hainmueller, 2010) as a way to bridge the quantitative/qualitative divide in comparative politics.
Abstract: In recent years a widespread consensus has emerged about the necessity of establishing bridges between the quantitative and the qualitative approaches to empirical research in political science. In this article, we discuss the use of the synthetic control method (Abadie and Gardeazabal, 2003; Abadie, Diamond, and Hainmueller, 2010) as a way to bridge the quantitative/qualitative divide in comparative politics. The synthetic control method provides a systematic way to choose comparison units in comparative case studies. This systematization opens the door to precise quantitative inference in small-sample comparative studies, without precluding the application of qualitative approaches. That is, the synthetic control method allows researchers to put \qualitative esh on quantitative bones" (Tarrow, 1995). We illustrate the main ideas behind the synthetic control method with an application where we study the economic impact of the 1990 German reunication in West Germany.

Journal ArticleDOI
09 Jun 2020-JAMA
TL;DR: How to interpret 2 types of diagnostic tests commonly in use for SARS-CoV-2 infections—reverse transcriptase–polymerase chain reaction (RT-PCR) and IgM and IgG enzyme-linked immunosorbent assay (ELISA)—and how the results may vary over time is described.
Abstract: The pandemic of coronavirus disease 2019 (COVID-19) continues to affect much of the world. Knowledge of diagnostic tests for severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) is still evolving, and a clear understanding of the nature of the tests and interpretation of their findings is important. This Viewpoint describes how to interpret 2 types of diagnostic tests commonly in use for SARS-CoV-2 infections—reverse transcriptase–polymerase chain reaction (RT-PCR) and IgM and IgG enzyme-linked immunosorbent assay (ELISA)—and how the results may vary over time (Figure).

Journal ArticleDOI
TL;DR: A large scale benchmark for molecular machine learning consisting of multiple public datasets, metrics, featurizations and learning algorithms.
Abstract: Molecular machine learning has been maturing rapidly over the last few years. Improved methods and the presence of larger datasets have enabled machine learning algorithms to make increasingly accurate predictions about molecular properties. However, algorithmic progress has been limited due to the lack of a standard benchmark to compare the efficacy of proposed methods; most new algorithms are benchmarked on different datasets making it challenging to gauge the quality of proposed methods. This work introduces MoleculeNet, a large scale benchmark for molecular machine learning. MoleculeNet curates multiple public datasets, establishes metrics for evaluation, and offers high quality open-source implementations of multiple previously proposed molecular featurization and learning algorithms (released as part of the DeepChem open source library). MoleculeNet benchmarks demonstrate that learnable representations are powerful tools for molecular machine learning and broadly offer the best performance. However, this result comes with caveats. Learnable representations still struggle to deal with complex tasks under data scarcity and highly imbalanced classification. For quantum mechanical and biophysical datasets, the use of physics-aware featurizations can be more important than choice of particular learning algorithm.

Book ChapterDOI
08 Oct 2016
TL;DR: A new aerial video dataset and benchmark for low altitude UAV target tracking, as well as, a photo-realistic UAV simulator that can be coupled with tracking methods to easily extend existing real-world datasets.
Abstract: In this paper, we propose a new aerial video dataset and benchmark for low altitude UAV target tracking, as well as, a photo-realistic UAV simulator that can be coupled with tracking methods. Our benchmark provides the first evaluation of many state-of-the-art and popular trackers on 123 new and fully annotated HD video sequences captured from a low-altitude aerial perspective. Among the compared trackers, we determine which ones are the most suitable for UAV tracking both in terms of tracking accuracy and run-time. The simulator can be used to evaluate tracking algorithms in real-time scenarios before they are deployed on a UAV “in the field”, as well as, generate synthetic but photo-realistic tracking datasets with automatic ground truth annotations to easily extend existing real-world datasets. Both the benchmark and simulator are made publicly available to the vision community on our website to further research in the area of object tracking from UAVs. (https://ivul.kaust.edu.sa/Pages/pub-benchmark-simulator-uav.aspx.).

Journal ArticleDOI
03 Jul 2020-Science
TL;DR: It is demonstrated that intestinal organoids can serve as a model to understand SARS-CoV-2 biology and infectivity in the gut, and hSIOs serve as an experimental model for coronavirus infection and biology.
Abstract: Severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) can cause coronavirus disease 2019 (COVID-19), an influenza-like disease that is primarily thought to infect the lungs with transmission through the respiratory route. However, clinical evidence suggests that the intestine may present another viral target organ. Indeed, the SARS-CoV-2 receptor angiotensin-converting enzyme 2 (ACE2) is highly expressed on differentiated enterocytes. In human small intestinal organoids (hSIOs), enterocytes were readily infected by SARS-CoV and SARS-CoV-2, as demonstrated by confocal and electron microscopy. Enterocytes produced infectious viral particles, whereas messenger RNA expression analysis of hSIOs revealed induction of a generic viral response program. Therefore, the intestinal epithelium supports SARS-CoV-2 replication, and hSIOs serve as an experimental model for coronavirus infection and biology.