scispace - formally typeset
Search or ask a question
Browse all papers

Journal ArticleDOI
TL;DR: Physiologically, metformin has been shown to reduce hepatic glucose production, yet not all of its effects can be explained by this mechanism and there is increasing evidence of a key role for the gut.
Abstract: Metformin is a widely-used drug that results in clear benefits in relation to glucose metabolism and diabetes-related complications. The mechanisms underlying these benefits are complex and still not fully understood. Physiologically, metformin has been shown to reduce hepatic glucose production, yet not all of its effects can be explained by this mechanism and there is increasing evidence of a key role for the gut. At the molecular level the findings vary depending on the doses of metformin used and duration of treatment, with clear differences between acute and chronic administration. Metformin has been shown to act via both AMP-activated protein kinase (AMPK)-dependent and AMPK-independent mechanisms; by inhibition of mitochondrial respiration but also perhaps by inhibition of mitochondrial glycerophosphate dehydrogenase, and a mechanism involving the lysosome. In the last 10 years, we have moved from a simple picture, that metformin improves glycaemia by acting on the liver via AMPK activation, to a much more complex picture reflecting its multiple modes of action. More work is required to truly understand how this drug works in its target population: individuals with type 2 diabetes.

1,302 citations


Journal ArticleDOI
TL;DR: A synthetic strategy to grow Janus monolayers of transition metal dichalcogenides breaking the out-of-plane structural symmetry of MoSSe by means of scanning transmission electron microscopy and energy-dependent X-ray photoelectron spectroscopy is reported.
Abstract: Structural symmetry-breaking plays a crucial role in determining the electronic band structures of two-dimensional materials. Tremendous efforts have been devoted to breaking the in-plane symmetry of graphene with electric fields on AB-stacked bilayers or stacked van der Waals heterostructures. In contrast, transition metal dichalcogenide monolayers are semiconductors with intrinsic in-plane asymmetry, leading to direct electronic bandgaps, distinctive optical properties and great potential in optoelectronics. Apart from their in-plane inversion asymmetry, an additional degree of freedom allowing spin manipulation can be induced by breaking the out-of-plane mirror symmetry with external electric fields or, as theoretically proposed, with an asymmetric out-of-plane structural configuration. Here, we report a synthetic strategy to grow Janus monolayers of transition metal dichalcogenides breaking the out-of-plane structural symmetry. In particular, based on a MoS2 monolayer, we fully replace the top-layer S with Se atoms. We confirm the Janus structure of MoSSe directly by means of scanning transmission electron microscopy and energy-dependent X-ray photoelectron spectroscopy, and prove the existence of vertical dipoles by second harmonic generation and piezoresponse force microscopy measurements.

1,302 citations


Posted Content
TL;DR: This paper details the improvements of CNN on different aspects, including layer design, activation function, loss function, regularization, optimization and fast computation, and introduces various applications of convolutional neural networks in computer vision, speech and natural language processing.
Abstract: In the last few years, deep learning has led to very good performance on a variety of problems, such as visual recognition, speech recognition and natural language processing. Among different types of deep neural networks, convolutional neural networks have been most extensively studied. Leveraging on the rapid growth in the amount of the annotated data and the great improvements in the strengths of graphics processor units, the research on convolutional neural networks has been emerged swiftly and achieved state-of-the-art results on various tasks. In this paper, we provide a broad survey of the recent advances in convolutional neural networks. We detailize the improvements of CNN on different aspects, including layer design, activation function, loss function, regularization, optimization and fast computation. Besides, we also introduce various applications of convolutional neural networks in computer vision, speech and natural language processing.

1,302 citations


Journal ArticleDOI
TL;DR: The European Space Agency's Planck satellite, dedicated to studying the early Universe and its subsequent evolution, was launched 14~May 2009 and scanned the microwave and submillimetre sky continuously between 12~August 2009 and 23~October 2013 as discussed by the authors.
Abstract: The European Space Agency's Planck satellite, dedicated to studying the early Universe and its subsequent evolution, was launched 14~May 2009 and scanned the microwave and submillimetre sky continuously between 12~August 2009 and 23~October 2013. In February~2015, ESA and the Planck Collaboration released the second set of cosmology products based on data from the entire Planck mission, including both temperature and polarization, along with a set of scientific and technical papers and a web-based explanatory supplement. This paper gives an overview of the main characteristics of the data and the data products in the release, as well as the associated cosmological and astrophysical science results and papers. The science products include maps of the cosmic microwave background (CMB), the thermal Sunyaev-Zeldovich effect, and diffuse foregrounds in temperature and polarization, catalogues of compact Galactic and extragalactic sources (including separate catalogues of Sunyaev-Zeldovich clusters and Galactic cold clumps), and extensive simulations of signals and noise used in assessing the performance of the analysis methods and assessment of uncertainties. The likelihood code used to assess cosmological models against the Planck data are described, as well as a CMB lensing likelihood. Scientific results include cosmological parameters deriving from CMB power spectra, gravitational lensing, and cluster counts, as well as constraints on inflation, non-Gaussianity, primordial magnetic fields, dark energy, and modified gravity.

1,302 citations


Journal ArticleDOI
TL;DR: Treatment with blinatumomab resulted in significantly longer overall survival than chemotherapy among adult patients with relapsed or refractory B‐cell precursor ALL, and remission rates within 12 weeks after treatment initiation were significantly higher.
Abstract: BackgroundBlinatumomab, a bispecific monoclonal antibody construct that enables CD3-positive T cells to recognize and eliminate CD19-positive acute lymphoblastic leukemia (ALL) blasts, was approved for use in patients with relapsed or refractory B-cell precursor ALL on the basis of single-group trials that showed efficacy and manageable toxic effects. MethodsIn this multi-institutional phase 3 trial, we randomly assigned adults with heavily pretreated B-cell precursor ALL, in a 2:1 ratio, to receive either blinatumomab or standard-of-care chemotherapy. The primary end point was overall survival. ResultsOf the 405 patients who were randomly assigned to receive blinatumomab (271 patients) or chemotherapy (134 patients), 376 patients received at least one dose. Overall survival was significantly longer in the blinatumomab group than in the chemotherapy group. The median overall survival was 7.7 months in the blinatumomab group and 4.0 months in the chemotherapy group (hazard ratio for death with blinatumomab...

1,301 citations


Proceedings ArticleDOI
10 Aug 2015
TL;DR: This work presents two case studies where high-performance generalized additive models with pairwise interactions (GA2Ms) are applied to real healthcare problems yielding intelligible models with state-of-the-art accuracy.
Abstract: In machine learning often a tradeoff must be made between accuracy and intelligibility. More accurate models such as boosted trees, random forests, and neural nets usually are not intelligible, but more intelligible models such as logistic regression, naive-Bayes, and single decision trees often have significantly worse accuracy. This tradeoff sometimes limits the accuracy of models that can be applied in mission-critical applications such as healthcare where being able to understand, validate, edit, and trust a learned model is important. We present two case studies where high-performance generalized additive models with pairwise interactions (GA2Ms) are applied to real healthcare problems yielding intelligible models with state-of-the-art accuracy. In the pneumonia risk prediction case study, the intelligible model uncovers surprising patterns in the data that previously had prevented complex learned models from being fielded in this domain, but because it is intelligible and modular allows these patterns to be recognized and removed. In the 30-day hospital readmission case study, we show that the same methods scale to large datasets containing hundreds of thousands of patients and thousands of attributes while remaining intelligible and providing accuracy comparable to the best (unintelligible) machine learning methods.

1,301 citations


Journal ArticleDOI
TL;DR: The Modules for Experiments in Stellar Astrophysics (MESA) Isochrones and Stellar Tracks (MIST) project as discussed by the authors provides a set of stellar evolutionary tracks and isochrones computed using MESA, a state-of-the-art 1D stellar evolution package.
Abstract: This is the first of a series of papers presenting the Modules for Experiments in Stellar Astrophysics (MESA) Isochrones and Stellar Tracks (MIST) project, a new comprehensive set of stellar evolutionary tracks and isochrones computed using MESA, a state-of-the-art open-source 1D stellar evolution package. In this work, we present models with solar-scaled abundance ratios covering a wide range of ages ($5 \leq \rm \log(Age)\;[yr] \leq 10.3$), masses ($0.1 \leq M/M_{\odot} \leq 300$), and metallicities ($-2.0 \leq \rm [Z/H] \leq 0.5$). The models are self-consistently and continuously evolved from the pre-main sequence to the end of hydrogen burning, the white dwarf cooling sequence, or the end of carbon burning, depending on the initial mass. We also provide a grid of models evolved from the pre-main sequence to the end of core helium burning for $-4.0 \leq \rm [Z/H] < -2.0$. We showcase extensive comparisons with observational constraints as well as with some of the most widely used existing models in the literature. The evolutionary tracks and isochrones can be downloaded from the project website at this http URL

1,301 citations


Journal ArticleDOI
TL;DR: A new periodontitis classification scheme has been adopted, in which forms of the disease previously recognized as "chronic" or "aggressive" are now grouped under a single category ("periodontitis") and are further characterized based on a multi-dimensional staging and grading system as mentioned in this paper.
Abstract: A new periodontitis classification scheme has been adopted, in which forms of the disease previously recognized as "chronic" or "aggressive" are now grouped under a single category ("periodontitis") and are further characterized based on a multi-dimensional staging and grading system. Staging is largely dependent upon the severity of disease at presentation as well as on the complexity of disease management, while grading provides supplemental information about biological features of the disease including a history-based analysis of the rate of periodontitis progression; assessment of the risk for further progression; analysis of possible poor outcomes of treatment; and assessment of the risk that the disease or its treatment may negatively affect the general health of the patient. Necrotizing periodontal diseases, whose characteristic clinical phenotype includes typical features (papilla necrosis, bleeding, and pain) and are associated with host immune response impairments, remain a distinct periodontitis category. Endodontic-periodontal lesions, defined by a pathological communication between the pulpal and periodontal tissues at a given tooth, occur in either an acute or a chronic form, and are classified according to signs and symptoms that have direct impact on their prognosis and treatment. Periodontal abscesses are defined as acute lesions characterized by localized accumulation of pus within the gingival wall of the periodontal pocket/sulcus, rapid tissue destruction and are associated with risk for systemic dissemination.

1,301 citations


Journal ArticleDOI
TL;DR: In this paper, the authors used a database of 45,813 first records of 16,926 established alien species and showed that the annual rate of first records worldwide has increased during the last 200 years, with 37% of all first records reported most recently (1970-2014).
Abstract: Although research on human-mediated exchanges of species has substantially intensified during the last centuries, we know surprisingly little about temporal dynamics of alien species accumulations across regions and taxa. Using a novel database of 45,813 first records of 16,926 established alien species, we show that the annual rate of first records worldwide has increased during the last 200 years, with 37% of all first records reported most recently (1970-2014). Inter-continental and inter-taxonomic variation can be largely attributed to the diaspora of European settlers in the nineteenth century and to the acceleration in trade in the twentieth century. For all taxonomic groups, the increase in numbers of alien species does not show any sign of saturation and most taxa even show increases in the rate of first records over time. This highlights that past efforts to mitigate invasions have not been effective enough to keep up with increasing globalization.

1,301 citations


Proceedings Article
02 Dec 2018
TL;DR: ProxylessNAS is presented, which can directly learn the architectures for large-scale target tasks and target hardware platforms and apply ProxylessNAS to specialize neural architectures for hardware with direct hardware metrics (e.g. latency) and provide insights for efficient CNN architecture design.
Abstract: Neural architecture search (NAS) has a great impact by automatically designing effective neural network architectures. However, the prohibitive computational demand of conventional NAS algorithms (e.g. $10^4$ GPU hours) makes it difficult to \emph{directly} search the architectures on large-scale tasks (e.g. ImageNet). Differentiable NAS can reduce the cost of GPU hours via a continuous representation of network architecture but suffers from the high GPU memory consumption issue (grow linearly w.r.t. candidate set size). As a result, they need to utilize~\emph{proxy} tasks, such as training on a smaller dataset, or learning with only a few blocks, or training just for a few epochs. These architectures optimized on proxy tasks are not guaranteed to be optimal on the target task. In this paper, we present \emph{ProxylessNAS} that can \emph{directly} learn the architectures for large-scale target tasks and target hardware platforms. We address the high memory consumption issue of differentiable NAS and reduce the computational cost (GPU hours and GPU memory) to the same level of regular training while still allowing a large candidate set. Experiments on CIFAR-10 and ImageNet demonstrate the effectiveness of directness and specialization. On CIFAR-10, our model achieves 2.08\% test error with only 5.7M parameters, better than the previous state-of-the-art architecture AmoebaNet-B, while using 6$\times$ fewer parameters. On ImageNet, our model achieves 3.1\% better top-1 accuracy than MobileNetV2, while being 1.2$\times$ faster with measured GPU latency. We also apply ProxylessNAS to specialize neural architectures for hardware with direct hardware metrics (e.g. latency) and provide insights for efficient CNN architecture design.

1,301 citations


Book ChapterDOI
08 Oct 2016
TL;DR: In this article, discriminative correlation filters (DCF) have demonstrated excellent performance for visual object tracking, and the key to their success is the ability to efficiently exploit available negative data.
Abstract: Discriminative Correlation Filters (DCF) have demonstrated excellent performance for visual object tracking. The key to their success is the ability to efficiently exploit available negative data b ...

Journal ArticleDOI
TL;DR: Baricitinib plus remdesivir was superior to remdes Vivir alone in reducing recovery time and accelerating improvement in clinical status among patients with Covid-19, notably among those receiving high-flow oxygen or noninvasive ventilation.
Abstract: Background Severe coronavirus disease 2019 (Covid-19) is associated with dysregulated inflammation. The effects of combination treatment with baricitinib, a Janus kinase inhibitor, plus remdesivir are not known. Methods We conducted a double-blind, randomized, placebo-controlled trial evaluating baricitinib plus remdesivir in hospitalized adults with Covid-19. All the patients received remdesivir (≤10 days) and either baricitinib (≤14 days) or placebo (control). The primary outcome was the time to recovery. The key secondary outcome was clinical status at day 15. Results A total of 1033 patients underwent randomization (with 515 assigned to combination treatment and 518 to control). Patients receiving baricitinib had a median time to recovery of 7 days (95% confidence interval [CI], 6 to 8), as compared with 8 days (95% CI, 7 to 9) with control (rate ratio for recovery, 1.16; 95% CI, 1.01 to 1.32; P = 0.03), and a 30% higher odds of improvement in clinical status at day 15 (odds ratio, 1.3; 95% CI, 1.0 to 1.6). Patients receiving high-flow oxygen or noninvasive ventilation at enrollment had a time to recovery of 10 days with combination treatment and 18 days with control (rate ratio for recovery, 1.51; 95% CI, 1.10 to 2.08). The 28-day mortality was 5.1% in the combination group and 7.8% in the control group (hazard ratio for death, 0.65; 95% CI, 0.39 to 1.09). Serious adverse events were less frequent in the combination group than in the control group (16.0% vs. 21.0%; difference, -5.0 percentage points; 95% CI, -9.8 to -0.3; P = 0.03), as were new infections (5.9% vs. 11.2%; difference, -5.3 percentage points; 95% CI, -8.7 to -1.9; P = 0.003). Conclusions Baricitinib plus remdesivir was superior to remdesivir alone in reducing recovery time and accelerating improvement in clinical status among patients with Covid-19, notably among those receiving high-flow oxygen or noninvasive ventilation. The combination was associated with fewer serious adverse events. (Funded by the National Institute of Allergy and Infectious Diseases; ClinicalTrials.gov number, NCT04401579.).

Journal ArticleDOI
TL;DR: Recently, research efforts have been able to tune and optimize pore spaces, immobilize specific functional groups, and introduce chiral pore environments to target MOF materials for methane storage, light hydrocarbon separations, enantioselective recognitions, carbon dioxide capture, and separations.
Abstract: ConspectusDiscoveries of novel functional materials have played very important roles to the development of science and technologies and thus to benefit our daily life. Among the diverse materials, metal–organic framework (MOF) materials are rapidly emerging as a unique type of porous and organic/inorganic hybrid materials which can be simply self-assembled from their corresponding inorganic metal ions/clusters with organic linkers, and can be straightforwardly characterized by various analytical methods. In terms of porosity, they are superior to other well-known porous materials such as zeolites and carbon materials; exhibiting extremely high porosity with surface area up to 7000 m2/g, tunable pore sizes, and metrics through the interplay of both organic and inorganic components with the pore sizes ranging from 3 to 100 A, and lowest framework density down to 0.13 g/cm3. Such unique features have enabled metal–organic frameworks to exhibit great potentials for a broad range of applications in gas storage...

Journal ArticleDOI
TL;DR: The aim of this article is to review the different TEER measurement techniques and analyze their strengths and weaknesses, determine the significance of TEER in drug toxicity studies, and examine the various in vitro models and microfluidic organs-on-chips implementations using TEER measurements in some widely studied barrier models.
Abstract: Transepithelial/transendothelial electrical resistance (TEER) is a widely accepted quantitative technique to measure the integrity of tight junction dynamics in cell culture models of endothelial and epithelial monolayers. TEER values are strong indicators of the integrity of the cellular barriers before they are evaluated for transport of drugs or chemicals. TEER measurements can be performed in real time without cell damage and generally are based on measuring ohmic resistance or measuring impedance across a wide spectrum of frequencies. The measurements for various cell types have been reported with commercially available measurement systems and also with custom-built microfluidic implementations. Some of the barrier models that have been widely characterized using TEER include the blood–brain barrier (BBB), gastrointestinal (GI) tract, and pulmonary models. Variations in these values can arise due to factors such as temperature, medium formulation, and passage number of cells. The aim of this article ...

Journal ArticleDOI
TL;DR: An overview of methods for wastewater treatment, and the advantages and disadvantages of available technologies can be found in this article, where the authors provide an overview of the available technologies and their performance characteristics.
Abstract: During the last 30 years, environmental issues about the chemical and biological contaminations of water have become a major concern for society, public authorities and the industry. Most domestic and industrial activities produce wastewaters containing undesirable toxic contaminants. In this context, a constant effort must be made to protect water resources. Current wastewater treatment methods involve a combination of physical, chemical and biological processes, and operations to remove insoluble particles and soluble contaminants from effluents. This article provides an overview of methods for wastewater treatment, and describes the advantages and disadvantages of available technologies.

Journal ArticleDOI
TL;DR: Transmembrane protein 119 (Tmem119), a cell-surface protein of unknown function, is identified as a highly expressed microglia-specific marker in both mouse and human, which will greatly facilitate understanding of microglial function in health and disease.
Abstract: The specific function of microglia, the tissue resident macrophages of the brain and spinal cord, has been difficult to ascertain because of a lack of tools to distinguish microglia from other immune cells, thereby limiting specific immunostaining, purification, and manipulation. Because of their unique developmental origins and predicted functions, the distinction of microglia from other myeloid cells is critically important for understanding brain development and disease; better tools would greatly facilitate studies of microglia function in the developing, adult, and injured CNS. Here, we identify transmembrane protein 119 (Tmem119), a cell-surface protein of unknown function, as a highly expressed microglia-specific marker in both mouse and human. We developed monoclonal antibodies to its intracellular and extracellular domains that enable the immunostaining of microglia in histological sections in healthy and diseased brains, as well as isolation of pure nonactivated microglia by FACS. Using our antibodies, we provide, to our knowledge, the first RNAseq profiles of highly pure mouse microglia during development and after an immune challenge. We used these to demonstrate that mouse microglia mature by the second postnatal week and to predict novel microglial functions. Together, we anticipate these resources will be valuable for the future study and understanding of microglia in health and disease.

Journal ArticleDOI
TL;DR: This epidemiological picture is an important benchmark for identifying persons at greater risk of suffering from psychological distress and the results are useful for tailoring psychological interventions targeting the post-traumatic nature of the distress.
Abstract: The uncontrolled spread of the coronavirus disease 2019 (COVID-19) has called for unprecedented measures, to the extent that the Italian government has imposed a quarantine on the entire country. Quarantine has a huge impact and can cause considerable psychological strain. The present study aims to establish the prevalence of psychiatric symptoms and identify risk and protective factors for psychological distress in the general population. An online survey was administered from 18-22 March 2020 to 2766 participants. Multivariate ordinal logistic regression models were constructed to examine the associations between sociodemographic variables; personality traits; depression, anxiety, and stress. Female gender, negative affect, and detachment were associated with higher levels of depression, anxiety, and stress. Having an acquaintance infected was associated with increased levels of both depression and stress, whereas a history of stressful situations and medical problems was associated with higher levels of depression and anxiety. Finally, those with a family member infected and young person who had to work outside their domicile presented higher levels of anxiety and stress, respectively. This epidemiological picture is an important benchmark for identifying persons at greater risk of suffering from psychological distress and the results are useful for tailoring psychological interventions targeting the post-traumatic nature of the distress.

Proceedings ArticleDOI
15 Jun 2019
TL;DR: benchmarks suggest that PointPillars is an appropriate encoding for object detection in point clouds, and proposes a lean downstream network.
Abstract: Object detection in point clouds is an important aspect of many robotics applications such as autonomous driving. In this paper, we consider the problem of encoding a point cloud into a format appropriate for a downstream detection pipeline. Recent literature suggests two types of encoders; fixed encoders tend to be fast but sacrifice accuracy, while encoders that are learned from data are more accurate, but slower. In this work, we propose PointPillars, a novel encoder which utilizes PointNets to learn a representation of point clouds organized in vertical columns (pillars). While the encoded features can be used with any standard 2D convolutional detection architecture, we further propose a lean downstream network. Extensive experimentation shows that PointPillars outperforms previous encoders with respect to both speed and accuracy by a large margin. Despite only using lidar, our full detection pipeline significantly outperforms the state of the art, even among fusion methods, with respect to both the 3D and bird’s eye view KITTI benchmarks. This detection performance is achieved while running at 62 Hz: a 2 - 4 fold runtime improvement. A faster version of our method matches the state of the art at 105 Hz. These benchmarks suggest that PointPillars is an appropriate encoding for object detection in point clouds.

Journal ArticleDOI
17 Mar 2015-Immunity
TL;DR: An integrated high-throughput transcriptional-metabolic profiling and analysis pipeline provides a highly integrated picture of the physiological modules supporting macrophage polarization, identifying potential pharmacologic control points for both macrophages.

Journal ArticleDOI
09 Mar 2018-Science
TL;DR: It is found that adopting a high-fiber diet promoted the growth of SCFA-producing organisms in diabetic humans and had better improvement in hemoglobin A1c levels, partly via increased glucagon-like peptide-1 production.
Abstract: The gut microbiota benefits humans via short-chain fatty acid (SCFA) production from carbohydrate fermentation, and deficiency in SCFA production is associated with type 2 diabetes mellitus (T2DM). We conducted a randomized clinical study of specifically designed isoenergetic diets, together with fecal shotgun metagenomics, to show that a select group of SCFA-producing strains was promoted by dietary fibers and that most other potential producers were either diminished or unchanged in patients with T2DM. When the fiber-promoted SCFA producers were present in greater diversity and abundance, participants had better improvement in hemoglobin A1c levels, partly via increased glucagon-like peptide-1 production. Promotion of these positive responders diminished producers of metabolically detrimental compounds such as indole and hydrogen sulfide. Targeted restoration of these SCFA producers may present a novel ecological approach for managing T2DM.

Journal ArticleDOI
TL;DR: Among patients with advanced breast cancer and a germline BRCA1/2 mutation, single‐agent talazoparib provided a significant benefit over standard chemotherapy with respect to progression‐free survival.
Abstract: Background The poly(adenosine diphosphate–ribose) inhibitor talazoparib has shown antitumor activity in patients with advanced breast cancer and germline mutations in BRCA1 and BRCA2 (BRCA1/2). Methods We conducted a randomized, open-label, phase 3 trial in which patients with advanced breast cancer and a germline BRCA1/2 mutation were assigned, in a 2:1 ratio, to receive talazoparib (1 mg once daily) or standard single-agent therapy of the physician’s choice (capecitabine, eribulin, gemcitabine, or vinorelbine in continuous 21-day cycles). The primary end point was progression-free survival, which was assessed by blinded independent central review. Results Of the 431 patients who underwent randomization, 287 were assigned to receive talazoparib and 144 were assigned to receive standard therapy. Median progression-free survival was significantly longer in the talazoparib group than in the standard-therapy group (8.6 months vs. 5.6 months; hazard ratio for disease progression or death, 0.54; 95% c...

Journal ArticleDOI
TL;DR: A simple scalable method is demonstrated to obtain graphene-based membranes with limited swelling, which exhibit 97% rejection for NaCl and decrease exponentially with decreasing sieve size, but water transport is weakly affected.
Abstract: Ion permeation and selectivity of graphene oxide membranes with sub-nm channels dramatically alters with the change in interlayer distance due to dehydration effects whereas permeation of water molecules remains largely unaffected. Graphene oxide membranes show exceptional molecular permeation properties, with promise for many applications1,2,3,4,5. However, their use in ion sieving and desalination technologies is limited by a permeation cutoff of ∼9 A (ref. 4), which is larger than the diameters of hydrated ions of common salts4,6. The cutoff is determined by the interlayer spacing (d) of ∼13.5 A, typical for graphene oxide laminates that swell in water2,4. Achieving smaller d for the laminates immersed in water has proved to be a challenge. Here, we describe how to control d by physical confinement and achieve accurate and tunable ion sieving. Membranes with d from ∼9.8 A to 6.4 A are demonstrated, providing a sieve size smaller than the diameters of hydrated ions. In this regime, ion permeation is found to be thermally activated with energy barriers of ∼10–100 kJ mol–1 depending on d. Importantly, permeation rates decrease exponentially with decreasing sieve size but water transport is weakly affected (by a factor of <2). The latter is attributed to a low barrier for the entry of water molecules and large slip lengths inside graphene capillaries. Building on these findings, we demonstrate a simple scalable method to obtain graphene-based membranes with limited swelling, which exhibit 97% rejection for NaCl.

Book ChapterDOI
08 Oct 2016
TL;DR: Deep CORAL as mentioned in this paper aligns correlations of layer activations in deep neural networks (DeepCORAL) to learn a nonlinear transformation that aligns correlation between the source and target distributions.
Abstract: Deep neural networks are able to learn powerful representations from large quantities of labeled input data, however they cannot always generalize well across changes in input distributions. Domain adaptation algorithms have been proposed to compensate for the degradation in performance due to domain shift. In this paper, we address the case when the target domain is unlabeled, requiring unsupervised adaptation. CORAL [18] is a simple unsupervised domain adaptation method that aligns the second-order statistics of the source and target distributions with a linear transformation. Here, we extend CORAL to learn a nonlinear transformation that aligns correlations of layer activations in deep neural networks (Deep CORAL). Experiments on standard benchmark datasets show state-of-the-art performance. Our code is available at: https://github.com/VisionLearningGroup/CORAL.

Journal ArticleDOI
TL;DR: A new tool, robvis (Risk‐Of‐Bias VISualization), is presented, available as an R package and web app, which facilitates rapid production of publication‐quality risk‐of‐bias assessment figures.
Abstract: Despite a major increase in the range and number of software offerings now available to help researchers produce evidence syntheses, there is currently no generic tool for producing figures to display and explore the risk-of-bias assessments that routinely take place as part of systematic review. However, tools such as the R programming environment and Shiny (an R package for building interactive web apps) have made it straightforward to produce new tools to help in producing evidence syntheses. We present a new tool, robvis (Risk-Of-Bias VISualization), available as an R package and web app, which facilitates rapid production of publication-quality risk-of-bias assessment figures. We present a timeline of the tool's development and its key functionality.

Journal ArticleDOI
02 Jun 2016-Nature
TL;DR: It is demonstrated that proteogenomic analysis of breast cancer elucidates functional consequences of somatic mutations, narrows candidate nominations for driver genes within large deletions and amplified regions, and identifies therapeutic targets.
Abstract: Somatic mutations have been extensively characterized in breast cancer, but the effects of these genetic alterations on the proteomic landscape remain poorly understood. Here we describe quantitative mass-spectrometry-based proteomic and phosphoproteomic analyses of 105 genomically annotated breast cancers, of which 77 provided high-quality data. Integrated analyses provided insights into the somatic cancer genome including the consequences of chromosomal loss, such as the 5q deletion characteristic of basal-like breast cancer. Interrogation of the 5q trans-effects against the Library of Integrated Network-based Cellular Signatures, connected loss of CETN3 and SKP1 to elevated expression of epidermal growth factor receptor (EGFR), and SKP1 loss also to increased SRC tyrosine kinase. Global proteomic data confirmed a stromal-enriched group of proteins in addition to basal and luminal clusters, and pathway analysis of the phosphoproteome identified a G-protein-coupled receptor cluster that was not readily identified at the mRNA level. In addition to ERBB2, other amplicon-associated highly phosphorylated kinases were identified, including CDK12, PAK1, PTK2, RIPK2 and TLK2. We demonstrate that proteogenomic analysis of breast cancer elucidates the functional consequences of somatic mutations, narrows candidate nominations for driver genes within large deletions and amplified regions, and identifies therapeutic targets.

Journal ArticleDOI
04 Apr 2017-JAMA
TL;DR: Among patients in the United States diagnosed with thyroid cancer from 1974-2013, the overall incidence of thyroid cancer increased 3% annually, with increases in the incidence rate and thyroid cancer mortality rate for advanced-stage papillary thyroid cancer.
Abstract: Importance Thyroid cancer incidence has increased substantially in the United States over the last 4 decades, driven largely by increases in papillary thyroid cancer. It is unclear whether the increasing incidence of papillary thyroid cancer has been related to thyroid cancer mortality trends. Objective To compare trends in thyroid cancer incidence and mortality by tumor characteristics at diagnosis. Design, Setting, and Participants Trends in thyroid cancer incidence and incidence-based mortality rates were evaluated using data from the Surveillance, Epidemiology, and End Results-9 (SEER-9) cancer registry program, and annual percent change in rates was calculated using log-linear regression. Exposure Tumor characteristics. Main Outcomes and Measures Annual percent changes in age-adjusted thyroid cancer incidence and incidence-based mortality rates by histologic type and SEER stage for cases diagnosed during 1974-2013. Results Among 77 276 patients (mean [SD] age at diagnosis, 48 [16] years; 58 213 [75%] women) diagnosed with thyroid cancer from 1974-2013, papillary thyroid cancer was the most common histologic type (64 625 cases), and 2371 deaths from thyroid cancer occurred during 1994-2013. Thyroid cancer incidence increased, on average, 3.6% per year (95% CI, 3.2%-3.9%) during 1974-2013 (from 4.56 per 100 000 person-years in 1974-1977 to 14.42 per 100 000 person-years in 2010-2013), primarily related to increases in papillary thyroid cancer (annual percent change, 4.4% [95% CI, 4.0%-4.7%]). Papillary thyroid cancer incidence increased for all SEER stages at diagnosis (4.6% per year for localized, 4.3% per year for regional, 2.4% per year for distant, 1.8% per year for unknown). During 1994-2013, incidence-based mortality increased 1.1% per year (95% CI, 0.6%-1.6%) (from 0.40 per 100 000 person-years in 1994-1997 to 0.46 per 100 000 person-years in 2010-2013) overall and 2.9% per year (95% CI, 1.1%-4.7%) for SEER distant stage papillary thyroid cancer. Conclusions and Relevance Among patients in the United States diagnosed with thyroid cancer from 1974-2013, the overall incidence of thyroid cancer increased 3% annually, with increases in the incidence rate and thyroid cancer mortality rate for advanced-stage papillary thyroid cancer. These findings are consistent with a true increase in the occurrence of thyroid cancer in the United States.

Journal Article
TL;DR: This report provides the most recent national data for 2017-2018 on obesity and severe obesity prevalence among adults by sex, age, and race and Hispanic origin.
Abstract: Obesity is associated with serious health risks (1). Severe obesity further increases the risk of obesity-related complications, such as coronary heart disease and end-stage renal disease (2,3). From 1999-2000 through 2015-2016, a significantly increasing trend in obesity was observed (4). This report provides the most recent national data for 2017-2018 on obesity and severe obesity prevalence among adults by sex, age, and race and Hispanic origin. Trends from 1999-2000 through 2017-2018 for adults aged 20 and over are also presented.

01 Jan 2015
TL;DR: A rich variety of diagnostic tests for these situations have been developed in the econometrics community, a collection of which has been implemented in the packages lmtest and strucchange covering the problems mentioned above.
Abstract: is still one of the most popular tools for data analysis despite (or due to) its simple structure. Although it is appropriate in many situations, there are many pitfalls that might affect the quality of conclusions drawn from fitted models or might even lead to uninterpretable results. Some of these pitfalls that are considered especially important in applied econometrics are heteroskedasticity or serial correlation of the error terms, structural changes in the regression coefficients, nonlinearities, functional misspecification or omitted variables. Therefore, a rich variety of diagnostic tests for these situations have been developed in the econometrics community, a collection of which has been implemented in the packages lmtest and strucchange covering the problems mentioned above. These diagnostic tests are not only useful in econometrics but also in many other fields where linear regression is used, which we will demonstrate with an application from biostatistics. As Breiman (2001) argues it is important to assess the goodness-of-fit of data models, in particular not only using omnibus tests but tests designed for a certain direction of the alternative. These diagnostic checks do not have to be seen as pure significance procedures but also as an explorative tool to extract information about the structure of the data, especially in connection with residual plots or other diagnostic plots. As Brown, Durbin, and Evans (1975) argue for the recursive CUSUM test, these procedures can “be regarded as yardsticks for the interpretation of data rather than leading to hard and fast decisions.” Moreover, we will always be able to reject the nullhypothesis provided we have enough data at hand. The question is not whether the model is wrong (it always is!) but if the irregularities are serious. The package strucchange implements a variety of procedures related to structural change of the regression coefficients and was already introduced in R news by Zeileis (2001) and described in more detail in Zeileis, Leisch, Hornik, and Kleiber (2002). Therefore, we will focus on the package lmtest in the following. Most of the tests and the datasets contained in the package are taken from the book of Kramer and Sonnberger (1986), which originally inspired us to write the package. Compared to the book, we implemented later versions of some tests and modern flexible interfaces for the procedures. Most of the tests are based on the OLS residuals of a linear model, which is specified by a formula argument. Instead of a formula a fitted model of class "lm" can also be supplied, which should work if the data are either contained in the object or still present in the workspace—however this is not encouraged. The full references for the tests can be found on the help pages of the respective function. We present applications of the tests contained in lmtest to two different data sets: the first is a macroeconomic time series from the U.S. analysed by Stock and Watson (1996) and the second is data from a study on measurments of fetal mandible length discussed by Royston and Altman (1994).

Journal ArticleDOI
TL;DR: 3.0 mg of liraglutide, as an adjunct to diet and exercise, was associated with reduced body weight and improved metabolic control in patients with type 2 diabetes and prediabetes.
Abstract: BACKGROUND Obesity is a chronic disease with serious health consequences, but weight loss is difficult to maintain through lifestyle intervention alone. Liraglutide, a glucagonlike peptide-1 analogue, has been shown to have potential benefit for weight management at a once-daily dose of 3.0 mg, injected subcutaneously. METHODS We conducted a 56-week, double-blind trial involving 3731 patients who did not have type 2 diabetes and who had a body-mass index (BMI; the weight in kilograms divided by the square of the height in meters) of at least 30 or a BMI of at least 27 if they had treated or untreated dyslipidemia or hypertension. We randomly assigned patients in a 2:1 ratio to receive once-daily subcutaneous injections of liraglutide at a dose of 3.0 mg (2487 patients) or placebo (1244 patients); both groups received counseling on lifestyle modification. The coprimary end points were the change in body weight and the proportions of patients losing at least 5% and more than 10% of their initial body weight. RESULTS At baseline, the mean (±SD) age of the patients was 45.1±12.0 years, the mean weight was 106.2±21.4 kg, and the mean BMI was 38.3±6.4; a total of 78.5% of the patients were women and 61.2% had prediabetes. At week 56, patients in the liraglutide group had lost a mean of 8.4±7.3 kg of body weight, and those in the placebo group had lost a mean of 2.8±6.5 kg (a difference of −5.6 kg; 95% confi dence interval, −6.0 to −5.1; P<0.001, with last-observation-carried-forward impu tation). A total of 63.2% of the patients in the liraglutide group as compared with 27.1% in the placebo group lost at least 5% of their body weight (P<0.001), and 33.1% and 10.6%, respectively, lost more than 10% of their body weight (P<0.001). The most frequently reported adverse events with liraglutide were mild or moderate nausea and diarrhea. Serious events occurred in 6.2% of the patients in the liraglutide group and in 5.0% of the patients in the placebo group. CONCLUSIONS In this study, 3.0 mg of liraglutide, as an adjunct to diet and exercise, was associated with reduced body weight and improved metabolic control. (Funded by Novo Nordisk; SCALE Obesity and Prediabetes NN8022-1839 ClinicalTrials.gov number, NCT01272219.)

Posted Content
TL;DR: This paper showed that adversarial training confers robustness to single-step attack methods, while multi-step attacks are somewhat less transferable than single step attack methods and single step attacks are the best for mounting black-box attacks.
Abstract: Adversarial examples are malicious inputs designed to fool machine learning models. They often transfer from one model to another, allowing attackers to mount black box attacks without knowledge of the target model's parameters. Adversarial training is the process of explicitly training a model on adversarial examples, in order to make it more robust to attack or to reduce its test error on clean inputs. So far, adversarial training has primarily been applied to small problems. In this research, we apply adversarial training to ImageNet. Our contributions include: (1) recommendations for how to succesfully scale adversarial training to large models and datasets, (2) the observation that adversarial training confers robustness to single-step attack methods, (3) the finding that multi-step attack methods are somewhat less transferable than single-step attack methods, so single-step attacks are the best for mounting black-box attacks, and (4) resolution of a "label leaking" effect that causes adversarially trained models to perform better on adversarial examples than on clean examples, because the adversarial example construction process uses the true label and the model can learn to exploit regularities in the construction process.