Showing papers by "Dartmouth College published in 2019"
••
TL;DR: The Eighth Edition of the JCA Special Issue seeks to continue to serve as a key resource that guides the utilization of TA in the treatment of human disease.
Abstract: The American Society for Apheresis (ASFA) Journal of Clinical Apheresis (JCA) Special Issue Writing Committee is charged with reviewing, updating, and categorizing indications for the evidence-based use of therapeutic apheresis in human disease. Since the 2007 JCA Special Issue (Fourth Edition), the Committee has incorporated systematic review and evidence-based approaches in the grading and categorization of apheresis indications. This Seventh Edition of the JCA Special Issue continues to maintain this methodology and rigor to make recommendations on the use of apheresis in a wide variety of diseases/conditions. The JCA Seventh Edition, like its predecessor, has consistently applied the category and grading system definitions in the fact sheets. The general layout and concept of a fact sheet that was used since the fourth edition has largely been maintained in this edition. Each fact sheet succinctly summarizes the evidence for the use of therapeutic apheresis in a specific disease entity. The Seventh Edition discusses 87 fact sheets (14 new fact sheets since the Sixth Edition) for therapeutic apheresis diseases and medical conditions, with 179 indications, which are separately graded and categorized within the listed fact sheets. Several diseases that are Category IV which have been described in detail in previous editions and do not have significant new evidence since the last publication are summarized in a separate table. The Seventh Edition of the JCA Special Issue serves as a key resource that guides the utilization of therapeutic apheresis in the treatment of human disease. J. Clin. Apheresis 31:149-162, 2016. © 2016 Wiley Periodicals, Inc.
1,691 citations
••
University of California, San Francisco1, University of Birmingham2, University of Liège3, Advocate Lutheran General Hospital4, Kantonsspital St. Gallen5, University of Adelaide6, Baylor College of Medicine7, Mayo Clinic8, University of Southern California9, Asahikawa Medical University10, University of Dundee11, Pontifical Catholic University of Chile12, Uppsala University13, University of Hong Kong14, Royal Adelaide Hospital15, University of Hamburg16, Sunnybrook Health Sciences Centre17, University of Minnesota18, Technische Universität München19, University of Cambridge20, University of Bologna21, Washington University in St. Louis22, Greenville Health System23, University of Bristol24, University of Ottawa25, Nagoya University26, University of Texas Southwestern Medical Center27, Shanghai Jiao Tong University28, Icahn School of Medicine at Mount Sinai29, Brigham and Women's Hospital30, Oregon Health & Science University31, University of Buenos Aires32, Duke University33, St. Elizabeth's Medical Center34, Dartmouth College35, University of Massachusetts Amherst36, University of the Witwatersrand37, Ghent University Hospital38, Sun Yat-sen University39
TL;DR: The GVG proposes a new Global Anatomic Staging System (GLASS), which involves defining a preferred target artery path (TAP) and then estimating limb-based patency (LBP) resulting in three stages of complexity for intervention.
993 citations
••
TL;DR: While previously polarization was primarily seen only in issue-based terms, a new type of division has emerged in the mass public in recent years: Ordinary Americans increasingly dislike and distru...
Abstract: While previously polarization was primarily seen only in issue-based terms, a new type of division has emerged in the mass public in recent years: Ordinary Americans increasingly dislike and distru...
920 citations
••
05 Mar 2019TL;DR: This paper proposed easy data augmentation techniques for boosting performance on text classification tasks, which consists of synonym replacement, random insertion, random swap, and random deletion, and showed that EDA improves performance for both convolutional and recurrent neural networks.
Abstract: We present EDA: easy data augmentation techniques for boosting performance on text classification tasks. EDA consists of four simple but powerful operations: synonym replacement, random insertion, random swap, and random deletion. On five text classification tasks, we show that EDA improves performance for both convolutional and recurrent neural networks. EDA demonstrates particularly strong results for smaller datasets; on average, across five datasets, training with EDA while using only 50% of the available training set achieved the same accuracy as normal training with all available data. We also performed extensive ablation studies and suggest parameters for practical use.
789 citations
••
TL;DR: This review describes the imprinted polymer production processes, the techniques used for reporting, and the applications of the reported sensors, including those designed to detect toxic chemicals, toxins in foods, drugs, explosives, and pathogens.
Abstract: Molecularly imprinted polymers are synthetic receptors for a targeted molecule. As such, they are analogues of the natural antibody–antigen systems. In this review, after a recounting of the early history of the general field, we specifically focus on the application of these polymers as sensors. In these applications, the polymers are paired with a reporting system, which may be electrical, electrochemical, optical, or gravimetric. The presence of the targeted molecule effects a change in the reporting agent, and a calibrated quantity of the target is recorded. In this review, we describe the imprinted polymer production processes, the techniques used for reporting, and the applications of the reported sensors. A brief survey of recent applications to gas-phase sensing is included, but the focus is primarily on the development of sensors for targets in solution. Included among the applications are those designed to detect toxic chemicals, toxins in foods, drugs, explosives, and pathogens. The application...
749 citations
•
TL;DR: EDA consists of four simple but powerful operations: synonym replacement, random insertion, random swap, and random deletion, which shows that EDA improves performance for both convolutional and recurrent neural networks.
Abstract: We present EDA: easy data augmentation techniques for boosting performance on text classification tasks. EDA consists of four simple but powerful operations: synonym replacement, random insertion, random swap, and random deletion. On five text classification tasks, we show that EDA improves performance for both convolutional and recurrent neural networks. EDA demonstrates particularly strong results for smaller datasets; on average, across five datasets, training with EDA while using only 50% of the available training set achieved the same accuracy as normal training with all available data. We also performed extensive ablation studies and suggest parameters for practical use.
666 citations
••
TL;DR: In this article, the authors show that anode-free lithium-metal pouch cells with a dual-salt LiDFOB/LiBF4 liquid electrolyte have 80% capacity remaining after 90 charge-discharge cycles.
Abstract: Cells with lithium-metal anodes are viewed as the most viable future technology, with higher energy density than existing lithium-ion batteries. Many researchers believe that for lithium-metal cells, the typical liquid electrolyte used in lithium-ion batteries must be replaced with a solid-state electrolyte to maintain the flat, dendrite-free lithium morphologies necessary for long-term stable cycling. Here, we show that anode-free lithium-metal pouch cells with a dual-salt LiDFOB/LiBF4 liquid electrolyte have 80% capacity remaining after 90 charge–discharge cycles, which is the longest life demonstrated to date for cells with zero excess lithium. The liquid electrolyte enables smooth dendrite-free lithium morphology comprised of densely packed columns even after 50 charge–discharge cycles. NMR measurements reveal that the electrolyte salts responsible for the excellent lithium morphology are slowly consumed during cycling. Extensive efforts have recently been geared towards developing all-solid-state batteries largely because of their potential to enable high-energy-density Li anodes. Here, the authors report a high-performance lithium pouch cell with no excess lithium, enabled by just a dual-salt liquid electrolyte.
520 citations
••
Queen Mary University of London1, Discovery Institute2, University of Glasgow3, Columbia University4, University College London5, King's College London6, Dartmouth College7, Brigham and Women's Hospital8, Leiden University Medical Center9, University of California, Los Angeles10, University of California, San Diego11, Temple University12, Brown University13, University of Edinburgh14, Babraham Institute15, Wellcome Trust Sanger Institute16, University of Bristol17, University of Essex18, CAS-MPG Partner Institute for Computational Biology19, RWTH Aachen University20, Macau University of Science and Technology21
TL;DR: Key challenges to understand clock mechanisms and biomarker utility are discussed, including dissecting the drivers and regulators of age-related changes in single-cell, tissue- and disease-specific models, as well as exploring other epigenomic marks, longitudinal and diverse population studies, and non-human models.
Abstract: Epigenetic clocks comprise a set of CpG sites whose DNA methylation levels measure subject age. These clocks are acknowledged as a highly accurate molecular correlate of chronological age in humans and other vertebrates. Also, extensive research is aimed at their potential to quantify biological aging rates and test longevity or rejuvenating interventions. Here, we discuss key challenges to understand clock mechanisms and biomarker utility. This requires dissecting the drivers and regulators of age-related changes in single-cell, tissue- and disease-specific models, as well as exploring other epigenomic marks, longitudinal and diverse population studies, and non-human models. We also highlight important ethical issues in forensic age determination and predicting the trajectory of biological aging in an individual.
457 citations
••
TL;DR: Key advances in the application of 2D materials, from both a historical and analytical perspective, are summarized for four different groups of analytes: gases, volatile compounds, ions, and biomolecules.
Abstract: Electrically–transduced sensors, with their simplicity and compatibility with standard electronic technologies, produce signals that can be efficiently acquired, processed, stored, and analyzed. Two dimensional (2D) nanomaterials, including graphene, phosphorene (BP), transition metal dichalcogenides (TMDCs), and others, have proven to be attractive for the fabrication of high–performance electrically-transduced chemical sensors due to their remarkable electronic and physical properties originating from their 2D structure. This review highlights the advances in electrically-transduced chemical sensing that rely on 2D materials. The structural components of such sensors are described, and the underlying operating principles for different types of architectures are discussed. The structural features, electronic properties, and surface chemistry of 2D nanostructures that dictate their sensing performance are reviewed. Key advances in the application of 2D materials, from both a historical and analytical pers...
443 citations
••
University of California, Los Angeles1, University of California, Berkeley2, Goddard Space Flight Center3, Nagoya University4, Kanazawa University5, Tohoku University6, Korea Astronomy and Space Science Institute7, The Aerospace Corporation8, University of Washington9, Dartmouth College10, Montana State University11, University of California, Santa Cruz12, National Cheng Kung University13, Academia Sinica Institute of Astronomy and Astrophysics14, University of Tokyo15, National Central University16, National Oceanic and Atmospheric Administration17, Cooperative Institute for Research in Environmental Sciences18, Johns Hopkins University Applied Physics Laboratory19, Kyushu University20, Kyoto University21, National Institute of Polar Research22, University of Colorado Boulder23, University of Iowa24, University of New Hampshire25, Southwest Research Institute26, National Center for Atmospheric Research27, Université Paris-Saclay28, Boston University29, Braunschweig University of Technology30, University of Calgary31, University of Graz32, University of Minnesota33
TL;DR: The SPEDAS development history, goals, and current implementation are reviewed, and its “modes of use” are explained with examples geared for users and its technical implementation and requirements with software developers in mind are outlined.
Abstract: With the advent of the Heliophysics/Geospace System Observatory (H/GSO), a complement of multi-spacecraft missions and ground-based observatories to study the space environment, data retrieval, analysis, and visualization of space physics data can be daunting. The Space Physics Environment Data Analysis System (SPEDAS), a grass-roots software development platform (
www.spedas.org
), is now officially supported by NASA Heliophysics as part of its data environment infrastructure. It serves more than a dozen space missions and ground observatories and can integrate the full complement of past and upcoming space physics missions with minimal resources, following clear, simple, and well-proven guidelines. Free, modular and configurable to the needs of individual missions, it works in both command-line (ideal for experienced users) and Graphical User Interface (GUI) mode (reducing the learning curve for first-time users). Both options have “crib-sheets,” user-command sequences in ASCII format that can facilitate record-and-repeat actions, especially for complex operations and plotting. Crib-sheets enhance scientific interactions, as users can move rapidly and accurately from exchanges of technical information on data processing to efficient discussions regarding data interpretation and science. SPEDAS can readily query and ingest all International Solar Terrestrial Physics (ISTP)-compatible products from the Space Physics Data Facility (SPDF), enabling access to a vast collection of historic and current mission data. The planned incorporation of Heliophysics Application Programmer’s Interface (HAPI) standards will facilitate data ingestion from distributed datasets that adhere to these standards. Although SPEDAS is currently Interactive Data Language (IDL)-based (and interfaces to Java-based tools such as Autoplot), efforts are under-way to expand it further to work with python (first as an interface tool and potentially even receiving an under-the-hood replacement). We review the SPEDAS development history, goals, and current implementation. We explain its “modes of use” with examples geared for users and outline its technical implementation and requirements with software developers in mind. We also describe SPEDAS personnel and software management, interfaces with other organizations, resources and support structure available to the community, and future development plans.
371 citations
•
01 Jan 2019
TL;DR: A forensic technique is described that models facial expressions and movements that typify an individual’s speaking pattern that can be used for authentication in the creation of deepfake videos.
Abstract: The creation of sophisticated fake videos has been largely relegated to Hollywood studios or state actors. Recent advances in deep learning, however, have made it significantly easier to create sophisticated and compelling fake videos. With relatively modest amounts of data and computing power, the average person can, for example, create a video of a world leader confessing to illegal activity leading to a constitutional crisis, a military leader saying something racially insensitive leading to civil unrest in an area of military activity, or a corporate titan claiming that their profits are weak leading to global stock manipulation. These so called deep fakes pose a significant threat to our democracy, national security, and society. To contend with this growing threat, we describe a forensic technique that models facial expressions and movements that typify an individual’s speaking pattern. Although not visually apparent, these correlations are often violated by the nature of how deep-fake videos are created and can, therefore, be used for authentication.
••
University of California, Davis1, Pennsylvania State University2, Aarhus University3, University of Oxford4, University of Lapland5, Institute of Arctic and Alpine Research6, Dartmouth College7, Johns Hopkins University Applied Physics Laboratory8, Umeå University9, University College London10, Harvard University11, Joint Institute for the Study of the Atmosphere and Ocean12, National Oceanic and Atmospheric Administration13
TL;DR: Expected consequences of increased Arctic warming include ongoing loss of land and sea ice, threats to wildlife and traditional human livelihoods, increased methane emissions, and extreme weather at lower latitudes.
Abstract: Over the past decade, the Arctic has warmed by 0.75°C, far outpacing the global average, while Antarctic temperatures have remained comparatively stable. As Earth approaches 2°C warming, the Arctic and Antarctic may reach 4°C and 2°C mean annual warming, and 7°C and 3°C winter warming, respectively. Expected consequences of increased Arctic warming include ongoing loss of land and sea ice, threats to wildlife and traditional human livelihoods, increased methane emissions, and extreme weather at lower latitudes. With low biodiversity, Antarctic ecosystems may be vulnerable to state shifts and species invasions. Land ice loss in both regions will contribute substantially to global sea level rise, with up to 3 m rise possible if certain thresholds are crossed. Mitigation efforts can slow or reduce warming, but without them northern high latitude warming may accelerate in the next two to four decades. International cooperation will be crucial to foreseeing and adapting to expected changes.
••
TL;DR: A number of core pathways and mechanisms of fibrosis, outlined in this Review, are shared across different tissues and might therefore present targets for general antifibrotic strategies and might enable the development of general antIFIBrotic compounds that are effective across different disease entities and organs.
Abstract: Fibrosis is defined as an excessive deposition of connective tissue components and can affect virtually every organ system, including the skin, lungs, liver and kidney. Fibrotic tissue remodelling often leads to organ malfunction and is commonly associated with high morbidity and mortality. The medical need for effective antifibrotic therapies is thus very high. However, the extraordinarily high costs of drug development and the rare incidence of many fibrotic disorders hinder the development of targeted therapies for individual fibrotic diseases. A potential strategy to overcome this challenge is to target common mechanisms and core pathways that are of central pathophysiological relevance across different fibrotic diseases. The factors influencing susceptibility to and initiation of these diseases are often distinct, with disease-specific and organ-specific risk factors, triggers and sites of first injury. Fibrotic remodelling programmes with shared fibrotic signalling responses such as transforming growth factor-β (TGFβ), platelet-derived growth factor (PDGF), WNT and hedgehog signalling drive disease progression in later stages of fibrotic diseases. The convergence towards shared responses has consequences for drug development as it might enable the development of general antifibrotic compounds that are effective across different disease entities and organs. Technological advances, including new models, single-cell technologies and gene editing, could provide new insights into the pathogenesis of fibrotic diseases and the development of drugs for their treatment. A number of core pathways and mechanisms of fibrosis, outlined in this Review, are shared across different tissues and might therefore present targets for general antifibrotic strategies. Organ-specific and disease-specific differences in fibrotic diseases could also provide insights for drug development efforts.
••
TL;DR: It is found that dependent variables from self-report surveys of self-regulation have high test–retest reliability, while DVs derived from behavioral tasks do not, and it is confirmed that this is due to differences in between-subject variability.
Abstract: The ability to regulate behavior in service of long-term goals is a widely studied psychological construct known as self-regulation. This wide interest is in part due to the putative relations between self-regulation and a range of real-world behaviors. Self-regulation is generally viewed as a trait, and individual differences are quantified using a diverse set of measures, including self-report surveys and behavioral tasks. Accurate characterization of individual differences requires measurement reliability, a property frequently characterized in self-report surveys, but rarely assessed in behavioral tasks. We remedy this gap by (i) providing a comprehensive literature review on an extensive set of self-regulation measures and (ii) empirically evaluating test-retest reliability of this battery in a new sample. We find that dependent variables (DVs) from self-report surveys of self-regulation have high test-retest reliability, while DVs derived from behavioral tasks do not. This holds both in the literature and in our sample, although the test-retest reliability estimates in the literature are highly variable. We confirm that this is due to differences in between-subject variability. We also compare different types of task DVs (e.g., model parameters vs. raw response times) in their suitability as individual difference DVs, finding that certain model parameters are as stable as raw DVs. Our results provide greater psychometric footing for the study of self-regulation and provide guidance for future studies of individual differences in this domain.
••
Queen Mary University of London1, Winthrop-University Hospital2, Karolinska Institutet3, Boston Children's Hospital4, Fondazione IRCCS Ca' Granda Ospedale Maggiore Policlinico5, Harvard University6, University of Colorado Denver7, McMaster University8, University of Auckland9, Universitaire Ziekenhuizen Leuven10, University of Newcastle11, Dublin City University12, University of Tampere13, University of Birmingham14, Pennsylvania State University15, University of Otago16, QIMR Berghofer Medical Research Institute17, Dartmouth College18, Menzies Research Institute19, University of Delhi20, Jikei University School of Medicine21
TL;DR: Vitamin D supplementation was safe, and it protected against ARIs overall, and incorporation of additional IPD from ongoing trials in the field has the potential to increase statistical power for analyses of secondary outcomes.
Abstract: Background Randomised controlled trials (RCTs) exploring the potential of vitamin D to prevent acute respiratory infections have yielded mixed results. Individual participant data (IPD) meta-analysis has the potential to identify factors that may explain this heterogeneity. Objectives To assess the overall effect of vitamin D supplementation on the risk of acute respiratory infections (ARIs) and to identify factors modifying this effect. Data sources MEDLINE, EMBASE, the Cochrane Central Register of Controlled Trials (CENTRAL), Web of Science, ClinicalTrials.gov and the International Standard Randomised Controlled Trials Number (ISRCTN) registry. Study selection Randomised, double-blind, placebo-controlled trials of supplementation with vitamin D3 or vitamin D2 of any duration having incidence of acute respiratory infection as a prespecified efficacy outcome were selected. Study appraisal Study quality was assessed using the Cochrane Collaboration Risk of Bias tool to assess sequence generation, allocation concealment, blinding of participants, personnel and outcome assessors, completeness of outcome data, evidence of selective outcome reporting and other potential threats to validity. Results We identified 25 eligible RCTs (a total of 11,321 participants, aged from 0 to 95 years). IPD were obtained for 10,933 out of 11,321 (96.6%) participants. Vitamin D supplementation reduced the risk of ARI among all participants [adjusted odds ratio (aOR) 0.88, 95% confidence interval (CI) 0.81 to 0.96; heterogeneity p < 0.001]. Subgroup analysis revealed that protective effects were seen in individuals receiving daily or weekly vitamin D without additional bolus doses (aOR 0.81, 95% CI 0.72 to 0.91), but not in those receiving one or more bolus doses (aOR 0.97, 95% CI 0.86 to 1.10; p = 0.05). Among those receiving daily or weekly vitamin D, protective effects of vitamin D were stronger in individuals with a baseline 25-hydroxyvitamin D [25(OH)D] concentration of < 25 nmol/l (aOR 0.30, 95% CI 0.17 to 0.53) than in those with a baseline 25(OH)D concentration of ≥ 25 nmol/l (aOR 0.75, 95% CI 0.60 to 0.95; p = 0.006). Vitamin D did not influence the proportion of participants experiencing at least one serious adverse event (aOR 0.98, 95% CI 0.80 to 1.20; p = 0.83). The body of evidence contributing to these analyses was assessed as being of high quality. Limitations Our study had limited power to detect the effects of vitamin D supplementation on the risk of upper versus lower respiratory infection, analysed separately. Conclusions Vitamin D supplementation was safe, and it protected against ARIs overall. Very deficient individuals and those not receiving bolus doses experienced the benefit. Incorporation of additional IPD from ongoing trials in the field has the potential to increase statistical power for analyses of secondary outcomes. Study registration This study is registered as PROSPERO CRD42014013953. Funding The National Institute for Health Research Health Technology Assessment programme.
••
TL;DR: Electron paramagnetic resonance spectroscopy and X-ray photoelectron spectros-copy studies suggested that the chemiresistive response of the COF-DC-8 involves charge transfer interactions between the analyte and nickel-phthalocyanine component of the framework.
Abstract: This paper describes the synthesis of a novel intrinsically conductive two-dimensional (2D) covalent organic framework (COF) through the aromatic annulation of 2,3,9,10,16,17,23,24-octa-aminophthalocyanine nickel(II) and pyrene-4,5,9,10-tetraone. The intrinsic bulk conductivity of the COF material (termed COF-DC-8) reached 2.51 × 10-3 S/m, and increased by 3 orders of magnitude with I2 doping. Electronic calculations revealed an anisotropic band structure, with the possibility for significant contribution from out-of-plane charge-transport to the intrinsic bulk conductivity. Upon integration into chemiresistive devices, this conductive COF showed excellent responses to various reducing and oxidizing gases, including NH3, H2S, NO, and NO2, with parts-per-billion (ppb) limits of detection (for NH3 = 70 ppb, for H2S = 204 ppb, for NO = 5 ppb, and for NO2 = 16 ppb based on 1.5 min exposure). Electron paramagnetic resonance spectroscopy and X-ray photoelectron spectroscopy studies suggested that the chemiresistive response of the COF-DC-8 involves charge transfer interactions between the analyte and nickelphthalocyanine component of the framework.
••
TL;DR: The third CAFA challenge, CAFA3, that featured an expanded analysis over the previous CAFA rounds, both in terms of volume of data analyzed and the types of analysis performed, concluded that while predictions of the molecular function and biological process annotations have slightly improved over time, those of the cellular component have not.
Abstract: The Critical Assessment of Functional Annotation (CAFA) is an ongoing, global, community-driven effort to evaluate and improve the computational annotation of protein function. Here, we report on the results of the third CAFA challenge, CAFA3, that featured an expanded analysis over the previous CAFA rounds, both in terms of volume of data analyzed and the types of analysis performed. In a novel and major new development, computational predictions and assessment goals drove some of the experimental assays, resulting in new functional annotations for more than 1000 genes. Specifically, we performed experimental whole-genome mutation screening in Candida albicans and Pseudomonas aureginosa genomes, which provided us with genome-wide experimental data for genes associated with biofilm formation and motility. We further performed targeted assays on selected genes in Drosophila melanogaster, which we suspected of being involved in long-term memory. We conclude that while predictions of the molecular function and biological process annotations have slightly improved over time, those of the cellular component have not. Term-centric prediction of experimental annotations remains equally challenging; although the performance of the top methods is significantly better than the expectations set by baseline methods in C. albicans and D. melanogaster, it leaves considerable room and need for improvement. Finally, we report that the CAFA community now involves a broad range of participants with expertise in bioinformatics, biological experimentation, biocuration, and bio-ontologies, working together to improve functional annotation, computational function prediction, and our ability to manage big data in the era of large experimental screens.
••
Colorado State University1, University of Auckland2, New Mexico Institute of Mining and Technology3, University of Hull4, Macquarie University5, United States Forest Service6, Durham University7, University of Lausanne8, Dartmouth College9, Texas State University10, University of Texas at Austin11, University of Vienna12, Concordia University Wisconsin13
TL;DR: The value in evaluating boundaries between components of geomorphic systems as transition zones and examining the fluxes across them to understand landscape functioning is emphasized.
Abstract: Connectivity describes the efficiency of material transfer between geomorphic system components such as hillslopes and rivers or longitudinal segments within a river network. Representations of geomorphic systems as networks should recognize that the compartments, links, and nodes exhibit connectivity at differing scales. The historical underpinnings of connectivity in geomorphology involve management of geomorphic systems and observations linking surface processes to landform dynamics. Current work in geomorphic connectivity emphasizes hydrological, sediment, or landscape connectivity. Signatures of connectivity can be detected using diverse indicators that vary from contemporary processes to stratigraphic records or a spatial metric such as sediment yield that encompasses geomorphic processes operate over time and space. One approach to measuring connectivity is to determine the fundamental temporal and spatial scales for the phenomenon of interest and to make measurements at a sufficiently large multiple of the fundamental scales to capture reliably a representative sample. Another approach seeks to characterize how connectivity varies with scale, by applying the same metric over a wide range of scales or using statistical measures that characterize the frequency distributions of connectivity across scales. Identifying and measuring connectivity is useful in basic and applied geomorphic research and we explore the implications of connectivity for river management. Common themes and ideas that merit further research include; increased understanding of the importance of capturing landscape heterogeneity and connectivity patterns; the potential to use graph and network theory metrics in analyzing connectivity; the need to understand which metrics best represent the physical system and its connectivity pathways, and to apply these metrics to the validation of numerical models; and the need to recognize the importance of low levels of connectivity in some situations. We emphasize the value in evaluating boundaries between components of geomorphic systems as transition zones and examining the fluxes across them to understand landscape functioning.
••
••
TL;DR: The marketing of prescription drugs, disease awareness campaigns, health services, and laboratory tests and the related consequences and regulation in the United States over a 20-year period (1997-2016) is reviewed.
Abstract: Importance Manufacturers, companies, and health care professionals and organizations use an array of promotional activities to sell and increase market share of their products and services. These activities seek to shape public and clinician beliefs about laboratory testing, the benefits and harms of prescription drugs, and some disease definitions. Objective To review the marketing of prescription drugs, disease awareness campaigns, health services, and laboratory tests and the related consequences and regulation in the United States over a 20-year period (1997-2016). Evidence Analysis (1997-2016) of consumer advertising (Kantar Media data for spending and number of ads); professional marketing (IQVIA Institute for Human Data Science, Open Payments Data [Centers for Medicare & Medicaid Services]); regulations and legal actions of the US Food and Drug Administration (FDA), Federal Trade Commission (FTC), state attorneys general, and US Department of Justice; and searches (1975-2018) of peer-reviewed medical literature (PubMed), business journals (Business Source Ultimate), and news media (Lexis Nexis) for articles about expenditures, content, and consequences and regulation of consumer and professional medical marketing. Spending is reported in 2016 dollars. Findings From 1997 through 2016, spending on medical marketing of drugs, disease awareness campaigns, health services, and laboratory testing increased from $17.7 to $29.9 billion. The most rapid increase was in direct-to-consumer (DTC) advertising, which increased from $2.1 billion (11.9%) of total spending in 1997 to $9.6 billion (32.0%) of total spending in 2016. DTC prescription drug advertising increased from $1.3 billion (79 000 ads) to $6 billion (4.6 million ads [including 663 000 TV commercials]), with a shift toward advertising high-cost biologics and cancer immunotherapies. Pharmaceutical companies increased DTC marketing about diseases treated by their drugs with increases in disease awareness campaigns from 44 to 401 and in spending from $177 million to $430 million. DTC advertising for health services increased from $542 million to $2.9 billion, with the largest spending increases by hospitals, dental centers, cancer centers, mental health and addiction clinics, and medical services (eg, home health). DTC spending on advertising for laboratory tests (such as genetic testing) increased from $75.4 million to $82.6 million, although the number of ads increased more substantially (from 14 100 to 255 300), reflecting an increase in less expensive electronic media advertising. Marketing to health care professionals by pharmaceutical companies accounted for most promotional spending and increased from $15.6 billion to $20.3 billion, including $5.6 billion for prescriber detailing, $13.5 billion for free samples, $979 million for direct physician payments (eg, speaking fees, meals) related to specific drugs, and $59 million for disease education. Manufacturers of FDA-approved laboratory tests paid $12.9 million to professionals in 2016. From 1997 through 2016, the number of consumer and professional drug promotional materials that companies submitted for FDA review increased from 34 182 to 97 252, while FDA violation letters for misleading drug marketing decreased from 156 to 11. Since 1997, 103 financial settlements between drug companies and federal and state governments resulted in more than $11 billion in fines for off-label or deceptive marketing practices. The FTC has acted against misleading marketing by a single for-profit cancer center. Conclusions and Relevance Medical marketing increased substantially from 1997 through 2016, especially DTC advertising for prescription drugs and health services. Pharmaceutical marketing to health professionals accounted for most spending and remains high even with new policies to limit industry influence. Despite the increase in marketing over 20 years, regulatory oversight remains limited.
••
Northwestern University1, University of California, San Diego2, University of Illinois at Urbana–Champaign3, Colorado State University4, University of Colorado Boulder5, City University of New York6, Dartmouth College7, University of Texas at Austin8, University of Wisconsin-Madison9, University of Minnesota10, National Scientific and Technical Research Council11, University of Los Andes12, University of Arizona13, J. Craig Venter Institute14
TL;DR: The findings indicate that mammalian gut microbiome plasticity in response to dietary shifts over both the lifespan of an individual host and the evolutionary history of a given host species is constrained by host physiological evolution, and the gut microbiome cannot be considered separately from host physiology when describing host nutritional strategies and the emergence of host dietary niches.
Abstract: Over the past decade several studies have reported that the gut microbiomes of mammals with similar dietary niches exhibit similar compositional and functional traits. However, these studies rely heavily on samples from captive individuals and often confound host phylogeny, gut morphology, and diet. To more explicitly test the influence of host dietary niche on the mammalian gut microbiome we use 16S rRNA gene amplicon sequencing and shotgun metagenomics to compare the gut microbiota of 18 species of wild non-human primates classified as either folivores or closely related non-folivores, evenly distributed throughout the primate order and representing a range of gut morphological specializations. While folivory results in some convergent microbial traits, collectively we show that the influence of host phylogeny on both gut microbial composition and function is much stronger than that of host dietary niche. This pattern does not result from differences in host geographic location or actual dietary intake at the time of sampling, but instead appears to result from differences in host physiology. These findings indicate that mammalian gut microbiome plasticity in response to dietary shifts over both the lifespan of an individual host and the evolutionary history of a given host species is constrained by host physiological evolution. Therefore, the gut microbiome cannot be considered separately from host physiology when describing host nutritional strategies and the emergence of host dietary niches.
••
••
TL;DR: It is concluded that self-regulation lacks coherence as a construct, and that data-driven ontologies lay the groundwork for a cumulative psychological science.
Abstract: Psychological sciences have identified a wealth of cognitive processes and behavioral phenomena, yet struggle to produce cumulative knowledge. Progress is hamstrung by siloed scientific traditions and a focus on explanation over prediction, two issues that are particularly damaging for the study of multifaceted constructs like self-regulation. Here, we derive a psychological ontology from a study of individual differences across a broad range of behavioral tasks, self-report surveys, and self-reported real-world outcomes associated with self-regulation. Though both tasks and surveys putatively measure self-regulation, they show little empirical relationship. Within tasks and surveys, however, the ontology identifies reliable individual traits and reveals opportunities for theoretic synthesis. We then evaluate predictive power of the psychological measurements and find that while surveys modestly and heterogeneously predict real-world outcomes, tasks largely do not. We conclude that self-regulation lacks coherence as a construct, and that data-driven ontologies lay the groundwork for a cumulative psychological science.
••
TL;DR: The information presented in this review depicts the multiple, beneficial effects of A. nodosum-based biostimulant extracts on plant growth and their defense responses and suggests new opportunities for further applications for marked benefits in production and quality in the agriculture and horticultural sectors.
Abstract: Abiotic and biotic stresses limit the growth and productivity of plants. In the current global scenario, in order to meet the requirements of the ever-increasing world population, chemical pesticides and synthetic fertilizers are used to boost agricultural production. These harmful chemicals pose a serious threat to the health of humans, animals, plants, and the entire biosphere. To minimize the agricultural chemical footprint, extracts of Ascophyllum nodosum (ANE) have been explored for their ability to improve plant growth and agricultural productivity. The scientific literature reviewed in this article attempts to explain how certain bioactive compounds present in extracts aid to improve plant tolerances to abiotic and/or biotic stresses, plant growth promotion, and their effects on root/microbe interactions. These reports have highlighted the use of various seaweed extracts in improving nutrient use efficiency in treated plants. These studies include investigations of physiological, biochemical, and molecular mechanisms as evidenced using model plants. However, the various modes of action of A. nodosum extracts have not been previously reviewed. The information presented in this review depicts the multiple, beneficial effects of A. nodosum-based biostimulant extracts on plant growth and their defense responses and suggests new opportunities for further applications for marked benefits in production and quality in the agriculture and horticultural sectors.
••
TL;DR: This study enrolled adults who fulfilled the 2010 American College of Rheumatology (ACR)-European League Against Rheumatism (EULAR) classification criteria for rheumatoid arthritis and evaluated the safety and efficacy of upadacitinib monotherapy after switching from methotrexate versus continuing metotrexate in patients with inadequate response to methot Rexate.
••
TL;DR: Extended follow-up of the NLST showed a similar NNS as the original analysis, and there was no overall increase in lung cancer incidence in the LDCT versus CXR arm.
••
••
TL;DR: Manufactured NKG2D-CAR T cells exhibited functional activity against autologous tumor cells in vitro, but modifications to enhance CAR T-cell expansion and target density may be needed to boost clinical activity.
Abstract: NKG2D ligands are widely expressed in solid and hematologic malignancies but absent or poorly expressed on healthy tissues. We conducted a phase I dose-escalation study to evaluate the safety and feasibility of a single infusion of NKG2D-chimeric antigen receptor (CAR) T cells, without lymphodepleting conditioning in subjects with acute myeloid leukemia/myelodysplastic syndrome or relapsed/refractory multiple myeloma. Autologous T cells were transfected with a γ-retroviral vector encoding a CAR fusing human NKG2D with the CD3ζ signaling domain. Four dose levels (1 × 106-3 × 107 total viable T cells) were evaluated. Twelve subjects were infused [7 acute myeloid leukemia (AML) and 5 multiple myeloma]. NKG2D-CAR products demonstrated a median 75% vector-driven NKG2D expression on CD3+ T cells. No dose-limiting toxicities, cytokine release syndrome, or CAR T cell-related neurotoxicity was observed. No significant autoimmune reactions were noted, and none of the ≥ grade 3 adverse events were attributable to NKG2D-CAR T cells. At the single injection of low cell doses used in this trial, no objective tumor responses were observed. However, hematologic parameters transiently improved in one subject with AML at the highest dose, and cases of disease stability without further therapy or on subsequent treatments were noted. At 24 hours, the cytokine RANTES increased a median of 1.9-fold among all subjects and 5.8-fold among six AML patients. Consistent with preclinical studies, NKG2D-CAR T cell-expansion and persistence were limited. Manufactured NKG2D-CAR T cells exhibited functional activity against autologous tumor cells in vitro, but modifications to enhance CAR T-cell expansion and target density may be needed to boost clinical activity.
••
University of Hawaii1, European Space Research and Technology Centre2, Sea Education Association3, Utrecht University4, University of Oldenburg5, University of Southampton6, IFREMER7, Plymouth Marine Laboratory8, University of the Highlands and Islands9, University of Plymouth10, University of Southern Denmark11, University of California, San Diego12, University of Copenhagen13, Atlantic Oceanographic and Meteorological Laboratory14, National Institute of Oceanography, India15, Kyushu University16, Shirshov Institute of Oceanology17, California Institute of Technology18, Carnegie Institution for Science19, Williams College20, United States Environmental Protection Agency21, University of Exeter22, Commonwealth Scientific and Industrial Research Organisation23, International Council for the Exploration of the Sea24, VU University Amsterdam25, National Oceanic and Atmospheric Administration26, Dartmouth College27, Catholic University of the North28, University of São Paulo29
TL;DR: In this article, the authors discuss the structure of the future integrated marine debris observing system (IMDOS) that is required to provide long-term monitoring of the state of the anthropogenic pollution and support operational activities to mitigate impacts on the ecosystem and safety of maritime activity.
Abstract: Plastics and other artificial materials pose new risks to health of the ocean. Anthropogenic debris travels across large distances and is ubiquitous in the water and on the shorelines, yet, observations of its sources, composition, pathways and distributions in the ocean are very sparse and inaccurate. Total amounts of plastics and other man-made debris in the ocean and on the shore, temporal trends in these amounts under exponentially increasing production, as well as degradation processes, vertical fluxes and time scales are largely unknown. Present ocean circulation models are not able to accurately simulate drift of debris because of its complex hydrodynamics. In this paper we discuss the structure of the future integrated marine debris observing system (IMDOS) that is required to provide long-term monitoring of the state of the anthropogenic pollution and support operational activities to mitigate impacts on the ecosystem and safety of maritime activity. The proposed observing system integrates remote sensing and in situ observations. Also, models are used to optimize the design of the system and, in turn, they will be gradually improved using the products of the system. Remote sensing technologies will provide spatially coherent coverage and consistent surveying time series at local to global scale. Optical sensors, including high-resolution imaging, multi- and hyperspectral, fluorescence, and Raman technologies, as well as SAR will be used to measure different types of debris. They will be implemented in a variety of platforms, from hand-held tools to ship-, buoy-, aircraft-, and satellite-based sensors. A network of in situ observations, including reports from volunteers, citizen scientists and ships of opportunity, will be developed to provide data for calibration/validation of remote sensors and to monitor the spread of plastic pollution and other marine debris. IMDOS will interact with other observing systems monitoring physical, chemical, and biological processes in the ocean and on shorelines as well as state of the ecosystem, maritime activities and safety, drift of sea ice, etc. The synthesized data will support innovative multi-disciplinary research and serve diverse community of users.
••
TL;DR: In this article, a convolutional neural network is used to identify regions of neoplastic cells, then aggregates those classifications to infer predominant and minor histologic patterns for any given whole-slide image.
Abstract: Classification of histologic patterns in lung adenocarcinoma is critical for determining tumor grade and treatment for patients. However, this task is often challenging due to the heterogeneous nature of lung adenocarcinoma and the subjective criteria for evaluation. In this study, we propose a deep learning model that automatically classifies the histologic patterns of lung adenocarcinoma on surgical resection slides. Our model uses a convolutional neural network to identify regions of neoplastic cells, then aggregates those classifications to infer predominant and minor histologic patterns for any given whole-slide image. We evaluated our model on an independent set of 143 whole-slide images. It achieved a kappa score of 0.525 and an agreement of 66.6% with three pathologists for classifying the predominant patterns, slightly higher than the inter-pathologist kappa score of 0.485 and agreement of 62.7% on this test set. All evaluation metrics for our model and the three pathologists were within 95% confidence intervals of agreement. If confirmed in clinical practice, our model can assist pathologists in improving classification of lung adenocarcinoma patterns by automatically pre-screening and highlighting cancerous regions prior to review. Our approach can be generalized to any whole-slide image classification task, and code is made publicly available at https://github.com/BMIRDS/deepslide .