Showing papers by "Pennsylvania State University published in 2021"
••
TL;DR: Using a variety of statistical and dynamic modeling approaches, the authors estimate that this variant has a 43 to 90% (range of 95% credible intervals, 38 to 130%) higher reproduction number than preexisting variants, and a fitted two-strain dynamic transmission model shows that VOC 202012/01 will lead to large resurgences of COVID-19 cases.
Abstract: A severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) variant, VOC 202012/01 (lineage B.1.1.7), emerged in southeast England in September 2020 and is rapidly spreading toward fixation. Using a variety of statistical and dynamic modeling approaches, we estimate that this variant has a 43 to 90% (range of 95% credible intervals, 38 to 130%) higher reproduction number than preexisting variants. A fitted two-strain dynamic transmission model shows that VOC 202012/01 will lead to large resurgences of COVID-19 cases. Without stringent control measures, including limited closure of educational institutions and a greatly accelerated vaccine rollout, COVID-19 hospitalizations and deaths across England in the first 6 months of 2021 were projected to exceed those in 2020. VOC 202012/01 has spread globally and exhibits a similar transmission increase (59 to 74%) in Denmark, Switzerland, and the United States.
1,935 citations
••
University of Nebraska Medical Center1, University of Texas Health Science Center at San Antonio2, Emory University3, National Institutes of Health4, Duke University5, University of California, Irvine6, University of Minnesota7, Cedars-Sinai Medical Center8, University of Florida9, Parkland Health & Hospital System10, University of California, San Diego11, Baylor College of Medicine12, University of Rochester13, Tan Tock Seng Hospital14, Scott & White Hospital15, University of California, San Francisco16, University of California, Davis17, University of Massachusetts Medical School18, University of Virginia19, Northwestern University20, Pennsylvania State University21, Providence Sacred Heart Medical Center and Children's Hospital22, University of Alabama at Birmingham23, Stanford University24, Denver Health Medical Center25, Seoul National University26, Changi General Hospital27, Kaiser Permanente28, Uniformed Services University of the Health Sciences29, Eli Lilly and Company30
TL;DR: Baricitinib plus remdesivir was superior to remdes Vivir alone in reducing recovery time and accelerating improvement in clinical status among patients with Covid-19, notably among those receiving high-flow oxygen or noninvasive ventilation.
Abstract: Background Severe coronavirus disease 2019 (Covid-19) is associated with dysregulated inflammation. The effects of combination treatment with baricitinib, a Janus kinase inhibitor, plus remdesivir are not known. Methods We conducted a double-blind, randomized, placebo-controlled trial evaluating baricitinib plus remdesivir in hospitalized adults with Covid-19. All the patients received remdesivir (≤10 days) and either baricitinib (≤14 days) or placebo (control). The primary outcome was the time to recovery. The key secondary outcome was clinical status at day 15. Results A total of 1033 patients underwent randomization (with 515 assigned to combination treatment and 518 to control). Patients receiving baricitinib had a median time to recovery of 7 days (95% confidence interval [CI], 6 to 8), as compared with 8 days (95% CI, 7 to 9) with control (rate ratio for recovery, 1.16; 95% CI, 1.01 to 1.32; P = 0.03), and a 30% higher odds of improvement in clinical status at day 15 (odds ratio, 1.3; 95% CI, 1.0 to 1.6). Patients receiving high-flow oxygen or noninvasive ventilation at enrollment had a time to recovery of 10 days with combination treatment and 18 days with control (rate ratio for recovery, 1.51; 95% CI, 1.10 to 2.08). The 28-day mortality was 5.1% in the combination group and 7.8% in the control group (hazard ratio for death, 0.65; 95% CI, 0.39 to 1.09). Serious adverse events were less frequent in the combination group than in the control group (16.0% vs. 21.0%; difference, -5.0 percentage points; 95% CI, -9.8 to -0.3; P = 0.03), as were new infections (5.9% vs. 11.2%; difference, -5.3 percentage points; 95% CI, -8.7 to -1.9; P = 0.003). Conclusions Baricitinib plus remdesivir was superior to remdesivir alone in reducing recovery time and accelerating improvement in clinical status among patients with Covid-19, notably among those receiving high-flow oxygen or noninvasive ventilation. The combination was associated with fewer serious adverse events. (Funded by the National Institute of Allergy and Infectious Diseases; ClinicalTrials.gov number, NCT04401579.).
1,301 citations
••
Daniel J. Klionsky1, Amal Kamal Abdel-Aziz2, Sara Abdelfatah3, Mahmoud Abdellatif4 +2980 more•Institutions (777)
TL;DR: In this article, the authors present a set of guidelines for investigators to select and interpret methods to examine autophagy and related processes, and for reviewers to provide realistic and reasonable critiques of reports that are focused on these processes.
Abstract: In 2008, we published the first set of guidelines for standardizing research in autophagy. Since then, this topic has received increasing attention, and many scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Thus, it is important to formulate on a regular basis updated guidelines for monitoring autophagy in different organisms. Despite numerous reviews, there continues to be confusion regarding acceptable methods to evaluate autophagy, especially in multicellular eukaryotes. Here, we present a set of guidelines for investigators to select and interpret methods to examine autophagy and related processes, and for reviewers to provide realistic and reasonable critiques of reports that are focused on these processes. These guidelines are not meant to be a dogmatic set of rules, because the appropriateness of any assay largely depends on the question being asked and the system being used. Moreover, no individual assay is perfect for every situation, calling for the use of multiple techniques to properly monitor autophagy in each experimental setting. Finally, several core components of the autophagy machinery have been implicated in distinct autophagic processes (canonical and noncanonical autophagy), implying that genetic approaches to block autophagy should rely on targeting two or more autophagy-related genes that ideally participate in distinct steps of the pathway. Along similar lines, because multiple proteins involved in autophagy also regulate other cellular pathways including apoptosis, not all of them can be used as a specific marker for bona fide autophagic responses. Here, we critically discuss current methods of assessing autophagy and the information they can, or cannot, provide. Our ultimate goal is to encourage intellectual and technical innovation in the field.
1,129 citations
••
Daniel Taliun1, Daniel N. Harris2, Michael D. Kessler2, Jedidiah Carlson3 +202 more•Institutions (61)
TL;DR: The Trans-Omics for Precision Medicine (TOPMed) project as discussed by the authors aims to elucidate the genetic architecture and biology of heart, lung, blood and sleep disorders, with the ultimate goal of improving diagnosis, treatment and prevention of these diseases.
Abstract: The Trans-Omics for Precision Medicine (TOPMed) programme seeks to elucidate the genetic architecture and biology of heart, lung, blood and sleep disorders, with the ultimate goal of improving diagnosis, treatment and prevention of these diseases The initial phases of the programme focused on whole-genome sequencing of individuals with rich phenotypic data and diverse backgrounds Here we describe the TOPMed goals and design as well as the available resources and early insights obtained from the sequence data The resources include a variant browser, a genotype imputation server, and genomic and phenotypic data that are available through dbGaP (Database of Genotypes and Phenotypes)1 In the first 53,831 TOPMed samples, we detected more than 400 million single-nucleotide and insertion or deletion variants after alignment with the reference genome Additional previously undescribed variants were detected through assembly of unmapped reads and customized analysis in highly variable loci Among the more than 400 million detected variants, 97% have frequencies of less than 1% and 46% are singletons that are present in only one individual (53% among unrelated individuals) These rare variants provide insights into mutational processes and recent human evolutionary history The extensive catalogue of genetic variation in TOPMed studies provides unique opportunities for exploring the contributions of rare and noncoding sequence variants to phenotypic variation Furthermore, combining TOPMed haplotypes with modern imputation methods improves the power and reach of genome-wide association studies to include variants down to a frequency of approximately 001% The goals, resources and design of the NHLBI Trans-Omics for Precision Medicine (TOPMed) programme are described, and analyses of rare variants detected in the first 53,831 samples provide insights into mutational processes and recent human evolutionary history
801 citations
••
National Institutes of Health1, Wellcome Trust Sanger Institute2, University of Cambridge3, Rockefeller University4, University of California, Davis5, Leibniz Association6, Seoul National University7, University of Southern California8, European Bioinformatics Institute9, Max Planck Society10, Dresden University of Technology11, Radboud University Nijmegen12, University of St Andrews13, University of Massachusetts Amherst14, University of Adelaide15, University of Missouri16, East Carolina University17, University of Queensland18, Clemson University19, University of Otago20, University of Arizona21, Natural History Museum22, Bangor University23, University of Konstanz24, Harvard University25, Northeastern University26, University of Antwerp27, National Museum of Natural History28, University of Graz29, University of Florida30, University of Basel31, University of California, Santa Cruz32, Zoological Society of San Diego33, Pacific Biosciences34, Pompeu Fabra University35, University of Maryland, College Park36, Harbin Institute of Technology37, University of Chicago38, Oregon Health & Science University39, Qatar Airways40, Monash University Malaysia Campus41, University of Milan42, Goethe University Frankfurt43, Pennsylvania State University44, University of Los Andes45, University of Copenhagen46, Norwegian University of Science and Technology47, Agency for Science, Technology and Research48, Royal Ontario Museum49, Smithsonian Institution50, Howard Hughes Medical Institute51, Walter Reed Army Institute of Research52, University of East Anglia53, University College Dublin54, University of Illinois at Urbana–Champaign55, La Trobe University56, University of California, San Diego57, Nova Southeastern University58
TL;DR: The Vertebrate Genomes Project (VGP) as mentioned in this paper is an international effort to generate high quality, complete reference genomes for all of the roughly 70,000 extant vertebrate species and to help to enable a new era of discovery across the life sciences.
Abstract: High-quality and complete reference genome assemblies are fundamental for the application of genomics to biology, disease, and biodiversity conservation. However, such assemblies are available for only a few non-microbial species1-4. To address this issue, the international Genome 10K (G10K) consortium5,6 has worked over a five-year period to evaluate and develop cost-effective methods for assembling highly accurate and nearly complete reference genomes. Here we present lessons learned from generating assemblies for 16 species that represent six major vertebrate lineages. We confirm that long-read sequencing technologies are essential for maximizing genome quality, and that unresolved complex repeats and haplotype heterozygosity are major sources of assembly error when not handled correctly. Our assemblies correct substantial errors, add missing sequence in some of the best historical reference genomes, and reveal biological discoveries. These include the identification of many false gene duplications, increases in gene sizes, chromosome rearrangements that are specific to lineages, a repeated independent chromosome breakpoint in bat genomes, and a canonical GC-rich pattern in protein-coding genes and their regulatory regions. Adopting these lessons, we have embarked on the Vertebrate Genomes Project (VGP), an international effort to generate high-quality, complete reference genomes for all of the roughly 70,000 extant vertebrate species and to help to enable a new era of discovery across the life sciences.
647 citations
••
TL;DR: It is hypothesized that age-related decline and dysregulation of immune function, i.e., immunosenescence and inflammaging play a major role in contributing to heightened vulnerability to severe COVID-19 outcomes in older adults and partitioning all immunological outcome data by age to better understand disease heterogeneity and aging.
495 citations
••
01 Oct 2021
TL;DR: More than half of COVID-19 survivors experienced persistent postacute sequelae (PASC) 6 months after recovery as mentioned in this paper, and most common PASC involved functional mobility impairments, pulmonary abnormalities, and mental health disorders.
Abstract: Importance Short-term and long-term persistent postacute sequelae of COVID-19 (PASC) have not been systematically evaluated. The incidence and evolution of PASC are dependent on time from infection, organ systems and tissue affected, vaccination status, variant of the virus, and geographic region. Objective To estimate organ system–specific frequency and evolution of PASC. Evidence Review PubMed (MEDLINE), Scopus, the World Health Organization Global Literature on Coronavirus Disease, and CoronaCentral databases were searched from December 2019 through March 2021. A total of 2100 studies were identified from databases and through cited references. Studies providing data on PASC in children and adults were included. The Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) guidelines for abstracting data were followed and performed independently by 2 reviewers. Quality was assessed using the Newcastle-Ottawa Scale for cohort studies. The main outcome was frequency of PASC diagnosed by (1) laboratory investigation, (2) radiologic pathology, and (3) clinical signs and symptoms. PASC were classified by organ system, ie, neurologic; cardiovascular; respiratory; digestive; dermatologic; and ear, nose, and throat as well as mental health, constitutional symptoms, and functional mobility. Findings From a total of 2100 studies identified, 57 studies with 250 351 survivors of COVID-19 met inclusion criteria. The mean (SD) age of survivors was 54.4 (8.9) years, 140 196 (56%) were male, and 197 777 (79%) were hospitalized during acute COVID-19. High-income countries contributed 45 studies (79%). The median (IQR) proportion of COVID-19 survivors experiencing at least 1 PASC was 54.0% (45.0%-69.0%; 13 studies) at 1 month (short-term), 55.0% (34.8%-65.5%; 38 studies) at 2 to 5 months (intermediate-term), and 54.0% (31.0%-67.0%; 9 studies) at 6 or more months (long-term). Most prevalent pulmonary sequelae, neurologic disorders, mental health disorders, functional mobility impairments, and general and constitutional symptoms were chest imaging abnormality (median [IQR], 62.2% [45.8%-76.5%]), difficulty concentrating (median [IQR], 23.8% [20.4%-25.9%]), generalized anxiety disorder (median [IQR], 29.6% [14.0%-44.0%]), general functional impairments (median [IQR], 44.0% [23.4%-62.6%]), and fatigue or muscle weakness (median [IQR], 37.5% [25.4%-54.5%]), respectively. Other frequently reported symptoms included cardiac, dermatologic, digestive, and ear, nose, and throat disorders. Conclusions and Relevance In this systematic review, more than half of COVID-19 survivors experienced PASC 6 months after recovery. The most common PASC involved functional mobility impairments, pulmonary abnormalities, and mental health disorders. These long-term PASC effects occur on a scale that could overwhelm existing health care capacity, particularly in low- and middle-income countries.
481 citations
••
TL;DR: In this article, the population of 47 compact binary mergers detected with a false-alarm rate of 0.614 were dynamically assembled, and the authors found that the BBH rate likely increases with redshift, but not faster than the star formation rate.
Abstract: We report on the population of 47 compact binary mergers detected with a false-alarm rate of 0.01 are dynamically assembled. Third, we estimate merger rates, finding RBBH = 23.9-+8.614.3 Gpc-3 yr-1 for BBHs and RBNS = 320-+240490 Gpc-3 yr-1 for binary neutron stars. We find that the BBH rate likely increases with redshift (85% credibility) but not faster than the star formation rate (86% credibility). Additionally, we examine recent exceptional events in the context of our population models, finding that the asymmetric masses of GW190412 and the high component masses of GW190521 are consistent with our models, but the low secondary mass of GW190814 makes it an outlier.
468 citations
••
TL;DR: It was showed that being a woman, having fair/poor general health status, being 18 to 24 years old, spending 8 or more hours on screens daily, and knowing someone infected predicted higher levels of psychological impact when risk factors were considered simultaneously.
Abstract: Background University students are increasingly recognized as a vulnerable population, suffering from higher levels of anxiety, depression, substance abuse, and disordered eating compared to the general population. Therefore, when the nature of their educational experience radically changes—such as sheltering in place during the COVID-19 pandemic—the burden on the mental health of this vulnerable population is amplified. The objectives of this study are to 1) identify the array of psychological impacts COVID-19 has on students, 2) develop profiles to characterize students' anticipated levels of psychological impact during the pandemic, and 3) evaluate potential sociodemographic, lifestyle-related, and awareness of people infected with COVID-19 risk factors that could make students more likely to experience these impacts. Methods Cross-sectional data were collected through web-based questionnaires from seven U.S. universities. Representative and convenience sampling was used to invite students to complete the questionnaires in mid-March to early-May 2020, when most coronavirus-related sheltering in place orders were in effect. We received 2,534 completed responses, of which 61% were from women, 79% from non-Hispanic Whites, and 20% from graduate students. Results Exploratory factor analysis on close-ended responses resulted in two latent constructs, which we used to identify profiles of students with latent profile analysis, including high (45% of sample), moderate (40%), and low (14%) levels of psychological impact. Bivariate associations showed students who were women, were non-Hispanic Asian, in fair/poor health, of below-average relative family income, or who knew someone infected with COVID-19 experienced higher levels of psychological impact. Students who were non-Hispanic White, above-average social class, spent at least two hours outside, or less than eight hours on electronic screens were likely to experience lower levels of psychological impact. Multivariate modeling (mixed-effects logistic regression) showed that being a woman, having fair/poor general health status, being 18 to 24 years old, spending 8 or more hours on screens daily, and knowing someone infected predicted higher levels of psychological impact when risk factors were considered simultaneously. Conclusion Inadequate efforts to recognize and address college students’ mental health challenges, especially during a pandemic, could have long-term consequences on their health and education.
444 citations
•
24 Nov 2021TL;DR: In this article, a multi-paradigm approach to analyzing Dilemmas is presented, categorized as paradoxes: individual rights/community standards Traditional Curriculum/hidden curriculum Personal Codes/Professional Codes The American Melting Pot/The Chinese Hot Pot Equality/Equity.
Abstract: Part 1 Practice and Paradigms in the Study of Ethics: Multiple Ethical Paradigms and the Preparation of Educational Leaders in a Diverse and Complex Era Viewing Ethical Dilemmas Through Multiple Paradigms Figure 1 - Diagrammatic Representation of the Ethic of the Profession. Part 2 A Multi-Paradigm Approach to Analyzing Dilemmas Categorized as Paradoxes: Individual Rights/Community Standards Traditional Curriculum/Hidden Curriculum Personal Codes/Professional Codes The American Melting Pot/The Chinese Hot Pot Equality/Equity. Part 3 Teaching as Scholarly Work: To the Instructor - Ethics, Ourselves and Our Pedagogy.
387 citations
••
TL;DR: In this paper, an emerging technique called algorithm unrolling, or unfolding, offers promise in eliminating these issues by providing a concrete and systematic connection between iterative algorithms that are widely used in signal processing and deep neural networks.
Abstract: Deep neural networks provide unprecedented performance gains in many real-world problems in signal and image processing. Despite these gains, the future development and practical deployment of deep networks are hindered by their black-box nature, i.e., a lack of interpretability and the need for very large training sets. An emerging technique called algorithm unrolling, or unfolding, offers promise in eliminating these issues by providing a concrete and systematic connection between iterative algorithms that are widely used in signal processing and deep neural networks. Unrolling methods were first proposed to develop fast neural network approximations for sparse coding. More recently, this direction has attracted enormous attention, and it is rapidly growing in both theoretic investigations and practical applications. The increasing popularity of unrolled deep networks is due, in part, to their potential in developing efficient, high-performance (yet interpretable) network architectures from reasonably sized training sets.
••
TL;DR: In this article, the authors reported the observation of gravitational waves from two compact binary coalescences in LIGO's and Virgo's third observing run with properties consistent with neutron star-black hole (NSBH) binaries.
Abstract: We report the observation of gravitational waves from two compact binary coalescences in LIGO’s and Virgo’s third observing run with properties consistent with neutron star–black hole (NSBH) binaries. The two events are named GW200105_162426 and GW200115_042309, abbreviated as GW200105 and GW200115; the first was observed by LIGO Livingston and Virgo and the second by all three LIGO–Virgo detectors. The source of GW200105 has component masses 8.9−1.5+1.2 and 1.9−0.2+0.3M⊙ , whereas the source of GW200115 has component masses 5.7−2.1+1.8 and 1.5−0.3+0.7M⊙ (all measurements quoted at the 90% credible level). The probability that the secondary’s mass is below the maximal mass of a neutron star is 89%–96% and 87%–98%, respectively, for GW200105 and GW200115, with the ranges arising from different astrophysical assumptions. The source luminosity distances are 280−110+110 and 300−100+150Mpc , respectively. The magnitude of the primary spin of GW200105 is less than 0.23 at the 90% credible level, and its orientation is unconstrained. For GW200115, the primary spin has a negative spin projection onto the orbital angular momentum at 88% probability. We are unable to constrain the spin or tidal deformation of the secondary component for either event. We infer an NSBH merger rate density of 45−33+75Gpc−3yr−1 when assuming that GW200105 and GW200115 are representative of the NSBH population or 130−69+112Gpc−3yr−1 under the assumption of a broader distribution of component masses.
••
TL;DR: The data recorded by these instruments during their first and second observing runs are described, including the gravitational-wave strain arrays, released as time series sampled at 16384 Hz.
••
PSL Research University1, University of Michigan2, University of Akron3, University of Strasbourg4, Boston University5, National Autonomous University of Mexico6, University of California, Santa Barbara7, Pennsylvania State University8, Imperial College London9, University of Naples Federico II10, Shanghai Jiao Tong University11, Leidos12, Ton Duc Thang University13, University of Bordeaux14, University of Manchester15, Western Washington University16, Polish Academy of Sciences17, North Carolina State University18, Ben-Gurion University of the Negev19, Forschungszentrum Jülich20, University of Texas Medical Branch21, University of Minnesota22, Fudan University23, University of Paris24
TL;DR: In this paper, the authors review what computer, in vitro, in vivo, and pharmacological experiments tell us about the accumulation and deposition of the oligomers of the (Aβ, tau), α-synuclein, IAPP, and superoxide dismutase 1 proteins, which have been the mainstream concept underlying Alzheimer's disease, Parkinson's disease (PD), type II diabetes (T2D), and amyotrophic lateral sclerosis (ALS) research.
Abstract: Protein misfolding and aggregation is observed in many amyloidogenic diseases affecting either the central nervous system or a variety of peripheral tissues. Structural and dynamic characterization of all species along the pathways from monomers to fibrils is challenging by experimental and computational means because they involve intrinsically disordered proteins in most diseases. Yet understanding how amyloid species become toxic is the challenge in developing a treatment for these diseases. Here we review what computer, in vitro, in vivo, and pharmacological experiments tell us about the accumulation and deposition of the oligomers of the (Aβ, tau), α-synuclein, IAPP, and superoxide dismutase 1 proteins, which have been the mainstream concept underlying Alzheimer's disease (AD), Parkinson's disease (PD), type II diabetes (T2D), and amyotrophic lateral sclerosis (ALS) research, respectively, for many years.
•
TL;DR: Information theoretically, it is proved that the mixture of local and global models can reduce the generalization error and a communication-reduced bilevel optimization method is proposed, which reduces the communication rounds to $O(\sqrt{T})$ and can achieve a convergence rate of $O(1/T)$ with some residual error.
Abstract: Investigation of the degree of personalization in federated learning algorithms has shown that only maximizing the performance of the global model will confine the capacity of the local models to personalize. In this paper, we advocate an adaptive personalized federated learning (APFL) algorithm, where each client will train their local models while contributing to the global model. We derive the generalization bound of mixture of local and global models, and find the optimal mixing parameter. We also propose a communication-efficient optimization method to collaboratively learn the personalized models and analyze its convergence in both smooth strongly convex and nonconvex settings. The extensive experiments demonstrate the effectiveness of our personalization schema, as well as the correctness of established generalization theories.
••
TL;DR: The authors showed that infection-blocking immunity wanes rapidly but that disease-reducing immunity is long-lived, and that once the endemic phase is reached and primary exposure is in childhood, SARS-CoV-2 may be no more virulent than the common cold.
Abstract: We are currently faced with the question of how the severity of infection with severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) may change in the years ahead. Our analysis of immunological and epidemiological data on endemic human coronaviruses (HCoVs) shows that infection-blocking immunity wanes rapidly but that disease-reducing immunity is long-lived. Our model, incorporating these components of immunity, recapitulates both the current severity of SARS-CoV-2 infection and the benign nature of HCoVs, suggesting that once the endemic phase is reached and primary exposure is in childhood, SARS-CoV-2 may be no more virulent than the common cold. We predict a different outcome for an emergent coronavirus that causes severe disease in children. These results reinforce the importance of behavioral containment during pandemic vaccine rollout, while prompting us to evaluate scenarios for continuing vaccination in the endemic phase.
••
University of Sydney1, University of Utah2, Vaccine and Infectious Disease Organization3, University of Glasgow4, University of California, Berkeley5, University of California, San Diego6, University of California, Davis7, Imperial College London8, Pennsylvania State University9, University of Melbourne10, Wellcome Trust11, University of Otago12, Xi'an Jiaotong-Liverpool University13, Texas A&M University14, King's College London15, Medical University of Vienna16, University of Pennsylvania17, University of Arizona18, Scripps Research Institute19, Tulane University20, University of Edinburgh21
TL;DR: In this article, a review of the current scientific evidence that may help clarify the origin of SARS-CoV-2 is presented, with a focus on how severe acute respiratory syndrome coronavirus 2 emerged in the human population.
••
TL;DR: In this paper, a continuous-time model was proposed to estimate the effect of previous exporting activity on the costs of meeting new clients, and to characterize the cumulative effects of learning on firms' search intensities.
Abstract: Customs record data reveal a number of patterns in relationships Colombian firms have with their U.S. buyers. We interpret these patterns in terms of a continuous-time model in which heterogeneous sellers search for buyers in a market. Success in selling to a buyer reveals information to the seller about the appeal of her product in the market, affecting her incentive to search for more buyers. Fit using the method of simulated moments, the model replicates key patterns in the customs records and allows us quantify several types of trade costs, including the search costs of identifying potential clients and the costs of maintaining business relationships with existing clients. It also allows us to estimate the effect of previous exporting activity on the costs of meeting new clients, and to characterize the cumulative effects of learning on firms' search intensities. Finally, we use our fitted model to explore the effects of these trade costs and learning effects on aggregate export dynamics
••
Queen Mary University of London1, Harvard University2, University of Auckland3, St. Michael's Hospital4, Winthrop-University Hospital5, Karolinska Institutet6, University of Zurich7, Pontifical Catholic University of Chile8, University of Copenhagen9, Boston Children's Hospital10, University of Parma11, University of London12, University of Colorado Denver13, Ben-Gurion University of the Negev14, McMaster University15, Case Western Reserve University16, Katholieke Universiteit Leuven17, University of Tampere18, Columbia University Medical Center19, Medical University of Łódź20, Pennsylvania State University21, University of Birmingham22, University of Otago23, Jikei University School of Medicine24, QIMR Berghofer Medical Research Institute25, Dartmouth College26, University of Helsinki27, University College of Medical Sciences28, University of Melbourne29, Menzies Research Institute30, University of Delhi31
TL;DR: A 2017 meta-analysis of data from 25 randomised controlled trials (RCTs) of vitamin D supplementation for the prevention of acute respiratory infections (ARIs) revealed a protective effect of this intervention as discussed by the authors.
••
University of Maryland, Baltimore County1, University of Arizona2, Utrecht University3, Pennsylvania State University4, University of Queensland5, Max Planck Society6, National University of Cordoba7, University College London8, Northwest University (China)9, University of Maine10, University of Hong Kong11, United Nations Environment Programme12, University of Amsterdam13, Smithsonian Institution14, World Wide Fund for Nature15, Duke University16, Aarhus University17, Wildlife Conservation Society18
TL;DR: In this paper, the authors use the most up-to-date, spatially explicit global reconstruction of historical human populations and land use to show that this paradigm is likely wrong.
Abstract: Archaeological and paleoecological evidence shows that by 10,000 BCE, all human societies employed varying degrees of ecologically transformative land use practices, including burning, hunting, species propagation, domestication, cultivation, and others that have left long-term legacies across the terrestrial biosphere. Yet, a lingering paradigm among natural scientists, conservationists, and policymakers is that human transformation of terrestrial nature is mostly recent and inherently destructive. Here, we use the most up-to-date, spatially explicit global reconstruction of historical human populations and land use to show that this paradigm is likely wrong. Even 12,000 y ago, nearly three quarters of Earth’s land was inhabited and therefore shaped by human societies, including more than 95% of temperate and 90% of tropical woodlands. Lands now characterized as “natural,” “intact,” and “wild” generally exhibit long histories of use, as do protected areas and Indigenous lands, and current global patterns of vertebrate species richness and key biodiversity areas are more strongly associated with past patterns of land use than with present ones in regional landscapes now characterized as natural. The current biodiversity crisis can seldom be explained by the loss of uninhabited wildlands, resulting instead from the appropriation, colonization, and intensifying use of the biodiverse cultural landscapes long shaped and sustained by prior societies. Recognizing this deep cultural connection with biodiversity will therefore be essential to resolve the crisis.
••
TL;DR: In this article, the authors focus on the available mechanistic models of additive manufacturing (AM) that have been adequately validated and evaluate the functionality of AM models in understanding of the printability of commonly used AM alloys and the fabrication of functionally graded alloys.
••
University of Leicester1, Pennsylvania State University2, Delft University of Technology3, University of Cassino4, University of Colorado Boulder5, Tallinn University of Technology6, University of Hong Kong7, National University of Singapore8, Queensland University of Technology9, Virginia Tech10, Technical University of Denmark11, University of California, Berkeley12, Aalborg University13, McGill University14, NHS Lanarkshire15, Edinburgh Napier University16
TL;DR: In this article, the authors present a review of the most commonly held dogmas on airborne transmission of severe acute respiratory syndrome coronavirus-2 (SARS-CoV-2) in the COVID-19 pandemic.
••
TL;DR: Management scholars study phenomena marked by complex interdependencies where multiple explanatory factors combine to bring about an outcome of interest to theorize about causal complexity.
Abstract: Management scholars study phenomena marked by complex interdependencies where multiple explanatory factors combine to bring about an outcome of interest. Yet, theorizing about causal complexity can prove challenging for the correlational theorizing that is predominant in the field of management, given its “net effects thinking” that emphasizes the unique contribution of individual explanatory factors. In contrast, configurational theories and thinking are well-suited to explaining causally complex phenomena. In this article, we seek to advance configurational theorizing by providing a model of the configurational theorizing process which consists of three iterative stages—scoping, linking and naming. In each stage, we develop and offer several heuristics aimed at stimulating configurational theorizing. That is, these theorizing heuristics are intended to help scholars discover configurations of explanatory factors, probe the connections among these factors, and articulate the orchestrating themes that underpin their coherence. We conclude with a discussion of how configurational theorizing advances theory development in the field of management and organizations, and beyond.
••
TL;DR: An explication of “fake news” that, as a concept, has ballooned to include more than simply false information, with partisans weaponizing it to cast aspersions on the veracity of claims made by those who are politically opposed to them is conducted.
Abstract: As the scourge of “fake news” continues to plague our information environment, attention has turned toward devising automated solutions for detecting problematic online content. But, in order to bu...
••
Ohio State University1, Indiana University2, University of Maryland, Baltimore3, University of Michigan4, University of Iowa5, Indiana University – Purdue University Indianapolis6, Michigan State University7, University of Nebraska–Lincoln8, University of Wisconsin-Madison9, Purdue University10, University of Minnesota11, Northwestern University12, University of Maryland, College Park13, Pennsylvania State University14, Rutgers University15, University of Nebraska Medical Center16
TL;DR: The Big Ten Conference requires comprehensive cardiac testing including cardiac magnetic resonance (CMR) imaging for all athletes with COVID-19, allowing comparison of screening approaches for safe return to play as mentioned in this paper.
Abstract: Importance Myocarditis is a leading cause of sudden death in competitive athletes Myocardial inflammation is known to occur with SARS-CoV-2 Different screening approaches for detection of myocarditis have been reported The Big Ten Conference requires comprehensive cardiac testing including cardiac magnetic resonance (CMR) imaging for all athletes with COVID-19, allowing comparison of screening approaches Objective To determine the prevalence of myocarditis in athletes with COVID-19 and compare screening strategies for safe return to play Design, Setting, and Participants Big Ten COVID-19 Cardiac Registry principal investigators were surveyed for aggregate observational data from March 1, 2020, through December 15, 2020, on athletes with COVID-19 For athletes with myocarditis, presence of cardiac symptoms and details of cardiac testing were recorded Myocarditis was categorized as clinical or subclinical based on the presence of cardiac symptoms and CMR findings Subclinical myocarditis classified as probable or possible myocarditis based on other testing abnormalities Myocarditis prevalence across universities was determined The utility of different screening strategies was evaluated Exposures SARS-CoV-2 by polymerase chain reaction testing Main Outcome and Measure Myocarditis via cardiovascular diagnostic testing Results Representing 13 universities, cardiovascular testing was performed in 1597 athletes (964 men [604%]) Thirty-seven (including 27 men) were diagnosed with COVID-19 myocarditis (overall 23%; range per program, 0%-76%); 9 had clinical myocarditis and 28 had subclinical myocarditis If cardiac testing was based on cardiac symptoms alone, only 5 athletes would have been detected (detected prevalence, 031%) Cardiac magnetic resonance imaging for all athletes yielded a 74-fold increase in detection of myocarditis (clinical and subclinical) Follow-up CMR imaging performed in 27 (730%) demonstrated resolution of T2 elevation in all (100%) and late gadolinium enhancement in 11 (407%) Conclusions and Relevance In this cohort study of 1597 US competitive athletes with CMR screening after COVID-19 infection, 37 athletes (23%) were diagnosed with clinical and subclinical myocarditis Variability was observed in prevalence across universities, and testing protocols were closely tied to the detection of myocarditis Variable ascertainment and unknown implications of CMR findings underscore the need for standardized timing and interpretation of cardiac testing These unique CMR imaging data provide a more complete understanding of the prevalence of clinical and subclinical myocarditis in college athletes after COVID-19 infection The role of CMR in routine screening for athletes safe return to play should be explored further
••
TL;DR: In this paper, the removal of heavy metals and dyes by clay-based adsorbents, from natural clays to 1D clay nanotubes and 2D Clay nanosheets, has been summarized.
••
TL;DR: In this paper, the authors examined advances in metal printing focusing on metallurgy, as well as the use of mechanistic models and machine learning and the role they play in the expansion of the additive manufacturing of metals.
Abstract: Additive manufacturing enables the printing of metallic parts, such as customized implants for patients, durable single-crystal parts for use in harsh environments, and the printing of parts with site-specific chemical compositions and properties from 3D designs. However, the selection of alloys, printing processes and process variables results in an exceptional diversity of microstructures, properties and defects that affect the serviceability of the printed parts. Control of these attributes using the rich knowledge base of metallurgy remains a challenge because of the complexity of the printing process. Transforming 3D designs created in the virtual world into high-quality products in the physical world needs a new methodology not commonly used in traditional manufacturing. Rapidly developing powerful digital tools such as mechanistic models and machine learning, when combined with the knowledge base of metallurgy, have the potential to shape the future of metal printing. Starting from product design to process planning and process monitoring and control, these tools can help improve microstructure and properties, mitigate defects, automate part inspection and accelerate part qualification. Here, we examine advances in metal printing focusing on metallurgy, as well as the use of mechanistic models and machine learning and the role they play in the expansion of the additive manufacturing of metals. Several key industries routinely use metal printing to make complex parts that are difficult to produce by conventional manufacturing. Here, we show that a synergistic combination of metallurgy, mechanistic models and machine learning is driving the continued growth of metal printing.
••
TL;DR: 1,3-dimethyl-3-imidazolium hexafluorophosphate (DMIMPF 6 ) ionic liquid is adopted to passivate the perovskite surface and also reduce the energy barrier between the perOVskite and hole transport layer to provide firm support to the understanding of the passivation effect.
Abstract: Surface defects have been a key constraint for perovskite photovoltaics. Herein, 1,3-dimethyl-3-imidazolium hexafluorophosphate (DMIMPF6 ) ionic liquid (IL) is adopted to passivate the surface of a formamidinium-cesium lead iodide perovskite (Cs0.08 FA0.92 PbI3 ) and also reduce the energy barrier between the perovskite and hole transport layer. Theoretical simulations and experimental results demonstrate that Pb-cluster and Pb-I antisite defects can be effectively passivated by [DMIM]+ bonding with the Pb2+ ion on the perovskite surface, leading to significantly suppressed non-radiative recombination. As a result, the solar cell efficiency was increased to 23.25 % from 21.09 %. Meanwhile, the DMIMPF6 -treated perovskite device demonstrated long-term stability because the hydrophobic DMIMPF6 layer blocked moisture permeation.
•
Pennsylvania State University1, Stanford University2, Duke University3, Vienna University of Technology4, Ioffe Institute5, Intel6, Purdue University7, University of Illinois at Urbana–Champaign8, Katholieke Universiteit Leuven9, University of Hong Kong10, Indian Institute of Science11, King Abdullah University of Science and Technology12, Indian Institute of Technology Delhi13
TL;DR: In this paper, the development of 2D field-effect transistors for use in future VLSI technologies is reviewed, and the key performance indicators for aggressively scaled 2D transistors are discussed.
Abstract: Field-effect transistors based on two-dimensional (2D) materials have the potential to be used in very large-scale integration (VLSI) technology, but whether they can be used at the front end of line or at the back end of line through monolithic or heterogeneous integration remains to be determined. To achieve this, multiple challenges must be overcome, including reducing the contact resistance, developing stable and controllable doping schemes, advancing mobility engineering and improving high-κ dielectric integration. The large-area growth of uniform 2D layers is also required to ensure low defect density, low device-to-device variation and clean interfaces. Here we review the development of 2D field-effect transistors for use in future VLSI technologies. We consider the key performance indicators for aggressively scaled 2D transistors and discuss how these should be extracted and reported. We also highlight potential applications of 2D transistors in conventional micro/nanoelectronics, neuromorphic computing, advanced sensing, data storage and future interconnect technologies. This Review examines the development of field-effect transistors based on two-dimensional materials and considers the challenges that need to be addressed for the devices to be incorporated into very large-scale integration (VLSI) technology.
••
TL;DR: In this article, the authors benchmark device-to-device variation in field effect transistors (FETs) based on monolayer MoS2 and WS2 films grown using metal-organic chemical vapor deposition process.
Abstract: Here we benchmark device-to-device variation in field-effect transistors (FETs) based on monolayer MoS2 and WS2 films grown using metal-organic chemical vapor deposition process. Our study involves 230 MoS2 FETs and 160 WS2 FETs with channel lengths ranging from 5 μm down to 100 nm. We use statistical measures to evaluate key FET performance indicators for benchmarking these two-dimensional (2D) transition metal dichalcogenide (TMD) monolayers against existing literature as well as ultra-thin body Si FETs. Our results show consistent performance of 2D FETs across 1 × 1 cm2 chips owing to high quality and uniform growth of these TMDs followed by clean transfer onto device substrates. We are able to demonstrate record high carrier mobility of 33 cm2 V−1 s−1 in WS2 FETs, which is a 1.5X improvement compared to the best reported in the literature. Our experimental demonstrations confirm the technological viability of 2D FETs in future integrated circuits. Here, the authors perform a benchmark study of field-effect transistors (FETs) based on 2D transition metal dichalcogenides, i.e., 230 MoS2 and 160 WS2 FETs, and track device-to-device variations to gauge the technological viability in future integrated circuits.