scispace - formally typeset
Search or ask a question

Showing papers by "Pennsylvania State University published in 2021"


Journal ArticleDOI
09 Apr 2021-Science
TL;DR: Using a variety of statistical and dynamic modeling approaches, the authors estimate that this variant has a 43 to 90% (range of 95% credible intervals, 38 to 130%) higher reproduction number than preexisting variants, and a fitted two-strain dynamic transmission model shows that VOC 202012/01 will lead to large resurgences of COVID-19 cases.
Abstract: A severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) variant, VOC 202012/01 (lineage B.1.1.7), emerged in southeast England in September 2020 and is rapidly spreading toward fixation. Using a variety of statistical and dynamic modeling approaches, we estimate that this variant has a 43 to 90% (range of 95% credible intervals, 38 to 130%) higher reproduction number than preexisting variants. A fitted two-strain dynamic transmission model shows that VOC 202012/01 will lead to large resurgences of COVID-19 cases. Without stringent control measures, including limited closure of educational institutions and a greatly accelerated vaccine rollout, COVID-19 hospitalizations and deaths across England in the first 6 months of 2021 were projected to exceed those in 2020. VOC 202012/01 has spread globally and exhibits a similar transmission increase (59 to 74%) in Denmark, Switzerland, and the United States.

1,935 citations


Journal ArticleDOI
TL;DR: Baricitinib plus remdesivir was superior to remdes Vivir alone in reducing recovery time and accelerating improvement in clinical status among patients with Covid-19, notably among those receiving high-flow oxygen or noninvasive ventilation.
Abstract: Background Severe coronavirus disease 2019 (Covid-19) is associated with dysregulated inflammation. The effects of combination treatment with baricitinib, a Janus kinase inhibitor, plus remdesivir are not known. Methods We conducted a double-blind, randomized, placebo-controlled trial evaluating baricitinib plus remdesivir in hospitalized adults with Covid-19. All the patients received remdesivir (≤10 days) and either baricitinib (≤14 days) or placebo (control). The primary outcome was the time to recovery. The key secondary outcome was clinical status at day 15. Results A total of 1033 patients underwent randomization (with 515 assigned to combination treatment and 518 to control). Patients receiving baricitinib had a median time to recovery of 7 days (95% confidence interval [CI], 6 to 8), as compared with 8 days (95% CI, 7 to 9) with control (rate ratio for recovery, 1.16; 95% CI, 1.01 to 1.32; P = 0.03), and a 30% higher odds of improvement in clinical status at day 15 (odds ratio, 1.3; 95% CI, 1.0 to 1.6). Patients receiving high-flow oxygen or noninvasive ventilation at enrollment had a time to recovery of 10 days with combination treatment and 18 days with control (rate ratio for recovery, 1.51; 95% CI, 1.10 to 2.08). The 28-day mortality was 5.1% in the combination group and 7.8% in the control group (hazard ratio for death, 0.65; 95% CI, 0.39 to 1.09). Serious adverse events were less frequent in the combination group than in the control group (16.0% vs. 21.0%; difference, -5.0 percentage points; 95% CI, -9.8 to -0.3; P = 0.03), as were new infections (5.9% vs. 11.2%; difference, -5.3 percentage points; 95% CI, -8.7 to -1.9; P = 0.003). Conclusions Baricitinib plus remdesivir was superior to remdesivir alone in reducing recovery time and accelerating improvement in clinical status among patients with Covid-19, notably among those receiving high-flow oxygen or noninvasive ventilation. The combination was associated with fewer serious adverse events. (Funded by the National Institute of Allergy and Infectious Diseases; ClinicalTrials.gov number, NCT04401579.).

1,301 citations


Journal ArticleDOI
TL;DR: In this article, the authors present a set of guidelines for investigators to select and interpret methods to examine autophagy and related processes, and for reviewers to provide realistic and reasonable critiques of reports that are focused on these processes.
Abstract: In 2008, we published the first set of guidelines for standardizing research in autophagy. Since then, this topic has received increasing attention, and many scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Thus, it is important to formulate on a regular basis updated guidelines for monitoring autophagy in different organisms. Despite numerous reviews, there continues to be confusion regarding acceptable methods to evaluate autophagy, especially in multicellular eukaryotes. Here, we present a set of guidelines for investigators to select and interpret methods to examine autophagy and related processes, and for reviewers to provide realistic and reasonable critiques of reports that are focused on these processes. These guidelines are not meant to be a dogmatic set of rules, because the appropriateness of any assay largely depends on the question being asked and the system being used. Moreover, no individual assay is perfect for every situation, calling for the use of multiple techniques to properly monitor autophagy in each experimental setting. Finally, several core components of the autophagy machinery have been implicated in distinct autophagic processes (canonical and noncanonical autophagy), implying that genetic approaches to block autophagy should rely on targeting two or more autophagy-related genes that ideally participate in distinct steps of the pathway. Along similar lines, because multiple proteins involved in autophagy also regulate other cellular pathways including apoptosis, not all of them can be used as a specific marker for bona fide autophagic responses. Here, we critically discuss current methods of assessing autophagy and the information they can, or cannot, provide. Our ultimate goal is to encourage intellectual and technical innovation in the field.

1,129 citations


Journal ArticleDOI
Daniel Taliun1, Daniel N. Harris2, Michael D. Kessler2, Jedidiah Carlson3  +202 moreInstitutions (61)
10 Feb 2021-Nature
TL;DR: The Trans-Omics for Precision Medicine (TOPMed) project as discussed by the authors aims to elucidate the genetic architecture and biology of heart, lung, blood and sleep disorders, with the ultimate goal of improving diagnosis, treatment and prevention of these diseases.
Abstract: The Trans-Omics for Precision Medicine (TOPMed) programme seeks to elucidate the genetic architecture and biology of heart, lung, blood and sleep disorders, with the ultimate goal of improving diagnosis, treatment and prevention of these diseases The initial phases of the programme focused on whole-genome sequencing of individuals with rich phenotypic data and diverse backgrounds Here we describe the TOPMed goals and design as well as the available resources and early insights obtained from the sequence data The resources include a variant browser, a genotype imputation server, and genomic and phenotypic data that are available through dbGaP (Database of Genotypes and Phenotypes)1 In the first 53,831 TOPMed samples, we detected more than 400 million single-nucleotide and insertion or deletion variants after alignment with the reference genome Additional previously undescribed variants were detected through assembly of unmapped reads and customized analysis in highly variable loci Among the more than 400 million detected variants, 97% have frequencies of less than 1% and 46% are singletons that are present in only one individual (53% among unrelated individuals) These rare variants provide insights into mutational processes and recent human evolutionary history The extensive catalogue of genetic variation in TOPMed studies provides unique opportunities for exploring the contributions of rare and noncoding sequence variants to phenotypic variation Furthermore, combining TOPMed haplotypes with modern imputation methods improves the power and reach of genome-wide association studies to include variants down to a frequency of approximately 001% The goals, resources and design of the NHLBI Trans-Omics for Precision Medicine (TOPMed) programme are described, and analyses of rare variants detected in the first 53,831 samples provide insights into mutational processes and recent human evolutionary history

801 citations


Journal ArticleDOI
Arang Rhie1, Shane A. McCarthy2, Shane A. McCarthy3, Olivier Fedrigo4, Joana Damas5, Giulio Formenti4, Sergey Koren1, Marcela Uliano-Silva6, William Chow2, Arkarachai Fungtammasan, J. H. Kim7, Chul Hee Lee7, Byung June Ko7, Mark Chaisson8, Gregory Gedman4, Lindsey J. Cantin4, Françoise Thibaud-Nissen1, Leanne Haggerty9, Iliana Bista2, Iliana Bista3, Michelle Smith2, Bettina Haase4, Jacquelyn Mountcastle4, Sylke Winkler10, Sylke Winkler11, Sadye Paez4, Jason T. Howard, Sonja C. Vernes12, Sonja C. Vernes10, Sonja C. Vernes13, Tanya M. Lama14, Frank Grützner15, Wesley C. Warren16, Christopher N. Balakrishnan17, Dave W Burt18, Jimin George19, Matthew T. Biegler4, David Iorns, Andrew Digby, Daryl Eason, Bruce C. Robertson20, Taylor Edwards21, Mark Wilkinson22, George F. Turner23, Axel Meyer24, Andreas F. Kautt24, Andreas F. Kautt25, Paolo Franchini24, H. William Detrich26, Hannes Svardal27, Hannes Svardal28, Maximilian Wagner29, Gavin J. P. Naylor30, Martin Pippel10, Milan Malinsky31, Milan Malinsky2, Mark Mooney, Maria Simbirsky, Brett T. Hannigan, Trevor Pesout32, Marlys L. Houck33, Ann C Misuraca33, Sarah B. Kingan34, Richard Hall34, Zev N. Kronenberg34, Ivan Sović34, Christopher Dunn34, Zemin Ning2, Alex Hastie, Joyce V. Lee, Siddarth Selvaraj, Richard E. Green32, Nicholas H. Putnam, Ivo Gut35, Jay Ghurye36, Erik Garrison32, Ying Sims2, Joanna Collins2, Sarah Pelan2, James Torrance2, Alan Tracey2, Jonathan Wood2, Robel E. Dagnew8, Dengfeng Guan3, Dengfeng Guan37, Sarah E. London38, David F. Clayton19, Claudio V. Mello39, Samantha R. Friedrich39, Peter V. Lovell39, Ekaterina Osipova10, Farooq O. Al-Ajli40, Farooq O. Al-Ajli41, Simona Secomandi42, Heebal Kim7, Constantina Theofanopoulou4, Michael Hiller43, Yang Zhou, Robert S. Harris44, Kateryna D. Makova44, Paul Medvedev44, Jinna Hoffman1, Patrick Masterson1, Karen Clark1, Fergal J. Martin9, Kevin L. Howe9, Paul Flicek9, Brian P. Walenz1, Woori Kwak, Hiram Clawson32, Mark Diekhans32, Luis R Nassar32, Benedict Paten32, Robert H. S. Kraus24, Robert H. S. Kraus10, Andrew J. Crawford45, M. Thomas P. Gilbert46, M. Thomas P. Gilbert47, Guojie Zhang, Byrappa Venkatesh48, Robert W. Murphy49, Klaus-Peter Koepfli50, Beth Shapiro51, Beth Shapiro32, Warren E. Johnson50, Warren E. Johnson52, Federica Di Palma53, Tomas Marques-Bonet, Emma C. Teeling54, Tandy Warnow55, Jennifer A. Marshall Graves56, Oliver A. Ryder33, Oliver A. Ryder57, David Haussler32, Stephen J. O'Brien58, Jonas Korlach34, Harris A. Lewin5, Kerstin Howe2, Eugene W. Myers11, Eugene W. Myers10, Richard Durbin3, Richard Durbin2, Adam M. Phillippy1, Erich D. Jarvis4, Erich D. Jarvis51 
National Institutes of Health1, Wellcome Trust Sanger Institute2, University of Cambridge3, Rockefeller University4, University of California, Davis5, Leibniz Association6, Seoul National University7, University of Southern California8, European Bioinformatics Institute9, Max Planck Society10, Dresden University of Technology11, Radboud University Nijmegen12, University of St Andrews13, University of Massachusetts Amherst14, University of Adelaide15, University of Missouri16, East Carolina University17, University of Queensland18, Clemson University19, University of Otago20, University of Arizona21, Natural History Museum22, Bangor University23, University of Konstanz24, Harvard University25, Northeastern University26, University of Antwerp27, National Museum of Natural History28, University of Graz29, University of Florida30, University of Basel31, University of California, Santa Cruz32, Zoological Society of San Diego33, Pacific Biosciences34, Pompeu Fabra University35, University of Maryland, College Park36, Harbin Institute of Technology37, University of Chicago38, Oregon Health & Science University39, Qatar Airways40, Monash University Malaysia Campus41, University of Milan42, Goethe University Frankfurt43, Pennsylvania State University44, University of Los Andes45, University of Copenhagen46, Norwegian University of Science and Technology47, Agency for Science, Technology and Research48, Royal Ontario Museum49, Smithsonian Institution50, Howard Hughes Medical Institute51, Walter Reed Army Institute of Research52, University of East Anglia53, University College Dublin54, University of Illinois at Urbana–Champaign55, La Trobe University56, University of California, San Diego57, Nova Southeastern University58
28 Apr 2021-Nature
TL;DR: The Vertebrate Genomes Project (VGP) as mentioned in this paper is an international effort to generate high quality, complete reference genomes for all of the roughly 70,000 extant vertebrate species and to help to enable a new era of discovery across the life sciences.
Abstract: High-quality and complete reference genome assemblies are fundamental for the application of genomics to biology, disease, and biodiversity conservation. However, such assemblies are available for only a few non-microbial species1-4. To address this issue, the international Genome 10K (G10K) consortium5,6 has worked over a five-year period to evaluate and develop cost-effective methods for assembling highly accurate and nearly complete reference genomes. Here we present lessons learned from generating assemblies for 16 species that represent six major vertebrate lineages. We confirm that long-read sequencing technologies are essential for maximizing genome quality, and that unresolved complex repeats and haplotype heterozygosity are major sources of assembly error when not handled correctly. Our assemblies correct substantial errors, add missing sequence in some of the best historical reference genomes, and reveal biological discoveries. These include the identification of many false gene duplications, increases in gene sizes, chromosome rearrangements that are specific to lineages, a repeated independent chromosome breakpoint in bat genomes, and a canonical GC-rich pattern in protein-coding genes and their regulatory regions. Adopting these lessons, we have embarked on the Vertebrate Genomes Project (VGP), an international effort to generate high-quality, complete reference genomes for all of the roughly 70,000 extant vertebrate species and to help to enable a new era of discovery across the life sciences.

647 citations


Journal ArticleDOI
TL;DR: It is hypothesized that age-related decline and dysregulation of immune function, i.e., immunosenescence and inflammaging play a major role in contributing to heightened vulnerability to severe COVID-19 outcomes in older adults and partitioning all immunological outcome data by age to better understand disease heterogeneity and aging.

495 citations


Journal ArticleDOI
01 Oct 2021
TL;DR: More than half of COVID-19 survivors experienced persistent postacute sequelae (PASC) 6 months after recovery as mentioned in this paper, and most common PASC involved functional mobility impairments, pulmonary abnormalities, and mental health disorders.
Abstract: Importance Short-term and long-term persistent postacute sequelae of COVID-19 (PASC) have not been systematically evaluated. The incidence and evolution of PASC are dependent on time from infection, organ systems and tissue affected, vaccination status, variant of the virus, and geographic region. Objective To estimate organ system–specific frequency and evolution of PASC. Evidence Review PubMed (MEDLINE), Scopus, the World Health Organization Global Literature on Coronavirus Disease, and CoronaCentral databases were searched from December 2019 through March 2021. A total of 2100 studies were identified from databases and through cited references. Studies providing data on PASC in children and adults were included. The Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) guidelines for abstracting data were followed and performed independently by 2 reviewers. Quality was assessed using the Newcastle-Ottawa Scale for cohort studies. The main outcome was frequency of PASC diagnosed by (1) laboratory investigation, (2) radiologic pathology, and (3) clinical signs and symptoms. PASC were classified by organ system, ie, neurologic; cardiovascular; respiratory; digestive; dermatologic; and ear, nose, and throat as well as mental health, constitutional symptoms, and functional mobility. Findings From a total of 2100 studies identified, 57 studies with 250 351 survivors of COVID-19 met inclusion criteria. The mean (SD) age of survivors was 54.4 (8.9) years, 140 196 (56%) were male, and 197 777 (79%) were hospitalized during acute COVID-19. High-income countries contributed 45 studies (79%). The median (IQR) proportion of COVID-19 survivors experiencing at least 1 PASC was 54.0% (45.0%-69.0%; 13 studies) at 1 month (short-term), 55.0% (34.8%-65.5%; 38 studies) at 2 to 5 months (intermediate-term), and 54.0% (31.0%-67.0%; 9 studies) at 6 or more months (long-term). Most prevalent pulmonary sequelae, neurologic disorders, mental health disorders, functional mobility impairments, and general and constitutional symptoms were chest imaging abnormality (median [IQR], 62.2% [45.8%-76.5%]), difficulty concentrating (median [IQR], 23.8% [20.4%-25.9%]), generalized anxiety disorder (median [IQR], 29.6% [14.0%-44.0%]), general functional impairments (median [IQR], 44.0% [23.4%-62.6%]), and fatigue or muscle weakness (median [IQR], 37.5% [25.4%-54.5%]), respectively. Other frequently reported symptoms included cardiac, dermatologic, digestive, and ear, nose, and throat disorders. Conclusions and Relevance In this systematic review, more than half of COVID-19 survivors experienced PASC 6 months after recovery. The most common PASC involved functional mobility impairments, pulmonary abnormalities, and mental health disorders. These long-term PASC effects occur on a scale that could overwhelm existing health care capacity, particularly in low- and middle-income countries.

481 citations


Journal ArticleDOI
Richard J. Abbott1, T. D. Abbott2, Sheelu Abraham3, Fausto Acernese4  +1428 moreInstitutions (155)
TL;DR: In this article, the population of 47 compact binary mergers detected with a false-alarm rate of 0.614 were dynamically assembled, and the authors found that the BBH rate likely increases with redshift, but not faster than the star formation rate.
Abstract: We report on the population of 47 compact binary mergers detected with a false-alarm rate of 0.01 are dynamically assembled. Third, we estimate merger rates, finding RBBH = 23.9-+8.614.3 Gpc-3 yr-1 for BBHs and RBNS = 320-+240490 Gpc-3 yr-1 for binary neutron stars. We find that the BBH rate likely increases with redshift (85% credibility) but not faster than the star formation rate (86% credibility). Additionally, we examine recent exceptional events in the context of our population models, finding that the asymmetric masses of GW190412 and the high component masses of GW190521 are consistent with our models, but the low secondary mass of GW190814 makes it an outlier.

468 citations


Journal ArticleDOI
07 Jan 2021-PLOS ONE
TL;DR: It was showed that being a woman, having fair/poor general health status, being 18 to 24 years old, spending 8 or more hours on screens daily, and knowing someone infected predicted higher levels of psychological impact when risk factors were considered simultaneously.
Abstract: Background University students are increasingly recognized as a vulnerable population, suffering from higher levels of anxiety, depression, substance abuse, and disordered eating compared to the general population. Therefore, when the nature of their educational experience radically changes—such as sheltering in place during the COVID-19 pandemic—the burden on the mental health of this vulnerable population is amplified. The objectives of this study are to 1) identify the array of psychological impacts COVID-19 has on students, 2) develop profiles to characterize students' anticipated levels of psychological impact during the pandemic, and 3) evaluate potential sociodemographic, lifestyle-related, and awareness of people infected with COVID-19 risk factors that could make students more likely to experience these impacts. Methods Cross-sectional data were collected through web-based questionnaires from seven U.S. universities. Representative and convenience sampling was used to invite students to complete the questionnaires in mid-March to early-May 2020, when most coronavirus-related sheltering in place orders were in effect. We received 2,534 completed responses, of which 61% were from women, 79% from non-Hispanic Whites, and 20% from graduate students. Results Exploratory factor analysis on close-ended responses resulted in two latent constructs, which we used to identify profiles of students with latent profile analysis, including high (45% of sample), moderate (40%), and low (14%) levels of psychological impact. Bivariate associations showed students who were women, were non-Hispanic Asian, in fair/poor health, of below-average relative family income, or who knew someone infected with COVID-19 experienced higher levels of psychological impact. Students who were non-Hispanic White, above-average social class, spent at least two hours outside, or less than eight hours on electronic screens were likely to experience lower levels of psychological impact. Multivariate modeling (mixed-effects logistic regression) showed that being a woman, having fair/poor general health status, being 18 to 24 years old, spending 8 or more hours on screens daily, and knowing someone infected predicted higher levels of psychological impact when risk factors were considered simultaneously. Conclusion Inadequate efforts to recognize and address college students’ mental health challenges, especially during a pandemic, could have long-term consequences on their health and education.

444 citations


Book
24 Nov 2021
TL;DR: In this article, a multi-paradigm approach to analyzing Dilemmas is presented, categorized as paradoxes: individual rights/community standards Traditional Curriculum/hidden curriculum Personal Codes/Professional Codes The American Melting Pot/The Chinese Hot Pot Equality/Equity.
Abstract: Part 1 Practice and Paradigms in the Study of Ethics: Multiple Ethical Paradigms and the Preparation of Educational Leaders in a Diverse and Complex Era Viewing Ethical Dilemmas Through Multiple Paradigms Figure 1 - Diagrammatic Representation of the Ethic of the Profession. Part 2 A Multi-Paradigm Approach to Analyzing Dilemmas Categorized as Paradoxes: Individual Rights/Community Standards Traditional Curriculum/Hidden Curriculum Personal Codes/Professional Codes The American Melting Pot/The Chinese Hot Pot Equality/Equity. Part 3 Teaching as Scholarly Work: To the Instructor - Ethics, Ourselves and Our Pedagogy.

387 citations


Journal ArticleDOI
TL;DR: In this paper, an emerging technique called algorithm unrolling, or unfolding, offers promise in eliminating these issues by providing a concrete and systematic connection between iterative algorithms that are widely used in signal processing and deep neural networks.
Abstract: Deep neural networks provide unprecedented performance gains in many real-world problems in signal and image processing. Despite these gains, the future development and practical deployment of deep networks are hindered by their black-box nature, i.e., a lack of interpretability and the need for very large training sets. An emerging technique called algorithm unrolling, or unfolding, offers promise in eliminating these issues by providing a concrete and systematic connection between iterative algorithms that are widely used in signal processing and deep neural networks. Unrolling methods were first proposed to develop fast neural network approximations for sparse coding. More recently, this direction has attracted enormous attention, and it is rapidly growing in both theoretic investigations and practical applications. The increasing popularity of unrolled deep networks is due, in part, to their potential in developing efficient, high-performance (yet interpretable) network architectures from reasonably sized training sets.

Journal ArticleDOI
Richard J. Abbott1, T. D. Abbott2, Sheelu Abraham3, Fausto Acernese4  +1692 moreInstitutions (195)
TL;DR: In this article, the authors reported the observation of gravitational waves from two compact binary coalescences in LIGO's and Virgo's third observing run with properties consistent with neutron star-black hole (NSBH) binaries.
Abstract: We report the observation of gravitational waves from two compact binary coalescences in LIGO’s and Virgo’s third observing run with properties consistent with neutron star–black hole (NSBH) binaries. The two events are named GW200105_162426 and GW200115_042309, abbreviated as GW200105 and GW200115; the first was observed by LIGO Livingston and Virgo and the second by all three LIGO–Virgo detectors. The source of GW200105 has component masses 8.9−1.5+1.2 and 1.9−0.2+0.3M⊙ , whereas the source of GW200115 has component masses 5.7−2.1+1.8 and 1.5−0.3+0.7M⊙ (all measurements quoted at the 90% credible level). The probability that the secondary’s mass is below the maximal mass of a neutron star is 89%–96% and 87%–98%, respectively, for GW200105 and GW200115, with the ranges arising from different astrophysical assumptions. The source luminosity distances are 280−110+110 and 300−100+150Mpc , respectively. The magnitude of the primary spin of GW200105 is less than 0.23 at the 90% credible level, and its orientation is unconstrained. For GW200115, the primary spin has a negative spin projection onto the orbital angular momentum at 88% probability. We are unable to constrain the spin or tidal deformation of the secondary component for either event. We infer an NSBH merger rate density of 45−33+75Gpc−3yr−1 when assuming that GW200105 and GW200115 are representative of the NSBH population or 130−69+112Gpc−3yr−1 under the assumption of a broader distribution of component masses.

Journal ArticleDOI
Richard J. Abbott1, T. D. Abbott2, Sheelu Abraham3, Fausto Acernese4  +1335 moreInstitutions (144)
TL;DR: The data recorded by these instruments during their first and second observing runs are described, including the gravitational-wave strain arrays, released as time series sampled at 16384 Hz.

Journal ArticleDOI
TL;DR: In this paper, the authors review what computer, in vitro, in vivo, and pharmacological experiments tell us about the accumulation and deposition of the oligomers of the (Aβ, tau), α-synuclein, IAPP, and superoxide dismutase 1 proteins, which have been the mainstream concept underlying Alzheimer's disease, Parkinson's disease (PD), type II diabetes (T2D), and amyotrophic lateral sclerosis (ALS) research.
Abstract: Protein misfolding and aggregation is observed in many amyloidogenic diseases affecting either the central nervous system or a variety of peripheral tissues. Structural and dynamic characterization of all species along the pathways from monomers to fibrils is challenging by experimental and computational means because they involve intrinsically disordered proteins in most diseases. Yet understanding how amyloid species become toxic is the challenge in developing a treatment for these diseases. Here we review what computer, in vitro, in vivo, and pharmacological experiments tell us about the accumulation and deposition of the oligomers of the (Aβ, tau), α-synuclein, IAPP, and superoxide dismutase 1 proteins, which have been the mainstream concept underlying Alzheimer's disease (AD), Parkinson's disease (PD), type II diabetes (T2D), and amyotrophic lateral sclerosis (ALS) research, respectively, for many years.

Journal Article
TL;DR: Information theoretically, it is proved that the mixture of local and global models can reduce the generalization error and a communication-reduced bilevel optimization method is proposed, which reduces the communication rounds to $O(\sqrt{T})$ and can achieve a convergence rate of $O(1/T)$ with some residual error.
Abstract: Investigation of the degree of personalization in federated learning algorithms has shown that only maximizing the performance of the global model will confine the capacity of the local models to personalize. In this paper, we advocate an adaptive personalized federated learning (APFL) algorithm, where each client will train their local models while contributing to the global model. We derive the generalization bound of mixture of local and global models, and find the optimal mixing parameter. We also propose a communication-efficient optimization method to collaboratively learn the personalized models and analyze its convergence in both smooth strongly convex and nonconvex settings. The extensive experiments demonstrate the effectiveness of our personalization schema, as well as the correctness of established generalization theories.

Journal ArticleDOI
12 Feb 2021-Science
TL;DR: The authors showed that infection-blocking immunity wanes rapidly but that disease-reducing immunity is long-lived, and that once the endemic phase is reached and primary exposure is in childhood, SARS-CoV-2 may be no more virulent than the common cold.
Abstract: We are currently faced with the question of how the severity of infection with severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) may change in the years ahead. Our analysis of immunological and epidemiological data on endemic human coronaviruses (HCoVs) shows that infection-blocking immunity wanes rapidly but that disease-reducing immunity is long-lived. Our model, incorporating these components of immunity, recapitulates both the current severity of SARS-CoV-2 infection and the benign nature of HCoVs, suggesting that once the endemic phase is reached and primary exposure is in childhood, SARS-CoV-2 may be no more virulent than the common cold. We predict a different outcome for an emergent coronavirus that causes severe disease in children. These results reinforce the importance of behavioral containment during pandemic vaccine rollout, while prompting us to evaluate scenarios for continuing vaccination in the endemic phase.


ReportDOI
TL;DR: In this paper, a continuous-time model was proposed to estimate the effect of previous exporting activity on the costs of meeting new clients, and to characterize the cumulative effects of learning on firms' search intensities.
Abstract: Customs record data reveal a number of patterns in relationships Colombian firms have with their U.S. buyers. We interpret these patterns in terms of a continuous-time model in which heterogeneous sellers search for buyers in a market. Success in selling to a buyer reveals information to the seller about the appeal of her product in the market, affecting her incentive to search for more buyers. Fit using the method of simulated moments, the model replicates key patterns in the customs records and allows us quantify several types of trade costs, including the search costs of identifying potential clients and the costs of maintaining business relationships with existing clients. It also allows us to estimate the effect of previous exporting activity on the costs of meeting new clients, and to characterize the cumulative effects of learning on firms' search intensities. Finally, we use our fitted model to explore the effects of these trade costs and learning effects on aggregate export dynamics

Journal ArticleDOI
TL;DR: A 2017 meta-analysis of data from 25 randomised controlled trials (RCTs) of vitamin D supplementation for the prevention of acute respiratory infections (ARIs) revealed a protective effect of this intervention as discussed by the authors.

Journal ArticleDOI
TL;DR: In this paper, the authors use the most up-to-date, spatially explicit global reconstruction of historical human populations and land use to show that this paradigm is likely wrong.
Abstract: Archaeological and paleoecological evidence shows that by 10,000 BCE, all human societies employed varying degrees of ecologically transformative land use practices, including burning, hunting, species propagation, domestication, cultivation, and others that have left long-term legacies across the terrestrial biosphere. Yet, a lingering paradigm among natural scientists, conservationists, and policymakers is that human transformation of terrestrial nature is mostly recent and inherently destructive. Here, we use the most up-to-date, spatially explicit global reconstruction of historical human populations and land use to show that this paradigm is likely wrong. Even 12,000 y ago, nearly three quarters of Earth’s land was inhabited and therefore shaped by human societies, including more than 95% of temperate and 90% of tropical woodlands. Lands now characterized as “natural,” “intact,” and “wild” generally exhibit long histories of use, as do protected areas and Indigenous lands, and current global patterns of vertebrate species richness and key biodiversity areas are more strongly associated with past patterns of land use than with present ones in regional landscapes now characterized as natural. The current biodiversity crisis can seldom be explained by the loss of uninhabited wildlands, resulting instead from the appropriation, colonization, and intensifying use of the biodiverse cultural landscapes long shaped and sustained by prior societies. Recognizing this deep cultural connection with biodiversity will therefore be essential to resolve the crisis.

Journal ArticleDOI
TL;DR: In this article, the authors focus on the available mechanistic models of additive manufacturing (AM) that have been adequately validated and evaluate the functionality of AM models in understanding of the printability of commonly used AM alloys and the fabrication of functionally graded alloys.


Journal ArticleDOI
TL;DR: Management scholars study phenomena marked by complex interdependencies where multiple explanatory factors combine to bring about an outcome of interest to theorize about causal complexity.
Abstract: Management scholars study phenomena marked by complex interdependencies where multiple explanatory factors combine to bring about an outcome of interest. Yet, theorizing about causal complexity can prove challenging for the correlational theorizing that is predominant in the field of management, given its “net effects thinking” that emphasizes the unique contribution of individual explanatory factors. In contrast, configurational theories and thinking are well-suited to explaining causally complex phenomena. In this article, we seek to advance configurational theorizing by providing a model of the configurational theorizing process which consists of three iterative stages—scoping, linking and naming. In each stage, we develop and offer several heuristics aimed at stimulating configurational theorizing. That is, these theorizing heuristics are intended to help scholars discover configurations of explanatory factors, probe the connections among these factors, and articulate the orchestrating themes that underpin their coherence. We conclude with a discussion of how configurational theorizing advances theory development in the field of management and organizations, and beyond.

Journal ArticleDOI
TL;DR: An explication of “fake news” that, as a concept, has ballooned to include more than simply false information, with partisans weaponizing it to cast aspersions on the veracity of claims made by those who are politically opposed to them is conducted.
Abstract: As the scourge of “fake news” continues to plague our information environment, attention has turned toward devising automated solutions for detecting problematic online content. But, in order to bu...

Journal ArticleDOI
TL;DR: The Big Ten Conference requires comprehensive cardiac testing including cardiac magnetic resonance (CMR) imaging for all athletes with COVID-19, allowing comparison of screening approaches for safe return to play as mentioned in this paper.
Abstract: Importance Myocarditis is a leading cause of sudden death in competitive athletes Myocardial inflammation is known to occur with SARS-CoV-2 Different screening approaches for detection of myocarditis have been reported The Big Ten Conference requires comprehensive cardiac testing including cardiac magnetic resonance (CMR) imaging for all athletes with COVID-19, allowing comparison of screening approaches Objective To determine the prevalence of myocarditis in athletes with COVID-19 and compare screening strategies for safe return to play Design, Setting, and Participants Big Ten COVID-19 Cardiac Registry principal investigators were surveyed for aggregate observational data from March 1, 2020, through December 15, 2020, on athletes with COVID-19 For athletes with myocarditis, presence of cardiac symptoms and details of cardiac testing were recorded Myocarditis was categorized as clinical or subclinical based on the presence of cardiac symptoms and CMR findings Subclinical myocarditis classified as probable or possible myocarditis based on other testing abnormalities Myocarditis prevalence across universities was determined The utility of different screening strategies was evaluated Exposures SARS-CoV-2 by polymerase chain reaction testing Main Outcome and Measure Myocarditis via cardiovascular diagnostic testing Results Representing 13 universities, cardiovascular testing was performed in 1597 athletes (964 men [604%]) Thirty-seven (including 27 men) were diagnosed with COVID-19 myocarditis (overall 23%; range per program, 0%-76%); 9 had clinical myocarditis and 28 had subclinical myocarditis If cardiac testing was based on cardiac symptoms alone, only 5 athletes would have been detected (detected prevalence, 031%) Cardiac magnetic resonance imaging for all athletes yielded a 74-fold increase in detection of myocarditis (clinical and subclinical) Follow-up CMR imaging performed in 27 (730%) demonstrated resolution of T2 elevation in all (100%) and late gadolinium enhancement in 11 (407%) Conclusions and Relevance In this cohort study of 1597 US competitive athletes with CMR screening after COVID-19 infection, 37 athletes (23%) were diagnosed with clinical and subclinical myocarditis Variability was observed in prevalence across universities, and testing protocols were closely tied to the detection of myocarditis Variable ascertainment and unknown implications of CMR findings underscore the need for standardized timing and interpretation of cardiac testing These unique CMR imaging data provide a more complete understanding of the prevalence of clinical and subclinical myocarditis in college athletes after COVID-19 infection The role of CMR in routine screening for athletes safe return to play should be explored further

Journal ArticleDOI
TL;DR: In this paper, the removal of heavy metals and dyes by clay-based adsorbents, from natural clays to 1D clay nanotubes and 2D Clay nanosheets, has been summarized.

Journal ArticleDOI
TL;DR: In this paper, the authors examined advances in metal printing focusing on metallurgy, as well as the use of mechanistic models and machine learning and the role they play in the expansion of the additive manufacturing of metals.
Abstract: Additive manufacturing enables the printing of metallic parts, such as customized implants for patients, durable single-crystal parts for use in harsh environments, and the printing of parts with site-specific chemical compositions and properties from 3D designs. However, the selection of alloys, printing processes and process variables results in an exceptional diversity of microstructures, properties and defects that affect the serviceability of the printed parts. Control of these attributes using the rich knowledge base of metallurgy remains a challenge because of the complexity of the printing process. Transforming 3D designs created in the virtual world into high-quality products in the physical world needs a new methodology not commonly used in traditional manufacturing. Rapidly developing powerful digital tools such as mechanistic models and machine learning, when combined with the knowledge base of metallurgy, have the potential to shape the future of metal printing. Starting from product design to process planning and process monitoring and control, these tools can help improve microstructure and properties, mitigate defects, automate part inspection and accelerate part qualification. Here, we examine advances in metal printing focusing on metallurgy, as well as the use of mechanistic models and machine learning and the role they play in the expansion of the additive manufacturing of metals. Several key industries routinely use metal printing to make complex parts that are difficult to produce by conventional manufacturing. Here, we show that a synergistic combination of metallurgy, mechanistic models and machine learning is driving the continued growth of metal printing.

Journal ArticleDOI
TL;DR: 1,3-dimethyl-3-imidazolium hexafluorophosphate (DMIMPF 6 ) ionic liquid is adopted to passivate the perovskite surface and also reduce the energy barrier between the perOVskite and hole transport layer to provide firm support to the understanding of the passivation effect.
Abstract: Surface defects have been a key constraint for perovskite photovoltaics. Herein, 1,3-dimethyl-3-imidazolium hexafluorophosphate (DMIMPF6 ) ionic liquid (IL) is adopted to passivate the surface of a formamidinium-cesium lead iodide perovskite (Cs0.08 FA0.92 PbI3 ) and also reduce the energy barrier between the perovskite and hole transport layer. Theoretical simulations and experimental results demonstrate that Pb-cluster and Pb-I antisite defects can be effectively passivated by [DMIM]+ bonding with the Pb2+ ion on the perovskite surface, leading to significantly suppressed non-radiative recombination. As a result, the solar cell efficiency was increased to 23.25 % from 21.09 %. Meanwhile, the DMIMPF6 -treated perovskite device demonstrated long-term stability because the hydrophobic DMIMPF6 layer blocked moisture permeation.

DOI
25 Nov 2021
TL;DR: In this paper, the development of 2D field-effect transistors for use in future VLSI technologies is reviewed, and the key performance indicators for aggressively scaled 2D transistors are discussed.
Abstract: Field-effect transistors based on two-dimensional (2D) materials have the potential to be used in very large-scale integration (VLSI) technology, but whether they can be used at the front end of line or at the back end of line through monolithic or heterogeneous integration remains to be determined. To achieve this, multiple challenges must be overcome, including reducing the contact resistance, developing stable and controllable doping schemes, advancing mobility engineering and improving high-κ dielectric integration. The large-area growth of uniform 2D layers is also required to ensure low defect density, low device-to-device variation and clean interfaces. Here we review the development of 2D field-effect transistors for use in future VLSI technologies. We consider the key performance indicators for aggressively scaled 2D transistors and discuss how these should be extracted and reported. We also highlight potential applications of 2D transistors in conventional micro/nanoelectronics, neuromorphic computing, advanced sensing, data storage and future interconnect technologies. This Review examines the development of field-effect transistors based on two-dimensional materials and considers the challenges that need to be addressed for the devices to be incorporated into very large-scale integration (VLSI) technology.

Journal ArticleDOI
TL;DR: In this article, the authors benchmark device-to-device variation in field effect transistors (FETs) based on monolayer MoS2 and WS2 films grown using metal-organic chemical vapor deposition process.
Abstract: Here we benchmark device-to-device variation in field-effect transistors (FETs) based on monolayer MoS2 and WS2 films grown using metal-organic chemical vapor deposition process. Our study involves 230 MoS2 FETs and 160 WS2 FETs with channel lengths ranging from 5 μm down to 100 nm. We use statistical measures to evaluate key FET performance indicators for benchmarking these two-dimensional (2D) transition metal dichalcogenide (TMD) monolayers against existing literature as well as ultra-thin body Si FETs. Our results show consistent performance of 2D FETs across 1 × 1 cm2 chips owing to high quality and uniform growth of these TMDs followed by clean transfer onto device substrates. We are able to demonstrate record high carrier mobility of 33 cm2 V−1 s−1 in WS2 FETs, which is a 1.5X improvement compared to the best reported in the literature. Our experimental demonstrations confirm the technological viability of 2D FETs in future integrated circuits. Here, the authors perform a benchmark study of field-effect transistors (FETs) based on 2D transition metal dichalcogenides, i.e., 230 MoS2 and 160 WS2 FETs, and track device-to-device variations to gauge the technological viability in future integrated circuits.