scispace - formally typeset
Search or ask a question
Browse all papers

Journal ArticleDOI
TL;DR: The results, which mirror those found previously for political fake news, suggest that nudging people to think about accuracy is a simple way to improve choices about what to share on social media.
Abstract: Across two studies with more than 1,700 U.S. adults recruited online, we present evidence that people share false claims about COVID-19 partly because they simply fail to think sufficiently about whether or not the content is accurate when deciding what to share. In Study 1, participants were far worse at discerning between true and false content when deciding what they would share on social media relative to when they were asked directly about accuracy. Furthermore, greater cognitive reflection and science knowledge were associated with stronger discernment. In Study 2, we found that a simple accuracy reminder at the beginning of the study (i.e., judging the accuracy of a non-COVID-19-related headline) nearly tripled the level of truth discernment in participants' subsequent sharing intentions. Our results, which mirror those found previously for political fake news, suggest that nudging people to think about accuracy is a simple way to improve choices about what to share on social media.

914 citations


Journal ArticleDOI
TL;DR: Pregnant women in Zika virus-affected areas should protect themselves from mosquito bites by using air conditioning, screens, or nets when indoors, wearing long sleeves and pants, using permethrin-treated clothing and gear, and using insect repellents when outdoors.
Abstract: In early 2015, an outbreak of Zika virus, a flavivirus transmitted by Aedes mosquitoes, was identified in northeast Brazil, an area where dengue virus was also circulating. By September, reports of an increase in the number of infants born with microcephaly in Zika virus-affected areas began to emerge, and Zika virus RNA was identified in the amniotic fluid of two women whose fetuses had been found to have microcephaly by prenatal ultrasound. The Brazil Ministry of Health (MoH) established a task force to investigate the possible association of microcephaly with Zika virus infection during pregnancy and a registry for incident microcephaly cases (head circumference ≥2 standard deviations [SD] below the mean for sex and gestational age at birth) and pregnancy outcomes among women suspected to have had Zika virus infection during pregnancy. Among a cohort of 35 infants with microcephaly born during August-October 2015 in eight of Brazil's 26 states and reported to the registry, the mothers of all 35 had lived in or visited Zika virus-affected areas during pregnancy, 25 (71%) infants had severe microcephaly (head circumference >3 SD below the mean for sex and gestational age), 17 (49%) had at least one neurologic abnormality, and among 27 infants who had neuroimaging studies, all had abnormalities. Tests for other congenital infections were negative. All infants had a lumbar puncture as part of the evaluation and cerebrospinal fluid (CSF) samples were sent to a reference laboratory in Brazil for Zika virus testing; results are not yet available. Further studies are needed to confirm the association of microcephaly with Zika virus infection during pregnancy and to understand any other adverse pregnancy outcomes associated with Zika virus infection. Pregnant women in Zika virus-affected areas should protect themselves from mosquito bites by using air conditioning, screens, or nets when indoors, wearing long sleeves and pants, using permethrin-treated clothing and gear, and using insect repellents when outdoors. Pregnant and lactating women can use all U.S. Environmental Protection Agency (EPA)-registered insect repellents according to the product label.

914 citations


Journal ArticleDOI
08 Jul 2016-PLOS ONE
TL;DR: Poor wellbeing and moderate to high levels of burnout are associated, in the majority of studies reviewed, with poor patient safety outcomes such as medical errors, however the lack of prospective studies reduces the ability to determine causality.
Abstract: Objective To determine whether there is an association between healthcare professionals’ wellbeing and burnout, with patient safety. Design Systematic research review. Data Sources PsychInfo (1806 to July 2015), Medline (1946 to July 2015), Embase (1947 to July 2015) and Scopus (1823 to July 2015) were searched, along with reference lists of eligible articles. Eligibility Criteria for Selecting Studies Quantitative, empirical studies that included i) either a measure of wellbeing or burnout, and ii) patient safety, in healthcare staff populations. Results Forty-six studies were identified. Sixteen out of the 27 studies that measured wellbeing found a significant correlation between poor wellbeing and worse patient safety, with six additional studies finding an association with some but not all scales used, and one study finding a significant association but in the opposite direction to the majority of studies. Twenty-one out of the 30 studies that measured burnout found a significant association between burnout and patient safety, whilst a further four studies found an association between one or more (but not all) subscales of the burnout measures employed, and patient safety. Conclusions Poor wellbeing and moderate to high levels of burnout are associated, in the majority of studies reviewed, with poor patient safety outcomes such as medical errors, however the lack of prospective studies reduces the ability to determine causality. Further prospective studies, research in primary care, conducted within the UK, and a clearer definition of healthcare staff wellbeing are needed. Implications This review illustrates the need for healthcare organisations to consider improving employees’ mental health as well as creating safer work environments when planning interventions to improve patient safety. Systematic Review Registration PROSPERO registration number: CRD42015023340.

914 citations


Journal ArticleDOI
TL;DR: Examination of two pregnant women from the state of Paraiba who were diagnosed with fetal microcephaly and were considered part of the ‘microCEphaly cluster’ was positive for Zika virus in both patients, most likely representing the first diagnoses of intrauterine transmission of the virus.
Abstract: An unexpected upsurge in diagnosis of fetal and pediatric microcephaly has been reported in the Brazilian press recently. Cases have been diagnosed in nine Brazilian states so far. By 28 November 2015, 646 cases had been reported in Pernambuco state alone. Although reports have circulated regarding the declaration of a state of national health emergency, there is no information on the imaging and clinical findings of affected cases. Authorities are considering different theories behind the ‘microcephaly outbreak’, including a possible association with the emergence of Zika virus disease within the region, the first case of which was detected in May 20151. Zika virus is a mosquito-borne disease closely related to yellow fever, dengue, West Nile and Japanese encephalitis viruses2. It was first identified in 1947 in the Zika Valley in Uganda and causes a mild disease with fever, erythema and arthralgia. Interestingly, vertical transmission to the fetus has not been reported previously, although two cases of perinatal transmission, occurring around the time of delivery and causing mild disease in the newborns, have been described3. We have examined recently two pregnant women from the state of Paraiba who were diagnosed with fetal microcephaly and were considered part of the ‘microcephaly cluster’ as both women suffered from symptoms related to Zika virus infection. Although both patients had negative blood results for Zika virus, amniocentesis and subsequent quantitative real-time polymerase chain reaction4, performed after ultrasound diagnosis of fetal microcephaly and analyzed at the Oswaldo Cruz Foundation, Rio de Janeiro, Brazil, was positive for Zika virus in both patients, most likely representing the first diagnoses of intrauterine transmission of the virus. The sequencing analysis identified in both cases a genotype of Asian origin. In Case 1, fetal ultrasound examination was performed at 30.1 weeks’ gestation. Head circumference (HC) was 246 mm (2.6 SD below expected value) and weight was estimated as 1179 g (21st percentile). Abdominal circumference (AC), femur length (FL) and transcranial Doppler were normal for gestational age as was the width of the lateral ventricles. Anomalies were limited to the brain and included brain atrophy with coarse calcifications involving the white matter of the frontal lobes, including the caudate, lentostriatal vessels and cerebellum. Corpus callosal and vermian dysgenesis and enlarged cisterna magna were observed (Figure 1). In Case 2, fetal ultrasound examination was performed at 29.2 weeks’ gestation. HC was 229 mm (3.1 SD below Figure 1 Case 1: (a) Transabdominal axial ultrasound image shows cerebral calcifications with failure of visualization of a normal vermis (large arrow). Calcifications are also present in the brain parenchyma (small arrow). (b) Transvaginal sagittal image shows dysgenesis of the corpus callosum (small arrow) and vermis (large arrow). (c) Coronal plane shows a wide interhemispheric fissure (large arrow) due to brain atrophy and bilateral parenchymatic coarse calcifications (small arrows). (d) Calcifications are visible in this more posterior coronal view and can be seen to involve the caudate (arrows).

914 citations


Proceedings Article
06 Oct 2017
TL;DR: In this article, the authors examined six extensions to the DQN algorithm and empirically studied their combination, showing that the combination provided state-of-the-art performance on the Atari 2600 benchmark.
Abstract: The deep reinforcement learning community has made several independent improvements to the DQN algorithm. However, it is unclear which of these extensions are complementary and can be fruitfully combined. This paper examines six extensions to the DQN algorithm and empirically studies their combination. Our experiments show that the combination provides state-of-the-art performance on the Atari 2600 benchmark, both in terms of data efficiency and final performance. We also provide results from a detailed ablation study that shows the contribution of each component to overall performance.

914 citations


Journal ArticleDOI
05 Nov 2015-Nature
TL;DR: A remarkable plasticity of PTEN expression in metastatic tumour cells in response to different organ microenvironments is demonstrated, underpinning an essential role of co-evolution between the metastatic cells and their microenvironment during the adaptive metastatic outgrowth.
Abstract: The development of life-threatening cancer metastases at distant organs requires disseminated tumour cells' adaptation to, and co-evolution with, the drastically different microenvironments of metastatic sites. Cancer cells of common origin manifest distinct gene expression patterns after metastasizing to different organs. Clearly, the dynamic interaction between metastatic tumour cells and extrinsic signals at individual metastatic organ sites critically effects the subsequent metastatic outgrowth. Yet, it is unclear when and how disseminated tumour cells acquire the essential traits from the microenvironment of metastatic organs that prime their subsequent outgrowth. Here we show that both human and mouse tumour cells with normal expression of PTEN, an important tumour suppressor, lose PTEN expression after dissemination to the brain, but not to other organs. The PTEN level in PTEN-loss brain metastatic tumour cells is restored after leaving the brain microenvironment. This brain microenvironment-dependent, reversible PTEN messenger RNA and protein downregulation is epigenetically regulated by microRNAs from brain astrocytes. Mechanistically, astrocyte-derived exosomes mediate an intercellular transfer of PTEN-targeting microRNAs to metastatic tumour cells, while astrocyte-specific depletion of PTEN-targeting microRNAs or blockade of astrocyte exosome secretion rescues the PTEN loss and suppresses brain metastasis in vivo. Furthermore, this adaptive PTEN loss in brain metastatic tumour cells leads to an increased secretion of the chemokine CCL2, which recruits IBA1-expressing myeloid cells that reciprocally enhance the outgrowth of brain metastatic tumour cells via enhanced proliferation and reduced apoptosis. Our findings demonstrate a remarkable plasticity of PTEN expression in metastatic tumour cells in response to different organ microenvironments, underpinning an essential role of co-evolution between the metastatic cells and their microenvironment during the adaptive metastatic outgrowth. Our findings signify the dynamic and reciprocal cross-talk between tumour cells and the metastatic niche; importantly, they provide new opportunities for effective anti-metastasis therapies, especially of consequence for brain metastasis patients.

914 citations


Journal ArticleDOI
TL;DR: This paper provides a data-driven approach to partition the data into subpopulations that differ in the magnitude of their treatment effects, and proposes an “honest” approach to estimation, whereby one sample is used to construct the partition and another to estimate treatment effects for each subpopulation.
Abstract: In this paper we propose methods for estimating heterogeneity in causal effects in experimental and observational studies and for conducting hypothesis tests about the magnitude of differences in treatment effects across subsets of the population. We provide a data-driven approach to partition the data into subpopulations that differ in the magnitude of their treatment effects. The approach enables the construction of valid confidence intervals for treatment effects, even with many covariates relative to the sample size, and without “sparsity” assumptions. We propose an “honest” approach to estimation, whereby one sample is used to construct the partition and another to estimate treatment effects for each subpopulation. Our approach builds on regression tree methods, modified to optimize for goodness of fit in treatment effects and to account for honest estimation. Our model selection criterion anticipates that bias will be eliminated by honest estimation and also accounts for the effect of making additional splits on the variance of treatment effect estimates within each subpopulation. We address the challenge that the “ground truth” for a causal effect is not observed for any individual unit, so that standard approaches to cross-validation must be modified. Through a simulation study, we show that for our preferred method honest estimation results in nominal coverage for 90% confidence intervals, whereas coverage ranges between 74% and 84% for nonhonest approaches. Honest estimation requires estimating the model with a smaller sample size; the cost in terms of mean squared error of treatment effects for our preferred method ranges between 7–22%.

913 citations


Journal ArticleDOI
TL;DR: A large international community sample was recruited to complete measures of self-perceived risk of contracting COVID-19, fear of the virus, moral foundations, political orientation, and behavior change in response to the pandemic, and the only predictor of positive behavior change was fear of COVID -19, with no effect of politically relevant variables.
Abstract: In the current context of the global pandemic of coronavirus disease-2019 (COVID-19), health professionals are working with social scientists to inform government policy on how to slow the spread of the virus. An increasing amount of social scientific research has looked at the role of public message framing, for instance, but few studies have thus far examined the role of individual differences in emotional and personality-based variables in predicting virus-mitigating behaviors. In this study, we recruited a large international community sample (N = 324) to complete measures of self-perceived risk of contracting COVID-19, fear of the virus, moral foundations, political orientation, and behavior change in response to the pandemic. Consistently, the only predictor of positive behavior change (e.g., social distancing, improved hand hygiene) was fear of COVID-19, with no effect of politically relevant variables. We discuss these data in relation to the potentially functional nature of fear in global health crises.

913 citations


Journal ArticleDOI
25 Feb 2020-JAMA
TL;DR: Wu et al. as mentioned in this paper described the current status of 2019-nCoV, assess the response, and offer proposals for bringing the outbreak under control, and proposed a new vaccination strategy.
Abstract: On December 31, 2019, China reported to the World Health Organization (WHO) cases of pneumonia in Wuhan, Hubei Province, China, now designated 2019-nCoV. Mounting cases and deaths pose major public health and governance challenges. China’s imposition of an unprecedented cordon sanitaire (a guarded area preventing anyone from leaving) in Hubei Province has also sparked controversy concerning its implementation and effectiveness. Cases have now spread to 4 continents. We describe the current status of 2019-nCoV, assess the response, and offer proposals for bringing the outbreak under control.

913 citations


Journal ArticleDOI
TL;DR: Yin et al. as mentioned in this paper provided an analysis and synthesis of the differing perspectives which are held by three prominent methodologists, namely, Yin, Sharan Merriam, and Robert E. Stake, on the utilization of case study method in the field of educational research.
Abstract: Case study is one of the most frequently used qualitative research methodologies. However, it still does not have a legitimate status as a social science research strategy because it does not have well-defined and well-structured protocols (Yin, 2002), so emerging researchers who plan to utilize case study usually become confused "as to what a case study is and how it can be differentiated from other types of qualitative research" (Merriam, 1998, p. xi). Research methodologists do not have a consensus on the design and implementation of case study, which makes it a contested terrain and hampers its full evolution. In this paper, I aim to provide an analysis and synthesis of the differing perspectives which are held by three prominent methodologists, namely Robert K. Yin, Sharan Merriam, and Robert E. Stake, on the utilization of case study method in the field of educational research. I will zero in on the ensuing works: Robert K. Yin's Case Study Research: Design and Methods (2002), Sharan B. Merriam's Qualitative Research and Case Study Applications in Education (1998), and Robert E. Stake's The Art of Case Study Research (1995). I selected these three methodologists and their particular books for the following reasons. First, Yin, Merriam and Stake are the three seminal authors who provide procedures to follow when conducting case study research (Creswell, Hanson, Plano, & Morales, 2007) which aid educational researchers to construct a roadmap in their utilization of case study. They are seen as three foundational methodologists in the area of case study research whose methodological suggestions largely impact educational researchers' decisions concerning case study design. Second, previous work on case study detailed the design (Baxter & Jack, 2008), introduction (Tellis, 1997a), and application of case study methodology (Tellis, 1997b) for broader audience of novice qualitative researchers. I believe this paper would be most beneficial and fruitful by exposing novice researchers to a spectrum of different views and conceptualizations of case study that are provided by prominent research methodologists from differing vantage points. This exposure would help them construct or position their own understanding in this spectrum so that they can conduct their research with a dependable and defensible design. Therefore, I present each one of the three distinctive stances on the knotty design issues in case study methodology through points of divergence, convergence, and complementarity. Finally, I opted to concentrate on their particular books for the juxtaposition in this paper, because in these seminal volumes they conscientiously expound upon case study research in its entirety by providing valuable insights into its every step from how it is being conceptualized to how it is communicated to the readers. Thus, the readers of the current paper will have a synthesis and analysis of three complete guides to case study methods, from which they can select the tools that are most appropriate and functional for their own research purposes. In this paper, I endeavor to scrutinize the areas where these three perspectives diverge, converge and complement one another in varying dimensions of case study research. I am going to follow six categorical dimensions which the three scholars mostly converge upon in their seminal texts on case study method: Epistemological Commitments, Defining Case and Case Study, Designing Case Study, Gathering Data, Analyzing Data, and Validating Data. Researcher's Position Prior to moving on to present a comparison of three case study perspectives, I believe readers need to know my identity as a researcher, my investment in this topic, and my intentions in this project. I just completed my doctoral degree in the field of applied linguistics with a dissertation focusing on English as a second language (ESL) teacher candidates' professional identity development. …

913 citations


Journal ArticleDOI
TL;DR: In this paper, the authors consider stochastic programs where the distribution of the uncertain parameters is only observable through a finite training dataset and use the Wasserstein metric to construct a ball in the space of probability distributions centered at the uniform distribution on the training samples.
Abstract: We consider stochastic programs where the distribution of the uncertain parameters is only observable through a finite training dataset. Using the Wasserstein metric, we construct a ball in the space of (multivariate and non-discrete) probability distributions centered at the uniform distribution on the training samples, and we seek decisions that perform best in view of the worst-case distribution within this Wasserstein ball. The state-of-the-art methods for solving the resulting distributionally robust optimization problems rely on global optimization techniques, which quickly become computationally excruciating. In this paper we demonstrate that, under mild assumptions, the distributionally robust optimization problems over Wasserstein balls can in fact be reformulated as finite convex programs—in many interesting cases even as tractable linear programs. Leveraging recent measure concentration results, we also show that their solutions enjoy powerful finite-sample performance guarantees. Our theoretical results are exemplified in mean-risk portfolio optimization as well as uncertainty quantification.

Journal ArticleDOI
Richard J. Abbott1, T. D. Abbott2, Sheelu Abraham3, Fausto Acernese4  +1334 moreInstitutions (150)
TL;DR: In this paper, the authors reported the observation of a compact binary coalescence involving a 222 −243 M ⊙ black hole and a compact object with a mass of 250 −267 M ⋆ (all measurements quoted at the 90% credible level) The gravitational-wave signal, GW190814, was observed during LIGO's and Virgo's third observing run on 2019 August 14 at 21:10:39 UTC and has a signal-to-noise ratio of 25 in the three-detector network.
Abstract: We report the observation of a compact binary coalescence involving a 222–243 M ⊙ black hole and a compact object with a mass of 250–267 M ⊙ (all measurements quoted at the 90% credible level) The gravitational-wave signal, GW190814, was observed during LIGO's and Virgo's third observing run on 2019 August 14 at 21:10:39 UTC and has a signal-to-noise ratio of 25 in the three-detector network The source was localized to 185 deg2 at a distance of ${241}_{-45}^{+41}$ Mpc; no electromagnetic counterpart has been confirmed to date The source has the most unequal mass ratio yet measured with gravitational waves, ${0112}_{-0009}^{+0008}$, and its secondary component is either the lightest black hole or the heaviest neutron star ever discovered in a double compact-object system The dimensionless spin of the primary black hole is tightly constrained to ≤007 Tests of general relativity reveal no measurable deviations from the theory, and its prediction of higher-multipole emission is confirmed at high confidence We estimate a merger rate density of 1–23 Gpc−3 yr−1 for the new class of binary coalescence sources that GW190814 represents Astrophysical models predict that binaries with mass ratios similar to this event can form through several channels, but are unlikely to have formed in globular clusters However, the combination of mass ratio, component masses, and the inferred merger rate for this event challenges all current models of the formation and mass distribution of compact-object binaries

Journal ArticleDOI
TL;DR: In this article, single atoms of palladium and platinum supported on graphitic carbon nitride (g-C3N4) were investigated by density functional theory calculations for the first time.
Abstract: Reducing carbon dioxide to hydrocarbon fuel with solar energy is significant for high-density solar energy storage and carbon balance. In this work, single atoms of palladium and platinum supported on graphitic carbon nitride (g-C3N4), i.e., Pd/g-C3N4 and Pt/g-C3N4, respectively, acting as photocatalysts for CO2 reduction were investigated by density functional theory calculations for the first time. During CO2 reduction, the individual metal atoms function as the active sites, while g-C3N4 provides the source of hydrogen (H*) from the hydrogen evolution reaction. The complete, as-designed photocatalysts exhibit excellent activity in CO2 reduction. HCOOH is the preferred product of CO2 reduction on the Pd/g-C3N4 catalyst with a rate-determining barrier of 0.66 eV, while the Pt/g-C3N4 catalyst prefers to reduce CO2 to CH4 with a rate-determining barrier of 1.16 eV. In addition, deposition of atom catalysts on g-C3N4 significantly enhances the visible-light absorption, rendering them ideal for visible-light reduction of CO2. Our findings open a new avenue of CO2 reduction for renewable energy supply.

Journal ArticleDOI
13 Dec 2019-Science
TL;DR: The first integrated global-scale intergovernmental assessment of the status, trends, and future of the links between people and nature provides an unprecedented picture of the extent of the authors' mutual dependence, the breadth and depth of the ongoing and impending crisis, and the interconnectedness among sectors and regions.
Abstract: The human impact on life on Earth has increased sharply since the 1970s, driven by the demands of a growing population with rising average per capita income. Nature is currently supplying more materials than ever before, but this has come at the high cost of unprecedented global declines in the extent and integrity of ecosystems, distinctness of local ecological communities, abundance and number of wild species, and the number of local domesticated varieties. Such changes reduce vital benefits that people receive from nature and threaten the quality of life of future generations. Both the benefits of an expanding economy and the costs of reducing nature's benefits are unequally distributed. The fabric of life on which we all depend-nature and its contributions to people-is unravelling rapidly. Despite the severity of the threats and lack of enough progress in tackling them to date, opportunities exist to change future trajectories through transformative action. Such action must begin immediately, however, and address the root economic, social, and technological causes of nature's deterioration.

Journal ArticleDOI
TL;DR: In patients with diabetes and recent worsening heart failure, sotagliflozin therapy, initiated before or shortly after discharge, resulted in a significantly lower total number of deaths from cardiovascular causes and hospitalizations and urgent visits for heart failure than placebo.
Abstract: Background Sodium–glucose cotransporter 2 (SGLT2) inhibitors reduce the risk of hospitalization for heart failure or death from cardiovascular causes among patients with stable heart failu...

Proceedings Article
12 Jul 2020
TL;DR: This work obtains tight convergence rates for FedAvg and proves that it suffers from `client-drift' when the data is heterogeneous (non-iid), resulting in unstable and slow convergence, and proposes a new algorithm (SCAFFOLD) which uses control variates (variance reduction) to correct for the ` client-drifts' in its local updates.
Abstract: Federated Averaging (FedAvg) has emerged as the algorithm of choice for federated learning due to its simplicity and low communication cost. However, in spite of recent research efforts, its performance is not fully understood. We obtain tight convergence rates for FedAvg and prove that it suffers from `client-drift' when the data is heterogeneous (non-iid), resulting in unstable and slow convergence. As a solution, we propose a new algorithm (SCAFFOLD) which uses control variates (variance reduction) to correct for the `client-drift' in its local updates. We prove that SCAFFOLD requires significantly fewer communication rounds and is not affected by data heterogeneity or client sampling. Further, we show that (for quadratics) SCAFFOLD can take advantage of similarity in the client's data yielding even faster convergence. The latter is the first result to quantify the usefulness of local-steps in distributed optimization.

Posted Content
TL;DR: This article proposed an adaptive attention model with a visual sentinel to decide whether to attend to the image and where, in order to extract meaningful information for sequential word generation, which set the new state-of-the-art by a significant margin.
Abstract: Attention-based neural encoder-decoder frameworks have been widely adopted for image captioning. Most methods force visual attention to be active for every generated word. However, the decoder likely requires little to no visual information from the image to predict non-visual words such as "the" and "of". Other words that may seem visual can often be predicted reliably just from the language model e.g., "sign" after "behind a red stop" or "phone" following "talking on a cell". In this paper, we propose a novel adaptive attention model with a visual sentinel. At each time step, our model decides whether to attend to the image (and if so, to which regions) or to the visual sentinel. The model decides whether to attend to the image and where, in order to extract meaningful information for sequential word generation. We test our method on the COCO image captioning 2015 challenge dataset and Flickr30K. Our approach sets the new state-of-the-art by a significant margin.

Journal ArticleDOI
TL;DR: Leisure-time physical activity was associated with lower risks of many cancer types, and most of these associations were evident regardless of body size or smoking history, supporting broad generalizability of findings.
Abstract: Importance Leisure-time physical activity has been associated with lower risk of heart-disease and all-cause mortality, but its association with risk of cancer is not well understood. Objective To determine the association of leisure-time physical activity with incidence of common types of cancer and whether associations vary by body size and/or smoking. Design, Setting, and Participants We pooled data from 12 prospective US and European cohorts with self-reported physical activity (baseline, 1987-2004). We used multivariable Cox regression to estimate hazard ratios (HRs) and 95% confidence intervals for associations of leisure-time physical activity with incidence of 26 types of cancer. Leisure-time physical activity levels were modeled as cohort-specific percentiles on a continuous basis and cohort-specific results were synthesized by random-effects meta-analysis. Hazard ratios for high vs low levels of activity are based on a comparison of risk at the 90th vs 10th percentiles of activity. The data analysis was performed from January 1, 2014, to June 1, 2015. Exposures Leisure-time physical activity of a moderate to vigorous intensity. Main Outcomes and Measures Incident cancer during follow-up. Results A total of 1.44 million participants (median [range] age, 59 [19-98] years; 57% female) and 186 932 cancers were included. High vs low levels of leisure-time physical activity were associated with lower risks of 13 cancers: esophageal adenocarcinoma (HR, 0.58; 95% CI, 0.37-0.89), liver (HR, 0.73; 95% CI, 0.55-0.98), lung (HR, 0.74; 95% CI, 0.71-0.77), kidney (HR, 0.77; 95% CI, 0.70-0.85), gastric cardia (HR, 0.78; 95% CI, 0.64-0.95), endometrial (HR, 0.79; 95% CI, 0.68-0.92), myeloid leukemia (HR, 0.80; 95% CI, 0.70-0.92), myeloma (HR, 0.83; 95% CI, 0.72-0.95), colon (HR, 0.84; 95% CI, 0.77-0.91), head and neck (HR, 0.85; 95% CI, 0.78-0.93), rectal (HR, 0.87; 95% CI, 0.80-0.95), bladder (HR, 0.87; 95% CI, 0.82-0.92), and breast (HR, 0.90; 95% CI, 0.87-0.93). Body mass index adjustment modestly attenuated associations for several cancers, but 10 of 13 inverse associations remained statistically significant after this adjustment. Leisure-time physical activity was associated with higher risks of malignant melanoma (HR, 1.27; 95% CI, 1.16-1.40) and prostate cancer (HR, 1.05; 95% CI, 1.03-1.08). Associations were generally similar between overweight/obese and normal-weight individuals. Smoking status modified the association for lung cancer but not other smoking-related cancers. Conclusions and Relevance Leisure-time physical activity was associated with lower risks of many cancer types. Health care professionals counseling inactive adults should emphasize that most of these associations were evident regardless of body size or smoking history, supporting broad generalizability of findings.

Book ChapterDOI
08 Oct 2016
TL;DR: In this article, the authors proposed an improved body part detector that generates effective bottom-up proposals for body parts, image-conditioned pairwise terms that allow to assemble the proposals into a variable number of consistent body part configurations, and an incremental optimization strategy that explores the search space more efficiently thus leading both to better performance and significant speedup factors.
Abstract: The goal of this paper is to advance the state-of-the-art of articulated pose estimation in scenes with multiple people. To that end we contribute on three fronts. We propose (1) improved body part detectors that generate effective bottom-up proposals for body parts; (2) novel image-conditioned pairwise terms that allow to assemble the proposals into a variable number of consistent body part configurations; and (3) an incremental optimization strategy that explores the search space more efficiently thus leading both to better performance and significant speed-up factors. Evaluation is done on two single-person and two multi-person pose estimation benchmarks. The proposed approach significantly outperforms best known multi-person pose estimation results while demonstrating competitive performance on the task of single person pose estimation (Models and code available at http://pose.mpi-inf.mpg.de).

Journal ArticleDOI
TL;DR: The physical, chemical and biochemical characteristics of the existing photoacoustic contrast agents are critically reviewed, highlighting key applications and present challenges for molecular PAI.
Abstract: Photoacoustic imaging (PAI) is an emerging tool that bridges the traditional depth limits of ballistic optical imaging and the resolution limits of diffuse optical imaging. Using the acoustic waves generated in response to the absorption of pulsed laser light, it provides noninvasive images of absorbed optical energy density at depths of several centimeters with a resolution of ∼100 μm. This versatile and scalable imaging modality has now shown potential for molecular imaging, which enables visualization of biological processes with systemically introduced contrast agents. Understanding the relative merits of the vast range of contrast agents available, from small-molecule dyes to gold and carbon nanostructures to liposome encapsulations, is a considerable challenge. Here we critically review the physical, chemical and biochemical characteristics of the existing photoacoustic contrast agents, highlighting key applications and present challenges for molecular PAI.

Journal ArticleDOI
TL;DR: Besides satisfying the Koch’s postulates, this readily available hamster model is an important tool for studying transmission, pathogenesis, treatment, and vaccination against SARS-CoV-2.
Abstract: Background A physiological small-animal model that resembles COVID-19 with low mortality is lacking. Methods Molecular docking on the binding between angiotensin-converting enzyme 2 (ACE2) of common laboratory mammals and the receptor-binding domain of the surface spike protein of SARS-CoV-2 suggested that the golden Syrian hamster is an option. Virus challenge, contact transmission, and passive immunoprophylaxis studies were performed. Serial organ tissues and blood were harvested for histopathology, viral load and titer, chemokine/cytokine level, and neutralizing antibody titer. Results The Syrian hamster could be consistently infected by SARS-CoV-2. Maximal clinical signs of rapid breathing, weight loss, histopathological changes from the initial exudative phase of diffuse alveolar damage with extensive apoptosis to the later proliferative phase of tissue repair, airway and intestinal involvement with viral nucleocapsid protein expression, high lung viral load, and spleen and lymphoid atrophy associated with marked chemokine/cytokine activation were observed within the first week of virus challenge. The mean lung virus titer was between 105 and 107 TCID50/g. Challenged index hamsters consistently infected naive contact hamsters housed within the same cages, resulting in similar pathology but not weight loss. All infected hamsters recovered and developed mean serum neutralizing antibody titers ≥1:427 14 days postchallenge. Immunoprophylaxis with early convalescent serum achieved significant decrease in lung viral load but not in lung pathology. No consistent nonsynonymous adaptive mutation of the spike was found in viruses isolated from the infected hamsters. Conclusions Besides satisfying Koch's postulates, this readily available hamster model is an important tool for studying transmission, pathogenesis, treatment, and vaccination against SARS-CoV-2.

Proceedings ArticleDOI
07 Jun 2015
TL;DR: DeepID2+ as discussed by the authors improves the performance by increasing the dimension of hidden representations and adding supervision to early convolutional layers, achieving state-of-the-art performance on LFW and YouTube Faces benchmarks.
Abstract: This paper designs a high-performance deep convolutional network (DeepID2+) for face recognition. It is learned with the identification-verification supervisory signal. By increasing the dimension of hidden representations and adding supervision to early convolutional layers, DeepID2+ achieves new state-of-the-art on LFW and YouTube Faces benchmarks. Through empirical studies, we have discovered three properties of its deep neural activations critical for the high performance: sparsity, selectiveness and robustness. (1) It is observed that neural activations are moderately sparse. Moderate sparsity maximizes the discriminative power of the deep net as well as the distance between images. It is surprising that DeepID2+ still can achieve high recognition accuracy even after the neural responses are binarized. (2) Its neurons in higher layers are highly selective to identities and identity-related attributes. We can identify different subsets of neurons which are either constantly excited or inhibited when different identities or attributes are present. Although DeepID2+ is not taught to distinguish attributes during training, it has implicitly learned such high-level concepts. (3) It is much more robust to occlusions, although occlusion patterns are not included in the training set.

Journal ArticleDOI
TL;DR: There is currently very limited information on the nature and prevalence of post‐COVID‐19 symptoms after hospital discharge.
Abstract: BACKGROUND: There is currently very limited information on the nature and prevalence of post-COVID-19 symptoms after hospital discharge. METHODS: A purposive sample of 100 survivors discharged from a large University hospital were assessed 4 to 8 weeks after discharge by a multidisciplinary team of rehabilitation professionals using a specialist telephone screening tool designed to capture symptoms and impact on daily life. EQ-5D-5L telephone version was also completed. RESULTS: Participants were between 29 and 71 days (mean 48 days) postdischarge from hospital. Thirty-two participants required treatment in intensive care unit (ICU group) and 68 were managed in hospital wards without needing ICU care (ward group). New illness-related fatigue was the most common reported symptom by 72% participants in ICU group and 60.3% in ward group. The next most common symptoms were breathlessness (65.6% in ICU group and 42.6% in ward group) and psychological distress (46.9% in ICU group and 23.5% in ward group). There was a clinically significant drop in EQ5D in 68.8% in ICU group and in 45.6% in ward group. CONCLUSIONS: This is the first study from the United Kingdom reporting on postdischarge symptoms. We recommend planning rehabilitation services to manage these symptoms appropriately and maximize the functional return of COVID-19 survivors.

Journal ArticleDOI
TL;DR: The American Cancer Society estimates the number of new cancer cases and deaths in the United States and compiles the most recent data on population-based cancer occurrence and outcomes using incidence data collected by central cancer registries and mortality data collected from the National Center for Health Statistics as discussed by the authors .
Abstract: Each year, the American Cancer Society estimates the numbers of new cancer cases and deaths in the United States and compiles the most recent data on population‐based cancer occurrence and outcomes using incidence data collected by central cancer registries and mortality data collected by the National Center for Health Statistics. In 2023, 1,958,310 new cancer cases and 609,820 cancer deaths are projected to occur in the United States. Cancer incidence increased for prostate cancer by 3% annually from 2014 through 2019 after two decades of decline, translating to an additional 99,000 new cases; otherwise, however, incidence trends were more favorable in men compared to women. For example, lung cancer in women decreased at one half the pace of men (1.1% vs. 2.6% annually) from 2015 through 2019, and breast and uterine corpus cancers continued to increase, as did liver cancer and melanoma, both of which stabilized in men aged 50 years and older and declined in younger men. However, a 65% drop in cervical cancer incidence during 2012 through 2019 among women in their early 20s, the first cohort to receive the human papillomavirus vaccine, foreshadows steep reductions in the burden of human papillomavirus‐associated cancers, the majority of which occur in women. Despite the pandemic, and in contrast with other leading causes of death, the cancer death rate continued to decline from 2019 to 2020 (by 1.5%), contributing to a 33% overall reduction since 1991 and an estimated 3.8 million deaths averted. This progress increasingly reflects advances in treatment, which are particularly evident in the rapid declines in mortality (approximately 2% annually during 2016 through 2020) for leukemia, melanoma, and kidney cancer, despite stable/increasing incidence, and accelerated declines for lung cancer. In summary, although cancer mortality rates continue to decline, future progress may be attenuated by rising incidence for breast, prostate, and uterine corpus cancers, which also happen to have the largest racial disparities in mortality.

Journal ArticleDOI
TL;DR: MendelianRandomization is a software package for the R open-source software environment that performs Mendelian randomization analyses using summarized data to implement the inverse-variance weighted, MR-Egger and weighted median methods for multiple genetic variants.
Abstract: MendelianRandomization is a software package for the R open-source software environment that performs Mendelian randomization analyses using summarized data. The core functionality is to implement the inverse-variance weighted, MR-Egger and weighted median methods for multiple genetic variants. Several options are available to the user, such as the use of robust regression, fixed- or random-effects models and the penalization of weights for genetic variants with heterogeneous causal estimates. Extensions to these methods, such as allowing for variants to be correlated, can be chosen if appropriate. Graphical commands allow summarized data to be displayed in an interactive graph, or the plotting of causal estimates from multiple methods, for comparison. Although the main method of data entry is directly by the user, there is also an option for allowing summarized data to be incorporated from the PhenoScanner database of genotype-phenotype associations. We hope to develop this feature in future versions of the package. The R software environment is available for download from [https://www.r-project.org/]. The MendelianRandomization package can be downloaded from the Comprehensive R Archive Network (CRAN) within R, or directly from [https://cran.r-project.org/web/packages/MendelianRandomization/]. Both R and the MendelianRandomization package are released under GNU General Public Licenses (GPL-2|GPL-3).

Journal ArticleDOI
Hongzhou Lu1
TL;DR: Three general methods, which include existing broad-spectrum antiviral drugs using standard assays, screening of a chemical library containing many existing compounds or databases, and the redevelopment of new specific drugs based on the genome and biophysical understanding of individual coronaviruses, are used to discover the potential antiviral treatment of human pathogen coronavirus.
Abstract: As of January 22, 2020, a total of 571 cases of the 2019-new coronavirus (2019-nCoV) have been reported in 25 provinces (districts and cities) in China. At present, there is no vaccine or antiviral treatment for human and animal coronavirus, so that identifying the drug treatment options as soon as possible is critical for the response to the 2019-nCoV outbreak. Three general methods, which include existing broad-spectrum antiviral drugs using standard assays, screening of a chemical library containing many existing compounds or databases, and the redevelopment of new specific drugs based on the genome and biophysical understanding of individual coronaviruses, are used to discover the potential antiviral treatment of human pathogen coronavirus. Lopinavir /Ritonavir, Nucleoside analogues, Neuraminidase inhibitors, Remdesivir, peptide (EK1), abidol, RNA synthesis inhibitors (such as TDF, 3TC), anti-inflammatory drugs (such as hormones and other molecules), Chinese traditional medicine, such ShuFengJieDu Capsules and Lianhuaqingwen Capsule, could be the drug treatment options for 2019-nCoV. However, the efficacy and safety of these drugs for 2019- nCoV still need to be further confirmed by clinical experiments.

Journal Article
TL;DR: Evidence considered by ACIP in recommending 9vHPV as one of three HPV vaccines that can be used for vaccination is summarized and recommendations for vaccine use are provided.
Abstract: During its February 2015 meeting, the Advisory Committee on Immunization Practices (ACIP) recommended 9-valent human papillomavirus (HPV) vaccine (9vHPV) (Gardasil 9, Merck and Co., Inc.) as one of three HPV vaccines that can be used for routine vaccination. HPV vaccine is recommended for routine vaccination at age 11 or 12 years. ACIP also recommends vaccination for females aged 13 through 26 years and males aged 13 through 21 years not vaccinated previously. Vaccination is also recommended through age 26 years for men who have sex with men and for immunocompromised persons (including those with HIV infection) if not vaccinated previously. 9vHPV is a noninfectious, virus-like particle (VLP) vaccine. Similar to quadrivalent HPV vaccine (4vHPV), 9vHPV contains HPV 6, 11, 16, and 18 VLPs. In addition, 9vHPV contains HPV 31, 33, 45, 52, and 58 VLPs. 9vHPV was approved by the Food and Drug Administration (FDA) on December 10, 2014, for use in females aged 9 through 26 years and males aged 9 through 15 years. For these recommendations, ACIP reviewed additional data on 9vHPV in males aged 16 through 26 years. 9vHPV and 4vHPV are licensed for use in females and males. Bivalent HPV vaccine (2vHPV), which contains HPV 16, 18 VLPs, is licensed for use in females. This report summarizes evidence considered by ACIP in recommending 9vHPV as one of three HPV vaccines that can be used for vaccination and provides recommendations for vaccine use.

Posted Content
TL;DR: A graph auto-encoder framework based on differentiable message passing on the bipartite interaction graph that shows competitive performance on standard collaborative filtering benchmarks and outperforms recent state-of-the-art methods.
Abstract: We consider matrix completion for recommender systems from the point of view of link prediction on graphs. Interaction data such as movie ratings can be represented by a bipartite user-item graph with labeled edges denoting observed ratings. Building on recent progress in deep learning on graph-structured data, we propose a graph auto-encoder framework based on differentiable message passing on the bipartite interaction graph. Our model shows competitive performance on standard collaborative filtering benchmarks. In settings where complimentary feature information or structured data such as a social network is available, our framework outperforms recent state-of-the-art methods.

Journal ArticleDOI
TL;DR: The principal objective of this review is to summarize the present knowledge on the use, advances, advantages and weaknesses of a large number of experimental techniques that are available for the characterization of nanoparticles.
Abstract: Nanostructures have attracted huge interest as a rapidly growing class of materials for many applications. Several techniques have been used to characterize the size, crystal structure, elemental composition and a variety of other physical properties of nanoparticles. In several cases, there are physical properties that can be evaluated by more than one technique. Different strengths and limitations of each technique complicate the choice of the most suitable method, while often a combinatorial characterization approach is needed. In addition, given that the significance of nanoparticles in basic research and applications is constantly increasing, it is necessary that researchers from separate fields overcome the challenges in the reproducible and reliable characterization of nanomaterials, after their synthesis and further process (e.g. annealing) stages. The principal objective of this review is to summarize the present knowledge on the use, advances, advantages and weaknesses of a large number of experimental techniques that are available for the characterization of nanoparticles. Different characterization techniques are classified according to the concept/group of the technique used, the information they can provide, or the materials that they are destined for. We describe the main characteristics of the techniques and their operation principles and we give various examples of their use, presenting them in a comparative mode, when possible, in relation to the property studied in each case.