scispace - formally typeset
Search or ask a question
Browse all papers

Journal ArticleDOI
TL;DR: A low diagnostic threshold in acutely ill patients, as well as in patients with predisposing factors, is suggested for pregnant women with unexplained persistent nausea, fatigue, and hypotension and a short corticotropin test is recommended as the "gold standard" diagnostic tool to establish the diagnosis.
Abstract: Objective: This clinical practice guideline addresses the diagnosis and treatment of primary adrenal insufficiency. Participants: The Task Force included a chair, selected by The Clinical Guidelines Subcommittee of the Endocrine Society, eight additional clinicians experienced with the disease, a methodologist, and a medical writer. The co-sponsoring associations (European Society of Endocrinology and the American Association for Clinical Chemistry) had participating members. The Task Force received no corporate funding or remuneration in connection with this review. Evidence: This evidence-based guideline was developed using the Grading of Recommendations, Assessment, Development, and Evaluation (GRADE) system to determine the strength of recommendations and the quality of evidence. Consensus Process: The evidence used to formulate recommendations was derived from two commissioned systematic reviews as well as other published systematic reviews and studies identified by the Task Force. The guideline was ...

1,015 citations


Journal ArticleDOI
15 Mar 2016-Heart
TL;DR: The findings suggest that deficiencies in social relationships are associated with an increased risk of developing CHD and stroke in high-income countries.
Abstract: Background The influence of social relationships on morbidity is widely accepted, but the size of the risk to cardiovascular health is unclear. Objective We undertook a systematic review and meta-analysis to investigate the association between loneliness or social isolation and incident coronary heart disease (CHD) and stroke. Methods Sixteen electronic databases were systematically searched for longitudinal studies set in high-income countries and published up until May 2015. Two independent reviewers screened studies for inclusion and extracted data. We assessed quality using a component approach and pooled data for analysis using random effects models. Results Of the 35 925 records retrieved, 23 papers met inclusion criteria for the narrative review. They reported data from 16 longitudinal datasets, for a total of 4628 CHD and 3002 stroke events recorded over follow-up periods ranging from 3 to 21 years. Reports of 11 CHD studies and 8 stroke studies provided data suitable for meta-analysis. Poor social relationships were associated with a 29% increase in risk of incident CHD (pooled relative risk: 1.29, 95% CI 1.04 to 1.59) and a 32% increase in risk of stroke (pooled relative risk: 1.32, 95% CI 1.04 to 1.68). Subgroup analyses did not identify any differences by gender. Conclusions Our findings suggest that deficiencies in social relationships are associated with an increased risk of developing CHD and stroke. Future studies are needed to investigate whether interventions targeting loneliness and social isolation can help to prevent two of the leading causes of death and disability in high-income countries. Study registration number CRD42014010225.

1,015 citations


Proceedings ArticleDOI
14 Jun 2017
TL;DR: In this paper, the authors proposed a novel deep neural network architecture which allows it to learn without any significant increase in number of parameters and achieves state-of-the-art performance on CamVid and Cityscapes dataset.
Abstract: Pixel-wise semantic segmentation for visual scene understanding not only needs to be accurate, but also efficient in order to find any use in real-time application. Existing algorithms even though are accurate but they do not focus on utilizing the parameters of neural network efficiently. As a result they are huge in terms of parameters and number of operations; hence slow too. In this paper, we propose a novel deep neural network architecture which allows it to learn without any significant increase in number of parameters. Our network uses only 11.5 million parameters and 21.2 GFLOPs for processing an image of resolution 3 × 640 × 360. It gives state-of-the-art performance on CamVid and comparable results on Cityscapes dataset. We also compare our networks processing time on NVIDIA GPU and embedded system device with existing state-of-the-art architectures for different image resolutions.

1,015 citations


Proceedings ArticleDOI
18 May 2015
TL;DR: Empirically, AutoRec's compact and efficiently trainable model outperforms state-of-the-art CF techniques (biased matrix factorization, RBM-CF and LLORMA) on the Movielens and Netflix datasets.
Abstract: This paper proposes AutoRec, a novel autoencoder framework for collaborative filtering (CF). Empirically, AutoRec's compact and efficiently trainable model outperforms state-of-the-art CF techniques (biased matrix factorization, RBM-CF and LLORMA) on the Movielens and Netflix datasets.

1,015 citations


Journal ArticleDOI
TL;DR: Authors/Task Force Members: Michele Brignole* (Chairperson), Angel Moya* (Co-chairperson) (Spain), Frederik J. de Lange (The Netherlands), Jean-Claude Deharo (France), Perry M. Elliott (UK), Alessandra Fanciulli (Austria), Artur Fedorowski (Sweden), Raffaello Furlan (Italy), Rose Anne Kenny (Ireland), Alfonso Mart ın (Spain

1,015 citations


Journal ArticleDOI
TL;DR: The potential consequences for health inequalities of the lockdown measures implemented internationally as a response to the COVID-19 pandemic are explored, focusing on the likely unequal impacts of the economic crisis.
Abstract: This essay examines the implications of the COVID-19 pandemic for health inequalities. It outlines historical and contemporary evidence of inequalities in pandemics-drawing on international research into the Spanish influenza pandemic of 1918, the H1N1 outbreak of 2009 and the emerging international estimates of socio-economic, ethnic and geographical inequalities in COVID-19 infection and mortality rates. It then examines how these inequalities in COVID-19 are related to existing inequalities in chronic diseases and the social determinants of health, arguing that we are experiencing a syndemic pandemic It then explores the potential consequences for health inequalities of the lockdown measures implemented internationally as a response to the COVID-19 pandemic, focusing on the likely unequal impacts of the economic crisis. The essay concludes by reflecting on the longer-term public health policy responses needed to ensure that the COVID-19 pandemic does not increase health inequalities for future generations.

1,015 citations


Journal ArticleDOI
TL;DR: An insight into the analogies, state-of-the-art technologies, concepts, and prospects under the umbrella of perovskite materials (both inorganic-organic hybrid halideperovskites and ferroelectric perovkites) for future multifunctional energy conversion and storage devices is provided.
Abstract: An insight into the analogies, state-of-the-art technologies, concepts, and prospects under the umbrella of perovskite materials (both inorganic-organic hybrid halide perovskites and ferroelectric perovskites) for future multifunctional energy conversion and storage devices is provided. Often, these are considered entirely different branches of research; however, considering them simultaneously and holistically can provide several new opportunities. Recent advancements have highlighted the potential of hybrid perovskites for high-efficiency solar cells. The intrinsic polar properties of these materials, including the potential for ferroelectricity, provide additional possibilities for simultaneously exploiting several energy conversion mechanisms such as the piezoelectric, pyroelectric, and thermoelectric effect and electrical energy storage. The presence of these phenomena can support the performance of perovskite solar cells. The energy conversion using these effects (piezo-, pyro-, and thermoelectric effect) can also be enhanced by a change in the light intensity. Thus, there lies a range of possibilities for tuning the structural, electronic, optical, and magnetic properties of perovskites to simultaneously harvest energy using more than one mechanism to realize an improved efficiency. This requires a basic understanding of concepts, mechanisms, corresponding material properties, and the underlying physics involved with these effects.

1,015 citations


Journal ArticleDOI
11 May 2017-PLOS ONE
TL;DR: The ability to create and deploy reproducible environments across these centers, a previously unmet need, makes Singularity a game changing development for computational science.
Abstract: Here we present Singularity, software developed to bring containers and reproducibility to scientific computing. Using Singularity containers, developers can work in reproducible environments of their choosing and design, and these complete environments can easily be copied and executed on other platforms. Singularity is an open source initiative that harnesses the expertise of system and software engineers and researchers alike, and integrates seamlessly into common workflows for both of these groups. As its primary use case, Singularity brings mobility of computing to both users and HPC centers, providing a secure means to capture and distribute software and compute environments. This ability to create and deploy reproducible environments across these centers, a previously unmet need, makes Singularity a game changing development for computational science.

1,015 citations


Journal ArticleDOI
TL;DR: NNPDF31 as discussed by the authors is the first global set of PDFs determined using a methodology validated by a closure test, which is motivated by recent progress in methodology and available data, and involves both on the methodological side, parametrize and determine the charm PDF alongside the light-quark and gluon ones, thereby increasing from seven to eight the number of independent PDFs.
Abstract: We present a new set of parton distributions, NNPDF31, which updates NNPDF30, the first global set of PDFs determined using a methodology validated by a closure test The update is motivated by recent progress in methodology and available data, and involves both On the methodological side, we now parametrize and determine the charm PDF alongside the light-quark and gluon ones, thereby increasing from seven to eight the number of independent PDFs On the data side, we now include the D0 electron and muon W asymmetries from the final Tevatron dataset, the complete LHCb measurements of W and Z production in the forward region at 7 and 8 TeV, and new ATLAS and CMS measurements of inclusive jet and electroweak boson production We also include for the first time top-quark pair differential distributions and the transverse momentum of the Z bosons from ATLAS and CMS We investigate the impact of parametrizing charm and provide evidence that the accuracy and stability of the PDFs are thereby improved We study the impact of the new data by producing a variety of determinations based on reduced datasets We find that both improvements have a significant impact on the PDFs, with some substantial reductions in uncertainties, but with the new PDFs generally in agreement with the previous set at the one-sigma level The most significant changes are seen in the light-quark flavor separation, and in increased precision in the determination of the gluon We explore the implications of NNPDF31 for LHC phenomenology at Run II, compare with recent LHC measurements at 13 TeV, provide updated predictions for Higgs production cross-sections and discuss the strangeness and charm content of the proton in light of our improved dataset and methodology The NNPDF31 PDFs are delivered for the first time both as Hessian sets, and as optimized Monte Carlo sets with a compressed number of replicas

1,014 citations


Journal ArticleDOI
TL;DR: Novel concepts and paradigms are described here that have emerged, targeting superior TE materials and higher TE performance, including band convergence, "phonon-glass electron-crystal", multiscale phonon scattering, resonant states, anharmonicity, etc.
Abstract: The past two decades have witnessed the rapid growth of thermoelectric (TE) research. Novel concepts and paradigms are described here that have emerged, targeting superior TE materials and higher TE performance. These superior aspects include band convergence, "phonon-glass electron-crystal", multiscale phonon scattering, resonant states, anharmonicity, etc. Based on these concepts, some new TE materials with distinct features have been identified, including solids with high band degeneracy, with cages in which atoms rattle, with nanostructures at various length scales, etc. In addition, the performance of classical materials has been improved remarkably. However, the figure of merit zT of most TE materials is still lower than 2.0, generally around 1.0, due to interrelated TE properties. In order to realize an "overall zT > 2.0," it is imperative that the interrelated properties are decoupled more thoroughly, or new degrees of freedom are added to the overall optimization problem. The electrical and thermal transport must be synergistically optimized. Here, a detailed discussion about the commonly adopted strategies to optimize individual TE properties is presented. Then, four main compromises between the TE properties are elaborated from the point of view of the underlying mechanisms and decoupling strategies. Finally, some representative systems of synergistic optimization are also presented, which can serve as references for other TE materials. In conclusion, some of the newest ideas for the future are discussed.

1,014 citations


Journal ArticleDOI
02 Nov 2017-Nature
TL;DR: A genome-wide association study of breast cancer in 122,977 cases and 105,974 controls of European ancestry and 14,068 cases and 13,104 controls of East Asian ancestry finds that heritability of Breast cancer due to all single-nucleotide polymorphisms in regulatory features was 2–5-fold enriched relative to the genome- wide average.
Abstract: Breast cancer risk is influenced by rare coding variants in susceptibility genes, such as BRCA1, and many common, mostly non-coding variants. However, much of the genetic contribution to breast cancer risk remains unknown. Here we report the results of a genome-wide association study of breast cancer in 122,977 cases and 105,974 controls of European ancestry and 14,068 cases and 13,104 controls of East Asian ancestry. We identified 65 new loci that are associated with overall breast cancer risk at P < 5 × 10-8. The majority of credible risk single-nucleotide polymorphisms in these loci fall in distal regulatory elements, and by integrating in silico data to predict target genes in breast cells at each locus, we demonstrate a strong overlap between candidate target genes and somatic driver genes in breast tumours. We also find that heritability of breast cancer due to all single-nucleotide polymorphisms in regulatory features was 2-5-fold enriched relative to the genome-wide average, with strong enrichment for particular transcription factor binding sites. These results provide further insight into genetic susceptibility to breast cancer and will improve the use of genetic risk scores for individualized screening and prevention.

Journal ArticleDOI
19 Aug 2016-Science
TL;DR: Microscopy of an evolving quantum system indicates that the full quantum state remains pure, whereas thermalization occurs on a local scale, whereas entanglement creates local entropy that validates the use of statistical physics for local observables.
Abstract: Statistical mechanics relies on the maximization of entropy in a system at thermal equilibrium. However, an isolated quantum many-body system initialized in a pure state remains pure during Schrodinger evolution, and in this sense it has static, zero entropy. We experimentally studied the emergence of statistical mechanics in a quantum state and observed the fundamental role of quantum entanglement in facilitating this emergence. Microscopy of an evolving quantum system indicates that the full quantum state remains pure, whereas thermalization occurs on a local scale. We directly measured entanglement entropy, which assumes the role of the thermal entropy in thermalization. The entanglement creates local entropy that validates the use of statistical physics for local observables. Our measurements are consistent with the eigenstate thermalization hypothesis.

Journal ArticleDOI
TL;DR: In this article, the authors report findings in five patients who presented with venous thrombosis and thrombo-cellocytopenia 7 to 10 days after receiving the first dose of the ChAdOx1 nCoV-19 adenoviral vector vaccine against coronavirus disease 2019 (Covid-19).
Abstract: We report findings in five patients who presented with venous thrombosis and thrombocytopenia 7 to 10 days after receiving the first dose of the ChAdOx1 nCoV-19 adenoviral vector vaccine against coronavirus disease 2019 (Covid-19). The patients were health care workers who were 32 to 54 years of age. All the patients had high levels of antibodies to platelet factor 4-polyanion complexes; however, they had had no previous exposure to heparin. Because the five cases occurred in a population of more than 130,000 vaccinated persons, we propose that they represent a rare vaccine-related variant of spontaneous heparin-induced thrombocytopenia that we refer to as vaccine-induced immune thrombotic thrombocytopenia.

Posted Content
TL;DR: DeepLIFT as mentioned in this paper decomposes the output prediction of a neural network on a specific input by backpropagating the contributions of all neurons in the network to every feature of the input.
Abstract: The purported "black box" nature of neural networks is a barrier to adoption in applications where interpretability is essential. Here we present DeepLIFT (Deep Learning Important FeaTures), a method for decomposing the output prediction of a neural network on a specific input by backpropagating the contributions of all neurons in the network to every feature of the input. DeepLIFT compares the activation of each neuron to its 'reference activation' and assigns contribution scores according to the difference. By optionally giving separate consideration to positive and negative contributions, DeepLIFT can also reveal dependencies which are missed by other approaches. Scores can be computed efficiently in a single backward pass. We apply DeepLIFT to models trained on MNIST and simulated genomic data, and show significant advantages over gradient-based methods. Video tutorial: this http URL, ICML slides: this http URL, ICML talk: this https URL, code: this http URL.

Proceedings Article
24 May 2019
TL;DR: The authors show that the unsupervised learning of disentangled representations is fundamentally impossible without inductive biases on both the models and the data, and suggest that future work on disentanglement learning should be explicit about the role of inductive bias and (implicit) supervision.
Abstract: The key idea behind the unsupervised learning of disentangled representations is that real-world data is generated by a few explanatory factors of variation which can be recovered by unsupervised learning algorithms. In this paper, we provide a sober look at recent progress in the field and challenge some common assumptions. We first theoretically show that the unsupervised learning of disentangled representations is fundamentally impossible without inductive biases on both the models and the data. Then, we train more than 12000 models covering most prominent methods and evaluation metrics in a reproducible large-scale experimental study on seven different data sets. We observe that while the different methods successfully enforce properties ``encouraged'' by the corresponding losses, well-disentangled models seemingly cannot be identified without supervision. Furthermore, increased disentanglement does not seem to lead to a decreased sample complexity of learning for downstream tasks. Our results suggest that future work on disentanglement learning should be explicit about the role of inductive biases and (implicit) supervision, investigate concrete benefits of enforcing disentanglement of the learned representations, and consider a reproducible experimental setup covering several data sets.

Journal ArticleDOI
TL;DR: The performance of a supercapacitor can be characterized by a series of key parameters, including the cell capacitance, operating voltage, equivalent series resistance, power density, energy density, and time constant.
Abstract: The performance of a supercapacitor can be characterized by a series of key parameters, including the cell capacitance, operating voltage, equivalent series resistance, power density, energy density, and time constant. To accurately measure these parameters, a variety of methods have been proposed and are used in academia and industry. As a result, some confusion has been caused due to the inconsistencies between different evaluation methods and practices. Such confusion hinders effective communication of new research findings, and creates a hurdle in transferring novel supercapacitor technologies from research labs to commercial applications. Based on public sources, this article is an attempt to inventory, critique and hopefully streamline the commonly used instruments, key performance metrics, calculation methods, and major affecting factors for supercapacitor performance evaluation. Thereafter the primary sources of inconsistencies are identified and possible solutions are suggested, with emphasis on device performance vs. material properties and the rate dependency of supercapacitors. We hope, by using reliable, intrinsic, and comparable parameters produced, the existing inconsistencies and confusion can be largely eliminated so as to facilitate further progress in the field.

Journal ArticleDOI
01 Dec 2016-BMJ Open
TL;DR: The AXIS tool was developed in a way that it can be used across disciplines to aid the inclusion of CSSs in systematic reviews, guidelines and clinical decision-making in a number of different disciplines.
Abstract: Objectives: The aim of this study was to develop a critical appraisal (CA) tool that addressed study design and reporting quality as well as the risk of bias in cross-sectional studies (CSSs). In addition, the aim was to produce a help document to guide the non-expert user through the tool. Design: An initial scoping review of the published literature and key epidemiological texts was undertaken prior to the formation of a Delphi panel to establish key components for a CA tool for CSSs. A consensus of 80% was required from the Delphi panel for any component to be included in the final tool. Results: An initial list of 39 components was identified through examination of existing resources. An international Delphi panel of 18 medical and veterinary experts was established. After 3 rounds of the Delphi process, the Appraisal tool for Cross-Sectional Studies (AXIS tool) was developed by consensus and consisted of 20 components. A detailed explanatory document was also developed with the tool, giving expanded explanation of each question and providing simple interpretations and examples of the epidemiological concepts being examined in each question to aid non-expert users. Conclusions: CA of the literature is a vital step in evidence synthesis and therefore evidence-based decision-making in a number of different disciplines. The AXIS tool is therefore unique and was developed in a way that it can be used across disciplines to aid the inclusion of CSSs in systematic reviews, guidelines and clinical decision-making.

Journal ArticleDOI
TL;DR: Global evidence linking sleep disturbance, sleep duration, and inflammation in adult humans is assessed and sleep disturbance and long sleep duration are associated with increases in markers of systemic inflammation.

Journal ArticleDOI
TL;DR: The mechanisms underlying fibrosis and approaches to therapy are reviewed and fibrotic tissue becomes excessive, it can have diverse pathophysiological effects on a number of organ systems.
Abstract: Fibrosis is a consequence of the inflammatory response. When fibrotic tissue becomes excessive, it can have diverse pathophysiological effects on a number of organ systems. The mechanisms underlying fibrosis and approaches to therapy are reviewed.

Journal ArticleDOI
TL;DR: SParse InversE Covariance Estimation for Ecological Association Inference is presented, a statistical method for the inference of microbial ecological networks from amplicon sequencing datasets that outperforms state-of-the-art methods to recover edges and network properties on synthetic data under a variety of scenarios.
Abstract: 16S ribosomal RNA (rRNA) gene and other environmental sequencing techniques provide snapshots of microbial communities, revealing phylogeny and the abundances of microbial populations across diverse ecosystems. While changes in microbial community structure are demonstrably associated with certain environmental conditions (from metabolic and immunological health in mammals to ecological stability in soils and oceans), identification of underlying mechanisms requires new statistical tools, as these datasets present several technical challenges. First, the abundances of microbial operational taxonomic units (OTUs) from amplicon-based datasets are compositional. Counts are normalized to the total number of counts in the sample. Thus, microbial abundances are not independent, and traditional statistical metrics (e.g., correlation) for the detection of OTU-OTU relationships can lead to spurious results. Secondly, microbial sequencing-based studies typically measure hundreds of OTUs on only tens to hundreds of samples; thus, inference of OTU-OTU association networks is severely under-powered, and additional information (or assumptions) are required for accurate inference. Here, we present SPIEC-EASI (SParse InversE Covariance Estimation for Ecological Association Inference), a statistical method for the inference of microbial ecological networks from amplicon sequencing datasets that addresses both of these issues. SPIEC-EASI combines data transformations developed for compositional data analysis with a graphical model inference framework that assumes the underlying ecological association network is sparse. To reconstruct the network, SPIEC-EASI relies on algorithms for sparse neighborhood and inverse covariance selection. To provide a synthetic benchmark in the absence of an experimentally validated gold-standard network, SPIEC-EASI is accompanied by a set of computational tools to generate OTU count data from a set of diverse underlying network topologies. SPIEC-EASI outperforms state-of-the-art methods to recover edges and network properties on synthetic data under a variety of scenarios. SPIEC-EASI also reproducibly predicts previously unknown microbial associations using data from the American Gut project.

Journal ArticleDOI
TL;DR: The emergence of multisystem inflammatory syndrome in children in New York State coincided with widespread SARS-CoV-2 transmission; this hyperinflammatory syndrome with dermatologic, mucocutaneous, and gastrointestinal manifestations was associated with cardiac dysfunction.
Abstract: Background A multisystem inflammatory syndrome in children (MIS-C) is associated with coronavirus disease 2019 The New York State Department of Health (NYSDOH) established active, statewi

Proceedings Article
19 Jun 2016
TL;DR: In this article, a semi-supervised learning framework based on graph embeddings is proposed, where given a graph between instances, an embedding for each instance is trained to jointly predict the class label and the neighborhood context in the graph.
Abstract: We present a semi-supervised learning framework based on graph embeddings. Given a graph between instances, we train an embedding for each instance to jointly predict the class label and the neighborhood context in the graph. We develop both transductive and inductive variants of our method. In the transductive variant of our method, the class labels are determined by both the learned embeddings and input feature vectors, while in the inductive variant, the embeddings are defined as a parametric function of the feature vectors, so predictions can be made on instances not seen during training. On a large and diverse set of benchmark tasks, including text classification, distantly supervised entity extraction, and entity classification, we show improved performance over many of the existing models.

Journal ArticleDOI
TL;DR: An in-depth presentation is given of olex2.refine, the new refinement engine integrated in the Olex2 program.
Abstract: This paper describes the mathematical basis for olex2.refine, the new refinement engine which is integrated within the Olex2 program. Precise and clear equations are provided for every computation performed by this engine, including structure factors and their derivatives, constraints, restraints and twinning; a general overview is also given of the different components of the engine and their relation to each other. A framework for adding multiple general constraints with dependencies on common physical parameters is described. Several new restraints on atomic displacement parameters are also presented.

Journal ArticleDOI
TL;DR: Traditional infection-control and public health strategies rely heavily on early detection of disease to contain spread, but when Covid-19 burst onto the global scene, public health officials initially doubted its ability to tackle infectious disease outbreaks.
Abstract: Traditional infection-control and public health strategies rely heavily on early detection of disease to contain spread. When Covid-19 burst onto the global scene, public health officials initially...

Journal ArticleDOI
TL;DR: The large proportion of asymptomatic children indicates the difficulty in identifying paediatric patients who do not have clear epidemiological information, leading to a dangerous situation in community-acquired infections.
Abstract: Summary Background Since December, 2019, an outbreak of coronavirus disease 2019 (COVID-19) has spread globally. Little is known about the epidemiological and clinical features of paediatric patients with COVID-19. Methods We retrospectively retrieved data for paediatric patients (aged 0–16 years) with confirmed COVID-19 from electronic medical records in three hospitals in Zhejiang, China. We recorded patients' epidemiological and clinical features. Findings From Jan 17 to March 1, 2020, 36 children (mean age 8·3 [SD 3·5] years) were identified to be infected with severe acute respiratory syndrome coronavirus 2. The route of transmission was by close contact with family members (32 [89%]) or a history of exposure to the epidemic area (12 [33%]); eight (22%) patients had both exposures. 19 (53%) patients had moderate clinical type with pneumonia; 17 (47%) had mild clinical type and either were asymptomatic (ten [28%]) or had acute upper respiratory symptoms (seven [19%]). Common symptoms on admission were fever (13 [36%]) and dry cough (seven [19%]). Of those with fever, four (11%) had a body temperature of 38·5°C or higher, and nine (25%) had a body temperature of 37·5–38·5°C. Typical abnormal laboratory findings were elevated creatine kinase MB (11 [31%]), decreased lymphocytes (11 [31%]), leucopenia (seven [19%]), and elevated procalcitonin (six [17%]). Besides radiographic presentations, variables that were associated significantly with severity of COVID-19 were decreased lymphocytes, elevated body temperature, and high levels of procalcitonin, D-dimer, and creatine kinase MB. All children received interferon alfa by aerosolisation twice a day, 14 (39%) received lopinavir–ritonavir syrup twice a day, and six (17%) needed oxygen inhalation. Mean time in hospital was 14 (SD 3) days. By Feb 28, 2020, all patients were cured. Interpretation Although all paediatric patients in our cohort had mild or moderate type of COVID-19, the large proportion of asymptomatic children indicates the difficulty in identifying paediatric patients who do not have clear epidemiological information, leading to a dangerous situation in community-acquired infections. Funding Ningbo Clinical Research Center for Children's Health and Diseases, Ningbo Reproductive Medicine Centre, and Key Scientific and Technological Innovation Projects of Wenzhou.

Proceedings ArticleDOI
27 Jun 2016
TL;DR: A novel approach for real-time facial reenactment of a monocular target video sequence (e.g., Youtube video) that addresses the under-constrained problem of facial identity recovery from monocular video by non-rigid model-based bundling and re-render the manipulated output video in a photo-realistic fashion.
Abstract: We present a novel approach for real-time facial reenactment of a monocular target video sequence (e.g., Youtube video). The source sequence is also a monocular video stream, captured live with a commodity webcam. Our goal is to animate the facial expressions of the target video by a source actor and re-render the manipulated output video in a photo-realistic fashion. To this end, we first address the under-constrained problem of facial identity recovery from monocular video by non-rigid model-based bundling. At run time, we track facial expressions of both source and target video using a dense photometric consistency measure. Reenactment is then achieved by fast and efficient deformation transfer between source and target. The mouth interior that best matches the re-targeted expression is retrieved from the target sequence and warped to produce an accurate fit. Finally, we convincingly re-render the synthesized target face on top of the corresponding video stream such that it seamlessly blends with the real-world illumination. We demonstrate our method in a live setup, where Youtube videos are reenacted in real time.

Journal ArticleDOI
12 Jun 2015-Science
TL;DR: A brave new world with a wider view Researchers have long attempted to follow animals as they move through their environment, but such efforts were limited to short distances and times in species large enough to carry large batteries and transmitters, while new technologies have opened up new frontiers in animal tracking remote data collection.
Abstract: BACKGROUND Global aquatic environments are changing profoundly as a result of human actions; consequently, so too are the ways in which organisms are distributing themselves through space and time. Our ability to predict organism and community responses to these alterations will be dependent on knowledge of animal movements, interactions, and how the physiological and environmental processes underlying them shape species distributions. These patterns and processes ultimately structure aquatic ecosystems and provide the wealth of ecosystem services upon which humans depend. Until recently, the vast size, opacity, and dynamic nature of the aquatic realm have impeded our efforts to understand these ecosystems. With rapid technological advancement over the past several decades, a suite of electronic tracking devices (e.g., acoustic and satellite transmitters) that can remotely monitor animals in these challenging environments are now available. Aquatic telemetry technology is rapidly accelerating our ability to observe animal behavior and distribution and, as a consequence, is fundamentally altering our understanding of the structure and function of global aquatic ecosystems. These advances provide the toolbox to define how future global aquatic management practices must evolve. ADVANCES Aquatic telemetry has emerged through technological advances in miniaturization, battery engineering, and software and hardware development, allowing the monitoring of organisms whose habitats range from the poles to the tropics and the photic zone to the abyssal depths. This is enabling the characterization of the horizontal and vertical movements of individuals, populations, and entire communities over scales of meters to tens of thousands of kilometers and over time frames of hours to years and even over the entire lifetimes of individuals. Electronic tags can now be equipped with sensors that measure ambient physical parameters (depth, temperature, conductivity, fluorescence), providing simultaneous monitoring of animals’ environments. By linking telemetry with biologgers (e.g., jaw-motion sensors), it is possible to monitor individual feeding events. In addition, other devices on instrumented animals can communicate with one another, providing insights into predator-prey interactions and social behavior. Coupling telemetry with minute nonlethal biopsy allows understanding of how trophic dynamics, population connectivity, and gene-level basis for organismal health and condition relate to movement. These advances are revolutionizing the scope and scales of questions that can be addressed on the causes and consequences of animal distribution and movement. OUTLOOK Aquatic animal telemetry has advanced rapidly, yet new challenges present themselves in coordination of monitoring across large-spatial scales (ocean basins), data sharing, and data assimilation. The continued advancement of aquatic telemetry lies in establishing and maintaining accessible and cost-effective infrastructure and in promoting multidisciplinary tagging approaches to maximize cost benefits. A united global network and centralized database will provide the mechanism for global telemetry data and will promote a transparent environment for data sharing that will, in turn, increase global communication, scope for collaboration, intellectual advancement, and funding opportunities. An overarching global network will realize the potential of telemetry, which is essential for advancing scientific knowledge and effectively managing globally shared aquatic resources and their ecosystems in the face of mounting human pressures and environmental change.

Journal ArticleDOI
17 Jul 2019
TL;DR: Wang et al. as discussed by the authors proposed Session-based Recommendation with Graph Neural Networks (SR-GNN) to capture complex transitions of items, which are difficult to be revealed by previous conventional sequential methods.
Abstract: The problem of session-based recommendation aims to predict user actions based on anonymous sessions. Previous methods model a session as a sequence and estimate user representations besides item representations to make recommendations. Though achieved promising results, they are insufficient to obtain accurate user vectors in sessions and neglect complex transitions of items. To obtain accurate item embedding and take complex transitions of items into account, we propose a novel method, i.e. Session-based Recommendation with Graph Neural Networks, SR-GNN for brevity. In the proposed method, session sequences are modeled as graphstructured data. Based on the session graph, GNN can capture complex transitions of items, which are difficult to be revealed by previous conventional sequential methods. Each session is then represented as the composition of the global preference and the current interest of that session using an attention network. Extensive experiments conducted on two real datasets show that SR-GNN evidently outperforms the state-of-the-art session-based recommendation methods consistently.

Journal ArticleDOI
TL;DR: In this article, the authors present cosmological parameter constraints from a tomographic weak gravitational lensing analysis of ~450deg$^2$ of imaging data from the Kilo Degree Survey (KiDS) for a flat Lambda$CDM cosmology with a prior on $H_0$ that encompasses the most recent direct measurements.
Abstract: We present cosmological parameter constraints from a tomographic weak gravitational lensing analysis of ~450deg$^2$ of imaging data from the Kilo Degree Survey (KiDS). For a flat $\Lambda$CDM cosmology with a prior on $H_0$ that encompasses the most recent direct measurements, we find $S_8\equiv\sigma_8\sqrt{\Omega_{\rm m}/0.3}=0.745\pm0.039$. This result is in good agreement with other low redshift probes of large scale structure, including recent cosmic shear results, along with pre-Planck cosmic microwave background constraints. A $2.3$-$\sigma$ tension in $S_8$ and `substantial discordance' in the full parameter space is found with respect to the Planck 2015 results. We use shear measurements for nearly 15 million galaxies, determined with a new improved `self-calibrating' version of $lens$fit validated using an extensive suite of image simulations. Four-band $ugri$ photometric redshifts are calibrated directly with deep spectroscopic surveys. The redshift calibration is confirmed using two independent techniques based on angular cross-correlations and the properties of the photometric redshift probability distributions. Our covariance matrix is determined using an analytical approach, verified numerically with large mock galaxy catalogues. We account for uncertainties in the modelling of intrinsic galaxy alignments and the impact of baryon feedback on the shape of the non-linear matter power spectrum, in addition to the small residual uncertainties in the shear and redshift calibration. The cosmology analysis was performed blind. Our high-level data products, including shear correlation functions, covariance matrices, redshift distributions, and Monte Carlo Markov Chains are available at http://kids.strw.leidenuniv.nl.

Journal ArticleDOI
07 Apr 2016
TL;DR: In this paper, the authors explore and discuss how soil scientists can help to reach the recently adopted UN Sustainable Development Goals (SDGs) in the most effective manner and recommend the following steps to be taken by the soil science community as a whole: (i) embrace the UN SDGs, as they provide a platform that allows soil science to demonstrate its relevance for realizing a sustainable society by 2030; (ii) show the specific value of soil science: research should explicitly show how using modern soil information can improve the results of inter-and transdisciplinary studies on SDGs related to food security
Abstract: . In this forum paper we discuss how soil scientists can help to reach the recently adopted UN Sustainable Development Goals (SDGs) in the most effective manner. Soil science, as a land-related discipline, has important links to several of the SDGs, which are demonstrated through the functions of soils and the ecosystem services that are linked to those functions (see graphical abstract in the Supplement). We explore and discuss how soil scientists can rise to the challenge both internally, in terms of our procedures and practices, and externally, in terms of our relations with colleague scientists in other disciplines, diverse groups of stakeholders and the policy arena. To meet these goals we recommend the following steps to be taken by the soil science community as a whole: (i) embrace the UN SDGs, as they provide a platform that allows soil science to demonstrate its relevance for realizing a sustainable society by 2030; (ii) show the specific value of soil science: research should explicitly show how using modern soil information can improve the results of inter- and transdisciplinary studies on SDGs related to food security, water scarcity, climate change, biodiversity loss and health threats; (iii) take leadership in overarching system analysis of ecosystems, as soils and soil scientists have an integrated nature and this places soil scientists in a unique position; (iii) raise awareness of soil organic matter as a key attribute of soils to illustrate its importance for soil functions and ecosystem services; (iv) improve the transfer of knowledge through knowledge brokers with a soil background; (v) start at the basis: educational programmes are needed at all levels, starting in primary schools, and emphasizing practical, down-to-earth examples; (vi) facilitate communication with the policy arena by framing research in terms that resonate with politicians in terms of the policy cycle or by considering drivers, pressures and responses affecting impacts of land use change; and finally (vii) all this is only possible if researchers, with soil scientists in the front lines, look over the hedge towards other disciplines, to the world at large and to the policy arena, reaching over to listen first, as a basis for genuine collaboration.