scispace - formally typeset
Search or ask a question
Browse all papers

Proceedings ArticleDOI
31 Jan 2019
TL;DR: The authors proposed a multi-task deep neural network (MT-DNN) for learning representations across multiple natural language understanding (NLU) tasks, which not only leverages large amounts of cross-task data, but also benefits from a regularization effect that leads to more general representations to help adapt to new tasks and domains.
Abstract: In this paper, we present a Multi-Task Deep Neural Network (MT-DNN) for learning representations across multiple natural language understanding (NLU) tasks. MT-DNN not only leverages large amounts of cross-task data, but also benefits from a regularization effect that leads to more general representations to help adapt to new tasks and domains. MT-DNN extends the model proposed in Liu et al. (2015) by incorporating a pre-trained bidirectional transformer language model, known as BERT (Devlin et al., 2018). MT-DNN obtains new state-of-the-art results on ten NLU tasks, including SNLI, SciTail, and eight out of nine GLUE tasks, pushing the GLUE benchmark to 82.7% (2.2% absolute improvement) as of February 25, 2019 on the latest GLUE test set. We also demonstrate using the SNLI and SciTail datasets that the representations learned by MT-DNN allow domain adaptation with substantially fewer in-domain labels than the pre-trained BERT representations. Our code and pre-trained models will be made publicly available.

647 citations


Journal ArticleDOI
TL;DR: In this paper, the electron mean free path and carrier relaxation time τ of the twenty most conductive elemental metals were determined by numerical integration over the Fermi surface obtained from first-principles, using constant λ or τ approximations and wave-vector dependent fermi velocities vf (k).
Abstract: The electron mean free path λ and carrier relaxation time τ of the twenty most conductive elemental metals are determined by numerical integration over the Fermi surface obtained from first-principles, using constant λ or τ approximations and wave-vector dependent Fermi velocities vf (k). The average vf deviates considerably from the free-electron prediction, even for elements with spherical Fermi surfaces including Cu (29% deviation). The calculated product of the bulk resistivity times λ indicates that, in the limit of narrow wires, Rh, Ir, and Ni are 2.1, 1.8, and 1.6 times more conductive than Cu, while various metals including Mo, Co, and Ru approximately match the Cu resistivity, suggesting that these metals are promising candidates to replace Cu for narrow interconnect lines.

647 citations


Book ChapterDOI
08 Oct 2016
TL;DR: In this paper, a fully automatic image colorization system was developed, which leverages recent advances in deep networks, exploiting both low-level and semantic representations, and trained a model to predict per-pixel color histograms.
Abstract: We develop a fully automatic image colorization system. Our approach leverages recent advances in deep networks, exploiting both low-level and semantic representations. As many scene elements naturally appear according to multimodal color distributions, we train our model to predict per-pixel color histograms. This intermediate output can be used to automatically generate a color image, or further manipulated prior to image formation. On both fully and partially automatic colorization tasks, we outperform existing methods. We also explore colorization as a vehicle for self-supervised visual representation learning.

647 citations


Book ChapterDOI
19 Sep 2017
TL;DR: A novel approach combining multi-scale and irregular isothetic representations of the input contour, as an extension of a previous work, improves the representation of the contour by 1-D intervals, and achieves afterwards the decomposition of thecontour into maximal arcs or segments.
Abstract: The reconstruction of noisy digital shapes is a complex question and a lot of contributions have been proposed to address this problem , including blurred segment decomposition or adaptive tangential covering for instance In this article, we propose a novel approach combining multi-scale and irregular isothetic representations of the input contour, as an extension of a previous work [Vacavant et al, A Combined Multi-Scale/Irregular Algorithm for the Vectorization of Noisy Digital Contours , CVIU 2013] Our new algorithm improves the representation of the contour by 1-D intervals, and achieves afterwards the decomposition of the contour into maximal arcs or segments Our experiments with synthetic and real images show that our contribution can be employed as a relevant option for noisy shape reconstruction

646 citations


Journal ArticleDOI
TL;DR: In this review, it is attempted to cover all recent aspects of [2 + 2] photocycloaddition chemistry with an emphasis on synthetically relevant, regio-, and stereoselective reactions.
Abstract: The [2 + 2] photocycloaddition is undisputedly the most important and most frequently used photochemical reaction. In this review, it is attempted to cover all recent aspects of [2 + 2] photocycloaddition chemistry with an emphasis on synthetically relevant, regio-, and stereoselective reactions. The review aims to comprehensively discuss relevant work, which was done in the field in the last 20 years (i.e., from 1995 to 2015). Organization of the data follows a subdivision according to mechanism and substrate classes. Cu(I) and PET (photoinduced electron transfer) catalysis are treated separately in sections 2 and 4, whereas the vast majority of photocycloaddition reactions which occur by direct excitation or sensitization are divided within section 3 into individual subsections according to the photochemically excited olefin.

646 citations


Journal ArticleDOI
20 Jun 2017-JAMA
TL;DR: It is concluded with moderate certainty that screening for obesity in children and adolescents 6 years and older is of moderate net benefit and clinicians should offer or refer them to comprehensive, intensive behavioral interventions to promote improvements in weight status.
Abstract: Importance Based on year 2000 Centers for Disease Control and Prevention growth charts, approximately 17% of children and adolescents aged 2 to 19 years in the United States have obesity, and almost 32% of children and adolescents are overweight or have obesity. Obesity in children and adolescents is associated with morbidity such as mental health and psychological issues, asthma, obstructive sleep apnea, orthopedic problems, and adverse cardiovascular and metabolic outcomes (eg, high blood pressure, abnormal lipid levels, and insulin resistance). Children and adolescents may also experience teasing and bullying behaviors based on their weight. Obesity in childhood and adolescence may continue into adulthood and lead to adverse cardiovascular outcomes or other obesity-related morbidity, such as type 2 diabetes. Subpopulation Considerations Although the overall rate of child and adolescent obesity has stabilized over the last decade after increasing steadily for 3 decades, obesity rates continue to increase in certain populations, such as African American girls and Hispanic boys. These racial/ethnic differences in obesity prevalence are likely a result of both genetic and nongenetic factors (eg, socioeconomic status, intake of sugar-sweetened beverages and fast food, and having a television in the bedroom). Objective To update the 2010 US Preventive Services Task Force (USPSTF) recommendation on screening for obesity in children 6 years and older. Evidence Review The USPSTF reviewed the evidence on screening for obesity in children and adolescents and the benefits and harms of weight management interventions. Findings Comprehensive, intensive behavioral interventions (≥26 contact hours) in children and adolescents 6 years and older who have obesity can result in improvements in weight status for up to 12 months; there is inadequate evidence regarding the effectiveness of less intensive interventions. The harms of behavioral interventions can be bounded as small to none, and the harms of screening are minimal. Therefore, the USPSTF concluded with moderate certainty that screening for obesity in children and adolescents 6 years and older is of moderate net benefit. Conclusions and Recommendation The USPSTF recommends that clinicians screen for obesity in children and adolescents 6 years and older and offer or refer them to comprehensive, intensive behavioral interventions to promote improvements in weight status. (B recommendation)

646 citations


Journal ArticleDOI
TL;DR: A survey of Hough Transform and its variants, their limitations and the modifications made to overcome them, the implementation issues in software and hardware, and applications in various fields is done.

646 citations


Journal ArticleDOI
TL;DR: A systematic review of articles that have used the unified theory of acceptance and use of technology (UTAUT) and indicated that general purpose systems and specialized business systems were examined in the majority of the articles using the UTAUT.
Abstract: – The purpose of this paper is to perform a systematic review of articles that have used the unified theory of acceptance and use of technology (UTAUT). , – The results produced in this research are based on the literature analysis of 174 existing articles on the UTAUT model. This has been performed by collecting data including demographic details, methodological details, limitations, and significance of relationships between the constructs from the available articles based on the UTAUT. , – The findings indicated that general purpose systems and specialized business systems were examined in the majority of the articles using the UTAUT. The analysis also indicated that cross-sectional approach, survey methods, and structural equation modelling analysis techniques were the most explored research methodologies whereas SPSS was found to be the largely used analysis tools. Moreover, the weight analysis of independent variables indicates that variables such as performance expectancy and behavioural intention qualified for the best predictor category. Moreover, the analysis also suggested that single subject or biased sample as the most explored limitation across all studies. , – The search activities were centered on occurrences of keywords to avoid tracing a large number of publications where these keywords might have been used as casual words in the main text. However, we acknowledge that there may be a number of studies, which lack keywords in the title, but still focus upon UTAUT in some form. , – This is the first research of its type which has extensively examined the literature on the UTAUT and provided the researchers with the accumulative knowledge about the model.

646 citations


Posted Content
TL;DR: Zhang et al. as mentioned in this paper proposed a deep learning-based approach, called ST-ResNet, to collectively forecast the in-flow and out-flow of crowds in each and every region through a city.
Abstract: Forecasting the flow of crowds is of great importance to traffic management and public safety, yet a very challenging task affected by many complex factors, such as inter-region traffic, events and weather. In this paper, we propose a deep-learning-based approach, called ST-ResNet, to collectively forecast the in-flow and out-flow of crowds in each and every region through a city. We design an end-to-end structure of ST-ResNet based on unique properties of spatio-temporal data. More specifically, we employ the framework of the residual neural networks to model the temporal closeness, period, and trend properties of the crowd traffic, respectively. For each property, we design a branch of residual convolutional units, each of which models the spatial properties of the crowd traffic. ST-ResNet learns to dynamically aggregate the output of the three residual neural networks based on data, assigning different weights to different branches and regions. The aggregation is further combined with external factors, such as weather and day of the week, to predict the final traffic of crowds in each and every region. We evaluate ST-ResNet based on two types of crowd flows in Beijing and NYC, finding that its performance exceeds six well-know methods.

646 citations


Journal ArticleDOI
TL;DR: The results expand the knowledge of the microbial regulation of immunity and may provide a basis for the development of microbiome-based therapeutics in autoimmune diseases.
Abstract: The gut microbiota regulates T cell functions throughout the body. We hypothesized that intestinal bacteria impact the pathogenesis of multiple sclerosis (MS), an autoimmune disorder of the CNS and thus analyzed the microbiomes of 71 MS patients not undergoing treatment and 71 healthy controls. Although no major shifts in microbial community structure were found, we identified specific bacterial taxa that were significantly associated with MS. Akkermansia muciniphila and Acinetobacter calcoaceticus, both increased in MS patients, induced proinflammatory responses in human peripheral blood mononuclear cells and in monocolonized mice. In contrast, Parabacteroides distasonis, which was reduced in MS patients, stimulated antiinflammatory IL-10–expressing human CD4+CD25+ T cells and IL-10+FoxP3+ Tregs in mice. Finally, microbiota transplants from MS patients into germ-free mice resulted in more severe symptoms of experimental autoimmune encephalomyelitis and reduced proportions of IL-10+ Tregs compared with mice “humanized” with microbiota from healthy controls. This study identifies specific human gut bacteria that regulate adaptive autoimmune responses, suggesting therapeutic targeting of the microbiota as a treatment for MS.

646 citations


Journal ArticleDOI
TL;DR: It is predicted that plastics ingestion is increasing in seabirds, that it will reach 99% of all species by 2050, and that effective waste management can reduce this threat.
Abstract: Plastic pollution in the ocean is a global concern; concentrations reach 580,000 pieces per km2 and production is increasing exponentially. Although a large number of empirical studies provide emerging evidence of impacts to wildlife, there has been little systematic assessment of risk. We performed a spatial risk analysis using predicted debris distributions and ranges for 186 seabird species to model debris exposure. We adjusted the model using published data on plastic ingestion by seabirds. Eighty of 135 (59%) species with studies reported in the literature between 1962 and 2012 had ingested plastic, and, within those studies, on average 29% of individuals had plastic in their gut. Standardizing the data for time and species, we estimate the ingestion rate would reach 90% of individuals if these studies were conducted today. Using these results from the literature, we tuned our risk model and were able to capture 71% of the variation in plastic ingestion based on a model including exposure, time, study method, and body size. We used this tuned model to predict risk across seabird species at the global scale. The highest area of expected impact occurs at the Southern Ocean boundary in the Tasman Sea between Australia and New Zealand, which contrasts with previous work identifying this area as having low anthropogenic pressures and concentrations of marine debris. We predict that plastics ingestion is increasing in seabirds, that it will reach 99% of all species by 2050, and that effective waste management can reduce this threat.

Journal ArticleDOI
TL;DR: This narrative review investigates the molecular mechanisms of hepatic steatosis in NAFLD, focusing on the four major pathways contributing to lipid homeostasis in the liver.
Abstract: Non-alcoholic fatty liver disease (NAFLD) is currently the world’s most common liver disease, estimated to affect up to one-fourth of the population. Hallmarked by hepatic steatosis, NAFLD is associated with a multitude of detrimental effects and increased mortality. This narrative review investigates the molecular mechanisms of hepatic steatosis in NAFLD, focusing on the four major pathways contributing to lipid homeostasis in the liver. Hepatic steatosis is a consequence of lipid acquisition exceeding lipid disposal, i.e., the uptake of fatty acids and de novo lipogenesis surpassing fatty acid oxidation and export. In NAFLD, hepatic uptake and de novo lipogenesis are increased, while a compensatory enhancement of fatty acid oxidation is insufficient in normalizing lipid levels and may even promote cellular damage and disease progression by inducing oxidative stress, especially with compromised mitochondrial function and increased oxidation in peroxisomes and cytochromes. While lipid export initially increases, it plateaus and may even decrease with disease progression, sustaining the accumulation of lipids. Fueled by lipo-apoptosis, hepatic steatosis leads to systemic metabolic disarray that adversely affects multiple organs, placing abnormal lipid metabolism associated with NAFLD in close relation to many of the current life-style-related diseases.

Journal ArticleDOI
TL;DR: This work realizes a dendrite-free Li metal anode by introducing an anion-immobilized composite solid electrolyte, where anions are tethered to polymer chains and ceramic particles to inhibit lithium dendrites and construct safe batteries.
Abstract: Lithium metal is strongly regarded as a promising electrode material in next-generation rechargeable batteries due to its extremely high theoretical specific capacity and lowest reduction potential. However, the safety issue and short lifespan induced by uncontrolled dendrite growth have hindered the practical applications of lithium metal anodes. Hence, we propose a flexible anion-immobilized ceramic–polymer composite electrolyte to inhibit lithium dendrites and construct safe batteries. Anions in the composite electrolyte are tethered by a polymer matrix and ceramic fillers, inducing a uniform distribution of space charges and lithium ions that contributes to a dendrite-free lithium deposition. The dissociation of anions and lithium ions also helps to reduce the polymer crystallinity, rendering stable and fast transportation of lithium ions. Ceramic fillers in the electrolyte extend the electrochemically stable window to as wide as 5.5 V and provide a barrier to short circuiting for realizing safe batteries at elevated temperature. The anion-immobilized electrolyte can be applied in all–solid-state batteries and exhibits a small polarization of 15 mV. Cooperated with LiFePO 4 and LiNi 0.5 Co 0.2 Mn 0.3 O 2 cathodes, the all–solid-state lithium metal batteries render excellent specific capacities of above 150 mAh⋅g −1 and well withstand mechanical bending. These results reveal a promising opportunity for safe and flexible next-generation lithium metal batteries.

Journal ArticleDOI
TL;DR: The obtained data indicate that obese persons in Ukraine adult population have a significantly higher level of Firmicutes and lower level of Bacteroidetes compared to normal-weight and lean adults.
Abstract: Metagenomic studies confirm that obesity is associated with a composition of gut microbiota There are some controversies, however, about the composition of gut microbial communities in obese individuals in different populations To examine the association between body mass index and microbiota composition in Ukrainian population, fecal concentrations of Bacteroidetes, Firmicutes, Actinobacteria and Firmicutes/Bacteroidetes (F/B) ratio were analyzed in 61 adult individuals The relative abundance of Actinobacteria was small (5–7%) and comparable in different BMI categories The content of Firmicutes was gradually increased while the content of Bacteroidetes was decreased with increasing body mass index (BMI) The F/B ratio also raised with increasing BMI In an unadjusted logistic regression model, F/B ratio was significantly associated with BMI (OR = 123, 95% CI 1,09–1,38) This association continued to be significant after adjusting for confounders such as age, sex, tobacco smoking and physical activity (OR = 133, 95% CI 1,11–1,60) The obtained data indicate that obese persons in Ukraine adult population have a significantly higher level of Firmicutes and lower level of Bacteroidetes compared to normal-weight and lean adults


Journal ArticleDOI
TL;DR: This research paper develops, explicates, and provides evidence for the utility of a Framework for Evaluation in Design Science (FEDS) together with a process to guide design science researchers in developing a strategy for evaluating the artefacts they develop within a DSR project.
Abstract: Evaluation of design artefacts and design theories is a key activity in Design Science Research (DSR), as it provides feedback for further development and (if done correctly) assures the rigour of the research. However, the extant DSR literature provides insufficient guidance on evaluation to enable Design Science Researchers to effectively design and incorporate evaluation activities into a DSR project that can achieve DSR goals and objectives. To address this research gap, this research paper develops, explicates, and provides evidence for the utility of a Framework for Evaluation in Design Science (FEDS) together with a process to guide design science researchers in developing a strategy for evaluating the artefacts they develop within a DSR project. A FEDS strategy considers why, when, how, and what to evaluate. FEDS includes a two-dimensional characterisation of DSR evaluation episodes (particular evaluations), with one dimension being the functional purpose of the evaluation (formative or summative) and the other dimension being the paradigm of the evaluation (artificial or naturalistic). The FEDS evaluation design process is comprised of four steps: (1) explicate the goals of the evaluation, (2) choose the evaluation strategy or strategies, (3) determine the properties to evaluate, and (4) design the individual evaluation episode(s). The paper illustrates the framework with two examples and provides evidence of its utility via a naturalistic, summative evaluation through its use on an actual DSR project.

Journal ArticleDOI
TL;DR: A new version of K EGG Mapper is reported, a suite of KEGG mapping tools available at the KEGg website, together with the KOALA family tools for automatic assignment of KO (KEGG Orthology) identifiers used in the mapping.
Abstract: KEGG is a reference knowledge base for biological interpretation of large-scale molecular datasets, such as genome and metagenome sequences. It accumulates experimental knowledge about high-level functions of the cell and the organism represented in terms of KEGG molecular networks, including KEGG pathway maps, BRITE hierarchies, and KEGG modules. By the process called KEGG mapping, a set of protein coding genes in the genome, for example, can be converted to KEGG molecular networks enabling interpretation of cellular functions and other high-level features. Here we report a new version of KEGG Mapper, a suite of KEGG mapping tools available at the KEGG website (https://www.kegg.jp/ or https://www.genome.jp/kegg/), together with the KOALA family tools for automatic assignment of KO (KEGG Orthology) identifiers used in the mapping.

Journal ArticleDOI
18 Mar 2020-BMJ
TL;DR: In this article, the authors provide guidance on how to calculate the sample size required to develop a clinical prediction model.
Abstract: Clinical prediction models aim to predict outcomes in individuals, to inform diagnosis or prognosis in healthcare. Hundreds of prediction models are published in the medical literature each year, yet many are developed using a dataset that is too small for the total number of participants or outcome events. This leads to inaccurate predictions and consequently incorrect healthcare decisions for some individuals. In this article, the authors provide guidance on how to calculate the sample size required to develop a clinical prediction model.

Journal ArticleDOI
TL;DR: The quality of a narrative review may be improved by borrowing from the systematic review methodologies that are aimed at reducing bias in the selection of articles for review and employing an effective bibliographic research strategy.
Abstract: Reviews provide a synthesis of published literature on a topic and describe its current state-of-art. Reviews in clinical research are thus useful when designing studies or developing practice guidelines. The two standard types of reviews are (a) systematic and (b) non-systematic or narrative review. Unlike systematic reviews that benefit from guidelines such as PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) statement, there are no acknowledged guidelines for narrative reviews. I have attempted to define the best practice recommendations for the preparation of a narrative review in clinical research. The quality of a narrative review may be improved by borrowing from the systematic review methodologies that are aimed at reducing bias in the selection of articles for review and employing an effective bibliographic research strategy. The dynamics of narrative review writing, the organizational pattern of the text, the analysis, and the synthesis processes are also discussed.

Journal ArticleDOI
TL;DR: A large, monthly frequency, macroeconomic database designed to be updated monthly using the Federal Reserve Economic Data (FRED) database is described, and it is suggested that diffusion indexes constructed as the partial sum of the factor estimates can potentially be useful for the study of business cycle chronology.
Abstract: This article describes a large, monthly frequency, macroeconomic database with the goal of establishing a convenient starting point for empirical analysis that requires “big data.” The dataset mimics the coverage of those already used in the literature but has three appealing features. First, it is designed to be updated monthly using the Federal Reserve Economic Data (FRED) database. Second, it will be publicly accessible, facilitating comparison of related research and replication of empirical work. Third, it will relieve researchers from having to manage data changes and revisions. We show that factors extracted from our dataset share the same predictive content as those based on various vintages of the so-called Stock–Watson dataset. In addition, we suggest that diffusion indexes constructed as the partial sum of the factor estimates can potentially be useful for the study of business cycle chronology. Supplementary materials for this article are available online.

Proceedings Article
03 Jul 2018
TL;DR: This article propose a meta-learning algorithm that learns to assign weights to training examples based on their gradient directions, which can be easily implemented on any type of deep network, does not require any additional hyperparameter tuning, and achieves impressive performance on class imbalance and corrupted label problems where only a small amount of clean validation data is available.
Abstract: Deep neural networks have been shown to be very powerful modeling tools for many supervised learning tasks involving complex input patterns. However, they can also easily overfit to training set biases and label noises. In addition to various regularizers, example reweighting algorithms are popular solutions to these problems, but they require careful tuning of additional hyperparameters, such as example mining schedules and regularization hyperparameters. In contrast to past reweighting methods, which typically consist of functions of the cost value of each example, in this work we propose a novel meta-learning algorithm that learns to assign weights to training examples based on their gradient directions. To determine the example weights, our method performs a meta gradient descent step on the current mini-batch example weights (which are initialized from zero) to minimize the loss on a clean unbiased validation set. Our proposed method can be easily implemented on any type of deep network, does not require any additional hyperparameter tuning, and achieves impressive performance on class imbalance and corrupted label problems where only a small amount of clean validation data is available.

Journal ArticleDOI
TL;DR: The authors found strong support for the expressive model: a multi-item partisan identity scale better accounts for campaign activity than a strong stance on subjectively important policy issues, the strength of ideological self-placement, or a measure of ideological identity.
Abstract: Party identification is central to the study of American political behavior, yet there remains disagreement over whether it is largely instrumental or expressive in nature. We draw on social identity theory to develop the expressive model and conduct four studies to compare it to an instrumental explanation of campaign involvement. We find strong support for the expressive model: a multi-item partisan identity scale better accounts for campaign activity than a strong stance on subjectively important policy issues, the strength of ideological self-placement, or a measure of ideological identity. A series of experiments underscore the power of partisan identity to generate action-oriented emotions that drive campaign activity. Strongly identified partisans feel angrier than weaker partisans when threatened with electoral loss and more positive when reassured of victory. In contrast, those who hold a strong and ideologically consistent position on issues are no more aroused emotionally than others by party threats or reassurances. In addition, threat and reassurance to the party's status arouse greater anger and enthusiasm among partisans than does a threatened loss or victory on central policy issues. Our findings underscore the power of an expressive partisan identity to drive campaign involvement and generate strong emotional reactions to ongoing campaign events.

Journal ArticleDOI
TL;DR: In this article, the authors studied the convergence rate of the Wasserstein distance of a sample of a given probability distribution with respect to the probability distribution of a random sample, and provided a non-asymptotic bound on the convergence of the sample to a stationary Markov chain.
Abstract: Let $$\mu _N$$ be the empirical measure associated to a $$N$$ -sample of a given probability distribution $$\mu $$ on $$\mathbb {R}^d$$ . We are interested in the rate of convergence of $$\mu _N$$ to $$\mu $$ , when measured in the Wasserstein distance of order $$p>0$$ . We provide some satisfying non-asymptotic $$L^p$$ -bounds and concentration inequalities, for any values of $$p>0$$ and $$d\ge 1$$ . We extend also the non asymptotic $$L^p$$ -bounds to stationary $$\rho $$ -mixing sequences, Markov chains, and to some interacting particle systems.

Journal ArticleDOI
17 Mar 2015-Immunity
TL;DR: The evidence and mechanisms that mitochondrial dependent signaling controls innate and adaptive immune responses are discussed.

Journal ArticleDOI
Bonnie R. Joubert1, Janine F. Felix2, Paul Yousefi3, Kelly M. Bakulski4, Allan C. Just5, Carrie V. Breton6, Sarah E. Reese1, Christina A. Markunas1, Christina A. Markunas7, Rebecca C Richmond8, Cheng-Jian Xu9, Leanne K. Küpers9, Sam S. Oh10, Cathrine Hoyo11, Olena Gruzieva12, Cilla Söderhäll12, Lucas A. Salas13, Nour Baïz14, Hongmei Zhang15, Johanna Lepeule16, Carlos Ruiz13, Symen Ligthart2, Tianyuan Wang1, Jack A. Taylor1, Liesbeth Duijts, Gemma C Sharp8, Soesma A Jankipersadsing9, Roy Miodini Nilsen17, Ahmad Vaez9, Ahmad Vaez18, M. Daniele Fallin4, Donglei Hu10, Augusto A. Litonjua19, Bernard F. Fuemmeler7, Karen Huen3, Juha Kere12, Inger Kull12, Monica Cheng Munthe-Kaas20, Ulrike Gehring21, Mariona Bustamante, Marie José Saurel-Coubizolles22, Bilal M. Quraishi15, Jie Ren6, Jörg Tost, Juan R. González13, Marjolein J. Peters2, Siri E. Håberg23, Zongli Xu1, Joyce B. J. van Meurs2, Tom R. Gaunt8, Marjan Kerkhof9, Eva Corpeleijn9, Andrew P. Feinberg24, Celeste Eng10, Andrea A. Baccarelli25, Sara E. Benjamin Neelon4, Asa Bradman3, Simon Kebede Merid12, Anna Bergström12, Zdenko Herceg26, Hector Hernandez-Vargas26, Bert Brunekreef21, Mariona Pinart, Barbara Heude27, Susan Ewart28, Jin Yao6, Nathanaël Lemonnier29, Oscar H. Franco2, Michael C. Wu30, Albert Hofman25, Albert Hofman2, Wendy L. McArdle8, Pieter van der Vlies9, Fahimeh Falahi9, Matthew W. Gillman25, Lisa F. Barcellos3, Ashok Kumar31, Ashok Kumar32, Ashok Kumar12, Magnus Wickman33, Magnus Wickman12, Stefano Guerra, Marie-Aline Charles27, John W. Holloway34, Charles Auffray29, Henning Tiemeier2, George Davey Smith8, Dirkje S. Postma9, Marie-France Hivert25, Brenda Eskenazi3, Martine Vrijheid13, Hasan Arshad34, Josep M. Antó, Abbas Dehghan2, Wilfried Karmaus15, Isabella Annesi-Maesano14, Jordi Sunyer, Akram Ghantous26, Göran Pershagen12, Nina Holland3, Susan K. Murphy7, Dawn L. DeMeo19, Esteban G. Burchard10, Christine Ladd-Acosta4, Harold Snieder9, Wenche Nystad23, Gerard H. Koppelman9, Caroline L Relton8, Vincent W. V. Jaddoe2, Allen J. Wilcox1, Erik Melén33, Erik Melén12, Stephanie J. London1 
TL;DR: This large scale meta-analysis of methylation data identified numerous loci involved in response to maternal smoking in pregnancy with persistence into later childhood and provide insights into mechanisms underlying effects of this important exposure.
Abstract: Epigenetic modifications, including DNA methylation, represent a potential mechanism for environmental impacts on human disease. Maternal smoking in pregnancy remains an important public health problem that impacts child health in a myriad of ways and has potential lifelong consequences. The mechanisms are largely unknown, but epigenetics most likely plays a role. We formed the Pregnancy And Childhood Epigenetics (PACE) consortium and meta-analyzed, across 13 cohorts (n = 6,685), the association between maternal smoking in pregnancy and newborn blood DNA methylation at over 450,000 CpG sites (CpGs) by using the Illumina 450K BeadChip. Over 6,000 CpGs were differentially methylated in relation to maternal smoking at genome-wide statistical significance (false discovery rate, 5%), including 2,965 CpGs corresponding to 2,017 genes not previously related to smoking and methylation in either newborns or adults. Several genes are relevant to diseases that can be caused by maternal smoking (e.g., orofacial clefts and asthma) or adult smoking (e.g., certain cancers). A number of differentially methylated CpGs were associated with gene expression. We observed enrichment in pathways and processes critical to development. In older children (5 cohorts, n = 3,187), 100% of CpGs gave at least nominal levels of significance, far more than expected by chance (p value < 2.2 × 10(-16)). Results were robust to different normalization methods used across studies and cell type adjustment. In this large scale meta-analysis of methylation data, we identified numerous loci involved in response to maternal smoking in pregnancy with persistence into later childhood and provide insights into mechanisms underlying effects of this important exposure.

Journal ArticleDOI
TL;DR: In this paper, the authors presented two new measurements of the Hubble parameter H(z) obtained with the cosmic chronometer method up to z ∼ 2.75, crossing for the first time the limit at z∼ 1.4 available in literature.
Abstract: One of the most compelling tasks of modern cosmology is to constrain the expansion history of the Universe, since this measurement can give insights on the nature of dark energy and help to estimate cosmological parameters. In this letter are presented two new measurements of the Hubble parameter H(z) obtained with the cosmic chronometer method up to z ∼ 2. Taking advantage of near-infrared spectroscopy of the few very massive and passive galaxies observed at z > 1.4 available in literature, the differential evolution of this population is estimated and calibrated with different stellar population synthesis models to constrain H(z), including in the final error budget all possible sources of systematic uncertainties (star formation history, stellar metallicity, model dependences). This analysis is able to extend significantly the redshift range coverage with respect to present-day constraints, crossing for the first time the limit at z ∼ 1.75. The new H(z) data are used to estimate the gain in accuracy on cosmological parameters with respect to previous measurements in two cosmological models, finding a small but detectable improvement (∼5 per cent) in particular on Ω_M and w_0. Finally, a simulation of a Euclid-like survey has been performed to forecast the expected improvement with future data. The provided constraints have been obtained just with the cosmic chronometers approach, without any additional data, and the results show the high potentiality of this method to constrain the expansion history of the Universe at these redshifts.

Journal ArticleDOI
Da Deng1
TL;DR: Li-ion batteries are the powerhouse for the digital electronic revolution in this modern mobile society, exclusively used in mobile phones and laptop computers as discussed by the authors, and much effort has been put to further improve the performance of Li-ion battery, achieved certain significant progress.
Abstract: Li-ion batteries are the powerhouse for the digital electronic revolution in this modern mobile society, exclusively used in mobile phones and laptop computers. The success of commercial Li-ion batteries in the 1990s was not an overnight achievement, but a result of intensive research and contribution by many great scientists and engineers. Then much efforts have been put to further improve the performance of Li-ion batteries, achieved certain significant progress. To meet the increasing demand for energy storage, particularly from increasingly popular electric vehicles, intensified research is required to develop next-generation Li-ion batteries with dramatically improved performances, including improved specific energy and volumetric energy density, cyclability, charging rate, stability, and safety. There are still notable challenges in the development of next-generation Li-ion batteries. New battery concepts have to be further developed to go beyond Li-ion batteries in the future. In this tutorial review, the focus is to introduce the basic concepts, highlight the recent progress, and discuss the challenges regarding Li-ion batteries. Brief discussion on popularly studied “beyond Li-ion” batteries is also provided.

Journal ArticleDOI
TL;DR: The evolution of nature's enzymes can lead to the discovery of new reactivity, transformations not known in biology, and reactivity inaccessible by small‐molecule catalysts.
Abstract: Tailor-made: Discussed herein is the ability to adapt biology's mechanisms for innovation and optimization to solving problems in chemistry and engineering. The evolution of nature's enzymes can lead to the discovery of new reactivity, transformations not known in biology, and reactivity inaccessible by small-molecule catalysts.

Journal ArticleDOI
TL;DR: The perception of teachers and students regarding its advantages, limitations and recommendations are explored and the use of online learning in medical and dental institutes in Pakistan is supported, considering its various advantages.
Abstract: Objective: During COVID-19 pandemic, the institutions in Pakistan have started online learning. This study explores the perception of teachers and students regarding its advantages, limitations and recommendations.Methods: This qualitative case study was conducted from March to April 2020. Using maximum variation sampling, 12 faculty members and 12 students from University College of Medicine and University College of Dentistry, Lahore were invited to participate. Four focus group interviews, two each with the faculty and students of medicine and dentistry were carried out. Data were transcribed verbatim and thematically analyzed using Atlas Ti.Results: The advantages included remote learning, comfort, accessibility, while the limitations involved inefficiency and difficulty in maintaining academic integrity. The recommendations were to train faculty on using online modalities and developing lesson plan with reduced cognitive load and increased interactivities.Conclusion: The current study supports the use of online learning in medical and dental institutes, considering its various advantages. Online learning modalities encourage student-centered learning and they are easily manageable during this lockdown situation. doi: https://doi.org/10.12669/pjms.36.COVID19-S4.2785 How to cite this:Mukhtar K, Javed K, Arooj M, Sethi A. Advantages, Limitations and Recommendations for online learning during COVID-19 pandemic era. 2020;36(COVID19-S4):COVID19-S27-S31. doi: https://doi.org/10.12669/pjms.36.COVID19-S4.2785 This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Journal ArticleDOI
TL;DR: In this paper , the authors forecasted the prevalence of dementia attributable to the three dementia risk factors included in the Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2019 (high body-mass index, high fasting plasma glucose, and smoking) from 2019 to 2050, using relative risks and forecasted risk factor prevalence to predict GBD risk-attributable prevalence in 2050 globally and by world region and country.
Abstract: Given the projected trends in population ageing and population growth, the number of people with dementia is expected to increase. In addition, strong evidence has emerged supporting the importance of potentially modifiable risk factors for dementia. Characterising the distribution and magnitude of anticipated growth is crucial for public health planning and resource prioritisation. This study aimed to improve on previous forecasts of dementia prevalence by producing country-level estimates and incorporating information on selected risk factors.We forecasted the prevalence of dementia attributable to the three dementia risk factors included in the Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2019 (high body-mass index, high fasting plasma glucose, and smoking) from 2019 to 2050, using relative risks and forecasted risk factor prevalence to predict GBD risk-attributable prevalence in 2050 globally and by world region and country. Using linear regression models with education included as an additional predictor, we then forecasted the prevalence of dementia not attributable to GBD risks. To assess the relative contribution of future trends in GBD risk factors, education, population growth, and population ageing, we did a decomposition analysis.We estimated that the number of people with dementia would increase from 57·4 (95% uncertainty interval 50·4-65·1) million cases globally in 2019 to 152·8 (130·8-175·9) million cases in 2050. Despite large increases in the projected number of people living with dementia, age-standardised both-sex prevalence remained stable between 2019 and 2050 (global percentage change of 0·1% [-7·5 to 10·8]). We estimated that there were more women with dementia than men with dementia globally in 2019 (female-to-male ratio of 1·69 [1·64-1·73]), and we expect this pattern to continue to 2050 (female-to-male ratio of 1·67 [1·52-1·85]). There was geographical heterogeneity in the projected increases across countries and regions, with the smallest percentage changes in the number of projected dementia cases in high-income Asia Pacific (53% [41-67]) and western Europe (74% [58-90]), and the largest in north Africa and the Middle East (367% [329-403]) and eastern sub-Saharan Africa (357% [323-395]). Projected increases in cases could largely be attributed to population growth and population ageing, although their relative importance varied by world region, with population growth contributing most to the increases in sub-Saharan Africa and population ageing contributing most to the increases in east Asia.Growth in the number of individuals living with dementia underscores the need for public health planning efforts and policy to address the needs of this group. Country-level estimates can be used to inform national planning efforts and decisions. Multifaceted approaches, including scaling up interventions to address modifiable risk factors and investing in research on biological mechanisms, will be key in addressing the expected increases in the number of individuals affected by dementia.Bill & Melinda Gates Foundation and Gates Ventures.