scispace - formally typeset
Search or ask a question

Showing papers by "National University of Singapore published in 2016"


Journal ArticleDOI
Daniel J. Klionsky1, Kotb Abdelmohsen2, Akihisa Abe3, Joynal Abedin4  +2519 moreInstitutions (695)
TL;DR: In this paper, the authors present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macro-autophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes.
Abstract: In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure flux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation, it is imperative to target by gene knockout or RNA interference more than one autophagy-related protein. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways implying that not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular assays, we hope to encourage technical innovation in the field.

5,187 citations


Journal ArticleDOI
29 Jul 2016-Science
TL;DR: Two-dimensional heterostructures with extended range of functionalities yields a range of possible applications, and spectrum reconstruction in graphene interacting with hBN allowed several groups to study the Hofstadter butterfly effect and topological currents in such a system.
Abstract: BACKGROUND Materials by design is an appealing idea that is very hard to realize in practice. Combining the best of different ingredients in one ultimate material is a task for which we currently have no general solution. However, we do have some successful examples to draw upon: Composite materials and III-V heterostructures have revolutionized many aspects of our lives. Still, we need a general strategy to solve the problem of mixing and matching crystals with different properties, creating combinations with predetermined attributes and functionalities. ADVANCES Two-dimensional (2D) materials offer a platform that allows creation of heterostructures with a variety of properties. One-atom-thick crystals now comprise a large family of these materials, collectively covering a very broad range of properties. The first material to be included was graphene, a zero-overlap semimetal. The family of 2D crystals has grown to includes metals (e.g., NbSe 2 ), semiconductors (e.g., MoS 2 ), and insulators [e.g., hexagonal boron nitride (hBN)]. Many of these materials are stable at ambient conditions, and we have come up with strategies for handling those that are not. Surprisingly, the properties of such 2D materials are often very different from those of their 3D counterparts. Furthermore, even the study of familiar phenomena (like superconductivity or ferromagnetism) in the 2D case, where there is no long-range order, raises many thought-provoking questions. A plethora of opportunities appear when we start to combine several 2D crystals in one vertical stack. Held together by van der Waals forces (the same forces that hold layered materials together), such heterostructures allow a far greater number of combinations than any traditional growth method. As the family of 2D crystals is expanding day by day, so too is the complexity of the heterostructures that could be created with atomic precision. When stacking different crystals together, the synergetic effects become very important. In the first-order approximation, charge redistribution might occur between the neighboring (and even more distant) crystals in the stack. Neighboring crystals can also induce structural changes in each other. Furthermore, such changes can be controlled by adjusting the relative orientation between the individual elements. Such heterostructures have already led to the observation of numerous exciting physical phenomena. Thus, spectrum reconstruction in graphene interacting with hBN allowed several groups to study the Hofstadter butterfly effect and topological currents in such a system. The possibility of positioning crystals in very close (but controlled) proximity to one another allows for the study of tunneling and drag effects. The use of semiconducting monolayers leads to the creation of optically active heterostructures. The extended range of functionalities of such heterostructures yields a range of possible applications. Now the highest-mobility graphene transistors are achieved by encapsulating graphene with hBN. Photovoltaic and light-emitting devices have been demonstrated by combining optically active semiconducting layers and graphene as transparent electrodes. OUTLOOK Currently, most 2D heterostructures are composed by direct stacking of individual monolayer flakes of different materials. Although this method allows ultimate flexibility, it is slow and cumbersome. Thus, techniques involving transfer of large-area crystals grown by chemical vapor deposition (CVD), direct growth of heterostructures by CVD or physical epitaxy, or one-step growth in solution are being developed. Currently, we are at the same level as we were with graphene 10 years ago: plenty of interesting science and unclear prospects for mass production. Given the fast progress of graphene technology over the past few years, we can expect similar advances in the production of the heterostructures, making the science and applications more achievable.

4,851 citations


Journal ArticleDOI
Haidong Wang1, Mohsen Naghavi1, Christine Allen1, Ryan M Barber1  +841 moreInstitutions (293)
TL;DR: The Global Burden of Disease 2015 Study provides a comprehensive assessment of all-cause and cause-specific mortality for 249 causes in 195 countries and territories from 1980 to 2015, finding several countries in sub-Saharan Africa had very large gains in life expectancy, rebounding from an era of exceedingly high loss of life due to HIV/AIDS.

4,804 citations


Journal ArticleDOI
TL;DR: An overview of UAV-aided wireless communications is provided, by introducing the basic networking architecture and main channel characteristics, highlighting the key design considerations as well as the new opportunities to be exploited.
Abstract: Wireless communication systems that include unmanned aerial vehicles promise to provide cost-effective wireless connectivity for devices without infrastructure coverage. Compared to terrestrial communications or those based on high-altitude platforms, on-demand wireless systems with low-altitude UAVs are in general faster to deploy, more flexibly reconfigured, and likely to have better communication channels due to the presence of short-range line-of-sight links. However, the utilization of highly mobile and energy-constrained UAVs for wireless communications also introduces many new challenges. In this article, we provide an overview of UAV-aided wireless communications, by introducing the basic networking architecture and main channel characteristics, highlighting the key design considerations as well as the new opportunities to be exploited.

3,145 citations


Journal ArticleDOI
Bin Zhou1, Yuan Lu2, Kaveh Hajifathalian2, James Bentham1  +494 moreInstitutions (170)
TL;DR: In this article, the authors used a Bayesian hierarchical model to estimate trends in diabetes prevalence, defined as fasting plasma glucose of 7.0 mmol/L or higher, or history of diagnosis with diabetes, or use of insulin or oral hypoglycaemic drugs in 200 countries and territories in 21 regions, by sex and from 1980 to 2014.

2,782 citations


Journal ArticleDOI
TL;DR: This paper provides further justification to prioritise promotion of regular physical activity worldwide as part of a comprehensive strategy to reduce non-communicable diseases.

1,369 citations


Journal ArticleDOI
26 Jul 2016-eLife
TL;DR: The height differential between the tallest and shortest populations was 19-20 cm a century ago, and has remained the same for women and increased for men a century later despite substantial changes in the ranking of countries.
Abstract: Being taller is associated with enhanced longevity, and higher education and earnings. We reanalysed 1472 population-based studies, with measurement of height on more than 18.6 million participants to estimate mean height for people born between 1896 and 1996 in 200 countries. The largest gain in adult height over the past century has occurred in South Korean women and Iranian men, who became 20.2 cm (95% credible interval 17.5–22.7) and 16.5 cm (13.3–19.7) taller, respectively. In contrast, there was little change in adult height in some sub-Saharan African countries and in South Asia over the century of analysis. The tallest people over these 100 years are men born in the Netherlands in the last quarter of 20th century, whose average heights surpassed 182.5 cm, and the shortest were women born in Guatemala in 1896 (140.3 cm; 135.8–144.8). The height differential between the tallest and shortest populations was 19-20 cm a century ago, and has remained the same for women and increased for men a century later despite substantial changes in the ranking of countries.

1,348 citations


Proceedings Article
12 Feb 2016
TL;DR: Correlation alignment (CORAL) as discussed by the authors minimizes domain shift by aligning the second-order statistics of source and target distributions, without requiring any target labels, and it can be implemented in four lines of Matlab code.
Abstract: Unlike human learning, machine learning often fails to handle changes between training (source) and test (target) input distributions. Such domain shifts, common in practical scenarios, severely damage the performance of conventional machine learning methods. Supervised domain adaptation methods have been proposed for the case when the target data have labels, including some that perform very well despite being "frustratingly easy" to implement. However, in practice, the target domain is often unlabeled, requiring unsupervised adaptation. We propose a simple, effective, and efficient method for unsupervised domain adaptation called CORrelation ALignment (CORAL). CORAL minimizes domain shift by aligning the second-order statistics of source and target distributions, without requiring any target labels. Even though it is extraordinarily simple–it can be implemented in four lines of Matlab code–CORAL performs remarkably well in extensive evaluations on standard benchmark datasets.

1,340 citations


Journal ArticleDOI
TL;DR: In this article, the authors review various studies on resource potential of natural gas hydrate, the current research progress in laboratory settings, and several recent field trials, and discuss possible limitation in each production method and the challenges to be addressed for large scale production.

1,236 citations


Proceedings ArticleDOI
24 Oct 2016
TL;DR: This paper investigates the security of running smart contracts based on Ethereum in an open distributed network like those of cryptocurrencies, and proposes ways to enhance the operational semantics of Ethereum to make contracts less vulnerable.
Abstract: Cryptocurrencies record transactions in a decentralized data structure called a blockchain. Two of the most popular cryptocurrencies, Bitcoin and Ethereum, support the feature to encode rules or scripts for processing transactions. This feature has evolved to give practical shape to the ideas of smart contracts, or full-fledged programs that are run on blockchains. Recently, Ethereum's smart contract system has seen steady adoption, supporting tens of thousands of contracts, holding millions dollars worth of virtual coins. In this paper, we investigate the security of running smart contracts based on Ethereum in an open distributed network like those of cryptocurrencies. We introduce several new security problems in which an adversary can manipulate smart contract execution to gain profit. These bugs suggest subtle gaps in the understanding of the distributed semantics of the underlying platform. As a refinement, we propose ways to enhance the operational semantics of Ethereum to make contracts less vulnerable. For developers writing contracts for the existing Ethereum system, we build a symbolic execution tool called Oyente to find potential security bugs. Among 19, 336 existing Ethereum contracts, Oyente flags 8, 833 of them as vulnerable, including the TheDAO bug which led to a 60 million US dollar loss in June 2016. We also discuss the severity of other attacks for several case studies which have source code available and confirm the attacks (which target only our accounts) in the main Ethereum network.

1,232 citations


Journal ArticleDOI
TL;DR: In this paper, the authors discuss the underpinnings of the topological band theory and its materials applications, and propose a framework for predicting new classes of topological materials.
Abstract: First-principles band theory, properly augmented by topological considerations, has provided a remarkably successful framework for predicting new classes of topological materials. This Colloquium discusses the underpinnings of the topological band theory and its materials applications.

Journal ArticleDOI
TL;DR: The intricate web of crosstalk among the often redundant multitudes of signaling intermediates is just beginning to be understood and future research employing genome-scale systems biology approaches to solve problems of such magnitude will undoubtedly lead to better understanding of plant development.
Abstract: Being sessile organisms, plants are often exposed to a wide array of abiotic and biotic stresses. Abiotic stress conditions include drought, heat, cold and salinity, whereas biotic stress arises mainly from bacteria, fungi, viruses, nematodes and insects. To adapt to such adverse situations, plants have evolved well-developed mechanisms that help to perceive the stress signal and enable optimal growth response. Phytohormones play critical roles in helping the plants to adapt to adverse environmental conditions. The elaborate hormone signaling networks and their ability to crosstalk make them ideal candidates for mediating defense responses. Recent research findings have helped to clarify the elaborate signaling networks and the sophisticated crosstalk occurring among the different hormone signaling pathways. In this review, we summarize the roles of the major plant hormones in regulating abiotic and biotic stress responses with special focus on the significance of crosstalk between different hormones in generating a sophisticated and efficient stress response. We divided the discussion into the roles of ABA, salicylic acid, jasmonates and ethylene separately at the start of the review. Subsequently, we have discussed the crosstalk among them, followed by crosstalk with growth promoting hormones (gibberellins, auxins and cytokinins). These have been illustrated with examples drawn from selected abiotic and biotic stress responses. The discussion on seed dormancy and germination serves to illustrate the fine balance that can be enforced by the two key hormones ABA and GA in regulating plant responses to environmental signals. The intricate web of crosstalk among the often redundant multitudes of signaling intermediates is just beginning to be understood. Future research employing genome-scale systems biology approaches to solve problems of such magnitude will undoubtedly lead to a better understanding of plant development. Therefore, discovering additional crosstalk mechanisms among various hormones in coordinating growth under stress will be an important theme in the field of abiotic stress research. Such efforts will help to reveal important points of genetic control that can be useful to engineer stress tolerant crops.

Journal ArticleDOI
TL;DR: Numerical results show that by optimizing the trajectory of the relay and power allocations adaptive to its induced channel variation, mobile relaying is able to achieve significant throughput gains over the conventional static relaying.
Abstract: In this paper, we consider a novel mobile relaying technique, where the relay nodes are mounted on unmanned aerial vehicles (UAVs) and hence are capable of moving at high speed. Compared with conventional static relaying, mobile relaying offers a new degree of freedom for performance enhancement via careful relay trajectory design. We study the throughput maximization problem in mobile relaying systems by optimizing the source/relay transmit power along with the relay trajectory, subject to practical mobility constraints (on the UAV’s speed and initial/final relay locations), as well as the information-causality constraint at the relay. It is shown that for the fixed relay trajectory, the throughput-optimal source/relay power allocations over time follow a “staircase” water filling structure, with non-increasing and non-decreasing water levels at the source and relay, respectively. On the other hand, with given power allocations, the throughput can be further improved by optimizing the UAV’s trajectory via successive convex optimization. An iterative algorithm is thus proposed to optimize the power allocations and relay trajectory alternately. Furthermore, for the special case with free initial and final relay locations, the jointly optimal power allocation and relay trajectory are derived. Numerical results show that by optimizing the trajectory of the relay and power allocations adaptive to its induced channel variation, mobile relaying is able to achieve significant throughput gains over the conventional static relaying.

Proceedings ArticleDOI
24 Oct 2016
TL;DR: ELASTICO is the first candidate for a secure sharding protocol with presence of byzantine adversaries, and scalability experiments on Amazon EC2 with up to $1, 600$ nodes confirm ELASTICO's theoretical scaling properties.
Abstract: Cryptocurrencies, such as Bitcoin and 250 similar alt-coins, embody at their core a blockchain protocol --- a mechanism for a distributed network of computational nodes to periodically agree on a set of new transactions. Designing a secure blockchain protocol relies on an open challenge in security, that of designing a highly-scalable agreement protocol open to manipulation by byzantine or arbitrarily malicious nodes. Bitcoin's blockchain agreement protocol exhibits security, but does not scale: it processes 3--7 transactions per second at present, irrespective of the available computation capacity at hand. In this paper, we propose a new distributed agreement protocol for permission-less blockchains called ELASTICO. ELASTICO scales transaction rates almost linearly with available computation for mining: the more the computation power in the network, the higher the number of transaction blocks selected per unit time. ELASTICO is efficient in its network messages and tolerates byzantine adversaries of up to one-fourth of the total computational power. Technically, ELASTICO uniformly partitions or parallelizes the mining network (securely) into smaller committees, each of which processes a disjoint set of transactions (or "shards"). While sharding is common in non-byzantine settings, ELASTICO is the first candidate for a secure sharding protocol with presence of byzantine adversaries. Our scalability experiments on Amazon EC2 with up to $1, 600$ nodes confirm ELASTICO's theoretical scaling properties.

Journal ArticleDOI
TL;DR: The in vitro organoid model facilitates an accurate study of a range of in vivo biological processes including tissue renewal, stem cell/niche functions and tissue responses to drugs, mutation or damage.
Abstract: The in vitro organoid model is a major technological breakthrough that has already been established as an essential tool in many basic biology and clinical applications. This near-physiological 3D model facilitates an accurate study of a range of in vivo biological processes including tissue renewal, stem cell/niche functions and tissue responses to drugs, mutation or damage. In this Review, we discuss the current achievements, challenges and potential applications of this technique.

Proceedings ArticleDOI
07 Jul 2016
TL;DR: A new learning algorithm based on the element-wise Alternating Least Squares (eALS) technique is designed, for efficiently optimizing a Matrix Factorization (MF) model with variably-weighted missing data and exploiting this efficiency to then seamlessly devise an incremental update strategy that instantly refreshes a MF model given new feedback.
Abstract: This paper contributes improvements on both the effectiveness and efficiency of Matrix Factorization (MF) methods for implicit feedback. We highlight two critical issues of existing works. First, due to the large space of unobserved feedback, most existing works resort to assign a uniform weight to the missing data to reduce computational complexity. However, such a uniform assumption is invalid in real-world settings. Second, most methods are also designed in an offline setting and fail to keep up with the dynamic nature of online data. We address the above two issues in learning MF models from implicit feedback. We first propose to weight the missing data based on item popularity, which is more effective and flexible than the uniform-weight assumption. However, such a non-uniform weighting poses efficiency challenge in learning the model. To address this, we specifically design a new learning algorithm based on the element-wise Alternating Least Squares (eALS) technique, for efficiently optimizing a MF model with variably-weighted missing data. We exploit this efficiency to then seamlessly devise an incremental update strategy that instantly refreshes a MF model given new feedback. Through comprehensive experiments on two public datasets in both offline and online protocols, we show that our implemented, open-source (https://github.com/hexiangnan/sigir16-eals) eALS consistently outperforms state-of-the-art implicit MF methods.

Journal ArticleDOI
TL;DR: This review article aims at studying the applications of water stable MOFs in terms of five major areas: adsorption, membrane separation, sensing, catalysis, and proton conduction.
Abstract: The recent advancement of water stable metal–organic frameworks (MOFs) expands the application of this unique porous material. This review article aims at studying their applications in terms of five major areas: adsorption, membrane separation, sensing, catalysis, and proton conduction. These applications are either conducted in a water-containing environment or directly targeted on water treatment processes. The representative and significant studies in each area were comprehensively reviewed and discussed for perspectives, to serve as a reference for researchers working in related areas. At the end, a summary and future outlook on the applications of water stable MOFs are suggested as concluding remarks.

Journal ArticleDOI
11 Jul 2016-Nature
TL;DR: In this paper, the authors performed whole-genome sequencing in 2,657 European individuals with and without diabetes, and exome sequencing for 12,940 individuals from five ancestry groups.
Abstract: The genetic architecture of common traits, including the number, frequency, and effect sizes of inherited variants that contribute to individual risk, has been long debated. Genome-wide association studies have identified scores of common variants associated with type 2 diabetes, but in aggregate, these explain only a fraction of the heritability of this disease. Here, to test the hypothesis that lower-frequency variants explain much of the remainder, the GoT2D and T2D-GENES consortia performed whole-genome sequencing in 2,657 European individuals with and without diabetes, and exome sequencing in 12,940 individuals from five ancestry groups. To increase statistical power, we expanded the sample size via genotyping and imputation in a further 111,548 subjects. Variants associated with type 2 diabetes after sequencing were overwhelmingly common and most fell within regions previously identified by genome-wide association studies. Comprehensive enumeration of sequence variation is necessary to identify functional alleles that provide important clues to disease pathophysiology, but large-scale sequencing does not support the idea that lower-frequency variants have a major role in predisposition to type 2 diabetes.

Journal ArticleDOI
TL;DR: In this paper, the authors combined satellite-based estimates, chemical transport model simulations, and ground measurements from 79 different countries to produce global estimates of annual average fine particle (PM2.5) and ozone concentrations at 0.1° × 0. 1° spatial resolution for five-year intervals from 1990 to 2010 and the year 2013.
Abstract: Exposure to ambient air pollution is a major risk factor for global disease. Assessment of the impacts of air pollution on population health and evaluation of trends relative to other major risk factors requires regularly updated, accurate, spatially resolved exposure estimates. We combined satellite-based estimates, chemical transport model simulations, and ground measurements from 79 different countries to produce global estimates of annual average fine particle (PM2.5) and ozone concentrations at 0.1° × 0.1° spatial resolution for five-year intervals from 1990 to 2010 and the year 2013. These estimates were applied to assess population-weighted mean concentrations for 1990-2013 for each of 188 countries. In 2013, 87% of the world's population lived in areas exceeding the World Health Organization Air Quality Guideline of 10 μg/m(3) PM2.5 (annual average). Between 1990 and 2013, global population-weighted PM2.5 increased by 20.4% driven by trends in South Asia, Southeast Asia, and China. Decreases in population-weighted mean concentrations of PM2.5 were evident in most high income countries. Population-weighted mean concentrations of ozone increased globally by 8.9% from 1990-2013 with increases in most countries-except for modest decreases in North America, parts of Europe, and several countries in Southeast Asia.

Journal ArticleDOI
TL;DR: In this paper, a review summarizes the progress in using such biopolymers as reinforcement fillers, antioxidants, UV adsorbents, antimicrobial agents, carbon precursors and biomaterials for tissue engineering and gene therapy.

Book ChapterDOI
22 Feb 2016
TL;DR: In this article, the authors analyze how fundamental and circumstantial bottlenecks in Bitcoin limit the ability of its current peer-to-peer overlay network to support substantially higher throughputs and lower latencies.
Abstract: The increasing popularity of blockchain-based cryptocurrencies has made scalability a primary and urgent concern. We analyze how fundamental and circumstantial bottlenecks in Bitcoin limit the ability of its current peer-to-peer overlay network to support substantially higher throughputs and lower latencies. Our results suggest that reparameterization of block size and intervals should be viewed only as a first increment toward achieving next-generation, high-load blockchain protocols, and major advances will additionally require a basic rethinking of technical approaches. We offer a structured perspective on the design space for such approaches. Within this perspective, we enumerate and briefly discuss a number of recently proposed protocol ideas and offer several new ideas and open challenges.

Journal ArticleDOI
11 Nov 2016-Science
TL;DR: The full range and scale of climate change effects on global biodiversity that have been observed in natural systems are described, and a set of core ecological processes that underpin ecosystem functioning and support services to people are identified.
Abstract: Most ecological processes now show responses to anthropogenic climate change. In terrestrial, freshwater, and marine ecosystems, species are changing genetically, physiologically, morphologically, and phenologically and are shifting their distributions, which affects food webs and results in new interactions. Disruptions scale from the gene to the ecosystem and have documented consequences for people, including unpredictable fisheries and crop yields, loss of genetic diversity in wild crop varieties, and increasing impacts of pests and diseases. In addition to the more easily observed changes, such as shifts in flowering phenology, we argue that many hidden dynamics, such as genetic changes, are also taking place. Understanding shifts in ecological processes can guide human adaptation strategies. In addition to reducing greenhouse gases, climate action and policy must therefore focus equally on strategies that safeguard biodiversity and ecosystems.

Journal ArticleDOI
TL;DR: A systematic and critical review of the production of activated carbon from hydrochars is presented in this paper, where the current knowledge gaps and challenges involved in the hydrothermal carbonization of biomass waste are critically evaluated with suggestions for further research.

Journal ArticleDOI
25 Mar 2016-Science
TL;DR: To contribute data about replicability in economics, 18 studies published in the American Economic Review and the Quarterly Journal of Economics between 2011 and 2014 are replicated, finding that two-thirds of the 18 studies examined yielded replicable estimates of effect size and direction.
Abstract: The replicability of some scientific findings has recently been called into question. To contribute data about replicability in economics, we replicated 18 studies published in the American Economic Review and the Quarterly Journal of Economics between 2011 and 2014. All of these replications followed predefined analysis plans that were made publicly available beforehand, and they all have a statistical power of at least 90% to detect the original effect size at the 5% significance level. We found a significant effect in the same direction as in the original study for 11 replications (61%); on average, the replicated effect size is 66% of the original. The replicability rate varies between 67% and 78% for four additional replicability indicators, including a prediction market measure of peer beliefs.

Journal ArticleDOI
01 Dec 2016-Nature
TL;DR: In this article, the authors present a comprehensive analysis of warming-induced changes in soil carbon stocks by assembling data from 49 field experiments located across North America, Europe and Asia, and provide estimates of soil carbon sensitivity to warming that may help to constrain Earth system model projections.
Abstract: The majority of the Earth's terrestrial carbon is stored in the soil. If anthropogenic warming stimulates the loss of this carbon to the atmosphere, it could drive further planetary warming. Despite evidence that warming enhances carbon fluxes to and from the soil, the net global balance between these responses remains uncertain. Here we present a comprehensive analysis of warming-induced changes in soil carbon stocks by assembling data from 49 field experiments located across North America, Europe and Asia. We find that the effects of warming are contingent on the size of the initial soil carbon stock, with considerable losses occurring in high-latitude areas. By extrapolating this empirical relationship to the global scale, we provide estimates of soil carbon sensitivity to warming that may help to constrain Earth system model projections. Our empirical relationship suggests that global soil carbon stocks in the upper soil horizons will fall by 30 ± 30 petagrams of carbon to 203 ± 161 petagrams of carbon under one degree of warming, depending on the rate at which the effects of warming are realized. Under the conservative assumption that the response of soil carbon to warming occurs within a year, a business-as-usual climate scenario would drive the loss of 55 ± 50 petagrams of carbon from the upper soil horizons by 2050. This value is around 12-17 per cent of the expected anthropogenic emissions over this period. Despite the considerable uncertainty in our estimates, the direction of the global soil carbon response is consistent across all scenarios. This provides strong empirical support for the idea that rising temperatures will stimulate the net loss of soil carbon to the atmosphere, driving a positive land carbon-climate feedback that could accelerate climate change.

Journal ArticleDOI
TL;DR: Regular monitoring of vaccine attitudes – coupled with monitoring of local immunization rates – at the national and sub-national levels can identify populations with declining confidence and acceptance.

Journal ArticleDOI
TL;DR: In this article, a review of the physical properties of phosphorene and its application in 2D semiconductor materials is presented. But, unlike graphene, phosphorenes have an anisotropic orthorhombic structure that is ductile along one of the inplane crystal directions but stiff along the other.
Abstract: 2D materials are the focus of an intense research effort because of their unique properties and their potential for revealing intriguing new phenomena. Phosphorene, a monolayer of black phosphorus, earned its place among the family of 2D semiconductor materials when recent results unveiled its high carrier mobility, high optical and UV absorption, and other attractive properties, which are of particular interest for optoelectronic applications. Unlike graphene, phosphorene has an anisotropic orthorhombic structure that is ductile along one of the in-plane crystal directions but stiff along the other. This results in unusual mechanical, electronic, optical and transport properties that reflect the anisotropy of the lattice. This Review summarizes the physical properties of phosphorene and highlights the recent progress made in the preparation, isolation and characterization of this material. The role of defects and doping is discussed, and phosphorene-based devices are surveyed; finally, the remaining challenges and potential applications of phosphorene are outlined. Phosphorene is a 2D material exhibiting remarkable mechanical, electronic and optical properties. In this Review, we survey fabrication techniques and discuss theoretical and experimental findings, exploring phosphorene from its fundamental properties to its implementation in devices.

Journal ArticleDOI
TL;DR: A new trimethylaluminum vapor-based crosslinking method to render the nanocrystal films insoluble is applied, coupled with the natural confinement of injected charges within the perovskite crystals, facilitates electron-hole capture and gives rise to a remarkable electroluminescence yield.
Abstract: The preparation of highly efficient perovskite nanocrystal light-emitting diodes is shown. A new trimethylaluminum vapor-based crosslinking method to render the nanocrystal films insoluble is applied. The resulting near-complete nanocrystal film coverage, coupled with the natural confinement of injected charges within the perovskite crystals, facilitates electron-hole capture and give rise to a remarkable electroluminescence yield of 5.7%.

Journal ArticleDOI
TL;DR: The characteristics of lncRNAs, including their roles, functions, and working mechanisms are summarized, methods for identifying and annotating lnc RNAs are described, and future opportunities for lncRNA-based therapies using antisense oligonucleotides are discussed.

Journal ArticleDOI
Lourens Poorter1, Frans Bongers1, T. Mitchell Aide2, Angelica M. Almeyda Zambrano3, Patricia Balvanera4, Justin M. Becknell5, Vanessa K. Boukili6, Pedro H. S. Brancalion7, Eben N. Broadbent3, Robin L. Chazdon6, Dylan Craven8, Dylan Craven9, Jarcilene S. Almeida-Cortez10, George A. L. Cabral10, Ben H. J. de Jong, Julie S. Denslow11, Daisy H. Dent12, Daisy H. Dent9, Saara J. DeWalt13, Juan Manuel Dupuy, Sandra M. Durán14, Mário M. Espírito-Santo, María C. Fandiño, Ricardo Gomes César7, Jefferson S. Hall9, José Luis Hernández-Stefanoni, Catarina C. Jakovac15, Catarina C. Jakovac1, André Braga Junqueira15, André Braga Junqueira1, Deborah K. Kennard16, Susan G. Letcher17, Juan Carlos Licona, Madelon Lohbeck18, Madelon Lohbeck1, Erika Marin-Spiotta19, Miguel Martínez-Ramos4, Paulo Eduardo dos Santos Massoca15, Jorge A. Meave4, Rita C. G. Mesquita15, Francisco Mora4, Rodrigo Muñoz4, Robert Muscarella20, Robert Muscarella21, Yule Roberta Ferreira Nunes, Susana Ochoa-Gaona, Alexandre Adalardo de Oliveira7, Edith Orihuela-Belmonte, Marielos Peña-Claros1, Eduardo A. Pérez-García4, Daniel Piotto, Jennifer S. Powers22, Jorge Rodríguez-Velázquez4, I. Eunice Romero-Pérez4, Jorge Ruiz23, Jorge Ruiz24, Juan Saldarriaga, Arturo Sanchez-Azofeifa14, Naomi B. Schwartz20, Marc K. Steininger, Nathan G. Swenson25, Marisol Toledo, María Uriarte20, Michiel van Breugel26, Michiel van Breugel27, Michiel van Breugel9, Hans van der Wal28, Maria das Dores Magalhães Veloso, Hans F. M. Vester29, Alberto Vicentini15, Ima Célia Guimarães Vieira30, Tony Vizcarra Bentos15, G. Bruce Williamson31, G. Bruce Williamson15, Danaë M. A. Rozendaal1, Danaë M. A. Rozendaal32, Danaë M. A. Rozendaal6 
11 Feb 2016-Nature
TL;DR: A biomass recovery map of Latin America is presented, which illustrates geographical and climatic variation in carbon sequestration potential during forest regrowth and will support policies to minimize forest loss in areas where biomass resilience is naturally low and promote forest regeneration and restoration in humid tropical lowland areas with high biomass resilience.
Abstract: Land-use change occurs nowhere more rapidly than in the tropics, where the imbalance between deforestation and forest regrowth has large consequences for the global carbon cycle. However, considerable uncertainty remains about the rate of biomass recovery in secondary forests, and how these rates are influenced by climate, landscape, and prior land use. Here we analyse aboveground biomass recovery during secondary succession in 45 forest sites and about 1,500 forest plots covering the major environmental gradients in the Neotropics. The studied secondary forests are highly productive and resilient. Aboveground biomass recovery after 20 years was on average 122 megagrams per hectare (Mg ha(-1)), corresponding to a net carbon uptake of 3.05 Mg C ha(-1) yr(-1), 11 times the uptake rate of old-growth forests. Aboveground biomass stocks took a median time of 66 years to recover to 90% of old-growth values. Aboveground biomass recovery after 20 years varied 11.3-fold (from 20 to 225 Mg ha(-1)) across sites, and this recovery increased with water availability (higher local rainfall and lower climatic water deficit). We present a biomass recovery map of Latin America, which illustrates geographical and climatic variation in carbon sequestration potential during forest regrowth. The map will support policies to minimize forest loss in areas where biomass resilience is naturally low (such as seasonally dry forest regions) and promote forest regeneration and restoration in humid tropical lowland areas with high biomass resilience.