scispace - formally typeset
Search or ask a question

Showing papers by "ETH Zurich published in 2016"


Journal ArticleDOI
Daniel J. Klionsky1, Kotb Abdelmohsen2, Akihisa Abe3, Joynal Abedin4  +2519 moreInstitutions (695)
TL;DR: In this paper, the authors present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macro-autophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes.
Abstract: In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure flux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation, it is imperative to target by gene knockout or RNA interference more than one autophagy-related protein. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways implying that not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular assays, we hope to encourage technical innovation in the field.

5,187 citations


Book ChapterDOI
08 Oct 2016
TL;DR: Temporal Segment Networks (TSN) as discussed by the authors combine a sparse temporal sampling strategy and video-level supervision to enable efficient and effective learning using the whole action video, which obtains the state-of-the-art performance on the datasets of HMDB51 and UCF101.
Abstract: Deep convolutional networks have achieved great success for visual recognition in still images. However, for action recognition in videos, the advantage over traditional methods is not so evident. This paper aims to discover the principles to design effective ConvNet architectures for action recognition in videos and learn these models given limited training samples. Our first contribution is temporal segment network (TSN), a novel framework for video-based action recognition. which is based on the idea of long-range temporal structure modeling. It combines a sparse temporal sampling strategy and video-level supervision to enable efficient and effective learning using the whole action video. The other contribution is our study on a series of good practices in learning ConvNets on video data with the help of temporal segment network. Our approach obtains the state-the-of-art performance on the datasets of HMDB51 (\( 69.4\,\% \)) and UCF101 (\( 94.2\,\% \)). We also visualize the learned ConvNet models, which qualitatively demonstrates the effectiveness of temporal segment network and the proposed good practices (Models and code at https://github.com/yjxiong/temporal-segment-networks).

2,778 citations


Journal ArticleDOI
Mingxun Wang1, Jeremy Carver1, Vanessa V. Phelan2, Laura M. Sanchez2, Neha Garg2, Yao Peng1, Don D. Nguyen1, Jeramie D. Watrous2, Clifford A. Kapono1, Tal Luzzatto-Knaan2, Carla Porto2, Amina Bouslimani2, Alexey V. Melnik2, Michael J. Meehan2, Wei-Ting Liu3, Max Crüsemann4, Paul D. Boudreau4, Eduardo Esquenazi, Mario Sandoval-Calderón5, Roland D. Kersten6, Laura A. Pace2, Robert A. Quinn7, Katherine R. Duncan8, Cheng-Chih Hsu1, Dimitrios J. Floros1, Ronnie G. Gavilan, Karin Kleigrewe4, Trent R. Northen9, Rachel J. Dutton10, Delphine Parrot11, Erin E. Carlson12, Bertrand Aigle13, Charlotte Frydenlund Michelsen14, Lars Jelsbak14, Christian Sohlenkamp5, Pavel A. Pevzner1, Anna Edlund15, Anna Edlund16, Jeffrey S. McLean15, Jeffrey S. McLean17, Jörn Piel18, Brian T. Murphy19, Lena Gerwick4, Chih-Chuang Liaw20, Yu-Liang Yang21, Hans-Ulrich Humpf22, Maria Maansson14, Robert A. Keyzers23, Amy C. Sims24, Andrew R. Johnson25, Ashley M. Sidebottom25, Brian E. Sedio26, Andreas Klitgaard14, Charles B. Larson4, Charles B. Larson2, Cristopher A. Boya P., Daniel Torres-Mendoza, David Gonzalez2, Denise Brentan Silva27, Denise Brentan Silva28, Lucas Miranda Marques28, Daniel P. Demarque28, Egle Pociute, Ellis C. O’Neill4, Enora Briand4, Enora Briand11, Eric J. N. Helfrich18, Eve A. Granatosky29, Evgenia Glukhov4, Florian Ryffel18, Hailey Houson, Hosein Mohimani1, Jenan J. Kharbush4, Yi Zeng1, Julia A. Vorholt18, Kenji L. Kurita30, Pep Charusanti1, Kerry L. McPhail31, Kristian Fog Nielsen14, Lisa Vuong, Maryam Elfeki19, Matthew F. Traxler32, Niclas Engene33, Nobuhiro Koyama2, Oliver B. Vining31, Ralph S. Baric24, Ricardo Pianta Rodrigues da Silva28, Samantha J. Mascuch4, Sophie Tomasi11, Stefan Jenkins9, Venkat R. Macherla, Thomas Hoffman, Vinayak Agarwal4, Philip G. Williams34, Jingqui Dai34, Ram P. Neupane34, Joshua R. Gurr34, Andrés M. C. Rodríguez28, Anne Lamsa1, Chen Zhang1, Kathleen Dorrestein2, Brendan M. Duggan2, Jehad Almaliti2, Pierre-Marie Allard35, Prasad Phapale, Louis-Félix Nothias36, Theodore Alexandrov, Marc Litaudon36, Jean-Luc Wolfender35, Jennifer E. Kyle37, Thomas O. Metz37, Tyler Peryea38, Dac-Trung Nguyen38, Danielle VanLeer38, Paul Shinn38, Ajit Jadhav38, Rolf Müller, Katrina M. Waters37, Wenyuan Shi15, Xueting Liu39, Lixin Zhang39, Rob Knight1, Paul R. Jensen4, Bernhard O. Palsson1, Kit Pogliano1, Roger G. Linington30, Marcelino Gutiérrez, Norberto Peporine Lopes28, William H. Gerwick2, William H. Gerwick4, Bradley S. Moore2, Bradley S. Moore4, Pieter C. Dorrestein4, Pieter C. Dorrestein2, Nuno Bandeira1, Nuno Bandeira2 
TL;DR: In GNPS, crowdsourced curation of freely available community-wide reference MS libraries will underpin improved annotations and data-driven social-networking should facilitate identification of spectra and foster collaborations.
Abstract: The potential of the diverse chemistries present in natural products (NP) for biotechnology and medicine remains untapped because NP databases are not searchable with raw data and the NP community has no way to share data other than in published papers. Although mass spectrometry (MS) techniques are well-suited to high-throughput characterization of NP, there is a pressing need for an infrastructure to enable sharing and curation of data. We present Global Natural Products Social Molecular Networking (GNPS; http://gnps.ucsd.edu), an open-access knowledge base for community-wide organization and sharing of raw, processed or identified tandem mass (MS/MS) spectrometry data. In GNPS, crowdsourced curation of freely available community-wide reference MS libraries will underpin improved annotations. Data-driven social-networking should facilitate identification of spectra and foster collaborations. We also introduce the concept of 'living data' through continuous reanalysis of deposited data.

2,365 citations


Journal ArticleDOI
30 Jun 2016-Nature
TL;DR: Substantial enhancement or over-delivery on current INDCs by additional national, sub-national and non-state actions is required to maintain a reasonable chance of meeting the target of keeping warming well below 2 degrees Celsius.
Abstract: The Paris climate agreement aims at holding global warming to well below 2 degrees Celsius and to "pursue efforts" to limit it to 1.5 degrees Celsius. To accomplish this, countries have submitted Intended Nationally Determined Contributions (INDCs) outlining their post-2020 climate action. Here we assess the effect of current INDCs on reducing aggregate greenhouse gas emissions, its implications for achieving the temperature objective of the Paris climate agreement, and potential options for overachievement. The INDCs collectively lower greenhouse gas emissions compared to where current policies stand, but still imply a median warming of 2.6-3.1 degrees Celsius by 2100. More can be achieved, because the agreement stipulates that targets for reducing greenhouse gas emissions are strengthened over time, both in ambition and scope. Substantial enhancement or over-delivery on current INDCs by additional national, sub-national and non-state actions is required to maintain a reasonable chance of meeting the target of keeping warming well below 2 degrees Celsius.

2,333 citations


Journal ArticleDOI
John Allison1, K. Amako2, John Apostolakis3, Pedro Arce4, Makoto Asai5, Tsukasa Aso6, Enrico Bagli, Alexander Bagulya7, Sw. Banerjee8, G. Barrand9, B. R. Beck10, Alexey Bogdanov11, D. Brandt, Jeremy M. C. Brown12, Helmut Burkhardt3, Ph Canal8, D. Cano-Ott4, Stephane Chauvie, Kyung-Suk Cho13, G.A.P. Cirrone14, Gene Cooperman15, M. A. Cortés-Giraldo16, G. Cosmo3, Giacomo Cuttone14, G.O. Depaola17, Laurent Desorgher, X. Dong15, Andrea Dotti5, Victor Daniel Elvira8, Gunter Folger3, Ziad Francis18, A. Galoyan19, L. Garnier9, M. Gayer3, K. Genser8, Vladimir Grichine7, Vladimir Grichine3, Susanna Guatelli20, Susanna Guatelli21, Paul Gueye22, P. Gumplinger23, Alexander Howard24, Ivana Hřivnáčová9, S. Hwang13, Sebastien Incerti25, Sebastien Incerti26, A. Ivanchenko3, Vladimir Ivanchenko3, F.W. Jones23, S. Y. Jun8, Pekka Kaitaniemi27, Nicolas A. Karakatsanis28, Nicolas A. Karakatsanis29, M. Karamitrosi30, M.H. Kelsey5, Akinori Kimura31, Tatsumi Koi5, Hisaya Kurashige32, A. Lechner3, S. B. Lee33, Francesco Longo34, M. Maire, Davide Mancusi, A. Mantero, E. Mendoza4, B. Morgan35, K. Murakami2, T. Nikitina3, Luciano Pandola14, P. Paprocki3, J Perl5, Ivan Petrović36, Maria Grazia Pia, W. Pokorski3, J. M. Quesada16, M. Raine, Maria A.M. Reis37, Alberto Ribon3, A. Ristic Fira36, Francesco Romano14, Giorgio Ivan Russo14, Giovanni Santin38, Takashi Sasaki2, D. Sawkey39, J. I. Shin33, Igor Strakovsky40, A. Taborda37, Satoshi Tanaka41, B. Tome, Toshiyuki Toshito, H.N. Tran42, Pete Truscott, L. Urbán, V. V. Uzhinsky19, Jerome Verbeke10, M. Verderi43, B. Wendt44, H. Wenzel8, D. H. Wright5, Douglas Wright10, T. Yamashita, J. Yarba8, H. Yoshida45 
TL;DR: Geant4 as discussed by the authors is a software toolkit for the simulation of the passage of particles through matter, which is used by a large number of experiments and projects in a variety of application domains, including high energy physics, astrophysics and space science, medical physics and radiation protection.
Abstract: Geant4 is a software toolkit for the simulation of the passage of particles through matter. It is used by a large number of experiments and projects in a variety of application domains, including high energy physics, astrophysics and space science, medical physics and radiation protection. Over the past several years, major changes have been made to the toolkit in order to accommodate the needs of these user communities, and to efficiently exploit the growth of computing power made available by advances in technology. The adaptation of Geant4 to multithreading, advances in physics, detector modeling and visualization, extensions to the toolkit, including biasing and reverse Monte Carlo, and tools for physics and release validation are discussed here.

2,260 citations


Journal ArticleDOI
TL;DR: Simultaneous localization and mapping (SLAM) as mentioned in this paper consists in the concurrent construction of a model of the environment (the map), and the estimation of the state of the robot moving within it.
Abstract: Simultaneous localization and mapping (SLAM) consists in the concurrent construction of a model of the environment (the map ), and the estimation of the state of the robot moving within it. The SLAM community has made astonishing progress over the last 30 years, enabling large-scale real-world applications and witnessing a steady transition of this technology to industry. We survey the current state of SLAM and consider future directions. We start by presenting what is now the de-facto standard formulation for SLAM. We then review related work, covering a broad set of topics including robustness and scalability in long-term mapping, metric and semantic representations for mapping, theoretical performance guarantees, active SLAM and exploration, and other new frontiers. This paper simultaneously serves as a position paper and tutorial to those who are users of SLAM. By looking at the published research with a critical eye, we delineate open challenges and new research issues, that still deserve careful scientific investigation. The paper also contains the authors’ take on two questions that often animate discussions during robotics conferences: Do robots need SLAM? and Is SLAM solved?

2,039 citations


Journal ArticleDOI
21 Apr 2016-Cell
TL;DR: It is concluded that transcript levels by themselves are not sufficient to predict protein levels in many scenarios and to thus explain genotype-phenotype relationships and that high-quality data quantifying different levels of gene expression are indispensable for the complete understanding of biological processes.

1,996 citations


Journal ArticleDOI
TL;DR: The Scenario Model Intercomparison Project (ScenarioMIP) as discussed by the authors is the primary activity within Phase 6 of the Coupled Model Comparison Project (CMIP6) that will provide multi-model climate projections based on alternative scenarios of future emissions and land use changes produced with integrated assessment models.
Abstract: . Projections of future climate change play a fundamental role in improving understanding of the climate system as well as characterizing societal risks and response options. The Scenario Model Intercomparison Project (ScenarioMIP) is the primary activity within Phase 6 of the Coupled Model Intercomparison Project (CMIP6) that will provide multi-model climate projections based on alternative scenarios of future emissions and land use changes produced with integrated assessment models. In this paper, we describe ScenarioMIP's objectives, experimental design, and its relation to other activities within CMIP6. The ScenarioMIP design is one component of a larger scenario process that aims to facilitate a wide range of integrated studies across the climate science, integrated assessment modeling, and impacts, adaptation, and vulnerability communities, and will form an important part of the evidence base in the forthcoming Intergovernmental Panel on Climate Change (IPCC) assessments. At the same time, it will provide the basis for investigating a number of targeted science and policy questions that are especially relevant to scenario-based analysis, including the role of specific forcings such as land use and aerosols, the effect of a peak and decline in forcing, the consequences of scenarios that limit warming to below 2 °C, the relative contributions to uncertainty from scenarios, climate models, and internal variability, and long-term climate system outcomes beyond the 21st century. To serve this wide range of scientific communities and address these questions, a design has been identified consisting of eight alternative 21st century scenarios plus one large initial condition ensemble and a set of long-term extensions, divided into two tiers defined by relative priority. Some of these scenarios will also provide a basis for variants planned to be run in other CMIP6-Endorsed MIPs to investigate questions related to specific forcings. Harmonized, spatially explicit emissions and land use scenarios generated with integrated assessment models will be provided to participating climate modeling groups by late 2016, with the climate model simulations run within the 2017–2018 time frame, and output from the climate model projections made available and analyses performed over the 2018–2020 period.

1,758 citations


Proceedings ArticleDOI
27 Jun 2016
TL;DR: This work presents a new benchmark dataset and evaluation methodology for the area of video object segmentation, named DAVIS (Densely Annotated VIdeo Segmentation), and provides a comprehensive analysis of several state-of-the-art segmentation approaches using three complementary metrics.
Abstract: Over the years, datasets and benchmarks have proven their fundamental importance in computer vision research, enabling targeted progress and objective comparisons in many fields. At the same time, legacy datasets may impend the evolution of a field due to saturated algorithm performance and the lack of contemporary, high quality data. In this work we present a new benchmark dataset and evaluation methodology for the area of video object segmentation. The dataset, named DAVIS (Densely Annotated VIdeo Segmentation), consists of fifty high quality, Full HD video sequences, spanning multiple occurrences of common video object segmentation challenges such as occlusions, motionblur and appearance changes. Each video is accompanied by densely annotated, pixel-accurate and per-frame ground truth segmentation. In addition, we provide a comprehensive analysis of several state-of-the-art segmentation approaches using three complementary metrics that measure the spatial extent of the segmentation, the accuracy of the silhouette contours and the temporal coherence. The results uncover strengths and weaknesses of current approaches, opening up promising directions for future works.

1,656 citations


Journal ArticleDOI
15 Sep 2016-Nature
TL;DR: Powerful mass-spectrometry-based technologies now provide unprecedented insights into the composition, structure, function and control of the proteome, shedding light on complex biological processes and phenotypes.
Abstract: Numerous biological processes are concurrently and coordinately active in every living cell. Each of them encompasses synthetic, catalytic and regulatory functions that are, almost always, carried out by proteins organized further into higher-order structures and networks. For decades, the structures and functions of selected proteins have been studied using biochemical and biophysical methods. However, the properties and behaviour of the proteome as an integrated system have largely remained elusive. Powerful mass-spectrometry-based technologies now provide unprecedented insights into the composition, structure, function and control of the proteome, shedding light on complex biological processes and phenotypes.

1,458 citations


Book ChapterDOI
08 Oct 2016
TL;DR: The core contributions are the joint estimation of depth andnormal information, pixelwise view selection using photometric and geometric priors, and a multi-view geometric consistency term for the simultaneous refinement and image-based depth and normal fusion.
Abstract: This work presents a Multi-View Stereo system for robust and efficient dense modeling from unstructured image collections. Our core contributions are the joint estimation of depth and normal information, pixelwise view selection using photometric and geometric priors, and a multi-view geometric consistency term for the simultaneous refinement and image-based depth and normal fusion. Experiments on benchmarks and large-scale Internet photo collections demonstrate state-of-the-art performance in terms of accuracy, completeness, and efficiency.

Journal ArticleDOI
TL;DR: Eleven datasets are provided, ranging from slow flights under good visual conditions to dynamic flights with motion blur and poor illumination, enabling researchers to thoroughly test and evaluate their algorithms.
Abstract: This paper presents visual-inertial datasets collected on-board a micro aerial vehicle. The datasets contain synchronized stereo images, IMU measurements and accurate ground truth. The first batch of datasets facilitates the design and evaluation of visual-inertial localization algorithms on real flight data. It was collected in an industrial environment and contains millimeter accurate position ground truth from a laser tracking system. The second batch of datasets is aimed at precise 3D environment reconstruction and was recorded in a room equipped with a motion capture system. The datasets contain 6D pose ground truth and a detailed 3D scan of the environment. Eleven datasets are provided in total, ranging from slow flights under good visual conditions to dynamic flights with motion blur and poor illumination, enabling researchers to thoroughly test and evaluate their algorithms. All datasets contain raw sensor measurements, spatio-temporally aligned sensor data and ground truth, extrinsic and intrinsic calibrations and datasets for custom calibrations.

Journal ArticleDOI
21 Jul 2016-Nature
TL;DR: It is shown how the human gut microbiome impacts the serum metabolome and associates with insulin resistance in 277 non-diabetic Danish individuals and suggested that microbial targets may have the potential to diminish insulin resistance and reduce the incidence of common metabolic and cardiovascular disorders.
Abstract: Insulin resistance is a forerunner state of ischaemic cardiovascular disease and type 2 diabetes. Here we show how the human gut microbiome impacts the serum metabolome and associates with insulin resistance in 277 non-diabetic Danish individuals. The serum metabolome of insulin-resistant individuals is characterized by increased levels of branched-chain amino acids (BCAAs), which correlate with a gut microbiome that has an enriched biosynthetic potential for BCAAs and is deprived of genes encoding bacterial inward transporters for these amino acids. Prevotella copri and Bacteroides vulgatus are identified as the main species driving the association between biosynthesis of BCAAs and insulin resistance, and in mice we demonstrate that P. copri can induce insulin resistance, aggravate glucose intolerance and augment circulating levels of BCAAs. Our findings suggest that microbial targets may have the potential to diminish insulin resistance and reduce the incidence of common metabolic and cardiovascular disorders.

Journal ArticleDOI
26 Jan 2016-ACS Nano
TL;DR: It is found that ligand binding to the NC surface is highly dynamic, and therefore, ligands are easily lost during the isolation and purification procedures, and when a small amount of both oleic acid and oleylamine is added, the NCs can be purified, maintaining optical, colloidal, and material integrity.
Abstract: Lead halide perovskite materials have attracted significant attention in the context of photovoltaics and other optoelectronic applications, and recently, research efforts have been directed to nanostructured lead halide perovskites. Collodial nanocrystals (NCs) of cesium lead halides (CsPbX3, X = Cl, Br, I) exhibit bright photoluminescence, with emission tunable over the entire visible spectral region. However, previous studies on CsPbX3 NCs did not address key aspects of their chemistry and photophysics such as surface chemistry and quantitative light absorption. Here, we elaborate on the synthesis of CsPbBr3 NCs and their surface chemistry. In addition, the intrinsic absorption coefficient was determined experimentally by combining elemental analysis with accurate optical absorption measurements. 1H solution nuclear magnetic resonance spectroscopy was used to characterize sample purity, elucidate the surface chemistry, and evaluate the influence of purification methods on the surface composition. We fi...

Journal ArticleDOI
TL;DR: The report includes the description of a computational machinery for nonlinear optical spectroscopy through an interface to the QM/MM package Cobramm.
Abstract: In this report, we summarize and describe the recent unique updates and additions to the Molcas quantum chemistry program suite as contained in release version 8. These updates include natural and spin orbitals for studies of magnetic properties, local and linear scaling methods for the Douglas-Kroll-Hess transformation, the generalized active space concept in MCSCF methods, a combination of multiconfigurational wave functions with density functional theory in the MC-PDFT method, additional methods for computation of magnetic properties, methods for diabatization, analytical gradients of state average complete active space SCF in association with density fitting, methods for constrained fragment optimization, large-scale parallel multireference configuration interaction including analytic gradients via the interface to the Columbus package, and approximations of the CASPT2 method to be used for computations of large systems. In addition, the report includes the description of a computational machinery for nonlinear optical spectroscopy through an interface to the QM/MM package Cobramm. Further, a module to run molecular dynamics simulations is added, two surface hopping algorithms are included to enable nonadiabatic calculations, and the DQ method for diabatization is added. Finally, we report on the subject of improvements with respects to alternative file options and parallelization.

Proceedings ArticleDOI
24 Oct 2016
TL;DR: This paper introduces a novel quantitative framework to analyse the security and performance implications of various consensus and network parameters of PoW blockchains and devise optimal adversarial strategies for double-spending and selfish mining while taking into account real world constraints.
Abstract: Proof of Work (PoW) powered blockchains currently account for more than 90% of the total market capitalization of existing digital cryptocurrencies. Although the security provisions of Bitcoin have been thoroughly analysed, the security guarantees of variant (forked) PoW blockchains (which were instantiated with different parameters) have not received much attention in the literature. This opens the question whether existing security analysis of Bitcoin's PoW applies to other implementations which have been instantiated with different consensus and/or network parameters. In this paper, we introduce a novel quantitative framework to analyse the security and performance implications of various consensus and network parameters of PoW blockchains. Based on our framework, we devise optimal adversarial strategies for double-spending and selfish mining while taking into account real world constraints such as network propagation, different block sizes, block generation intervals, information propagation mechanism, and the impact of eclipse attacks. Our framework therefore allows us to capture existing PoW-based deployments as well as PoW blockchain variants that are instantiated with different parameters, and to objectively compare the tradeoffs between their performance and security provisions.

Journal ArticleDOI
TL;DR: This work uses recently available data on infrastructure, land cover and human access into natural areas to construct a globally standardized measure of the cumulative human footprint on the terrestrial environment at 1 km2 resolution from 1993 to 2009.
Abstract: Human pressures on the environment are changing spatially and temporally, with profound implications for the planet’s biodiversity and human economies. Here we use recently available data on infrastructure, land cover and human access into natural areas to construct a globally standardized measure of the cumulative human footprint on the terrestrial environment at 1 km2 resolution from 1993 to 2009. We note that while the human population has increased by 23% and the world economy has grown 153%, the human footprint has increased by just 9%. Still, 75% the planet’s land surface is experiencing measurable human pressures. Moreover, pressures are perversely intense, widespread and rapidly intensifying in places with high biodiversity. Encouragingly, we discover decreases in environmental pressures in the wealthiest countries and those with strong control of corruption. Clearly the human footprint on Earth is changing, yet there are still opportunities for conservation gains. Habitat loss and urbanization are primary components of human impact on the environment. Here, Venter et al.use global data on infrastructure, agriculture, and urbanization to show that the human footprint is growing slower than the human population, but footprints are increasing in biodiverse regions.

Journal ArticleDOI
TL;DR: This work designed cobalt-based multilayered thin thin metals in which the cobalt layer is sandwiched between two heavy metals and so provides additive interfacial Dzyaloshinskii-Moriya interactions (DMIs), which reach a value close to 2 mJ m(-2) in the case of the Ir|Co|Pt asymmetric multilayers.
Abstract: Facing the ever-growing demand for data storage will most probably require a new paradigm. Nanoscale magnetic skyrmions are anticipated to solve this issue as they are arguably the smallest spin textures in magnetic thin films in nature. We designed cobalt-based multilayered thin films in which the cobalt layer is sandwiched between two heavy metals and so provides additive interfacial Dzyaloshinskii-Moriya interactions (DMIs), which reach a value close to 2 mJ m(-2) in the case of the Ir|Co|Pt asymmetric multilayers. Using a magnetization-sensitive scanning X-ray transmission microscopy technique, we imaged small magnetic domains at very low fields in these multilayers. The study of their behaviour in a perpendicular magnetic field allows us to conclude that they are actually magnetic skyrmions stabilized by the large DMI. This discovery of stable sub-100 nm individual skyrmions at room temperature in a technologically relevant material opens the way for device applications in the near future.

Journal ArticleDOI
TL;DR: The COSMOS2015(24) catalog as mentioned in this paper contains precise photometric redshifts and stellar masses for more than half a million objects over the 2deg(2) COSmOS field, which is highly optimized for the study of galaxy evolution and environments in the early universe.
Abstract: We present the COSMOS2015(24) catalog, which contains precise photometric redshifts and stellar masses for more than half a million objects over the 2deg(2) COSMOS field. Including new YJHK(s) images from the UltraVISTA-DR2 survey, Y-band images from Subaru/Hyper-Suprime-Cam, and infrared data from the Spitzer Large Area Survey with the Hyper-Suprime-Cam Spitzer legacy program, this near-infrared-selected catalog is highly optimized for the study of galaxy evolution and environments in the early universe. To maximize catalog completeness for bluer objects and at higher redshifts, objects have been detected on a chi(2) sum of the YJHK(s) and z(++) images. The catalog contains similar to 6 x 10(5) objects in the 1.5 deg(2) UltraVISTA-DR2 region and similar to 1.5 x 10(5) objects are detected in the “ultra-deep stripes” (0.62 deg(2)) at K-s \textless= 24.7 (3 sigma, 3 `', AB magnitude). Through a comparison with the zCOSMOS-bright spectroscopic redshifts, we measure a photometric redshift precision of sigma(Delta z(1) (+ zs)) = 0.007 and a catastrophic failure fraction of eta = 0.5%. At 3 \textless z \textless 6, using the unique database of spectroscopic redshifts in COSMOS, we find sigma(Delta z(1) (+ zs)) = 0.021 and eta = 13.2%. The deepest regions reach a 90% completeness limit of 10(10)M(circle dot) to z = 4. Detailed comparisons of the color distributions, number counts, and clustering show excellent agreement with the literature in the same mass ranges. COSMOS2015 represents a unique, publicly available, valuable resource with which to investigate the evolution of galaxies within their environment back to the earliest stages of the history of the universe. The COSMOS2015 catalog is distributed via anonymous ftp and through the usual astronomical archive systems (CDS, ESO Phase 3, IRSA).

Journal ArticleDOI
TL;DR: In this article, the authors quantify potential global impacts of different negative emissions technologies on various factors (such as land, greenhouse gas emissions, water, albedo, nutrients and energy) to determine the biophysical limits to, and economic costs of, their widespread application.
Abstract: To have a >50% chance of limiting warming below 2 °C, most recent scenarios from integrated assessment models (IAMs) require large-scale deployment of negative emissions technologies (NETs). These are technologies that result in the net removal of greenhouse gases from the atmosphere. We quantify potential global impacts of the different NETs on various factors (such as land, greenhouse gas emissions, water, albedo, nutrients and energy) to determine the biophysical limits to, and economic costs of, their widespread application. Resource implications vary between technologies and need to be satisfactorily addressed if NETs are to have a significant role in achieving climate goals.

Journal ArticleDOI
TL;DR: A variational representation of quantum states based on artificial neural networks with a variable number of hidden neurons and a reinforcement-learning scheme that is capable of both finding the ground state and describing the unitary time evolution of complex interacting quantum systems.
Abstract: The challenge posed by the many-body problem in quantum physics originates from the difficulty of describing the non-trivial correlations encoded in the exponential complexity of the many-body wave function. Here we demonstrate that systematic machine learning of the wave function can reduce this complexity to a tractable computational form, for some notable cases of physical interest. We introduce a variational representation of quantum states based on artificial neural networks with variable number of hidden neurons. A reinforcement-learning scheme is then demonstrated, capable of either finding the ground-state or describing the unitary time evolution of complex interacting quantum systems. We show that this approach achieves very high accuracy in the description of equilibrium and dynamical properties of prototypical interacting spins models in both one and two dimensions, thus offering a new powerful tool to solve the quantum many-body problem.

Posted Content
TL;DR: Temporal Segment Network (TSN) as discussed by the authors is based on the idea of long-range temporal structure modeling and combines a sparse temporal sampling strategy and video-level supervision to enable efficient and effective learning using the whole action video.
Abstract: Deep convolutional networks have achieved great success for visual recognition in still images. However, for action recognition in videos, the advantage over traditional methods is not so evident. This paper aims to discover the principles to design effective ConvNet architectures for action recognition in videos and learn these models given limited training samples. Our first contribution is temporal segment network (TSN), a novel framework for video-based action recognition. which is based on the idea of long-range temporal structure modeling. It combines a sparse temporal sampling strategy and video-level supervision to enable efficient and effective learning using the whole action video. The other contribution is our study on a series of good practices in learning ConvNets on video data with the help of temporal segment network. Our approach obtains the state-the-of-art performance on the datasets of HMDB51 ( $ 69.4\% $) and UCF101 ($ 94.2\% $). We also visualize the learned ConvNet models, which qualitatively demonstrates the effectiveness of temporal segment network and the proposed good practices.

Journal ArticleDOI
TL;DR: This review covers technical aspects of tES, as well as applications like exploration of brain physiology, modelling approaches, tES in cognitive neurosciences, and interventional approaches to help the reader to appropriately design and conduct studies involving these brain stimulation techniques.

Journal ArticleDOI
20 Oct 2016-Cell
TL;DR: Elevating L-arginine levels induced global metabolic changes including a shift from glycolysis to oxidative phosphorylation in activated T cells and promoted the generation of central memory-like cells endowed with higher survival capacity and, in a mouse model, anti-tumor activity.

Journal ArticleDOI
TL;DR: In this article, a review of multiferroic thin-film heterostructures, device architectures, and domain and interface effects is presented. But the focus of the field is now shifting into neighbouring research areas, as discussed in this review.
Abstract: Materials with a coexistence of magnetic and ferroelectric order — multiferroics — provide an efficient route for the control of magnetism by electric fields. The study of multiferroics dates back to the 1950s, but in recent years, key discoveries in theory, synthesis and characterization techniques have led to a new surge of interest in these materials. Different mechanisms, such as lone-pair, geometric, charge-ordering and spin-driven effects, can support multiferroicity. The general focus of the field is now shifting into neighbouring research areas, as we discuss in this Review. Multiferroic thin-film heterostructures, device architectures, and domain and interface effects are explored. The violation of spatial and inversion symmetry in multiferroic materials is a key feature because it determines their properties. Other aspects, such as the non-equilibrium dynamics of multiferroics, are underrated and should be included in the topics that will define the future of the field. Multiferroic materials exhibit magnetic and ferroelectric order at the same time and provide a way to control magnetism with electric fields. We discuss the mechanisms supporting multiferroicity, multiferroic thin films and heterostructures, the non-equilibrium dynamics of multiferroics, fundamental symmetry issues and the impact of multiferroics on other research areas.

Journal ArticleDOI
TL;DR: In this paper, the state-of-the-art algorithms for vital node identification in real networks are reviewed and compared, and extensive empirical analyses are provided to compare well-known methods on disparate real networks.

Journal ArticleDOI
01 Nov 2016-Energy
TL;DR: In this paper, the authors demonstrate how the MERRA and MERRA-2 global meteorological reanalyses as well as the Meteosat-based CM-SAF SARAH satellite dataset can be used to produce hourly PV simulations across Europe.

Journal ArticleDOI
17 Jun 2016-Science
TL;DR: It is demonstrated the importance of the amount of the oxidized form of cellular nicotinamide adenine dinucleotide (NAD+) and its effect on mitochondrial activity as a pivotal switch to modulate muscle SC (MuSC) senescence and it is demonstrated that NR delays senescences of neural SCs and melanocyteSCs and increases mouse life span.
Abstract: Adult stem cells (SCs) are essential for tissue maintenance and regeneration yet are susceptible to senescence during aging. We demonstrate the importance of the amount of the oxidized form of cellular nicotinamide adenine dinucleotide (NAD(+)) and its effect on mitochondrial activity as a pivotal switch to modulate muscle SC (MuSC) senescence. Treatment with the NAD(+) precursor nicotinamide riboside (NR) induced the mitochondrial unfolded protein response and synthesis of prohibitin proteins, and this rejuvenated MuSCs in aged mice. NR also prevented MuSC senescence in the mdx (C57BL/10ScSn-Dmd(mdx)/J) mouse model of muscular dystrophy. We furthermore demonstrate that NR delays senescence of neural SCs and melanocyte SCs and increases mouse life span. Strategies that conserve cellular NAD(+) may reprogram dysfunctional SCs and improve life span in mammals.

Book ChapterDOI
22 Feb 2016
TL;DR: In this article, the authors analyze how fundamental and circumstantial bottlenecks in Bitcoin limit the ability of its current peer-to-peer overlay network to support substantially higher throughputs and lower latencies.
Abstract: The increasing popularity of blockchain-based cryptocurrencies has made scalability a primary and urgent concern. We analyze how fundamental and circumstantial bottlenecks in Bitcoin limit the ability of its current peer-to-peer overlay network to support substantially higher throughputs and lower latencies. Our results suggest that reparameterization of block size and intervals should be viewed only as a first increment toward achieving next-generation, high-load blockchain protocols, and major advances will additionally require a basic rethinking of technical approaches. We offer a structured perspective on the design space for such approaches. Within this perspective, we enumerate and briefly discuss a number of recently proposed protocol ideas and offer several new ideas and open challenges.

Proceedings Article
01 Jan 2016
TL;DR: In this article, the Dynamic Filter Network (DFN) is proposed, where filters are generated dynamically conditioned on an input, and a wide variety of filtering operation can be learned this way, including local spatial transformations, selective (de)blurring or adaptive feature extraction.
Abstract: In a traditional convolutional layer, the learned filters stay fixed after training. In contrast, we introduce a new framework, the Dynamic Filter Network, where filters are generated dynamically conditioned on an input. We show that this architecture is a powerful one, with increased flexibility thanks to its adaptive nature, yet without an excessive increase in the number of model parameters. A wide variety of filtering operation can be learned this way, including local spatial transformations, but also others like selective (de)blurring or adaptive feature extraction. Moreover, multiple such layers can be combined, e.g. in a recurrent architecture. We demonstrate the effectiveness of the dynamic filter network on the tasks of video and stereo prediction, and reach state-of-the-art performance on the moving MNIST dataset with a much smaller model. By visualizing the learned filters, we illustrate that the network has picked up flow information by only looking at unlabelled training data. This suggests that the network can be used to pretrain networks for various supervised tasks in an unsupervised way, like optical flow and depth estimation.