scispace - formally typeset
Search or ask a question

Showing papers by "Technical University of Denmark published in 2019"


Journal ArticleDOI
TL;DR: A deep neural network-based approach that improves SP prediction across all domains of life and distinguishes between three types of prokaryotic SPs is presented.
Abstract: Signal peptides (SPs) are short amino acid sequences in the amino terminus of many newly synthesized proteins that target proteins into, or across, membranes. Bioinformatic tools can predict SPs from amino acid sequences, but most cannot distinguish between various types of signal peptides. We present a deep neural network-based approach that improves SP prediction across all domains of life and distinguishes between three types of prokaryotic SPs.

2,732 citations


Journal ArticleDOI
TL;DR: AntiSMASH 5 adds detection rules for clusters encoding the biosynthesis of acyl-amino acids, β-lactones, fungal RiPPs, RaS-Ri PPs, polybrominated diphenyl ethers, C-nucleosides, PPY-like ketones and lipolanthines and provides more detailed predictions for type II polyketide synthase-encoding gene clusters.
Abstract: Secondary metabolites produced by bacteria and fungi are an important source of antimicrobials and other bioactive compounds. In recent years, genome mining has seen broad applications in identifying and characterizing new compounds as well as in metabolic engineering. Since 2011, the 'antibiotics and secondary metabolite analysis shell-antiSMASH' (https://antismash.secondarymetabolites.org) has assisted researchers in this, both as a web server and a standalone tool. It has established itself as the most widely used tool for identifying and analysing biosynthetic gene clusters (BGCs) in bacterial and fungal genome sequences. Here, we present an entirely redesigned and extended version 5 of antiSMASH. antiSMASH 5 adds detection rules for clusters encoding the biosynthesis of acyl-amino acids, β-lactones, fungal RiPPs, RaS-RiPPs, polybrominated diphenyl ethers, C-nucleosides, PPY-like ketones and lipolanthines. For type II polyketide synthase-encoding gene clusters, antiSMASH 5 now offers more detailed predictions. The HTML output visualization has been redesigned to improve the navigation and visual representation of annotations. We have again improved the runtime of analysis steps, making it possible to deliver comprehensive annotations for bacterial genomes within a few minutes. A new output file in the standard JavaScript object notation (JSON) format is aimed at downstream tools that process antiSMASH results programmatically.

2,084 citations


Journal ArticleDOI
TL;DR: A broad and historical view of different aspects and their complex interplay in CO2R catalysis on Cu is taken, with the purpose of providing new insights, critical evaluations, and guidance to the field with regard to research directions and best practices.
Abstract: To date, copper is the only heterogeneous catalyst that has shown a propensity to produce valuable hydrocarbons and alcohols, such as ethylene and ethanol, from electrochemical CO2 reduction (CO2R). There are variety of factors that impact CO2R activity and selectivity, including the catalyst surface structure, morphology, composition, the choice of electrolyte ions and pH, and the electrochemical cell design. Many of these factors are often intertwined, which can complicate catalyst discovery and design efforts. Here we take a broad and historical view of these different aspects and their complex interplay in CO2R catalysis on Cu, with the purpose of providing new insights, critical evaluations, and guidance to the field with regard to research directions and best practices. First, we describe the various experimental probes and complementary theoretical methods that have been used to discern the mechanisms by which products are formed, and next we present our current understanding of the complex reaction networks for CO2R on Cu. We then analyze two key methods that have been used in attempts to alter the activity and selectivity of Cu: nanostructuring and the formation of bimetallic electrodes. Finally, we offer some perspectives on the future outlook for electrochemical CO2R.

2,055 citations


Journal ArticleDOI
TL;DR: This work critically discusses the advantages and disadvantages of a unified terminology, proposes a definition and categorization framework, and highlights areas of uncertainty on how to define and categorize plastic debris.
Abstract: The accumulation of plastic litter in natural environments is a global issue. Concerns over potential negative impacts on the economy, wildlife, and human health provide strong incentives for improving the sustainable use of plastics. Despite the many voices raised on the issue, we lack a consensus on how to define and categorize plastic debris. This is evident for microplastics, where inconsistent size classes are used and where the materials to be included are under debate. While this is inherent in an emerging research field, an ambiguous terminology results in confusion and miscommunication that may compromise progress in research and mitigation measures. Therefore, we need to be explicit on what exactly we consider plastic debris. Thus, we critically discuss the advantages and disadvantages of a unified terminology, propose a definition and categorization framework, and highlight areas of uncertainty. Going beyond size classes, our framework includes physicochemical properties (polymer composition, solid state, solubility) as defining criteria and size, shape, color, and origin as classifiers for categorization. Acknowledging the rapid evolution of our knowledge on plastic pollution, our framework will promote consensus building within the scientific and regulatory community based on a solid scientific foundation.

1,119 citations


Journal ArticleDOI
TL;DR: Climate change strongly impacts regions in high latitudes and altitudes that store high amounts of carbon in yet frozen ground, and the authors show that the consequence of these changes is global warming of permafrost at depths greater than 10 m in the Northern Hemisphere, in mountains, and in Antarctica.
Abstract: Permafrost warming has the potential to amplify global climate change, because when frozen sediments thaw it unlocks soil organic carbon. Yet to date, no globally consistent assessment of permafrost temperature change has been compiled. Here we use a global data set of permafrost temperature time series from the Global Terrestrial Network for Permafrost to evaluate temperature change across permafrost regions for the period since the International Polar Year (2007–2009). During the reference decade between 2007 and 2016, ground temperature near the depth of zero annual amplitude in the continuous permafrost zone increased by 0.39 ± 0.15 °C. Over the same period, discontinuous permafrost warmed by 0.20 ± 0.10 °C. Permafrost in mountains warmed by 0.19 ± 0.05 °C and in Antarctica by 0.37 ± 0.10 °C. Globally, permafrost temperature increased by 0.29 ± 0.12 °C. The observed trend follows the Arctic amplification of air temperature increase in the Northern Hemisphere. In the discontinuous zone, however, ground warming occurred due to increased snow thickness while air temperature remained statistically unchanged.

906 citations


Journal ArticleDOI
01 Jan 2019-Nature
TL;DR: It is demonstrated that a strategy that uses multi-epitope, personalized neoantigen vaccination, which has previously been tested in patients with high-risk melanoma, is feasible for tumours such as glioblastoma, which typically have a relatively low mutation load and an immunologically ‘cold’ tumour microenvironment.
Abstract: Neoantigens, which are derived from tumour-specific protein-coding mutations, are exempt from central tolerance, can generate robust immune responses1,2 and can function as bona fide antigens that facilitate tumour rejection3. Here we demonstrate that a strategy that uses multi-epitope, personalized neoantigen vaccination, which has previously been tested in patients with high-risk melanoma4–6, is feasible for tumours such as glioblastoma, which typically have a relatively low mutation load1,7 and an immunologically ‘cold’ tumour microenvironment8. We used personalized neoantigen-targeting vaccines to immunize patients newly diagnosed with glioblastoma following surgical resection and conventional radiotherapy in a phase I/Ib study. Patients who did not receive dexamethasone—a highly potent corticosteroid that is frequently prescribed to treat cerebral oedema in patients with glioblastoma—generated circulating polyfunctional neoantigen-specific CD4+ and CD8+ T cell responses that were enriched in a memory phenotype and showed an increase in the number of tumour-infiltrating T cells. Using single-cell T cell receptor analysis, we provide evidence that neoantigen-specific T cells from the peripheral blood can migrate into an intracranial glioblastoma tumour. Neoantigen-targeting vaccines thus have the potential to favourably alter the immune milieu of glioblastoma.

844 citations


Journal ArticleDOI
22 May 2019-Nature
TL;DR: A protocol for the electrochemical reduction of nitrogen to ammonia enables isotope-sensitive quantification of the ammonia produced and the identification and removal of contaminants, and should help to prevent false positives from appearing in the literature.
Abstract: The electrochemical synthesis of ammonia from nitrogen under mild conditions using renewable electricity is an attractive alternative1–4 to the energy-intensive Haber–Bosch process, which dominates industrial ammonia production. However, there are considerable scientific and technical challenges5,6 facing the electrochemical alternative, and most experimental studies reported so far have achieved only low selectivities and conversions. The amount of ammonia produced is usually so small that it cannot be firmly attributed to electrochemical nitrogen fixation7–9 rather than contamination from ammonia that is either present in air, human breath or ion-conducting membranes9, or generated from labile nitrogen-containing compounds (for example, nitrates, amines, nitrites and nitrogen oxides) that are typically present in the nitrogen gas stream10, in the atmosphere or even in the catalyst itself. Although these sources of experimental artefacts are beginning to be recognized and managed11,12, concerted efforts to develop effective electrochemical nitrogen reduction processes would benefit from benchmarking protocols for the reaction and from a standardized set of control experiments designed to identify and then eliminate or quantify the sources of contamination. Here we propose a rigorous procedure using 15N2 that enables us to reliably detect and quantify the electrochemical reduction of nitrogen to ammonia. We demonstrate experimentally the importance of various sources of contamination, and show how to remove labile nitrogen-containing compounds from the nitrogen gas as well as how to perform quantitative isotope measurements with cycling of 15N2 gas to reduce both contamination and the cost of isotope measurements. Following this protocol, we find that no ammonia is produced when using the most promising pure-metal catalysts for this reaction in aqueous media, and we successfully confirm and quantify ammonia synthesis using lithium electrodeposition in tetrahydrofuran13. The use of this rigorous protocol should help to prevent false positives from appearing in the literature, thus enabling the field to focus on viable pathways towards the practical electrochemical reduction of nitrogen to ammonia. A protocol for the electrochemical reduction of nitrogen to ammonia enables isotope-sensitive quantification of the ammonia produced and the identification and removal of contaminants.

819 citations


Journal ArticleDOI
TL;DR: This protocol provides an overview of all new features of the COBRA Toolbox and can be adapted to generate and analyze constraint-based models in a wide variety of scenarios.
Abstract: Constraint-based reconstruction and analysis (COBRA) provides a molecular mechanistic framework for integrative analysis of experimental molecular systems biology data and quantitative prediction of physicochemically and biochemically feasible phenotypic states. The COBRA Toolbox is a comprehensive desktop software suite of interoperable COBRA methods. It has found widespread application in biology, biomedicine, and biotechnology because its functions can be flexibly combined to implement tailored COBRA protocols for any biochemical network. This protocol is an update to the COBRA Toolbox v.1.0 and v.2.0. Version 3.0 includes new methods for quality-controlled reconstruction, modeling, topological analysis, strain and experimental design, and network visualization, as well as network integration of chemoinformatic, metabolomic, transcriptomic, proteomic, and thermochemical data. New multi-lingual code integration also enables an expansion in COBRA application scope via high-precision, high-performance, and nonlinear numerical optimization solvers for multi-scale, multi-cellular, and reaction kinetic modeling, respectively. This protocol provides an overview of all these new features and can be adapted to generate and analyze constraint-based models in a wide variety of scenarios. The COBRA Toolbox v.3.0 provides an unparalleled depth of COBRA methods.

719 citations


Journal ArticleDOI
Andrea Cossarizza1, Hyun-Dong Chang, Andreas Radbruch, Andreas Acs2  +459 moreInstitutions (160)
TL;DR: These guidelines are a consensus work of a considerable number of members of the immunology and flow cytometry community providing the theory and key practical aspects offlow cytometry enabling immunologists to avoid the common errors that often undermine immunological data.
Abstract: These guidelines are a consensus work of a considerable number of members of the immunology and flow cytometry community. They provide the theory and key practical aspects of flow cytometry enabling immunologists to avoid the common errors that often undermine immunological data. Notably, there are comprehensive sections of all major immune cell types with helpful Tables detailing phenotypes in murine and human cells. The latest flow cytometry techniques and applications are also described, featuring examples of the data that can be generated and, importantly, how the data can be analysed. Furthermore, there are sections detailing tips, tricks and pitfalls to avoid, all written and peer-reviewed by leading experts in the field, making this an essential research companion.

698 citations


Journal ArticleDOI
TL;DR: A wealth of candidates are being investigated to improve the catalysts found in acidic and alkaline electrolysers as discussed by the authors, and attention should be focused on developing stable water oxidation catalysts with improved intrinsic activity, not only increased geometric activity.
Abstract: A wealth of candidates are being investigated to improve the catalysts found in acidic and alkaline electrolysers. However, attention should be focused on developing stable water oxidation catalysts with improved intrinsic activity — not only increased geometric activity — alongside best practice for data collection.

631 citations


Journal ArticleDOI
TL;DR: An overview of these new P2P electricity markets that starts with the motivation, challenges, market designs moving to the potential future developments in this field is contributed, providing recommendations while considering a test-case.
Abstract: The advent of more proactive consumers, the so-called “prosumers”, with production and storage capabilities, is empowering the consumers and bringing new opportunities and challenges to the operation of power systems in a market environment. Recently, a novel proposal for the design and operation of electricity markets has emerged: these so-called peer-to-peer (P2P) electricity markets conceptually allow the prosumers to directly share their electrical energy and investment. Such P2P markets rely on a consumer-centric and bottom-up perspective by giving the opportunity to consumers to freely choose the way they buy their electric energy. A community can also be formed by prosumers who want to collaborate, or in terms of operational energy management. This paper contributes with an overview of these new P2P markets that starts with the motivation, challenges, market designs moving to the potential future developments in this field, providing recommendations while considering a test-case.

Journal ArticleDOI
TL;DR: It is suggested that global AMR gene diversity and abundance vary by region, and that improving sanitation and health could potentially limit the global burden of AMR.
Abstract: Antimicrobial resistance (AMR) is a serious threat to global public health, but obtaining representative data on AMR for healthy human populations is difficult. Here, we use metagenomic analysis of untreated sewage to characterize the bacterial resistome from 79 sites in 60 countries. We find systematic differences in abundance and diversity of AMR genes between Europe/North-America/Oceania and Africa/Asia/South-America. Antimicrobial use data and bacterial taxonomy only explains a minor part of the AMR variation that we observe. We find no evidence for cross-selection between antimicrobial classes, or for effect of air travel between sites. However, AMR gene abundance strongly correlates with socio-economic, health and environmental factors, which we use to predict AMR gene abundances in all countries in the world. Our findings suggest that global AMR gene diversity and abundance vary by region, and that improving sanitation and health could potentially limit the global burden of AMR. We propose metagenomic analysis of sewage as an ethically acceptable and economically feasible approach for continuous global surveillance and prediction of AMR.

Journal ArticleDOI
TL;DR: Simba as discussed by the authors is the next generation of the Mufasa cosmological galaxy formation simulations run with Gizmo's meshless finite mass hydrodynamics, which includes updates to Mufaa's sub-resolution star formation and feedback prescriptions, and introduces black hole growth via the torque-limited accretion model of Angles-Alcazar et al.
Abstract: We introduce the Simba simulations, the next generation of the Mufasa cosmological galaxy formation simulations run with Gizmo's meshless finite mass hydrodynamics. Simba includes updates to Mufasa's sub-resolution star formation and feedback prescriptions, and introduces black hole growth via the torque-limited accretion model of Angles-Alcazar et al. (2017) from cold gas and Bondi accretion from hot gas, along with black hole feedback via kinetic bipolar outflows and X-ray energy. Ejection velocities are taken to be ~10^3 km/s at high Eddington ratios, increasing to ~8000 km/s at Eddington ratios below 2%, with a constant momentum input of 20L/c. Simba further includes an on-the-fly dust production, growth, and destruction model. Our Simba run with (100 Mpc/h)^3 and 1024^3 gas elements reproduces numerous observables, including galaxy stellar mass functions at z=0-6, the stellar mass--star formation rate main sequence, HI and H2 fractions, the mass-metallicity relation at z=0 and z=2, star-forming galaxy sizes, hot gas fractions in massive halos, and z=0 galaxy dust properties. However, Simba also yields an insufficiently sharp truncation of the z=0 mass function, and too-large sizes for low-mass quenched galaxies. We show that Simba's jet feedback is primarily responsible for quenching massive galaxies.

Journal ArticleDOI
TL;DR: A review of approaches for business model innovation for circular economy and/or sustainability, based on a systematic review of academic literature and practitioner-based methodologies, is presented in this paper.

Journal ArticleDOI
30 Sep 2019
TL;DR: During the development of TargetP 2.0, a state-of-the-art method to predict targeting signal, a previously overlooked biological signal for subcellular targeting is found using the output from a deep learning method.
Abstract: In bioinformatics, machine learning methods have been used to predict features embedded in the sequences. In contrast to what is generally assumed, machine learning approaches can also provide new insights into the underlying biology. Here, we demonstrate this by presenting TargetP 2.0, a novel state-of-the-art method to identify N-terminal sorting signals, which direct proteins to the secretory pathway, mitochondria, and chloroplasts or other plastids. By examining the strongest signals from the attention layer in the network, we find that the second residue in the protein, that is, the one following the initial methionine, has a strong influence on the classification. We observe that two-thirds of chloroplast and thylakoid transit peptides have an alanine in position 2, compared with 20% in other plant proteins. We also note that in fungi and single-celled eukaryotes, less than 30% of the targeting peptides have an amino acid that allows the removal of the N-terminal methionine compared with 60% for the proteins without targeting peptide. The importance of this feature for predictions has not been highlighted before.

Journal ArticleDOI
TL;DR: An overview of the application of ML to optical communications and networking is provided, relevant literature is classified and surveyed, and an introductory tutorial on ML is provided for researchers and practitioners interested in this field.
Abstract: Today’s telecommunication networks have become sources of enormous amounts of widely heterogeneous data. This information can be retrieved from network traffic traces, network alarms, signal quality indicators, users’ behavioral data, etc. Advanced mathematical tools are required to extract meaningful information from these data and take decisions pertaining to the proper functioning of the networks from the network-generated data. Among these mathematical tools, machine learning (ML) is regarded as one of the most promising methodological approaches to perform network-data analysis and enable automated network self-configuration and fault management. The adoption of ML techniques in the field of optical communication networks is motivated by the unprecedented growth of network complexity faced by optical networks in the last few years. Such complexity increase is due to the introduction of a huge number of adjustable and interdependent system parameters (e.g., routing configurations, modulation format, symbol rate, coding schemes, etc.) that are enabled by the usage of coherent transmission/reception technologies, advanced digital signal processing, and compensation of nonlinear effects in optical fiber propagation. In this paper we provide an overview of the application of ML to optical communications and networking. We classify and survey relevant literature dealing with the topic, and we also provide an introductory tutorial on ML for researchers and practitioners interested in this field. Although a good number of research papers have recently appeared, the application of ML to optical networks is still in its infancy: to stimulate further work in this area, we conclude this paper proposing new possible research directions.

Journal ArticleDOI
25 Oct 2019-Science
TL;DR: This Review explores grand challenges in wind energy research that must be addressed to enable wind energy to supply one-third to one-half, or even more, of the world’s electricity needs.
Abstract: Harvested by advanced technical systems honed over decades of research and development, wind energy has become a mainstream energy resource. However, continued innovation is needed to realize the potential of wind to serve the global demand for clean energy. Here, we outline three interdependent, cross-disciplinary grand challenges underpinning this research endeavor. The first is the need for a deeper understanding of the physics of atmospheric flow in the critical zone of plant operation. The second involves science and engineering of the largest dynamic, rotating machines in the world. The third encompasses optimization and control of fleets of wind plants working synergistically within the electricity grid. Addressing these challenges could enable wind power to provide as much as half of our global electricity needs and perhaps beyond.

Journal ArticleDOI
27 Feb 2019-Nature
TL;DR: In this paper, the authors reported the complete biosynthesis of the major cannabinoids cannabigerolic acid, Δ9-tetrahydrocannabinolic acid and cannabidiolic acid in Saccharomyces cerevisiae, from the simple sugar galactose.
Abstract: Cannabis sativa L. has been cultivated and used around the globe for its medicinal properties for millennia1. Some cannabinoids, the hallmark constituents of Cannabis, and their analogues have been investigated extensively for their potential medical applications2. Certain cannabinoid formulations have been approved as prescription drugs in several countries for the treatment of a range of human ailments3. However, the study and medicinal use of cannabinoids has been hampered by the legal scheduling of Cannabis, the low in planta abundances of nearly all of the dozens of known cannabinoids4, and their structural complexity, which limits bulk chemical synthesis. Here we report the complete biosynthesis of the major cannabinoids cannabigerolic acid, Δ9-tetrahydrocannabinolic acid, cannabidiolic acid, Δ9-tetrahydrocannabivarinic acid and cannabidivarinic acid in Saccharomyces cerevisiae, from the simple sugar galactose. To accomplish this, we engineered the native mevalonate pathway to provide a high flux of geranyl pyrophosphate and introduced a heterologous, multi-organism-derived hexanoyl-CoA biosynthetic pathway5. We also introduced the Cannabis genes that encode the enzymes involved in the biosynthesis of olivetolic acid6, as well as the gene for a previously undiscovered enzyme with geranylpyrophosphate:olivetolate geranyltransferase activity and the genes for corresponding cannabinoid synthases7,8. Furthermore, we established a biosynthetic approach that harnessed the promiscuity of several pathway genes to produce cannabinoid analogues. Feeding different fatty acids to our engineered strains yielded cannabinoid analogues with modifications in the part of the molecule that is known to alter receptor binding affinity and potency9. We also demonstrated that our biological system could be complemented by simple synthetic chemistry to further expand the accessible chemical space. Our work presents a platform for the production of natural and unnatural cannabinoids that will allow for more rigorous study of these compounds and could be used in the development of treatments for a variety of human health problems.

Journal ArticleDOI
TL;DR: MIBiG 2.0 is presented, which encompasses major updates to the schema, the data, and the online repository itself, and improves the user experience by adding new features such as query searches and a statistics page, and enabled direct link-outs to chemical structure databases.
Abstract: Fueled by the explosion of (meta)genomic data, genome mining of specialized metabolites has become a major technology for drug discovery and studying microbiome ecology. In these efforts, computational tools like antiSMASH have played a central role through the analysis of Biosynthetic Gene Clusters (BGCs). Thousands of candidate BGCs from microbial genomes have been identified and stored in public databases. Interpreting the function and novelty of these predicted BGCs requires comparison with a well-documented set of BGCs of known function. The MIBiG (Minimum Information about a Biosynthetic Gene Cluster) Data Standard and Repository was established in 2015 to enable curation and storage of known BGCs. Here, we present MIBiG 2.0, which encompasses major updates to the schema, the data, and the online repository itself. Over the past five years, 851 new BGCs have been added. Additionally, we performed extensive manual data curation of all entries to improve the annotation quality of our repository. We also redesigned the data schema to ensure the compliance of future annotations. Finally, we improved the user experience by adding new features such as query searches and a statistics page, and enabled direct link-outs to chemical structure databases. The repository is accessible online at https://mibig.secondarymetabolites.org/.

Journal ArticleDOI
01 Jun 2019-Proteins
TL;DR: The accuracy of NetSurfP‐2.0 is assessed and it is found to consistently produce state‐of‐the‐art predictions for each of its output features, and the processing time has been optimized to allow predicting more than 1000 proteins in less than 2 hours, and complete proteomes in more than 1 day.
Abstract: The ability to predict local structural features of a protein from the primary sequence is of paramount importance for unraveling its function in absence of experimental structural information. Two main factors affect the utility of potential prediction tools: their accuracy must enable extraction of reliable structural information on the proteins of interest, and their runtime must be low to keep pace with sequencing data being generated at a constantly increasing speed. Here, we present NetSurfP-2.0, a novel tool that can predict the most important local structural features with unprecedented accuracy and runtime. NetSurfP-2.0 is sequence-based and uses an architecture composed of convolutional and long short-term memory neural networks trained on solved protein structures. Using a single integrated model, NetSurfP-2.0 predicts solvent accessibility, secondary structure, structural disorder, and backbone dihedral angles for each residue of the input sequences. We assessed the accuracy of NetSurfP-2.0 on several independent test datasets and found it to consistently produce state-of-the-art predictions for each of its output features. We observe a correlation of 80% between predictions and experimental data for solvent accessibility, and a precision of 85% on secondary structure 3-class predictions. In addition to improved accuracy, the processing time has been optimized to allow predicting more than 1000 proteins in less than 2 hours, and complete proteomes in less than 1 day.

Journal ArticleDOI
20 Mar 2019-Joule
TL;DR: In this article, density functional theory (DFT) was used to predict the remaining adsorption energies on a random subset of available binding sites on the surface of the HEA IrPdPtRhRu.

Journal ArticleDOI
TL;DR: It is shown that simple combinations of the top algorithms result in higher kappa metric values than any algorithm individually, with 0.93 for the best combination.
Abstract: Automated detection of cancer metastases in lymph nodes has the potential to improve the assessment of prognosis for patients. To enable fair comparison between the algorithms for this purpose, we set up the CAMELYON17 challenge in conjunction with the IEEE International Symposium on Biomedical Imaging 2017 Conference in Melbourne. Over 300 participants registered on the challenge website, of which 23 teams submitted a total of 37 algorithms before the initial deadline. Participants were provided with 899 whole-slide images (WSIs) for developing their algorithms. The developed algorithms were evaluated based on the test set encompassing 100 patients and 500 WSIs. The evaluation metric used was a quadratic weighted Cohen’s kappa. We discuss the algorithmic details of the 10 best pre-conference and two post-conference submissions. All these participants used convolutional neural networks in combination with pre- and postprocessing steps. Algorithms differed mostly in neural network architecture, training strategy, and pre- and postprocessing methodology. Overall, the kappa metric ranged from 0.89 to −0.13 across all submissions. The best results were obtained with pre-trained architectures such as ResNet. Confusion matrix analysis revealed that all participants struggled with reliably identifying isolated tumor cells, the smallest type of metastasis, with detection rates below 40%. Qualitative inspection of the results of the top participants showed categories of false positives, such as nerves or contamination, which could be targeted for further optimization. Last, we show that simple combinations of the top algorithms result in higher kappa metric values than any algorithm individually, with 0.93 for the best combination.

Journal ArticleDOI
TL;DR: In this paper, a multi-scale modeling approach that combines size-modified Poisson-Boltzmann theory with ab initio simulations of field effects on critical reaction intermediates is presented.
Abstract: Solid–liquid interface engineering has recently emerged as a promising technique to optimize the activity and product selectivity of the electrochemical reduction of CO2. In particular, the cation identity and the interfacial electric field have been shown to have a particularly significant impact on the activity of desired products. Using a combination of theoretical and experimental investigations, we show the cation size and its resultant impact on the interfacial electric field to be the critical factor behind the ion specificity of electrochemical CO2 reduction. We present a multi-scale modeling approach that combines size-modified Poisson–Boltzmann theory with ab initio simulations of field effects on critical reaction intermediates. The model shows an unprecedented quantitative agreement with experimental trends in cation effects on CO production on Ag, C2 production on Cu, CO vibrational signatures on Pt and Cu as well as Au(111) single crystal experimental double layer capacitances. The insights obtained represent quantitative evidence for the impact of cations on the interfacial electric field. Finally, we present design principles to increase the activity and selectivity of any field-sensitive electrochemical process based on the surface charging properties: the potential of zero charge, the ion size, and the double layer capacitance.

Journal ArticleDOI
TL;DR: An expanded GWAS of birth weight and subsequent analysis using structural equation modeling and Mendelian randomization decomposes maternal and fetal genetic contributions and causal links between birth weight, blood pressure and glycemic traits.
Abstract: Birth weight variation is influenced by fetal and maternal genetic and non-genetic factors, and has been reproducibly associated with future cardio-metabolic health outcomes. In expanded genome-wide association analyses of own birth weight (n = 321,223) and offspring birth weight (n = 230,069 mothers), we identified 190 independent association signals (129 of which are novel). We used structural equation modeling to decompose the contributions of direct fetal and indirect maternal genetic effects, then applied Mendelian randomization to illuminate causal pathways. For example, both indirect maternal and direct fetal genetic effects drive the observational relationship between lower birth weight and higher later blood pressure: maternal blood pressure-raising alleles reduce offspring birth weight, but only direct fetal effects of these alleles, once inherited, increase later offspring blood pressure. Using maternal birth weight-lowering genotypes to proxy for an adverse intrauterine environment provided no evidence that it causally raises offspring blood pressure, indicating that the inverse birth weight-blood pressure association is attributable to genetic effects, and not to intrauterine programming.

Journal ArticleDOI
TL;DR: A relaxed consensus + innovation (RCI) approach is described to solve the MBED in fully decentralized manner, allowing for a fully decentralized market clearing that converges with a negligible optimality gap, with a limited amount of information being shared.
Abstract: With the sustained deployment of distributed generation capacities and the more proactive role of consumers, power systems and their operation are drifting away from a conventional top–down hierarchical structure. Electricity market structures, however, have not yet embraced that evolution. Respecting the high-dimensional, distributed and dynamic nature of modern power systems would translate to designing peer-to-peer markets or, at least, to using such an underlying decentralized structure to enable a bottom–up approach to future electricity markets. A peer-to-peer market structure based on a multi-bilateral economic dispatch (MBED) formulation is introduced, allowing for multi-bilateral trading with product differentiation, for instance based on consumer preferences. A relaxed consensus + innovation (RCI) approach is described to solve the MBED in fully decentralized manner. A set of realistic case studies and their analysis allow us to show that such peer-to-peer market structures can effectively yield market outcomes that are different from centralized market structures and optimal in terms of respecting consumers preferences while maximizing social welfare. Additionally, the RCI solving approach allows for a fully decentralized market clearing that converges with a negligible optimality gap, with a limited amount of information being shared.

Journal ArticleDOI
TL;DR: A microkinetic model for CO2 and CO reduction on copper is presented, based on ab initio simulations, to elucidate pH’s impact on competitive reaction pathways and elucidate how reaction conditions can lead to significant enhancements in selectivity and activity towards higher value C2 products.
Abstract: We present a microkinetic model for CO(2) reduction (CO(2)R) on Cu(211) towards C2 products, based on energetics estimated from an explicit solvent model. We show that the differences in both Tafel slopes and pH dependence for C1 vs C2 activity arise from differences in their multi-step mechanisms. We find the depletion in C2 products observed at high overpotential and high pH to arise from the 2nd order dependence of C-C coupling on CO coverage, which decreases due to competition from the C1 pathway. We further demonstrate that CO(2) reduction at a fixed pH yield similar activities, due to the facile kinetics for CO2 reduction to CO on Cu, which suggests C2 products to be favored for CO2R under alkaline conditions. The mechanistic insights of this work elucidate how reaction conditions can lead to significant enhancements in selectivity and activity towards higher value C2 products.

Journal ArticleDOI
TL;DR: Technological solutions including conventional activated sludge, membrane bioreactors, moving bed biofilm reactors, and nature-based solutions such as constructed wetlands are compared for the achievable removal efficiencies of the selected CEC and their potential of acting as reservoirs of ARB&ARGs.

Journal ArticleDOI
TL;DR: In this article, a polarization-orthogonal excitation collection scheme is designed to minimize the polarization filtering loss under resonant excitation, achieving a single-photon efficiency of 0.60.
Abstract: An optimal single-photon source should deterministically deliver one, and only one, photon at a time, with no trade-off between the source’s efficiency and the photon indistinguishability. However, all reported solid-state sources of indistinguishable single photons had to rely on polarization filtering, which reduced the efficiency by 50%, fundamentally limiting the scaling of photonic quantum technologies. Here, we overcome this long-standing challenge by coherently driving quantum dots deterministically coupled to polarization-selective Purcell microcavities. We present two examples: narrowband, elliptical micropillars and broadband, elliptical Bragg gratings. A polarization-orthogonal excitation–collection scheme is designed to minimize the polarization filtering loss under resonant excitation. We demonstrate a polarized single-photon efficiency of 0.60 ± 0.02 (0.56 ± 0.02), a single-photon purity of 0.975 ± 0.005 (0.991 ± 0.003) and an indistinguishability of 0.975 ± 0.006 (0.951 ± 0.005) for the micropillar (Bragg grating) device. Our work provides promising solutions for truly optimal single-photon sources combining near-unity indistinguishability and near-unity system efficiency simultaneously. Single-photon sources with a single-photon efficiency of 0.60, a single-photon purity of 0.975 and an indistinguishability of 0.975 are demonstrated. This is achieved by fabricating elliptical resonators around site-registered quantum dots.

Journal ArticleDOI
TL;DR: The recent progress in the development of multifunctional and self‐healable hydrogels for various tissue engineering applications is discussed in detail and their potential applications within the rapidly expanding areas of bioelectronics, cyborganics, and soft robotics are highlighted.
Abstract: Given their durability and long-term stability, self-healable hydrogels have, in the past few years, emerged as promising replacements for the many brittle hydrogels currently being used in preclinical or clinical trials. To this end, the incompatibility between hydrogel toughness and rapid self-healing remains unaddressed, and therefore most of the self-healable hydrogels still face serious challenges within the dynamic and mechanically demanding environment of human organs/tissues. Furthermore, depending on the target tissue, the self-healing hydrogels must comply with a wide range of properties including electrical, biological, and mechanical. Notably, the incorporation of nanomaterials into double-network hydrogels is showing great promise as a feasible way to generate self-healable hydrogels with the above-mentioned attributes. Here, the recent progress in the development of multifunctional and self-healable hydrogels for various tissue engineering applications is discussed in detail. Their potential applications within the rapidly expanding areas of bioelectronic hydrogels, cyborganics, and soft robotics are further highlighted.

Journal ArticleDOI
TL;DR: Using a polyphasic approach combining phenotype, physiology, sequence and extrolite data, eight new species are described in Aspergillus section Flavi, including three newly described species A. aflatoxiformans, A. austwickii and A. cerealis in addition to A. togoensis.