scispace - formally typeset
Search or ask a question

Showing papers by "Tokyo Institute of Technology published in 2019"


Journal ArticleDOI
TL;DR: A meta-analysis of eight geographically and technically diverse fecal shotgun metagenomic studies of colorectal cancer identified a core set of 29 species significantly enriched in CRC metagenomes, establishing globally generalizable, predictive taxonomic and functional microbiome CRC signatures as a basis for future diagnostics.
Abstract: Association studies have linked microbiome alterations with many human diseases. However, they have not always reported consistent results, thereby necessitating cross-study comparisons. Here, a meta-analysis of eight geographically and technically diverse fecal shotgun metagenomic studies of colorectal cancer (CRC, n = 768), which was controlled for several confounders, identified a core set of 29 species significantly enriched in CRC metagenomes (false discovery rate (FDR) < 1 × 10−5). CRC signatures derived from single studies maintained their accuracy in other studies. By training on multiple studies, we improved detection accuracy and disease specificity for CRC. Functional analysis of CRC metagenomes revealed enriched protein and mucin catabolism genes and depleted carbohydrate degradation genes. Moreover, we inferred elevated production of secondary bile acids from CRC metagenomes, suggesting a metabolic link between cancer-associated gut microbes and a fat- and meat-rich diet. Through extensive validations, this meta-analysis firmly establishes globally generalizable, predictive taxonomic and functional microbiome CRC signatures as a basis for future diagnostics. Cross-study analysis defines fecal microbial species associated with colorectal cancer.

615 citations


Journal ArticleDOI
TL;DR: Large-cohort multi-omics data indicate that shifts in the microbiome and metabolome occur from the very early stages of the development of colorectal cancer, which is of possible etiological and diagnostic importance.
Abstract: In most cases of sporadic colorectal cancers, tumorigenesis is a multistep process, involving genomic alterations in parallel with morphologic changes. In addition, accumulating evidence suggests that the human gut microbiome is linked to the development of colorectal cancer. Here we performed fecal metagenomic and metabolomic studies on samples from a large cohort of 616 participants who underwent colonoscopy to assess taxonomic and functional characteristics of gut microbiota and metabolites. Microbiome and metabolome shifts were apparent in cases of multiple polypoid adenomas and intramucosal carcinomas, in addition to more advanced lesions. We found two distinct patterns of microbiome elevations. First, the relative abundance of Fusobacterium nucleatum spp. was significantly (P < 0.005) elevated continuously from intramucosal carcinoma to more advanced stages. Second, Atopobium parvulum and Actinomyces odontolyticus, which co-occurred in intramucosal carcinomas, were significantly (P < 0.005) increased only in multiple polypoid adenomas and/or intramucosal carcinomas. Metabolome analyses showed that branched-chain amino acids and phenylalanine were significantly (P < 0.005) increased in intramucosal carcinomas and bile acids, including deoxycholate, were significantly (P < 0.005) elevated in multiple polypoid adenomas and/or intramucosal carcinomas. We identified metagenomic and metabolomic markers to discriminate cases of intramucosal carcinoma from the healthy controls. Our large-cohort multi-omics data indicate that shifts in the microbiome and metabolome occur from the very early stages of the development of colorectal cancer, which is of possible etiological and diagnostic importance. Colorectal cancer stages are associated with distinct microbial and metabolomic profiles that could shed light on cancer progression.

599 citations


Proceedings ArticleDOI
15 Jun 2019
TL;DR: This work proposes an approach where a single convolutional neural network plays a dual role: It is simultaneously a dense feature descriptor and a feature detector, and shows that this model can be trained using pixel correspondences extracted from readily available large-scale SfM reconstructions, without any further annotations.
Abstract: In this work we address the problem of finding reliable pixel-level correspondences under difficult imaging conditions. We propose an approach where a single convolutional neural network plays a dual role: It is simultaneously a dense feature descriptor and a feature detector. By postponing the detection to a later stage, the obtained keypoints are more stable than their traditional counterparts based on early detection of low-level structures. We show that this model can be trained using pixel correspondences extracted from readily available large-scale SfM reconstructions, without any further annotations. The proposed method obtains state-of-the-art performance on both the difficult Aachen Day-Night localization dataset and the InLoc indoor localization benchmark, as well as competitive performance on other benchmarks for image matching and 3D reconstruction.

594 citations


Journal ArticleDOI
A. Abada1, Marcello Abbrescia2, Marcello Abbrescia3, Shehu S. AbdusSalam4  +1491 moreInstitutions (239)
TL;DR: In this article, the authors present the second volume of the Future Circular Collider Conceptual Design Report, devoted to the electron-positron collider FCC-ee, and present the accelerator design, performance reach, a staged operation scenario, the underlying technologies, civil engineering, technical infrastructure, and an implementation plan.
Abstract: In response to the 2013 Update of the European Strategy for Particle Physics, the Future Circular Collider (FCC) study was launched, as an international collaboration hosted by CERN. This study covers a highest-luminosity high-energy lepton collider (FCC-ee) and an energy-frontier hadron collider (FCC-hh), which could, successively, be installed in the same 100 km tunnel. The scientific capabilities of the integrated FCC programme would serve the worldwide community throughout the 21st century. The FCC study also investigates an LHC energy upgrade, using FCC-hh technology. This document constitutes the second volume of the FCC Conceptual Design Report, devoted to the electron-positron collider FCC-ee. After summarizing the physics discovery opportunities, it presents the accelerator design, performance reach, a staged operation scenario, the underlying technologies, civil engineering, technical infrastructure, and an implementation plan. FCC-ee can be built with today’s technology. Most of the FCC-ee infrastructure could be reused for FCC-hh. Combining concepts from past and present lepton colliders and adding a few novel elements, the FCC-ee design promises outstandingly high luminosity. This will make the FCC-ee a unique precision instrument to study the heaviest known particles (Z, W and H bosons and the top quark), offering great direct and indirect sensitivity to new physics.

526 citations


Journal ArticleDOI
TL;DR: The combined analysis of heterogeneous CRC cohorts identified reproducible microbiome biomarkers and accurate disease-predictive models that can form the basis for clinical prognostic tests and hypothesis-driven mechanistic studies.
Abstract: Several studies have investigated links between the gut microbiome and colorectal cancer (CRC), but questions remain about the replicability of biomarkers across cohorts and populations. We performed a meta-analysis of five publicly available datasets and two new cohorts and validated the findings on two additional cohorts, considering in total 969 fecal metagenomes. Unlike microbiome shifts associated with gastrointestinal syndromes, the gut microbiome in CRC showed reproducibly higher richness than controls (P < 0.01), partially due to expansions of species typically derived from the oral cavity. Meta-analysis of the microbiome functional potential identified gluconeogenesis and the putrefaction and fermentation pathways as being associated with CRC, whereas the stachyose and starch degradation pathways were associated with controls. Predictive microbiome signatures for CRC trained on multiple datasets showed consistently high accuracy in datasets not considered for model training and independent validation cohorts (average area under the curve, 0.84). Pooled analysis of raw metagenomes showed that the choline trimethylamine-lyase gene was overabundant in CRC (P = 0.001), identifying a relationship between microbiome choline metabolism and CRC. The combined analysis of heterogeneous CRC cohorts thus identified reproducible microbiome biomarkers and accurate disease-predictive models that can form the basis for clinical prognostic tests and hypothesis-driven mechanistic studies. Multicohort analysis identifies microbial signatures of colorectal cancer in fecal microbiomes.

478 citations


Journal ArticleDOI
TL;DR: In this paper, a generalized Bloch band theory in one-dimensional spatially periodic tight-binding models is established. And the Brillouin zone is defined for non-Hermitian systems.
Abstract: In spatially periodic Hermitian systems, such as electronic systems in crystals, the band structure is described by the band theory in terms of the Bloch wave functions, which reproduce energy levels for large systems with open boundaries. In this paper, we establish a generalized Bloch band theory in one-dimensional spatially periodic tight-binding models. We show how to define the Brillouin zone in non-Hermitian systems. From this Brillouin zone, one can calculate continuum bands, which reproduce the band structure in an open chain. As an example, we apply our theory to the non-Hermitian Su-Schrieffer-Heeger model. We also show the bulk-edge correspondence between the winding number and existence of the topological edge states.

466 citations


Journal ArticleDOI
TL;DR: In this article, a comprehensive review discusses the pseudo kinetics and mechanisms of the photodegradation reactions, as well as the operational factors that govern the adsorption of dye molecules, including the initial dye concentration, pH of the solution, temperature of the reaction medium, and light intensity.
Abstract: Due to its low cost, environmentally friendly process, and lack of secondary contamination, the photodegradation of dyes is regarded as a promising technology for industrial wastewater treatment. This technology demonstrates the light-enhanced generation of charge carriers and reactive radicals that non-selectively degrade various organic dyes into water, CO2, and other organic compounds via direct photodegradation or a sensitization-mediated degradation process. The overall efficiency of the photocatalysis system is closely dependent upon operational parameters that govern the adsorption and photodegradation of dye molecules, including the initial dye concentration, pH of the solution, temperature of the reaction medium, and light intensity. Additionally, the charge-carrier properties of the photocatalyst strongly affect the generation of reactive species in the heterogeneous photodegradation and thereby dictate the photodegradation efficiency. Herein, this comprehensive review discusses the pseudo kinetics and mechanisms of the photodegradation reactions. The operational factors affecting the photodegradation of either cationic or anionic dye molecules, as well as the charge-carrier properties of the photocatalyst, are also fully explored. By further analyzing past works to clarify key active species for photodegradation reactions and optimal conditions, this review provides helpful guidelines that can be applied to foster the development of efficient photodegradation systems.

464 citations


Journal ArticleDOI
A. Abada1, Marcello Abbrescia2, Marcello Abbrescia3, Shehu S. AbdusSalam4  +1496 moreInstitutions (238)
TL;DR: In this paper, the authors describe the detailed design and preparation of a construction project for a post-LHC circular energy frontier collider in collaboration with national institutes, laboratories and universities worldwide, and enhanced by a strong participation of industrial partners.
Abstract: Particle physics has arrived at an important moment of its history. The discovery of the Higgs boson, with a mass of 125 GeV, completes the matrix of particles and interactions that has constituted the “Standard Model” for several decades. This model is a consistent and predictive theory, which has so far proven successful at describing all phenomena accessible to collider experiments. However, several experimental facts do require the extension of the Standard Model and explanations are needed for observations such as the abundance of matter over antimatter, the striking evidence for dark matter and the non-zero neutrino masses. Theoretical issues such as the hierarchy problem, and, more in general, the dynamical origin of the Higgs mechanism, do likewise point to the existence of physics beyond the Standard Model. This report contains the description of a novel research infrastructure based on a highest-energy hadron collider with a centre-of-mass collision energy of 100 TeV and an integrated luminosity of at least a factor of 5 larger than the HL-LHC. It will extend the current energy frontier by almost an order of magnitude. The mass reach for direct discovery will reach several tens of TeV, and allow, for example, to produce new particles whose existence could be indirectly exposed by precision measurements during the earlier preceding e+e– collider phase. This collider will also precisely measure the Higgs self-coupling and thoroughly explore the dynamics of electroweak symmetry breaking at the TeV scale, to elucidate the nature of the electroweak phase transition. WIMPs as thermal dark matter candidates will be discovered, or ruled out. As a single project, this particle collider infrastructure will serve the world-wide physics community for about 25 years and, in combination with a lepton collider (see FCC conceptual design report volume 2), will provide a research tool until the end of the 21st century. Collision energies beyond 100 TeV can be considered when using high-temperature superconductors. The European Strategy for Particle Physics (ESPP) update 2013 stated “To stay at the forefront of particle physics, Europe needs to be in a position to propose an ambitious post-LHC accelerator project at CERN by the time of the next Strategy update”. The FCC study has implemented the ESPP recommendation by developing a long-term vision for an “accelerator project in a global context”. This document describes the detailed design and preparation of a construction project for a post-LHC circular energy frontier collider “in collaboration with national institutes, laboratories and universities worldwide”, and enhanced by a strong participation of industrial partners. Now, a coordinated preparation effort can be based on a core of an ever-growing consortium of already more than 135 institutes worldwide. The technology for constructing a high-energy circular hadron collider can be brought to the technology readiness level required for constructing within the coming ten years through a focused R&D programme. The FCC-hh concept comprises in the baseline scenario a power-saving, low-temperature superconducting magnet system based on an evolution of the Nb3Sn technology pioneered at the HL-LHC, an energy-efficient cryogenic refrigeration infrastructure based on a neon-helium (Nelium) light gas mixture, a high-reliability and low loss cryogen distribution infrastructure based on Invar, high-power distributed beam transfer using superconducting elements and local magnet energy recovery and re-use technologies that are already gradually introduced at other CERN accelerators. On a longer timescale, high-temperature superconductors can be developed together with industrial partners to achieve an even more energy efficient particle collider or to reach even higher collision energies.The re-use of the LHC and its injector chain, which also serve for a concurrently running physics programme, is an essential lever to come to an overall sustainable research infrastructure at the energy frontier. Strategic R&D for FCC-hh aims at minimising construction cost and energy consumption, while maximising the socio-economic impact. It will mitigate technology-related risks and ensure that industry can benefit from an acceptable utility. Concerning the implementation, a preparatory phase of about eight years is both necessary and adequate to establish the project governance and organisation structures, to build the international machine and experiment consortia, to develop a territorial implantation plan in agreement with the host-states’ requirements, to optimise the disposal of land and underground volumes, and to prepare the civil engineering project. Such a large-scale, international fundamental research infrastructure, tightly involving industrial partners and providing training at all education levels, will be a strong motor of economic and societal development in all participating nations. The FCC study has implemented a set of actions towards a coherent vision for the world-wide high-energy and particle physics community, providing a collaborative framework for topically complementary and geographically well-balanced contributions. This conceptual design report lays the foundation for a subsequent infrastructure preparatory and technical design phase.

425 citations


Journal ArticleDOI
A. Abada1, Marcello Abbrescia2, Marcello Abbrescia3, Shehu S. AbdusSalam4  +1501 moreInstitutions (239)
TL;DR: In this article, the physics opportunities of the Future Circular Collider (FC) were reviewed, covering its e+e-, pp, ep and heavy ion programs, and the measurement capabilities of each FCC component, addressing the study of electroweak, Higgs and strong interactions.
Abstract: We review the physics opportunities of the Future Circular Collider, covering its e+e-, pp, ep and heavy ion programmes. We describe the measurement capabilities of each FCC component, addressing the study of electroweak, Higgs and strong interactions, the top quark and flavour, as well as phenomena beyond the Standard Model. We highlight the synergy and complementarity of the different colliders, which will contribute to a uniquely coherent and ambitious research programme, providing an unmatchable combination of precision and sensitivity to new physics.

407 citations


Journal ArticleDOI
TL;DR: U-Net based concrete crack detection method proposed in the present study is compared with the DCNN-based method, and U-Net is found to be more elegant than DCNN with more robustness, more effectiveness and more accurate detection.

364 citations


Journal ArticleDOI
TL;DR: This study investigates the use of end‐to‐end representation learning for compounds and proteins, integrates the representations, and develops a new CPI prediction approach by combining a graph neural network (GNN) for compound and a convolutional Neural Network (CNN) for proteins.
Abstract: Motivation In bioinformatics, machine learning-based methods that predict the compound-protein interactions (CPIs) play an important role in the virtual screening for drug discovery. Recently, end-to-end representation learning for discrete symbolic data (e.g. words in natural language processing) using deep neural networks has demonstrated excellent performance on various difficult problems. For the CPI problem, data are provided as discrete symbolic data, i.e. compounds are represented as graphs where the vertices are atoms, the edges are chemical bonds, and proteins are sequences in which the characters are amino acids. In this study, we investigate the use of end-to-end representation learning for compounds and proteins, integrate the representations, and develop a new CPI prediction approach by combining a graph neural network (GNN) for compounds and a convolutional neural network (CNN) for proteins. Results Our experiments using three CPI datasets demonstrated that the proposed end-to-end approach achieves competitive or higher performance as compared to various existing CPI prediction methods. In addition, the proposed approach significantly outperformed existing methods on an unbalanced dataset. This suggests that data-driven representations of compounds and proteins obtained by end-to-end GNNs and CNNs are more robust than traditional chemical and biological features obtained from databases. Although analyzing deep learning models is difficult due to their black-box nature, we address this issue using a neural attention mechanism, which allows us to consider which subsequences in a protein are more important for a drug compound when predicting its interaction. The neural attention mechanism also provides effective visualization, which makes it easier to analyze a model even when modeling is performed using real-valued representations instead of discrete features. Availability and implementation https://github.com/masashitsubaki. Supplementary information Supplementary data are available at Bioinformatics online.

Journal ArticleDOI
TL;DR: In this paper, the authors presented MERIT Hydro, a new global flow direction map at 3-arc sec resolution (90 m at the equator) derived from the latest elevation data (MERIT DEM) and water body data sets (G1WBM, Global Surface Water Occurrence, and OpenStreetMap).
Abstract: High‐resolution raster hydrography maps are a fundamental data source for many geoscience applications. Here we introduce MERIT Hydro, a new global flow direction map at 3‐arc sec resolution (~90 m at the equator) derived from the latest elevation data (MERIT DEM) and water body data sets (G1WBM, Global Surface Water Occurrence, and OpenStreetMap). We developed a new algorithm to extract river networks near automatically by separating actual inland basins from dummy depressions caused by the errors in input elevation data. After a minimum amount of hand editing, the constructed hydrography map shows good agreement with existing quality‐controlled river network data sets in terms of flow accumulation area and river basin shape. The location of river streamlines was realistically aligned with existing satellite‐based global river channel data. Relative error in the drainage area was <0.05 for 90% of Global Runoff Data Center (GRDC) gauges, confirming the accuracy of the delineated global river networks. Discrepancies in flow accumulation area were found mostly in arid river basins containing depressions that are occasionally connected at high water levels and thus resulting in uncertain watershed boundaries. MERIT Hydro improves on existing global hydrography data sets in terms of spatial coverage (between N90 and S60) and representation of small streams, mainly due to increased availability of high‐quality baseline geospatial data sets. The new flow direction and flow accumulation maps, along with accompanying supplementary layers on hydrologically adjusted elevation and channel width, will advance geoscience studies related to river hydrology at both global and local scales. Plain Language Summary Rivers play important roles in global hydrological and biogeochemical cycles, and many socioeconomic activities also depend on water resources in river basins. Global‐scale frontier studies of river networks and surface waters require that all rivers on the Earth are precisely mapped at high resolution, but until now, no such map has been produced. Here we present “MERIT Hydro,” the first high‐resolution, global map of river networks developed by combining the latest global map of land surface elevation with the latest maps of water bodies that were built using satellites and open databases. Surface flow direction of each 3‐arc sec pixel (~90‐m size at the equator) is mapped across the entire globe except Antarctica, and many supplemental maps (such as flow accumulation area, river width, and a vectorized river network) are generated. MERIT Hydro thus represents a major advance in our ability to represent the global river network and is a data set that is anticipated to enhance a wide range of geoscience applications including flood risk assessment, aquatic carbon emissions, and climate modeling.

Journal ArticleDOI
TL;DR: The current state of knowledge for the biospace in which life operates on Earth is reviewed and discussed in a planetary context, highlighting knowledge gaps and areas of opportunity.
Abstract: Prokaryotic life has dominated most of the evolutionary history of our planet, evolving to occupy virtually all available environmental niches. Extremophiles, especially those thriving under multiple extremes, represent a key area of research for multiple disciplines, spanning from the study of adaptations to harsh conditions, to the biogeochemical cycling of elements. Extremophile research also has implications for origin of life studies and the search for life on other planetary and celestial bodies. In this article, we will review the current state of knowledge for the biospace in which life operates on Earth and will discuss it in a planetary context, highlighting knowledge gaps and areas of opportunity.

Journal ArticleDOI
Tomotada Akutsu1, Masaki Ando1, Masaki Ando2, Koya Arai2  +199 moreInstitutions (48)
TL;DR: KAGRA as discussed by the authors is a 2.5-generation GW detector with two 3'km baseline arms arranged in an 'L' shape, similar to the second generations of Advanced LIGO and Advanced Virgo, but it will be operating at cryogenic temperatures with sapphire mirrors.
Abstract: The recent detections of gravitational waves (GWs) reported by the LIGO and Virgo collaborations have made a significant impact on physics and astronomy. A global network of GW detectors will play a key role in uncovering the unknown nature of the sources in coordinated observations with astronomical telescopes and detectors. Here we introduce KAGRA, a new GW detector with two 3 km baseline arms arranged in an ‘L’ shape. KAGRA’s design is similar to the second generations of Advanced LIGO and Advanced Virgo, but it will be operating at cryogenic temperatures with sapphire mirrors. This low-temperature feature is advantageous for improving the sensitivity around 100 Hz and is considered to be an important feature for the third-generation GW detector concept (for example, the Einstein Telescope of Europe or the Cosmic Explorer of the United States). Hence, KAGRA is often called a 2.5-generation GW detector based on laser interferometry. KAGRA’s first observation run is scheduled in late 2019, aiming to join the third observation run of the advanced LIGO–Virgo network. When operating along with the existing GW detectors, KAGRA will be helpful in locating GW sources more accurately and determining the source parameters with higher precision, providing information for follow-up observations of GW trigger candidates.

Journal ArticleDOI
TL;DR: Structural and biochemical data suggest that the essential autophagy protein Atg2 acts as a lipid-transfer protein that supplies phospholipids from the source organelle to the isolation membranes (IMs) for autophagosome formation.
Abstract: A key event in autophagy is autophagosome formation, whereby the newly synthesized isolation membrane (IM) expands to form a complete autophagosome using endomembrane-derived lipids. Atg2 physically links the edge of the expanding IM with the endoplasmic reticulum (ER), a role that is essential for autophagosome formation. However, the molecular function of Atg2 during ER-IM contact remains unclear, as does the mechanism of lipid delivery to the IM. Here we show that the conserved amino-terminal region of Schizosaccharomyces pombe Atg2 includes a lipid-transfer-protein-like hydrophobic cavity that accommodates phospholipid acyl chains. Atg2 bridges highly curved liposomes, thereby facilitating efficient phospholipid transfer in vitro, a function that is inhibited by mutations that impair autophagosome formation in vivo. These results suggest that Atg2 acts as a lipid-transfer protein that supplies phospholipids for autophagosome formation.

Journal ArticleDOI
Georges Aad1, Alexander Kupco2, Samuel Webb3, Timo Dreyer4  +3380 moreInstitutions (206)
TL;DR: In this article, a search for high-mass dielectron and dimuon resonances in the mass range of 250 GeV to 6 TeV was performed at the Large Hadron Collider.

Journal ArticleDOI
E. Kou, Phillip Urquijo1, Wolfgang Altmannshofer2, F. Beaujean3  +558 moreInstitutions (140)
TL;DR: The Belle II detector as mentioned in this paper is a state-of-the-art detector for heavy flavor physics, quarkonium and exotic states, searches for dark sectors, and many other areas.
Abstract: The Belle II detector will provide a major step forward in precision heavy flavor physics, quarkonium and exotic states, searches for dark sectors, and many other areas. The sensitivity to a large number of key observables can be improved by about an order of magnitude compared to the current measurements, and up to two orders in very clean search measurements. This increase in statistical precision arises not only due to the increased luminosity, but also from improved detector efficiency and precision for many channels. Many of the most interesting observables tend to have very small theoretical uncertainties that will therefore not limit the physics reach. This book has presented many new ideas for measurements, both to elucidate the nature of current anomalies seen in flavor, and to search for new phenomena in a plethora of observables that will become accessible with the Belle II dataset. The simulation used for the studiesinthis book was state ofthe artat the time, though weare learning a lot more about the experiment during the commissioning period. The detector is in operation, and working spectacularly well.

Proceedings Article
16 Jun 2019
TL;DR: In this paper, a single CNN is simultaneously a dense feature descriptor and a feature detector, and the obtained keypoints are more stable than their traditional counterparts based on early detection of low-level structures.
Abstract: In this work we address the problem of finding reliable pixel-level correspondences under difficult imaging conditions. We propose an approach where a single convolutional neural network plays a dual role: It is simultaneously a dense feature descriptor and a feature detector. By postponing the detection to a later stage, the obtained keypoints are more stable than their traditional counterparts based on early detection of low-level structures. We show that this model can be trained using pixel correspondences extracted from readily available large-scale SfM reconstructions, without any further annotations. The proposed method obtains state-of-the-art performance on both the difficult Aachen Day-Night localization dataset and the InLoc indoor localization benchmark, as well as competitive performance on other benchmarks for image matching and 3D reconstruction.

Journal ArticleDOI
TL;DR: In this paper, the authors compared the energy efficiency of liquid H2, methylcyclohexane (MCH), and ammonia (NH3), and concluded that NH3 has the highest total energy efficiency, followed by liquid H 2, and MCH.

Journal ArticleDOI
Morad Aaboud, Georges Aad1, Brad Abbott2, Dale Charles Abbott3  +2936 moreInstitutions (198)
TL;DR: An exclusion limit on the H→invisible branching ratio of 0.26(0.17_{-0.05}^{+0.07}) at 95% confidence level is observed (expected) in combination with the results at sqrt[s]=7 and 8 TeV.
Abstract: Dark matter particles, if sufficiently light, may be produced in decays of the Higgs boson. This Letter presents a statistical combination of searches for H→invisible decays where H is produced according to the standard model via vector boson fusion, Z(ll)H, and W/Z(had)H, all performed with the ATLAS detector using 36.1 fb^{-1} of pp collisions at a center-of-mass energy of sqrt[s]=13 TeV at the LHC. In combination with the results at sqrt[s]=7 and 8 TeV, an exclusion limit on the H→invisible branching ratio of 0.26(0.17_{-0.05}^{+0.07}) at 95% confidence level is observed (expected).

Journal ArticleDOI
TL;DR: A cell-free protein synthesis system and small proteoliposomes are combined inside a giant unilamellar vesicle to synthesize protein by the production of ATP by light, powering transcription and translation.
Abstract: Attempts to construct an artificial cell have widened our understanding of living organisms. Many intracellular systems have been reconstructed by assembling molecules, however the mechanism to synthesize its own constituents by self-sufficient energy has to the best of our knowledge not been developed. Here, we combine a cell-free protein synthesis system and small proteoliposomes, which consist of purified ATP synthase and bacteriorhodopsin, inside a giant unilamellar vesicle to synthesize protein by the production of ATP by light. The photo-synthesized ATP is consumed as a substrate for transcription and as an energy for translation, eventually driving the synthesis of bacteriorhodopsin or constituent proteins of ATP synthase, the original essential components of the proteoliposome. The de novo photosynthesized bacteriorhodopsin and the parts of ATP synthase integrate into the artificial photosynthetic organelle and enhance its ATP photosynthetic activity through the positive feedback of the products. Our artificial photosynthetic cell system paves the way to construct an energetically independent artificial cell.

Journal ArticleDOI
Georges Aad1, Alexander Kupco2, Samuel Webb3, Timo Dreyer4  +2962 moreInstitutions (195)
TL;DR: In this article, an improved energy clustering algorithm is introduced, and its implications for the measurement and identification of prompt electrons and photons are discussed in detail, including corrections and calibrations that affect performance, including energy calibration, identification and isolation efficiencies.
Abstract: This paper describes the reconstruction of electrons and photons with the ATLAS detector, employed for measurements and searches exploiting the complete LHC Run 2 dataset. An improved energy clustering algorithm is introduced, and its implications for the measurement and identification of prompt electrons and photons are discussed in detail. Corrections and calibrations that affect performance, including energy calibration, identification and isolation efficiencies, and the measurement of the charge of reconstructed electron candidates are determined using up to 81 fb−1 of proton-proton collision data collected at √s=13 TeV between 2015 and 2017.

Journal ArticleDOI
TL;DR: In this article, the authors present ultra-high energy-resolution spectroscopic imaging (SI)-STM to clarify the nature of the vortex bound states in Fe(Se,Te), and show that some vortices host the Majorana bound states but others do not.
Abstract: Majorana quasiparticles in condensed matter are important for topological quantum computing1–3, but remain elusive. Vortex cores of topological superconductors may accommodate Majorana quasiparticles that appear as the Majorana bound state (MBS) at zero energy4,5. The iron-based superconductor Fe(Se,Te) possesses a superconducting topological surface state6–9 that was investigated by scanning tunnelling microscopy (STM) studies, which suggest such a zero-energy vortex bound state (ZVBS)10,11. Here we present ultrahigh energy-resolution spectroscopic imaging (SI)–STM to clarify the nature of the vortex bound states in Fe(Se,Te). We found the ZVBS at 0 ± 20 μeV, which constrained its MBS origin, and showed that some vortices host the ZVBS but others do not. We show that the fraction of vortices hosting the ZVBS decreases with increasing magnetic field and that local quenched disorders are not related to the ZVBS. Our observations elucidate the necessary conditions to realize the ZVBS, which paves the way towards controllable Majorana quasiparticles. High-resolution spectroscopy supports the presence of Majorana bound states in an iron-based topological superconductor.

Journal ArticleDOI
21 Jun 2019
TL;DR: In this paper, the authors demonstrate the successful discovery of new polymers with high thermal conductivity, inspired by machine-learning-assisted polymer chemistry, using a molecular design algorithm trained to recognize quantitative structure.
Abstract: The use of machine learning in computational molecular design has great potential to accelerate the discovery of innovative materials. However, its practical benefits still remain unproven in real-world applications, particularly in polymer science. We demonstrate the successful discovery of new polymers with high thermal conductivity, inspired by machine-learning-assisted polymer chemistry. This discovery was made by the interplay between machine intelligence trained on a substantially limited amount of polymeric properties data, expertise from laboratory synthesis and advanced technologies for thermophysical property measurements. Using a molecular design algorithm trained to recognize quantitative structure—property relationships with respect to thermal conductivity and other targeted polymeric properties, we identified thousands of promising hypothetical polymers. From these candidates, three were selected for monomer synthesis and polymerization because of their synthetic accessibility and their potential for ease of processing in further applications. The synthesized polymers reached thermal conductivities of 0.18–0.41 W/mK, which are comparable to those of state-of-the-art polymers in non-composite thermo-plastics.

Journal ArticleDOI
Georges Aad1, Alexander Kupco2, Samuel Webb3, Timo Dreyer4  +2961 moreInstitutions (196)
TL;DR: In this article, the ATLAS Collaboration during Run 2 of the Large Hadron Collider (LHC) was used to identify jets containing b-hadrons, and the performance of the algorithms was evaluated in the s...
Abstract: The algorithms used by the ATLAS Collaboration during Run 2 of the Large Hadron Collider to identify jets containing b-hadrons are presented. The performance of the algorithms is evaluated in the s ...

Journal ArticleDOI
TL;DR: In this paper, an analytical framework for P2P microgrids is developed based on literature review as well as expert interviews, which incorporates technological, economic, social, 4) environmental and 5) institutional dimensions.
Abstract: The future of energy is complex, with fluctuating renewable resources in increasingly distributed systems. It is suggested that blockchain technology is a timely innovation with potential to facilitate this future. Peer-to-peer (P2P) microgrids can support renewable energy as well as economically empower consumers and prosumers. However, the rapid development of blockchain and prospects for P2P energy networks is coupled with several grey areas in the institutional landscape. The purpose of this paper is to holistically explore potential challenges of blockchain-based P2P microgrids, and propose practical implications for institutional development as well as academia. An analytical framework for P2P microgrids is developed based on literature review as well as expert interviews. The framework incorporates 1) Technological, 2) Economic, 3) Social, 4) Environmental and 5) Institutional dimensions. Directions for future work in practical and academic contexts are identified. It is suggested that bridging the gap from technological to institutional readiness would require the incorporation of all dimensions as well as their inter-relatedness. Gradual institutional change leveraging community-building and regulatory sandbox approaches are proposed as potential pathways in incorporating this multi-dimensionality, reducing cross-sectoral silos, and facilitating interoperability between current and future systems. By offering insight through holistic conceptualization, this paper aims to contribute to expanding research in building the pillars of a more substantiated institutional arch for blockchain in the energy sector.

Journal ArticleDOI
TL;DR: To facilitate widespread use of transfer learning, a pretrained model library called XenonPy.MDL is developed, which comprises more than 140 000 pretrained models for various properties of small molecules, polymers, and inorganic crystalline materials and which has autonomously identified rather nontrivial transferability across different properties transcending the different disciplines of materials science.
Abstract: There is a growing demand for the use of machine learning (ML) to derive fast-to-evaluate surrogate models of materials properties. In recent years, a broad array of materials property databases ha...

Journal ArticleDOI
TL;DR: In this article, natural magnetic van der Waals heterostructures of (MnBi2Te4)m(Bi 2Te3)n that exhibit controllable magnetic properties while maintaining their topological surface states were reported.
Abstract: Heterostructures having both magnetism and topology are promising materials for the realization of exotic topological quantum states while challenging in synthesis and engineering. Here, we report natural magnetic van der Waals heterostructures of (MnBi2Te4)m(Bi2Te3)n that exhibit controllable magnetic properties while maintaining their topological surface states. The interlayer antiferromagnetic exchange coupling is gradually weakened as the separation of magnetic layers increases, and an anomalous Hall effect that is well coupled with magnetization and shows ferromagnetic hysteresis was observed below 5 K. The obtained homogeneous heterostructure with atomically sharp interface and intrinsic magnetic properties will be an ideal platform for studying the quantum anomalous Hall effect, axion insulator states, and the topological magnetoelectric effect.

Journal ArticleDOI
TL;DR: In this paper, it was shown that topologically nontrivial band degeneracies can appear as exceptional surfaces in non-Hermitian systems with parity-time and parity-particle-hole symmetries.
Abstract: Non-Hermiticity makes possible novel topological phenomena, which are notallowed in Hermitian systems. The authors show that topologically nontrivial band degeneracies can appear as exceptional surfaces in non-Hermitian systems with parity--time and parity--particle-hole symmetries. It is shown that when parity--time or parity--particle-hole symmetry is present, $d$-dimensional non-Hermitian systems can have $(d\ensuremath{-}1)$-dimensional exceptional surfaces. This work suggests new topological phases with nodal band structures protected by non-Hermitian topology and symmetry.

Journal ArticleDOI
TL;DR: This paper reviews the key features of the three above types of fuzzy systems and points out the historical rationale for each type and its current research mainstreams, and focuses on fuzzy model-based approaches developed via Lyapunov stability theorem and linear matrix inequality (LMI) formulations.
Abstract: More than 40 years after fuzzy logic control appeared as an effective tool to deal with complex processes, the research on fuzzy control systems has constantly evolved. Mamdani fuzzy control was originally introduced as a modelfree control approach based on expert?s experience and knowledge. Due to the lack of a systematic framework to study Mamdani fuzzy systems, we have witnessed growing interest in fuzzy model-based approaches with Takagi-Sugeno fuzzy systems and singleton-type fuzzy systems (also called piecewise multiaffine systems) over the past decades. This paper reviews the key features of the three above types of fuzzy systems. Through these features, we point out the historical rationale for each type of fuzzy systems and its current research mainstreams. However, the focus is put on fuzzy model-based approaches developed via Lyapunov stability theorem and linear matrix inequality (LMI) formulations. Finally, our personal viewpoint on the perspectives and challenges of the future fuzzy control research is discussed.