scispace - formally typeset
Search or ask a question

Showing papers by "Technion – Israel Institute of Technology published in 2019"


Journal ArticleDOI
TL;DR: This amended and improved digestion method (INFOGEST 2.0) avoids challenges associated with the original method, such as the inclusion of the oral phase and the use of gastric lipase.
Abstract: Developing a mechanistic understanding of the impact of food structure and composition on human health has increasingly involved simulating digestion in the upper gastrointestinal tract. These simulations have used a wide range of different conditions that often have very little physiological relevance, and this impedes the meaningful comparison of results. The standardized protocol presented here is based on an international consensus developed by the COST INFOGEST network. The method is designed to be used with standard laboratory equipment and requires limited experience to encourage a wide range of researchers to adopt it. It is a static digestion method that uses constant ratios of meal to digestive fluids and a constant pH for each step of digestion. This makes the method simple to use but not suitable for simulating digestion kinetics. Using this method, food samples are subjected to sequential oral, gastric and intestinal digestion while parameters such as electrolytes, enzymes, bile, dilution, pH and time of digestion are based on available physiological data. This amended and improved digestion method (INFOGEST 2.0) avoids challenges associated with the original method, such as the inclusion of the oral phase and the use of gastric lipase. The method can be used to assess the endpoints resulting from digestion of foods by analyzing the digestion products (e.g., peptides/amino acids, fatty acids, simple sugars) and evaluating the release of micronutrients from the food matrix. The whole protocol can be completed in ~7 d, including ~5 d required for the determination of enzyme activities.

1,394 citations


Journal ArticleDOI
TL;DR: Among patients with newly diagnosed advanced ovarian cancer who had a response to platinum-based chemotherapy, those who received niraparib had significantly longer progression-free survival thanThose who received placebo, regardless of the presence or absence of homologous-recombination deficiency.
Abstract: Background Niraparib, an inhibitor of poly(adenosine diphosphate [ADP]–ribose) polymerase (PARP), has been associated with significantly increased progression-free survival among patients ...

1,106 citations


Journal ArticleDOI
02 Jan 2019
TL;DR: A neural model for representing snippets of code as continuous distributed vectors as a single fixed-length code vector which can be used to predict semantic properties of the snippet, making it the first to successfully predict method names based on a large, cross-project corpus.
Abstract: We present a neural model for representing snippets of code as continuous distributed vectors (``code embeddings''). The main idea is to represent a code snippet as a single fixed-length code vector, which can be used to predict semantic properties of the snippet. To this end, code is first decomposed to a collection of paths in its abstract syntax tree. Then, the network learns the atomic representation of each path while simultaneously learning how to aggregate a set of them. We demonstrate the effectiveness of our approach by using it to predict a method's name from the vector representation of its body. We evaluate our approach by training a model on a dataset of 12M methods. We show that code vectors trained on this dataset can predict method names from files that were unobserved during training. Furthermore, we show that our model learns useful method name vectors that capture semantic similarities, combinations, and analogies. A comparison of our approach to previous techniques over the same dataset shows an improvement of more than 75%, making it the first to successfully predict method names based on a large, cross-project corpus. Our trained model, visualizations and vector similarities are available as an interactive online demo at http://code2vec.org. The code, data and trained models are available at https://github.com/tech-srl/code2vec.

849 citations


Book ChapterDOI
04 Oct 2019
TL;DR: In this article, the authors consider a probabilistic Turing machine, where all players agree on a single string y, selected with the right probability distribution, as M's output, while keeping the maximum possible pniracy about them.
Abstract: Permission to copy without fee all or part of this material is granted provided that the copies are not made or Idistributed for direct commercial advantage, the ACM copyright notice and the title of the publication and its date appear, and notice is given that copying is by permission of the Association for Computing Machimery. To copy otherwise, or to republish, requires a fee and/or specfic permission. correctly run a given Turing machine hi on these 2;‘s while keeping the maximum possible pniracy about them. That is, they want to compute Y~(~l,..., 2,) without revealing more about the Zi’s than it is already contained in the value y itself. For instance, if M computes the sum of the q’s, every single player should not be able to learn more than the sum of the inputs of the other parties. Here A4 ma.y very well be a probabilistic Turing machine. In this case, all playen want to agree on a single string y, selected with the right probability distribution, as M’s output.

682 citations


Proceedings ArticleDOI
02 May 2019
TL;DR: SinGAN, an unconditional generative model that can be learned from a single natural image, is introduced, trained to capture the internal distribution of patches within the image, and is then able to generate high quality, diverse samples that carry the same visual content as the image.
Abstract: We introduce SinGAN, an unconditional generative model that can be learned from a single natural image. Our model is trained to capture the internal distribution of patches within the image, and is then able to generate high quality, diverse samples that carry the same visual content as the image. SinGAN contains a pyramid of fully convolutional GANs, each responsible for learning the patch distribution at a different scale of the image. This allows generating new samples of arbitrary size and aspect ratio, that have significant variability, yet maintain both the global structure and the fine textures of the training image. In contrast to previous single image GAN schemes, our approach is not limited to texture images, and is not conditional (i.e. it generates samples from noise). User studies confirm that the generated samples are commonly confused to be real images. We illustrate the utility of SinGAN in a wide range of image manipulation tasks.

660 citations


Journal ArticleDOI
Nasim Mavaddat1, Kyriaki Michailidou1, Kyriaki Michailidou2, Joe Dennis1  +307 moreInstitutions (105)
TL;DR: This PRS, optimized for prediction of estrogen receptor (ER)-specific disease, from the largest available genome-wide association dataset is developed and empirically validated and is a powerful and reliable predictor of breast cancer risk that may improve breast cancer prevention programs.
Abstract: Stratification of women according to their risk of breast cancer based on polygenic risk scores (PRSs) could improve screening and prevention strategies. Our aim was to develop PRSs, optimized for prediction of estrogen receptor (ER)-specific disease, from the largest available genome-wide association dataset and to empirically validate the PRSs in prospective studies. The development dataset comprised 94,075 case subjects and 75,017 control subjects of European ancestry from 69 studies, divided into training and validation sets. Samples were genotyped using genome-wide arrays, and single-nucleotide polymorphisms (SNPs) were selected by stepwise regression or lasso penalized regression. The best performing PRSs were validated in an independent test set comprising 11,428 case subjects and 18,323 control subjects from 10 prospective studies and 190,040 women from UK Biobank (3,215 incident breast cancers). For the best PRSs (313 SNPs), the odds ratio for overall disease per 1 standard deviation in ten prospective studies was 1.61 (95%CI: 1.57-1.65) with area under receiver-operator curve (AUC) = 0.630 (95%CI: 0.628-0.651). The lifetime risk of overall breast cancer in the top centile of the PRSs was 32.6%. Compared with women in the middle quintile, those in the highest 1% of risk had 4.37- and 2.78-fold risks, and those in the lowest 1% of risk had 0.16- and 0.27-fold risks, of developing ER-positive and ER-negative disease, respectively. Goodness-of-fit tests indicated that this PRS was well calibrated and predicts disease risk accurately in the tails of the distribution. This PRS is a powerful and reliable predictor of breast cancer risk that may improve breast cancer prevention programs.

653 citations


Journal ArticleDOI
TL;DR: Pembrolizumab monotherapy demonstrated durable antitumor activity and manageable safety in patients with advanced cervical cancer and the US Food and Drug Administration granted accelerated approval of pembrolIZumab for patients withAdvanced PD-L1-positive cervical cancer who experienced progression during or after chemotherapy.
Abstract: PURPOSEKEYNOTE-158 (ClinicalTrials.gov identifier: NCT02628067) is a phase II basket study investigating the antitumor activity and safety of pembrolizumab in multiple cancer types. We present inte...

575 citations


Journal ArticleDOI
10 Jan 2019-Nature
TL;DR: In a phase I trial, highly individualized peptide vaccines against unmutated tumour antigens and neoepitopes elicited sustained responses in CD8+ and CD4+ T cells, respectively, in patients with newly diagnosed glioblastoma.
Abstract: Patients with glioblastoma currently do not sufficiently benefit from recent breakthroughs in cancer treatment that use checkpoint inhibitors1,2. For treatments using checkpoint inhibitors to be successful, a high mutational load and responses to neoepitopes are thought to be essential3. There is limited intratumoural infiltration of immune cells4 in glioblastoma and these tumours contain only 30–50 non-synonymous mutations5. Exploitation of the full repertoire of tumour antigens—that is, both unmutated antigens and neoepitopes—may offer more effective immunotherapies, especially for tumours with a low mutational load. Here, in the phase I trial GAPVAC-101 of the Glioma Actively Personalized Vaccine Consortium (GAPVAC), we integrated highly individualized vaccinations with both types of tumour antigens into standard care to optimally exploit the limited target space for patients with newly diagnosed glioblastoma. Fifteen patients with glioblastomas positive for human leukocyte antigen (HLA)-A*02:01 or HLA-A*24:02 were treated with a vaccine (APVAC1) derived from a premanufactured library of unmutated antigens followed by treatment with APVAC2, which preferentially targeted neoepitopes. Personalization was based on mutations and analyses of the transcriptomes and immunopeptidomes of the individual tumours. The GAPVAC approach was feasible and vaccines that had poly-ICLC (polyriboinosinic-polyribocytidylic acid-poly-l-lysine carboxymethylcellulose) and granulocyte–macrophage colony-stimulating factor as adjuvants displayed favourable safety and strong immunogenicity. Unmutated APVAC1 antigens elicited sustained responses of central memory CD8+ T cells. APVAC2 induced predominantly CD4+ T cell responses of T helper 1 type against predicted neoepitopes.

568 citations


Journal ArticleDOI
TL;DR: This expert consensus report is neither a guideline update nor a position statement, but rather a summary and consensus view in the form of consensus recommendations.
Abstract: The European Society of Cardiology (ESC) has published a series of guidelines on heart failure (HF) over the last 25 years, most recently in 2016. Given the amount of new information that has become available since then, the Heart Failure Association (HFA) of the ESC recognized the need to review and summarise recent developments in a consensus document. Here we report from the HFA workshop that was held in January 2019 in Frankfurt, Germany. This expert consensus report is neither a guideline update nor a position statement, but rather a summary and consensus view in the form of consensus recommendations. The report describes how these guidance statements are supported by evidence, it makes some practical comments, and it highlights new research areas and how progress might change the clinical management of HF. We have avoided re-interpretation of information already considered in the 2016 ESC/HFA guidelines. Specific new recommendations have been made based on the evidence from major trials published since 2016, including sodium-glucose co-transporter 2 inhibitors in type 2 diabetes mellitus, MitraClip for functional mitral regurgitation, atrial fibrillation ablation in HF, tafamidis in cardiac transthyretin amyloidosis, rivaroxaban in HF, implantable cardioverter-defibrillators in non-ischaemic HF, and telemedicine for HF. In addition, new trial evidence from smaller trials and updated meta-analyses have given us the chance to provide refined recommendations in selected other areas. Further, new trial evidence is due in many of these areas and others over the next 2 years, in time for the planned 2021 ESC guidelines on the diagnosis and treatment of acute and chronic heart failure.

467 citations


Journal ArticleDOI
TL;DR: In this paper, a review of the latest activities on both fundamental aspects of Mg-based hydrides and their applications is presented, as well as a historic overview on the topic and outlines projected future developments.

411 citations


Journal ArticleDOI
TL;DR: Increased amounts of bandwidth are required to guarantee both high-quality/high-rate wireless services (4G and 5G) and reliable sensing capabilities, such as for automotive radar, air traffic control, earth geophysical monitoring, and security applications.
Abstract: Increased amounts of bandwidth are required to guarantee both high-quality/high-rate wireless services (4G and 5G) and reliable sensing capabilities, such as for automotive radar, air traffic control, earth geophysical monitoring, and security applications. Therefore, coexistence between radar and communication systems using overlapping bandwidths has come to be a primary investigation field in recent years. Various signal processing techniques, such as interference mitigation, precoding or spatial separation, and waveform design, allow both radar and communications to share the spectrum.

Journal ArticleDOI
TL;DR: Genome-wide association analyses based on whole-genome sequencing and imputation identify 40 new risk variants for colorectal cancer, including a strongly protective low-frequency variant at CHD1 and loci implicating signaling and immune function in disease etiology.
Abstract: To further dissect the genetic architecture of colorectal cancer (CRC), we performed whole-genome sequencing of 1,439 cases and 720 controls, imputed discovered sequence variants and Haplotype Reference Consortium panel variants into genome-wide association study data, and tested for association in 34,869 cases and 29,051 controls. Findings were followed up in an additional 23,262 cases and 38,296 controls. We discovered a strongly protective 0.3% frequency variant signal at CHD1. In a combined meta-analysis of 125,478 individuals, we identified 40 new independent signals at P < 5 × 10-8, bringing the number of known independent signals for CRC to ~100. New signals implicate lower-frequency variants, Kruppel-like factors, Hedgehog signaling, Hippo-YAP signaling, long noncoding RNAs and somatic drivers, and support a role for immune function. Heritability analyses suggest that CRC risk is highly polygenic, and larger, more comprehensive studies enabling rare variant analysis will improve understanding of biology underlying this risk and influence personalized screening strategies and drug development.

Proceedings ArticleDOI
07 Jan 2019
TL;DR: In this article, distance metric learning (DML) is applied to object classification, both in the standard regime of rich training data and in the few-shot scenario, where each category is represented by only a few examples.
Abstract: Distance metric learning (DML) has been successfully applied to object classification, both in the standard regime of rich training data and in the few-shot scenario, where each category is represented by only a few examples. In this work, we propose a new method for DML that simultaneously learns the backbone network parameters, the embedding space, and the multi-modal distribution of each of the training categories in that space, in a single end-to-end training process. Our approach outperforms state-of-the-art methods for DML-based object classification on a variety of standard fine-grained datasets. Furthermore, we demonstrate the effectiveness of our approach on the problem of few-shot object detection, by incorporating the proposed DML architecture as a classification head into a standard object detection model. We achieve the best results on the ImageNet-LOC dataset compared to strong baselines, when only a few training examples are available. We also offer the community a new episodic benchmark based on the ImageNet dataset for the few-shot object detection task.

Journal ArticleDOI
TL;DR: A novel lead-free halide is presented, namely Rb2 CuBr3, as a scintillator with exceptionally high light yield, providing nontoxicity, high radioluminescence intensity, and good stability, thus laying good foundations for potential application in low-dose radiography.
Abstract: Scintillators are widely utilized for radiation detections in many fields, such as nondestructive inspection, medical imaging, and space exploration. Lead halide perovskite scintillators have recently received extensive research attention owing to their tunable emission wavelength, low detection limit, and ease of fabrication. However, the low light yields toward X-ray irradiation and the lead toxicity of these perovskites severely restricts their practical application. A novel lead-free halide is presented, namely Rb2 CuBr3 , as a scintillator with exceptionally high light yield. Rb2 CuBr3 exhibits a 1D crystal structure and enjoys strong carrier confinement and near-unity photoluminescence quantum yield (98.6%) in violet emission. The high photoluminescence quantum yield combined with negligible self-absorption from self-trapped exciton emission and strong X-ray absorption capability enables a record high light yield of ≈91056 photons per MeV among perovskite and relative scintillators. Overall, Rb2 CuBr3 provides nontoxicity, high radioluminescence intensity, and good stability, thus laying good foundations for potential application in low-dose radiography.

Journal ArticleDOI
TL;DR: The first known attempt to quantify the volume of managed aquifer recharge (MAR) at global scale, and to illustrate the advancement of all the major types of MAR and relate these to research and regulatory advancements is presented in this article.
Abstract: The last 60 years has seen unprecedented groundwater extraction and overdraft as well as development of new technologies for water treatment that together drive the advance in intentional groundwater replenishment known as managed aquifer recharge (MAR). This paper is the first known attempt to quantify the volume of MAR at global scale, and to illustrate the advancement of all the major types of MAR and relate these to research and regulatory advancements. Faced with changing climate and rising intensity of climate extremes, MAR is an increasingly important water management strategy, alongside demand management, to maintain, enhance and secure stressed groundwater systems and to protect and improve water quality. During this time, scientific research—on hydraulic design of facilities, tracer studies, managing clogging, recovery efficiency and water quality changes in aquifers—has underpinned practical improvements in MAR and has had broader benefits in hydrogeology. Recharge wells have greatly accelerated recharge, particularly in urban areas and for mine water management. In recent years, research into governance, operating practices, reliability, economics, risk assessment and public acceptance of MAR has been undertaken. Since the 1960s, implementation of MAR has accelerated at a rate of 5%/year, but is not keeping pace with increasing groundwater extraction. Currently, MAR has reached an estimated 10 km3/year, ~2.4% of groundwater extraction in countries reporting MAR (or ~1.0% of global groundwater extraction). MAR is likely to exceed 10% of global extraction, based on experience where MAR is more advanced, to sustain quantity, reliability and quality of water supplies.

Journal ArticleDOI
TL;DR: This review aims to bring together these multidisciplinary and interdisciplinary features of MDR cancers by deciphering the molecular mechanisms underlying anticancer drug resistance, to pave the way towards the development of novel precision medicine treatment modalities that are able to surmount distinct and well-defined mechanisms of antic cancer drug resistance.

Journal ArticleDOI
20 May 2019
TL;DR: A number of families of accelerating optical waves have been identified in the paraxial and non-paraxial domains in space and/or time, with different methods developed to control at will their trajectory, amplitude, and beam width as mentioned in this paper.
Abstract: Over the last dozen years, the area of accelerating waves has made considerable advances not only in terms of fundamentals and experimental demonstrations, but also in connection to a wide range of applications. Starting from the prototypical Airy beam that was proposed and observed in 2007, new families of accelerating waves have been identified in the paraxial and nonparaxial domains in space and/or time, with different methods developed to control at will their trajectory, amplitude, and beam width. Accelerating optical waves exhibit a number of highly desirable attributes. They move along a curved or accelerating trajectory while being resilient to perturbations (self-healing) and are diffraction-free. It is because of these particular features that accelerating waves have been utilized in a variety of applications in the areas of filamentation, beam focusing, particle manipulation, biomedical imaging, plasmons, and material processing, among others.

Journal ArticleDOI
TL;DR: ESGE recommends the use of high volume or low volume PEG-based regimens as well as that of non-PEG- based agents that have been clinically validated for routine bowel preparation as an acceptable alternative to split dosing.
Abstract: ESGE recommends a low fiber diet on the day preceding colonoscopy.Strong recommendation, moderate quality evidence.ESGE recommends the use of enhanced instructions for bowel preparation.Strong recommendation, moderate quality evidence.ESGE suggests adding oral simethicone to bowel preparation.Weak recommendation, moderate quality evidence.ESGE recommends split-dose bowel preparation for elective colonoscopy.Strong recommendation, high quality evidence.ESGE recommends, for patients undergoing afternoon colonoscopy, a same-day bowel preparation as an acceptable alternative to split dosing.Strong recommendation, high quality evidence.ESGE recommends to start the last dose of bowel preparation within 5 hours of colonoscopy, and to complete it at least 2 hours before the beginning of the procedure.Strong recommendation, moderate quality evidence.ESGE recommends the use of high volume or low volume PEG-based regimens as well as that of non-PEG-based agents that have been clinically validated for routine bowel preparation. In patients at risk for hydroelectrolyte disturbances, the choice of laxative should be individualized.Strong recommendation, moderate quality evidence.

Journal ArticleDOI
TL;DR: An integrated high-dimensional measurement of immune age describes a person’s immune status better than chronological age and predicts all-cause mortality beyond well-established risk factors in the Framingham Heart Study.
Abstract: Immune responses generally decline with age. However, the dynamics of this process at the individual level have not been characterized, hindering quantification of an individual's immune age. Here, we use multiple 'omics' technologies to capture population- and individual-level changes in the human immune system of 135 healthy adult individuals of different ages sampled longitudinally over a nine-year period. We observed high inter-individual variability in the rates of change of cellular frequencies that was dictated by their baseline values, allowing identification of steady-state levels toward which a cell subset converged and the ordered convergence of multiple cell subsets toward an older adult homeostasis. These data form a high-dimensional trajectory of immune aging (IMM-AGE) that describes a person's immune status better than chronological age. We show that the IMM-AGE score predicted all-cause mortality beyond well-established risk factors in the Framingham Heart Study, establishing its potential use in clinics for identification of patients at risk.

Journal ArticleDOI
01 May 2019-Nature
TL;DR: In this article, an analogue black hole was constructed with improvements compared with the previous setup, such as reduced magnetic field noise, enhanced mechanical and thermal stability and redesigned optics, and the correlation spectrum of the Hawking radiation was measured in an analog black hole composed of rubidium atoms.
Abstract: The entropy of a black hole1 and Hawking radiation2 should have the same temperature given by the surface gravity, within a numerical factor of the order of unity In addition, Hawking radiation should have a thermal spectrum, which creates an information paradox3,4 However, the thermality should be limited by greybody factors5, at the very least6 It has been proposed that the physics of Hawking radiation could be verified in an analogue system7, an idea that has been carefully studied and developed theoretically8–18 Classical white-hole analogues have been investigated experimentally19–21, and other analogue systems have been presented22,23 The theoretical works and our long-term study of this subject15,24–27 enabled us to observe spontaneous Hawking radiation in an analogue black hole28 The observed correlation spectrum showed thermality at the lowest and highest energies, but the overall spectrum was not of the thermal form, and no temperature could be ascribed to it Theoretical studies of our observation made predictions about the thermality and Hawking temperature29–33 Here we construct an analogue black hole with improvements compared with our previous setup, such as reduced magnetic field noise, enhanced mechanical and thermal stability and redesigned optics We find that the correlation spectrum of Hawking radiation agrees well with a thermal spectrum, and its temperature is given by the surface gravity, confirming the predictions of Hawking’s theory The Hawking radiation observed is in the regime of linear dispersion, in analogy with a real black hole, and the radiation inside the black hole is composed of negative-energy partner modes only, as predicted The spectrum of Hawking radiation is measured in an analogue black hole composed of rubidium atoms, confirming Hawking’s prediction that Hawking radiation is thermal with a temperature given by the surface gravity

Journal ArticleDOI
TL;DR: This update focuses on resuscitation and risk assessment; preendoscopic, endoscopic, and pharmacologic management; and secondary prophylaxis for recurrent UGIB, as well as assessing the methodological quality of existing systematic reviews.
Abstract: This guideline updates the 2010 International Consensus Recommendations on Management of Patients With Nonvariceal Upper Gastrointestinal Bleeding.

Proceedings Article
01 Jan 2019
TL;DR: This paper introduces the first practical 4-bit post training quantization approach: it does not involve training the quantized model (fine-tuning), nor it requires the availability of the full dataset, and achieves accuracy that is just a few percents less the state-of-the-art baseline across a wide range of convolutional models.
Abstract: Convolutional neural networks require significant memory bandwidth and storage for intermediate computations, apart from substantial computing resources. Neural network quantization has significant benefits in reducing the amount of intermediate results, but it often requires the full datasets and time-consuming fine tuning to recover the accuracy lost after quantization. This paper introduces the first practical 4-bit post training quantization approach: it does not involve training the quantized model (fine-tuning), nor it requires the availability of the full dataset. We target the quantization of both activations and weights and suggest three complementary methods for minimizing quantization error at the tensor level, two of whom obtain a closed-form analytical solution. Combining these methods, our approach achieves accuracy that is just a few percents less the state-of-the-art baseline across a wide range of convolutional models. The source code to replicate all experiments is available on GitHub: \url{https://github.com/submission2019/cnn-quantization}.

Journal ArticleDOI
Georges Aad1, Alexander Kupco2, Samuel Webb3, Timo Dreyer4  +3380 moreInstitutions (206)
TL;DR: In this article, a search for high-mass dielectron and dimuon resonances in the mass range of 250 GeV to 6 TeV was performed at the Large Hadron Collider.

Journal ArticleDOI
E. Kou, Phillip Urquijo1, Wolfgang Altmannshofer2, F. Beaujean3  +558 moreInstitutions (140)
TL;DR: The Belle II detector as mentioned in this paper is a state-of-the-art detector for heavy flavor physics, quarkonium and exotic states, searches for dark sectors, and many other areas.
Abstract: The Belle II detector will provide a major step forward in precision heavy flavor physics, quarkonium and exotic states, searches for dark sectors, and many other areas. The sensitivity to a large number of key observables can be improved by about an order of magnitude compared to the current measurements, and up to two orders in very clean search measurements. This increase in statistical precision arises not only due to the increased luminosity, but also from improved detector efficiency and precision for many channels. Many of the most interesting observables tend to have very small theoretical uncertainties that will therefore not limit the physics reach. This book has presented many new ideas for measurements, both to elucidate the nature of current anomalies seen in flavor, and to search for new phenomena in a plethora of observables that will become accessible with the Belle II dataset. The simulation used for the studiesinthis book was state ofthe artat the time, though weare learning a lot more about the experiment during the commissioning period. The detector is in operation, and working spectacularly well.

Journal ArticleDOI
Morad Aaboud, Georges Aad1, Brad Abbott2, Dale Charles Abbott3  +2936 moreInstitutions (198)
TL;DR: An exclusion limit on the H→invisible branching ratio of 0.26(0.17_{-0.05}^{+0.07}) at 95% confidence level is observed (expected) in combination with the results at sqrt[s]=7 and 8 TeV.
Abstract: Dark matter particles, if sufficiently light, may be produced in decays of the Higgs boson. This Letter presents a statistical combination of searches for H→invisible decays where H is produced according to the standard model via vector boson fusion, Z(ll)H, and W/Z(had)H, all performed with the ATLAS detector using 36.1 fb^{-1} of pp collisions at a center-of-mass energy of sqrt[s]=13 TeV at the LHC. In combination with the results at sqrt[s]=7 and 8 TeV, an exclusion limit on the H→invisible branching ratio of 0.26(0.17_{-0.05}^{+0.07}) at 95% confidence level is observed (expected).

Journal ArticleDOI
19 Sep 2019-Cell
TL;DR: The analysis of melanoma patient tumor data recapitulates the results in terms of overall survival and response to immune checkpoint therapy and highlights the importance of clonal mutations in robust immune surveillance and the need to quantify patient ITH to determine the response to checkpoint blockade.

Journal ArticleDOI
TL;DR: In this paper, the authors decouple the water oxidation and reduction reactions by dividing the process into two steps: an electrochemical step that reduces water at the cathode and oxidizes the anode, followed by a spontaneous chemical step that is driven faster at higher temperature, which reduces anode back to its initial state by oxidizing water.
Abstract: Electrolytic hydrogen production faces technological challenges to improve its efficiency, economic value and potential for global integration. In conventional water electrolysis, the water oxidation and reduction reactions are coupled in both time and space, as they occur simultaneously at an anode and a cathode in the same cell. This introduces challenges, such as product separation, and sets strict constraints on material selection and process conditions. Here, we decouple these reactions by dividing the process into two steps: an electrochemical step that reduces water at the cathode and oxidizes the anode, followed by a spontaneous chemical step that is driven faster at higher temperature, which reduces the anode back to its initial state by oxidizing water. This enables overall water splitting at average cell voltages of 1.44–1.60 V with nominal current densities of 10–200 mA cm−2 in a membrane-free, two-electrode cell. This allows us to produce hydrogen at low voltages in a simple, cyclic process with high efficiency, robustness, safety and scale-up potential. Conventionally, the two half reactions involved in water electrolysis occur simultaneously, presenting materials and process challenges. Here, the authors decouple these to split water efficiently in two steps: electrochemical hydrogen evolution, followed by spontaneous oxygen evolution at elevated temperature.

Journal ArticleDOI
TL;DR: This article aims to review nature-inspired chemical sensors for enabling fast, relatively inexpensive, and minimally invasive diagnostics and follow-up of the health conditions via monitoring of biomarkers and volatile biomarkers.
Abstract: This article aims to review nature-inspired chemical sensors for enabling fast, relatively inexpensive, and minimally (or non-) invasive diagnostics and follow-up of the health conditions. It can be achieved via monitoring of biomarkers and volatile biomarkers, that are excreted from one or combination of body fluids (breath, sweat, saliva, urine, seminal fluid, nipple aspirate fluid, tears, stool, blood, interstitial fluid, and cerebrospinal fluid). The first part of the review gives an updated compilation of the biomarkers linked with specific sickness and/or sampling origin. The other part of the review provides a didactic examination of the concepts and approaches related to the emerging chemistries, sensing materials, and transduction techniques used for biomarker-based medical evaluations. The strengths and pitfalls of each approach are discussed and criticized. Future perspective with relation to the information and communication era is presented and discussed.

Journal ArticleDOI
Georges Aad1, Alexander Kupco2, Samuel Webb3, Timo Dreyer4  +2962 moreInstitutions (195)
TL;DR: In this article, an improved energy clustering algorithm is introduced, and its implications for the measurement and identification of prompt electrons and photons are discussed in detail, including corrections and calibrations that affect performance, including energy calibration, identification and isolation efficiencies.
Abstract: This paper describes the reconstruction of electrons and photons with the ATLAS detector, employed for measurements and searches exploiting the complete LHC Run 2 dataset. An improved energy clustering algorithm is introduced, and its implications for the measurement and identification of prompt electrons and photons are discussed in detail. Corrections and calibrations that affect performance, including energy calibration, identification and isolation efficiencies, and the measurement of the charge of reconstructed electron candidates are determined using up to 81 fb−1 of proton-proton collision data collected at √s=13 TeV between 2015 and 2017.

Journal ArticleDOI
TL;DR: In this paper, the original hydrophobic porous polymers were synthesized within surfactant-stabilized water-in-oil high internal phase emulsions (HIPEs) by us.
Abstract: Emulsion templating presently extends far beyond the original hydrophobic porous polymers that were synthesized within surfactant-stabilized water-in-oil high internal phase emulsions (HIPEs) by us...