scispace - formally typeset
Search or ask a question

Showing papers by "Technion – Israel Institute of Technology published in 2016"


Journal ArticleDOI
Daniel J. Klionsky1, Kotb Abdelmohsen2, Akihisa Abe3, Joynal Abedin4  +2519 moreInstitutions (695)
TL;DR: In this paper, the authors present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macro-autophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes.
Abstract: In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure flux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation, it is imperative to target by gene knockout or RNA interference more than one autophagy-related protein. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways implying that not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular assays, we hope to encourage technical innovation in the field.

5,187 citations


Journal ArticleDOI
TL;DR: Bound states in the continuum (BICs) are waves that remain localized even though they coexist with a continuous spectrum of radiating waves that can carry energy away.
Abstract: Bound states in the continuum (BICs) are waves that remain localized even though they coexist with a continuous spectrum of radiating waves that can carry energy away. Their very existence defies conventional wisdom. Although BICs were first proposed in quantum mechanics, they are a general wave phenomenon and have since been identified in electromagnetic waves, acoustic waves in air, water waves and elastic waves in solids. These states have been studied in a wide range of material systems, such as piezoelectric materials, dielectric photonic crystals, optical waveguides and fibres, quantum dots, graphene and topological insulators. In this Review, we describe recent developments in this field with an emphasis on the physical mechanisms that lead to BICs across seemingly very different materials and types of waves. We also discuss experimental realizations, existing applications and directions for future work. The fascinating wave phenomenon of ‘bound states in the continuum’ spans different material and wave systems, including electron, electromagnetic and mechanical waves. In this Review, we focus on the common physical mechanisms underlying these bound states, whilst also discussing recent experimental realizations, current applications and future opportunities for research.

1,612 citations


Journal ArticleDOI
TL;DR: The eigenstate thermalization hypothesis (ETH) as discussed by the authors is a natural extension of quantum chaos and random matrix theory (RMT) that allows one to describe thermalization in isolated chaotic systems without invoking the notion of an external bath.
Abstract: This review gives a pedagogical introduction to the eigenstate thermalization hypothesis (ETH), its basis, and its implications to statistical mechanics and thermodynamics. In the first part, ETH is introduced as a natural extension of ideas from quantum chaos and random matrix theory (RMT). To this end, we present a brief overview of classical and quantum chaos, as well as RMT and some of its most important predictions. The latter include the statistics of energy levels, eigenstate components, and matrix elements of observables. Building on these, we introduce the ETH and show that it allows one to describe thermalization in isolated chaotic systems without invoking the notion of an external bath. We examine numerical evidence of eigenstate thermalization from studies of many-body lattice systems. We also introduce the concept of a quench as a means of taking isolated systems out of equilibrium, and discuss results of numerical experiments on quantum quenches. The second part of the review explores the i...

1,536 citations


Proceedings Article
08 Feb 2016
TL;DR: A binary matrix multiplication GPU kernel is written with which it is possible to run the MNIST BNN 7 times faster than with an unoptimized GPU kernel, without suffering any loss in classification accuracy.
Abstract: We introduce a method to train Binarized Neural Networks (BNNs) - neural networks with binary weights and activations at run-time. At train-time the binary weights and activations are used for computing the parameter gradients. During the forward pass, BNNs drastically reduce memory size and accesses, and replace most arithmetic operations with bit-wise operations, which is expected to substantially improve power-efficiency. To validate the effectiveness of BNNs, we conducted two sets of experiments on the Torch7 and Theano frameworks. On both, BNNs achieved nearly state-of-the-art results over the MNIST, CIFAR-10 and SVHN datasets. We also report our preliminary results on the challenging ImageNet dataset. Last but not least, we wrote a binary matrix multiplication GPU kernel with which it is possible to run our MNIST BNN 7 times faster than with an unoptimized GPU kernel, without suffering any loss in classification accuracy. The code for training and running our BNNs is available on-line.

1,425 citations


01 Jan 2016
TL;DR: The ubiquitin pathway is a highly complex, temporally controlled and tightly regulated process, which plays important roles in a broad array of basic cellular processes as discussed by the authors, including cell cycle and growth regulators, components of signal transduction pathways, enzymes of house keeping and cell-specific metabolic pathways.
Abstract: The discovery of the ubiquitin pathway and its many substrates and functions has revolutionized our concept of intracellular protein degradation. From an unregulated, non‐specific terminal scavenger process, it has become clear that proteolysis of cellular proteins is a highly complex, temporally controlled and tightly regulated process which plays important roles in a broad array of basic cellular processes. It is carried out by a complex cascade of enzymes and displays a high degree of specificity towards its numerous substrates. Among these are cell cycle and growth regulators, components of signal transduction pathways, enzymes of house keeping and cell‐specific metabolic pathways, and mutated or post‐translationally damaged proteins. The system is also involved in processing major histocompatibility complex (MHC) class I antigens. For many years it has been thought that activity of the system is limited to the cytosol and probably to the nucleus. However, recent experimental evidence has demonstrated that membrane‐anchored and even secretory pathway‐compartmentalized proteins are also targeted by the system. These proteins must be first translocated in a retrograde manner into the cytosol, as components of the pathway have not been identified in the endoplasmic reticulum (ER) lumen. With the multiple cellular targets, it is not surprising that the system is involved in the regulation of many basic cellular processes such as cell cycle and division, differentiation and development, the response to stress and extracellular modulators, morphogenesis of neuronal networks, modulation of cell surface receptors, ion channels and the secretory pathway, DNA repair, regulation of the immune and inflammatory responses, biogenesis of organelles and apoptosis. One would also predict that aberrations in such a complex system may be implicated in the pathogenesis of many diseases, both inherited and acquired. Recent evidence shows that this is indeed the case. Degradation of a protein by the ubiquitin system involves two distinct …

1,256 citations


Posted Content
TL;DR: A binary matrix multiplication GPU kernel is programmed with which it is possible to run the MNIST QNN 7 times faster than with an unoptimized GPU kernel, without suffering any loss in classification accuracy.
Abstract: We introduce a method to train Quantized Neural Networks (QNNs) --- neural networks with extremely low precision (e.g., 1-bit) weights and activations, at run-time. At train-time the quantized weights and activations are used for computing the parameter gradients. During the forward pass, QNNs drastically reduce memory size and accesses, and replace most arithmetic operations with bit-wise operations. As a result, power consumption is expected to be drastically reduced. We trained QNNs over the MNIST, CIFAR-10, SVHN and ImageNet datasets. The resulting QNNs achieve prediction accuracy comparable to their 32-bit counterparts. For example, our quantized version of AlexNet with 1-bit weights and 2-bit activations achieves $51\%$ top-1 accuracy. Moreover, we quantize the parameter gradients to 6-bits as well which enables gradients computation using only bit-wise operation. Quantized recurrent neural networks were tested over the Penn Treebank dataset, and achieved comparable accuracy as their 32-bit counterparts using only 4-bits. Last but not least, we programmed a binary matrix multiplication GPU kernel with which it is possible to run our MNIST QNN 7 times faster than with an unoptimized GPU kernel, without suffering any loss in classification accuracy. The QNN code is available online.

1,232 citations


Journal ArticleDOI
TL;DR: A droplet-based, single-cell RNA-seq method is implemented to determine the transcriptomes of over 12,000 individual pancreatic cells from four human donors and two mouse strains and provides a resource for the discovery of novel cell type-specific transcription factors, signaling receptors, and medically relevant genes.
Abstract: Although the function of the mammalian pancreas hinges on complex interactions of distinct cell types, gene expression profiles have primarily been described with bulk mixtures. Here we implemented a droplet-based, single-cell RNA-seq method to determine the transcriptomes of over 12,000 individual pancreatic cells from four human donors and two mouse strains. Cells could be divided into 15 clusters that matched previously characterized cell types: all endocrine cell types, including rare epsilon-cells; exocrine cell types; vascular cells; Schwann cells; quiescent and activated stellate cells; and four types of immune cells. We detected subpopulations of ductal cells with distinct expression profiles and validated their existence with immuno-histochemistry stains. Moreover, among human beta- cells, we detected heterogeneity in the regulation of genes relating to functional maturation and levels of ER stress. Finally, we deconvolved bulk gene expression samples using the single-cell data to detect disease-associated differential expression. Our dataset provides a resource for the discovery of novel cell type-specific transcription factors, signaling receptors, and medically relevant genes.

1,046 citations


Journal ArticleDOI
26 Aug 2016-Science
TL;DR: Recent progress in tailoring and combining quantum dots to build electronic and optoelectronic devices and new ligand chemistries and matrix materials have been reported that provide freedom to control the dynamics of excitons and charge carriers and to design device interfaces are reviewed.
Abstract: BACKGROUND The Information Age was founded on the semiconductor revolution, marked by the growth of high-purity semiconductor single crystals. The resultant design and fabrication of electronic devices exploits our ability to control the concentration, motion, and dynamics of charge carriers in the bulk semiconductor solid state. Our desire to introduce electronics everywhere is fueled by opportunities to create intelligent and enabling devices for the information, communication, consumer product, health, and energy sectors. This demand for ubiquitous electronics is spurring the design of materials that exhibit engineered physical properties and that can enable new fabrication methods for low-cost, large-area, and flexible devices. Semiconductors, which are at the heart of electronics and optoelectronics, come with high demands on chemical purity and structural perfection. Alternatives to silicon technology are expected to combine the electronic and optical properties of inorganic semiconductors (high charge carrier mobility, precise n- and p-type doping, and the ability to engineer the band gap energy) with the benefits of additive device manufacturing: low cost, large area, and the use of solution-based fabrication techniques. Along these lines, colloidal semiconductor quantum dots (QDs), which are nanoscale crystals of analogous bulk semiconductor crystals, offer a powerful platform for device engineers. Colloidal QDs may be tailored in size, shape, and composition and their surfaces functionalized with molecular ligands of diverse chemistry. At the nanoscale (typically 2 to 20 nm), quantum and dielectric confinement effects give rise to the prized size-, shape-, and composition-tunable electronic and optical properties of QDs. Surface ligands enable the stabilization of QDs in the form of colloids, allowing their bottom-up assembly into QD solids. The physical properties of QD solids can be designed by selecting the characteristics of the individual QD building blocks and by controlling the electronic communication between the QDs in the solid state. These QD solids can be engineered with application-specific electronic and optoelectronic properties for the large-area, solution-based assembly of devices. ADVANCES The large surface-to-volume ratio of QDs places a substantial importance on the composition and structure of the surface in defining the physical properties that govern the concentration, motion, and dynamics of excitations and charge carriers in QD solids. Recent studies have shown pathways to passivate uncoordinated atoms at the QD surface that act to trap and scatter charge carriers. Surface atoms, ligands, and ions can serve as dopants to control the electron affinity of QD solids. Surface ligands and surrounding matrices control the barriers to electronic, excitonic, and thermal transport between QDs and between QDs and matrices. New ligand chemistries and matrix materials have been reported that provide freedom to control the dynamics of excitons and charge carriers and to design device interfaces. These advances in engineering the chemical and physical properties of the QD surface have been translated into recent achievements of high-mobility transistors and circuits, high-quantum-yield photodetectors and light-emitting devices, and high-efficiency photovoltaic devices. OUTLOOK The dominant role and dynamic nature of the QD surface, and the strong motive to build novel QD devices, will drive the exploration of new surface chemistries and matrix materials, processes for their assembly and integration with other materials in devices, and measurements and simulations with which to map the relationship between surface chemistry and materials and device properties. Challenges remain to achieve full control over the carrier type, concentration, and mobility in the QD channel and the barriers and traps at device interfaces that limit the gain and speed of QD electronics. Surface chemistries that allow for both long carrier lifetime and high carrier mobility and the freedom to engineer the bandgap and band alignment of QDs and other device layers are needed to exploit physics particular to QDs and to advance device architectures that contribute to improving the performance of QD optoelectronics. The importance of thermal transport in QD solids and their devices is an essential emerging topic that promises to become of greater importance as we develop QD devices.

930 citations


Journal ArticleDOI
19 Oct 2016-Neuron
TL;DR: A newly evolved variant of adeno-associated virus, rAAV2-retro, permits robust retrograde access to projection neurons with efficiency comparable to classical synthetic retrograde tracers and enables sufficient sensor/effector expression for functional circuit interrogation and in vivo genome editing in targeted neuronal populations.

925 citations


Journal ArticleDOI
TL;DR: Improvements in economics, resolution, and ease of use make CEL-Sequ2 uniquely suited to single-cell RNA-Seq analysis in terms of economics,resolution, and easing of use.
Abstract: Single-cell transcriptomics requires a method that is sensitive, accurate, and reproducible. Here, we present CEL-Seq2, a modified version of our CEL-Seq method, with threefold higher sensitivity, lower costs, and less hands-on time. We implemented CEL-Seq2 on Fluidigm’s C1 system, providing its first single-cell, on-chip barcoding method, and we detected gene expression changes accompanying the progression through the cell cycle in mouse fibroblast cells. We also compare with Smart-Seq to demonstrate CEL-Seq2’s increased sensitivity relative to other available methods. Collectively, the improvements make CEL-Seq2 uniquely suited to single-cell RNA-Seq analysis in terms of economics, resolution, and ease of use.

875 citations


Journal ArticleDOI
TL;DR: In this paper, the authors combined satellite-based estimates, chemical transport model simulations, and ground measurements from 79 different countries to produce global estimates of annual average fine particle (PM2.5) and ozone concentrations at 0.1° × 0. 1° spatial resolution for five-year intervals from 1990 to 2010 and the year 2013.
Abstract: Exposure to ambient air pollution is a major risk factor for global disease. Assessment of the impacts of air pollution on population health and evaluation of trends relative to other major risk factors requires regularly updated, accurate, spatially resolved exposure estimates. We combined satellite-based estimates, chemical transport model simulations, and ground measurements from 79 different countries to produce global estimates of annual average fine particle (PM2.5) and ozone concentrations at 0.1° × 0.1° spatial resolution for five-year intervals from 1990 to 2010 and the year 2013. These estimates were applied to assess population-weighted mean concentrations for 1990-2013 for each of 188 countries. In 2013, 87% of the world's population lived in areas exceeding the World Health Organization Air Quality Guideline of 10 μg/m(3) PM2.5 (annual average). Between 1990 and 2013, global population-weighted PM2.5 increased by 20.4% driven by trends in South Asia, Southeast Asia, and China. Decreases in population-weighted mean concentrations of PM2.5 were evident in most high income countries. Population-weighted mean concentrations of ozone increased globally by 8.9% from 1990-2013 with increases in most countries-except for modest decreases in North America, parts of Europe, and several countries in Southeast Asia.

Journal ArticleDOI
01 Sep 2016-Nature
TL;DR: It is demonstrated that a dynamical encircling of an exceptional point is analogous to the scattering through a two-mode waveguide with suitably designed boundaries and losses, and mode transitions are induced that transform this device into a robust and asymmetric switch between different waveguide modes.
Abstract: Physical systems with loss or gain have resonant modes that decay or grow exponentially with time. Whenever two such modes coalesce both in their resonant frequency and their rate of decay or growth, an 'exceptional point' occurs, giving rise to fascinating phenomena that defy our physical intuition. Particularly intriguing behaviour is predicted to appear when an exceptional point is encircled sufficiently slowly, such as a state-flip or the accumulation of a geometric phase. The topological structure of exceptional points has been experimentally explored, but a full dynamical encircling of such a point and the associated breakdown of adiabaticity have remained out of reach of measurement. Here we demonstrate that a dynamical encircling of an exceptional point is analogous to the scattering through a two-mode waveguide with suitably designed boundaries and losses. We present experimental results from a corresponding waveguide structure that steers incoming waves around an exceptional point during the transmission process. In this way, mode transitions are induced that transform this device into a robust and asymmetric switch between different waveguide modes. This work will enable the exploration of exceptional point physics in system control and state transfer schemes at the crossroads between fundamental research and practical applications.

Journal ArticleDOI
TL;DR: Treatment with rosuvastatin at a dose of 10 mg per day resulted in a significantly lower risk of cardiovascular events than placebo in an intermediate-risk, ethnically diverse population without cardiovascular disease.
Abstract: BackgroundPrevious trials have shown that the use of statins to lower cholesterol reduces the risk of cardiovascular events among persons without cardiovascular disease. Those trials have involved persons with elevated lipid levels or inflammatory markers and involved mainly white persons. It is unclear whether the benefits of statins can be extended to an intermediate-risk, ethnically diverse population without cardiovascular disease. MethodsIn one comparison from a 2-by-2 factorial trial, we randomly assigned 12,705 participants in 21 countries who did not have cardiovascular disease and were at intermediate risk to receive rosuvastatin at a dose of 10 mg per day or placebo. The first coprimary outcome was the composite of death from cardiovascular causes, nonfatal myocardial infarction, or nonfatal stroke, and the second coprimary outcome additionally included revascularization, heart failure, and resuscitated cardiac arrest. The median follow-up was 5.6 years. ResultsThe overall mean low-density lipop...

Journal ArticleDOI
TL;DR: The results support the use of isavuconazole for the primary treatment of patients with invasive mould disease and non-inferiority was shown.

Journal ArticleDOI
Georges Aad1, Brad Abbott2, Jalal Abdallah3, Ovsat Abdinov4, Baptiste Abeloos5, Rosemarie Aben6, Ossama AbouZeid7, N. L. Abraham8, Halina Abramowicz9, Henso Abreu10, Ricardo Abreu11, Yiming Abulaiti12, Bobby Samir Acharya13, Bobby Samir Acharya14, Leszek Adamczyk15, David H. Adams16, Jahred Adelman17, Stefanie Adomeit18, Tim Adye19, A. A. Affolder20, Tatjana Agatonovic-Jovin21, Johannes Agricola22, Juan Antonio Aguilar-Saavedra23, Steven Ahlen24, Faig Ahmadov4, Faig Ahmadov25, Giulio Aielli26, Henrik Akerstedt12, T. P. A. Åkesson27, Andrei Akimov, Gian Luigi Alberghi28, Justin Albert29, S. Albrand30, M. J. Alconada Verzini31, Martin Aleksa32, Igor Aleksandrov25, Calin Alexa, Gideon Alexander9, Theodoros Alexopoulos33, Muhammad Alhroob2, Malik Aliev34, Gianluca Alimonti, John Alison35, Steven Patrick Alkire36, Bmm Allbrooke8, Benjamin William Allen11, Phillip Allport37, Alberto Aloisio38, Alejandro Alonso39, Francisco Alonso31, Cristiano Alpigiani40, Mahmoud Alstaty1, B. Alvarez Gonzalez32, D. Álvarez Piqueras41, Mariagrazia Alviggi38, Brian Thomas Amadio42, K. Amako, Y. Amaral Coutinho43, Christoph Amelung44, D. Amidei45, S. P. Amor Dos Santos46, António Amorim47, Simone Amoroso32, Glenn Amundsen44, Christos Anastopoulos48, Lucian Stefan Ancu49, Nansi Andari17, Timothy Andeen50, Christoph Falk Anders51, G. Anders32, John Kenneth Anders20, Kelby Anderson35, Attilio Andreazza52, Andrei51, Stylianos Angelidakis53, Ivan Angelozzi6, Philipp Anger54, Aaron Angerami36, Francis Anghinolfi32, Alexey Anisenkov55, Nuno Anjos56 
Aix-Marseille University1, University of Oklahoma2, University of Iowa3, Azerbaijan National Academy of Sciences4, Université Paris-Saclay5, University of Amsterdam6, University of California, Santa Cruz7, University of Sussex8, Tel Aviv University9, Technion – Israel Institute of Technology10, University of Oregon11, Stockholm University12, King's College London13, International Centre for Theoretical Physics14, AGH University of Science and Technology15, Brookhaven National Laboratory16, Northern Illinois University17, Ludwig Maximilian University of Munich18, Rutherford Appleton Laboratory19, University of Liverpool20, University of Belgrade21, University of Göttingen22, University of Granada23, Boston University24, Joint Institute for Nuclear Research25, University of Rome Tor Vergata26, Lund University27, University of Bologna28, University of Victoria29, University of Grenoble30, National University of La Plata31, CERN32, National Technical University of Athens33, University of Salento34, University of Chicago35, Columbia University36, University of Birmingham37, University of Naples Federico II38, University of Copenhagen39, University of Washington40, University of Valencia41, Lawrence Berkeley National Laboratory42, Federal University of Rio de Janeiro43, Brandeis University44, University of Michigan45, University of Coimbra46, University of Lisbon47, University of Sheffield48, University of Geneva49, University of Texas at Austin50, Heidelberg University51, University of Milan52, National and Kapodistrian University of Athens53, Dresden University of Technology54, Novosibirsk State University55, IFAE56
TL;DR: In this article, a combined ATLAS and CMS measurements of the Higgs boson production and decay rates, as well as constraints on its couplings to vector bosons and fermions, are presented.
Abstract: Combined ATLAS and CMS measurements of the Higgs boson production and decay rates, as well as constraints on its couplings to vector bosons and fermions, are presented. The combination is based on the analysis of five production processes, namely gluon fusion, vector boson fusion, and associated production with a $W$ or a $Z$ boson or a pair of top quarks, and of the six decay modes $H \to ZZ, WW$, $\gamma\gamma, \tau\tau, bb$, and $\mu\mu$. All results are reported assuming a value of 125.09 GeV for the Higgs boson mass, the result of the combined measurement by the ATLAS and CMS experiments. The analysis uses the CERN LHC proton--proton collision data recorded by the ATLAS and CMS experiments in 2011 and 2012, corresponding to integrated luminosities per experiment of approximately 5 fb$^{-1}$ at $\sqrt{s}=7$ TeV and 20 fb$^{-1}$ at $\sqrt{s} = 8$ TeV. The Higgs boson production and decay rates measured by the two experiments are combined within the context of three generic parameterisations: two based on cross sections and branching fractions, and one on ratios of coupling modifiers. Several interpretations of the measurements with more model-dependent parameterisations are also given. The combined signal yield relative to the Standard Model prediction is measured to be 1.09 $\pm$ 0.11. The combined measurements lead to observed significances for the vector boson fusion production process and for the $H \to \tau\tau$ decay of $5.4$ and $5.5$ standard deviations, respectively. The data are consistent with the Standard Model predictions for all parameterisations considered.

Journal ArticleDOI
19 Aug 2016
TL;DR: This review highlights and discusses current technical and scientific involvement of microorganisms in enzyme production and their present status in worldwide enzyme market.
Abstract: Biocatalytic potential of microorganisms have been employed for centuries to produce bread, wine, vinegar and other common products without understanding the biochemical basis of their ingredients. Microbial enzymes have gained interest for their widespread uses in industries and medicine owing to their stability, catalytic activity, and ease of production and optimization than plant and animal enzymes. The use of enzymes in various industries (e.g., food, agriculture, chemicals, and pharmaceuticals) is increasing rapidly due to reduced processing time, low energy input, cost effectiveness, nontoxic and eco-friendly characteristics. Microbial enzymes are capable of degrading toxic chemical compounds of industrial and domestic wastes (phenolic compounds, nitriles, amines etc.) either via degradation or conversion. Here in this review, we highlight and discuss current technical and scientific involvement of microorganisms in enzyme production and their present status in worldwide enzyme market.

01 Jan 2016
TL;DR: In this article, the authors combined satellite-based estimates, chemical transport model (CTM) simulations and ground measurements from 79 different countries to produce new global estimates of annual average fine particle (PM2.5) and ozone concentrations at 0.1° × 0. 1° spatial resolution for five-year intervals from 1990-2010 and the year 2013.
Abstract: Exposure to ambient air pollution is a major risk factor for global disease. Assessment of the impacts of air pollution on population health and the evaluation of trends relative to other major risk factors requires regularly updated, accurate, spatially resolved exposure estimates. We combined satellite-based estimates, chemical transport model (CTM) simulations and ground measurements from 79 different countries to produce new global estimates of annual average fine particle (PM2.5) and ozone concentrations at 0.1° × 0.1° spatial resolution for five-year intervals from 1990-2010 and the year 2013. These estimates were then applied to assess population-weighted mean concentrations for 1990 – 2013 for each of 188 countries. In 2013, 87% of the world’s population lived in areas exceeding the World Health Organization (WHO) Air Quality Guideline of 10 μg/m3 PM2.5 (annual average). Between 1990 and 2013, decreases in population-weighted mean concentrations of PM2.5 were evident in most high income countries, in contrast to increases estimated in South Asia, throughout much of Southeast Asia, and in China. Population-weighted mean concentrations of ozone increased in most countries from 1990 - 2013, with modest decreases in North America, parts of Europe, and several countries in Southeast Asia.


Journal ArticleDOI
01 Jan 2016-Science
TL;DR: Substantial barriers remain for the clinical application of selection-inverting treatment strategies, and the development of fast, genomic diagnostics that can identify not only the pathogen’s current resistance profile but also its future potential for evolution of resistance is required.
Abstract: Antibiotic treatment has two conflicting effects: the desired, immediate effect of inhibiting bacterial growth and the undesired, long-term effect of promoting the evolution of resistance. Although these contrasting outcomes seem inextricably linked, recent work has revealed several ways by which antibiotics can be combined to inhibit bacterial growth while, counterintuitively, selecting against resistant mutants. Decoupling treatment efficacy from the risk of resistance can be achieved by exploiting specific interactions between drugs, and the ways in which resistance mutations to a given drug can modulate these interactions or increase the sensitivity of the bacteria to other compounds. Although their practical application requires much further development and validation, and relies on advances in genomic diagnostics, these discoveries suggest novel paradigms that may restrict or even reverse the evolution of resistance.

Journal ArticleDOI
TL;DR: Therapy with candesartan at a dose of 16 mg per day plus hydrochlorothiazide at a doses of 12.5mg per day was not associated with a lower rate of major cardiovascular events than placebo among persons at intermediate risk who did not have cardiovascular disease.
Abstract: BackgroundAntihypertensive therapy reduces the risk of cardiovascular events among high-risk persons and among those with a systolic blood pressure of 160 mm Hg or higher, but its role in persons at intermediate risk and with lower blood pressure is unclear. MethodsIn one comparison from a 2-by-2 factorial trial, we randomly assigned 12,705 participants at intermediate risk who did not have cardiovascular disease to receive either candesartan at a dose of 16 mg per day plus hydrochlorothiazide at a dose of 12.5 mg per day or placebo. The first coprimary outcome was the composite of death from cardiovascular causes, nonfatal myocardial infarction, or nonfatal stroke; the second coprimary outcome additionally included resuscitated cardiac arrest, heart failure, and revascularization. The median follow-up was 5.6 years. ResultsThe mean blood pressure of the participants at baseline was 138.1/81.9 mm Hg; the decrease in blood pressure was 6.0/3.0 mm Hg greater in the active-treatment group than in the placebo...

Journal ArticleDOI
TL;DR: Vacuum Rabi splitting is demonstrated, a manifestation of strong coupling, using silver bowtie plasmonic cavities loaded with semiconductor quantum dots (QDs).
Abstract: Strong coupling at the limit of a single quantum emitter has not been reported. Here, Santhosh et al. show a transparency dip is observed in the scattering spectra of individual silver bowties with one to a few quantum dots, placing the plasmonic bowtie-quantum dot constructs close to the strong coupling regime.

Journal ArticleDOI
TL;DR: Machine learning combining clinical and CCTA data was found to predict 5-year all-cause mortality significantly better than existing clinical or C CTA metrics alone.
Abstract: Aims Traditional prognostic risk assessment in patients undergoing non-invasive imaging is based upon a limited selection of clinical and imaging findings Machine learning (ML) can consider a greater number and complexity of variables Therefore, we investigated the feasibility and accuracy of ML to predict 5-year all-cause mortality (ACM) in patients undergoing coronary computed tomographic angiography (CCTA), and compared the performance to existing clinical or CCTA metrics Methods and results The analysis included 10 030 patients with suspected coronary artery disease and 5-year follow-up from the COronary CT Angiography EvaluatioN For Clinical Outcomes: An InteRnational Multicenter registry All patients underwent CCTA as their standard of care Twenty-five clinical and 44 CCTA parameters were evaluated, including segment stenosis score (SSS), segment involvement score (SIS), modified Duke index (DI), number of segments with non-calcified, mixed or calcified plaques, age, sex, gender, standard cardiovascular risk factors, and Framingham risk score (FRS) Machine learning involved automated feature selection by information gain ranking, model building with a boosted ensemble algorithm, and 10-fold stratified cross-validation Seven hundred and forty-five patients died during 5-year follow-up Machine learning exhibited a higher area-under-curve compared with the FRS or CCTA severity scores alone (SSS, SIS, DI) for predicting all-cause mortality (ML: 079 vs FRS: 061, SSS: 064, SIS: 064, DI: 062; P < 0001) Conclusions Machine learning combining clinical and CCTA data was found to predict 5-year ACM significantly better than existing clinical or CCTA metrics alone

Journal ArticleDOI
TL;DR: This review article examined the recent crystal structures of ABC proteins to depict the functionally important structural elements, such as domains, conserved motifs, and critical amino acids that are involved in ATP-binding and drug efflux.

Journal ArticleDOI
Georges Aad1, Brad Abbott2, Jalal Abdallah3, Ovsat Abdinov4  +2828 moreInstitutions (191)
TL;DR: In this article, the performance of the ATLAS muon identification and reconstruction using the first LHC dataset recorded at s√ = 13 TeV in 2015 was evaluated using the Monte Carlo simulations.
Abstract: This article documents the performance of the ATLAS muon identification and reconstruction using the first LHC dataset recorded at s√ = 13 TeV in 2015. Using a large sample of J/ψ→μμ and Z→μμ decays from 3.2 fb−1 of pp collision data, measurements of the reconstruction efficiency, as well as of the momentum scale and resolution, are presented and compared to Monte Carlo simulations. The reconstruction efficiency is measured to be close to 99% over most of the covered phase space (|η| 2.2, the pT resolution for muons from Z→μμ decays is 2.9% while the precision of the momentum scale for low-pT muons from J/ψ→μμ decays is about 0.2%.

Journal ArticleDOI
09 Sep 2016-Science
TL;DR: The MEGA-plate provides a versatile platform for studying microbial adaption and directly visualizing evolutionary dynamics, and it is found that evolution is not always led by the most resistant mutants; highly resistant mutants may be trapped behind more sensitive lineages.
Abstract: A key aspect of bacterial survival is the ability to evolve while migrating across spatially varying environmental challenges. Laboratory experiments, however, often study evolution in well-mixed systems. Here, we introduce an experimental device, the microbial evolution and growth arena (MEGA)–plate, in which bacteria spread and evolved on a large antibiotic landscape (120 × 60 centimeters) that allowed visual observation of mutation and selection in a migrating bacterial front. While resistance increased consistently, multiple coexisting lineages diversified both phenotypically and genotypically. Analyzing mutants at and behind the propagating front, we found that evolution is not always led by the most resistant mutants; highly resistant mutants may be trapped behind more sensitive lineages. The MEGA-plate provides a versatile platform for studying microbial adaption and directly visualizing evolutionary dynamics.

Journal ArticleDOI
TL;DR: In this paper, the entanglement between the pairs of particles inside and outside a black hole has been studied, with tantalizing insights into the field of black hole thermodynamics.
Abstract: Hawking radiation is observed emanating from an analogue black hole, with measurements of the entanglement between the pairs of particles inside and outside the hole offering tantalizing insights into the field of black hole thermodynamics.

Journal ArticleDOI
TL;DR: This contribution investigates a new paradigm from machine learning, namely deep machine learning by examining design configurations of deep Convolutional Neural Networks and the impact of different hyper-parameter settings towards the accuracy of defect detection results.

Journal ArticleDOI
TL;DR: The aim is to demonstrate that using a double bond as a chemical handle, metal-assisted long-distance activation could be used as a powerful synthetic strategy, leading to a selective reaction at a position distal to the initial double bond.
Abstract: Exploiting the reactivity of one functional group within a molecule to generate a reaction at a different position is an ongoing challenge in organic synthesis. Effective remote functionalization protocols have the potential to provide access to almost any derivatives but are difficult to achieve. The difficulty is more pronounced for acyclic systems where flexible alkyl chains are present between the initiating functional group and the desired reactive centres. In this Review, we discuss the concept of remote functionalization of alkenes using metal complexes, leading to a selective reaction at a position distal to the initial double bond. We aim to show the vast opportunity provided by this growing field through selected and representative examples. Our aim is to demonstrate that using a double bond as a chemical handle, metal-assisted long-distance activation could be used as a powerful synthetic strategy.

Journal ArticleDOI
03 Jun 2016-Science
TL;DR: The alliance between the shared-aperture concepts and the geometric phase phenomenon arising from spin-orbit interaction provides a route to implement photonic spin-control multifunctional metasurfaces to improve functionality in photonics.
Abstract: The shared-aperture phased antenna array developed in the field of radar applications is a promising approach for increased functionality in photonics. The alliance between the shared-aperture concepts and the geometric phase phenomenon arising from spin-orbit interaction provides a route to implement photonic spin-control multifunctional metasurfaces. We adopted a thinning technique within the shared-aperture synthesis and investigated interleaved sparse nanoantenna matrices and the spin-enabled asymmetric harmonic response to achieve helicity-controlled multiple structured wavefronts such as vortex beams carrying orbital angular momentum. We used multiplexed geometric phase profiles to simultaneously measure spectrum characteristics and the polarization state of light, enabling integrated on-chip spectropolarimetric analysis. The shared-aperture metasurface platform opens a pathway to novel types of nanophotonic functionality.

Book ChapterDOI
22 Feb 2016
TL;DR: This work studies decentralized cryptocurrency protocols in which the participants do not deplete physical scarce resources, and presents their novel pure Proof of Stake protocols, arguing that they help in mitigating problems that the existing protocols exhibit.
Abstract: We study decentralized cryptocurrency protocols in which the participants do not deplete physical scarce resources. Such protocols commonly rely on Proof of Stake, i.e., on mechanisms that extend voting power to the stakeholders of the system. We offer analysis of existing protocols that have a substantial amount of popularity. We then present our novel pure Proof of Stake protocols, and argue that they help in mitigating problems that the existing protocols exhibit.