Institution
Technion – Israel Institute of Technology
Education•Haifa, Israel•
About: Technion – Israel Institute of Technology is a education organization based out in Haifa, Israel. It is known for research contribution in the topics: Population & Nonlinear system. The organization has 31714 authors who have published 79377 publications receiving 2603976 citations. The organization is also known as: Technion Israel Institute of Technology & Ṭekhniyon, Makhon ṭekhnologi le-Yiśraʼel.
Papers published on a yearly basis
Papers
More filters
••
TL;DR: The implementation of this framework in a software application, termed DRIM (discovery of rank imbalanced motifs), which identifies sequence motifs in lists of ranked DNA sequences, is demonstrated, demonstrating that the statistical framework embodied in the DRIM software tool is highly effective for identifying regulatory sequence elements in a variety of applications.
Abstract: Computational methods for discovery of sequence elements that are enriched in a target set compared with a background set are fundamental in molecular biology research. One example is the discovery of transcription factor binding motifs that are inferred from ChIP–chip (chromatin immuno-precipitation on a microarray) measurements. Several major challenges in sequence motif discovery still require consideration: (i) the need for a principled approach to partitioning the data into target and background sets; (ii) the lack of rigorous models and of an exact p-value for measuring motif enrichment; (iii) the need for an appropriate framework for accounting for motif multiplicity; (iv) the tendency, in many of the existing methods, to report presumably significant motifs even when applied to randomly generated data. In this paper we present a statistical framework for discovering enriched sequence elements in ranked lists that resolves these four issues. We demonstrate the implementation of this framework in a software application, termed DRIM (discovery of rank imbalanced motifs), which identifies sequence motifs in lists of ranked DNA sequences. We applied DRIM to ChIP–chip and CpG methylation data and obtained the following results. (i) Identification of 50 novel putative transcription factor (TF) binding sites in yeast ChIP–chip data. The biological function of some of them was further investigated to gain new insights on transcription regulation networks in yeast. For example, our discoveries enable the elucidation of the network of the TF ARO80. Another finding concerns a systematic TF binding enhancement to sequences containing CA repeats. (ii) Discovery of novel motifs in human cancer CpG methylation data. Remarkably, most of these motifs are similar to DNA sequence elements bound by the Polycomb complex that promotes histone methylation. Our findings thus support a model in which histone methylation and CpG methylation are mechanistically linked. Overall, we demonstrate that the statistical framework embodied in the DRIM software tool is highly effective for identifying regulatory sequence elements in a variety of applications ranging from expression and ChIP–chip to CpG methylation data. DRIM is publicly available at http://bioinfo.cs.technion.ac.il/drim.
687 citations
••
TL;DR: The rationale for the development of reconfigurable manufacturing systems, which possess the advantages both of dedicated lines and of flexible systems, has been explained in this article, and a rigorous mathematical method is introduced for designing RMS with this recommended structure.
686 citations
••
TL;DR: Shadow arrays are introduced which keep track of the incremental changes to the synaptic weights during a single pass of back-propagating learning and are ordered by decreasing sensitivity numbers so that the network can be efficiently pruned by discarding the last items of the sorted list.
Abstract: The sensitivity of the global error (cost) function to the inclusion/exclusion of each synapse in the artificial neural network is estimated. Introduced are shadow arrays which keep track of the incremental changes to the synaptic weights during a single pass of back-propagating learning. The synapses are then ordered by decreasing sensitivity numbers so that the network can be efficiently pruned by discarding the last items of the sorted list. Unlike previous approaches, this simple procedure does not require a modification of the cost function, does not interfere with the learning process, and demands a negligible computational overhead. >
684 citations
••
TL;DR: In this paper, the results of the four LEP experiments were combined to determine fundamental properties of the W boson and the electroweak theory, including the branching fraction of W and the trilinear gauge-boson self-couplings.
684 citations
••
04 Oct 2019
TL;DR: In this article, the authors consider a probabilistic Turing machine, where all players agree on a single string y, selected with the right probability distribution, as M's output, while keeping the maximum possible pniracy about them.
Abstract: Permission to copy without fee all or part of this material is granted provided that the copies are not made or Idistributed for direct commercial advantage, the ACM copyright notice and the title of the publication and its date appear, and notice is given that copying is by permission of the Association for Computing Machimery. To copy otherwise, or to republish, requires a fee and/or specfic permission. correctly run a given Turing machine hi on these 2;‘s while keeping the maximum possible pniracy about them. That is, they want to compute Y~(~l,..., 2,) without revealing more about the Zi’s than it is already contained in the value y itself. For instance, if M computes the sum of the q’s, every single player should not be able to learn more than the sum of the inputs of the other parties. Here A4 ma.y very well be a probabilistic Turing machine. In this case, all playen want to agree on a single string y, selected with the right probability distribution, as M’s output.
682 citations
Authors
Showing all 31937 results
Name | H-index | Papers | Citations |
---|---|---|---|
Robert Langer | 281 | 2324 | 326306 |
Nicholas G. Martin | 192 | 1770 | 161952 |
Tobin J. Marks | 159 | 1621 | 111604 |
Grant W. Montgomery | 157 | 926 | 108118 |
David Eisenberg | 156 | 697 | 112460 |
David J. Mooney | 156 | 695 | 94172 |
Dirk Inzé | 149 | 647 | 74468 |
Jerrold M. Olefsky | 143 | 595 | 77356 |
Joseph J.Y. Sung | 142 | 1240 | 92035 |
Deborah Estrin | 135 | 562 | 106177 |
Bruce Yabsley | 133 | 1191 | 84889 |
Jerry W. Shay | 133 | 639 | 74774 |
Richard N. Bergman | 130 | 477 | 91718 |
Shlomit Tarem | 129 | 1306 | 86919 |
Allen Mincer | 129 | 1040 | 80059 |