scispace - formally typeset
Search or ask a question
Institution

Technion – Israel Institute of Technology

EducationHaifa, Israel
About: Technion – Israel Institute of Technology is a education organization based out in Haifa, Israel. It is known for research contribution in the topics: Population & Nonlinear system. The organization has 31714 authors who have published 79377 publications receiving 2603976 citations. The organization is also known as: Technion Israel Institute of Technology & Ṭekhniyon, Makhon ṭekhnologi le-Yiśraʼel.


Papers
More filters
Journal ArticleDOI
TL;DR: These findings show that human ES cells have great potential to become an unlimited cell source for neurons in culture, and may then be used in transplantation therapies for neural pathologies.

466 citations

Journal ArticleDOI
15 Apr 2005-Polymer
TL;DR: In this paper, the authors describe the results of the experimental investigation and modeling of multiple jets during the electrospinning of polymer solutions, and demonstrate how the external electric fields and mutual electric interaction of multiple charged jets influence their path and evolution during electro-spinning.

466 citations

Journal ArticleDOI
TL;DR: In this paper, an experimental study is presented to evaluate the effectiveness of micro-surface structure, produced by laser texturing, to improve tribological properties of reciprocating automotive components, including piston rings and cylinder linings.
Abstract: An experimental study is presented to evaluate the effectiveness of micro-surface structure, produced by laser texturing, to improve tribological properties of reciprocating automotive components. The test rig and test specimens are described and some test results are presented. Good correlation is found with theoretical prediction of friction reduction on a simple, yet representative, test specimen. Potential benefit of the laser surface texturing under conditions of lubricant starvation is also presented. Finally, friction reduction with actual production piston rings and cylinder liner segments is demonstrated. Presented at the 57th Annual Meeting in Houston, Texas May 19–23, 2002

465 citations

Proceedings ArticleDOI
13 Oct 2003
TL;DR: This work exploits a recently proposed approximation technique, locality-sensitive hashing (LSH), to reduce the computational complexity of adaptive mean shift and implements the implementation of LSH, where the optimal parameters of the data structure are determined by a pilot learning procedure, and the partitions are data driven.
Abstract: Feature space analysis is the main module in many computer vision tasks. The most popular technique, k-means clustering, however, has two inherent limitations: the clusters are constrained to be spherically symmetric and their number has to be known a priori. In nonparametric clustering methods, like the one based on mean shift, these limitations are eliminated but the amount of computation becomes prohibitively large as the dimension of the space increases. We exploit a recently proposed approximation technique, locality-sensitive hashing (LSH), to reduce the computational complexity of adaptive mean shift. In our implementation of LSH the optimal parameters of the data structure are determined by a pilot learning procedure, and the partitions are data driven. As an application, the performance of mode and k-means based textons are compared in a texture classification study.

465 citations

Journal ArticleDOI
01 Apr 1997
TL;DR: It is constructively proved that the NARX networks with a finite number of parameters are computationally as strong as fully connected recurrent networks and thus Turing machines, raising the issue of what amount of feedback or recurrence is necessary for any network to be Turing equivalent and what restrictions on feedback limit computational power.
Abstract: Recently, fully connected recurrent neural networks have been proven to be computationally rich-at least as powerful as Turing machines. This work focuses on another network which is popular in control applications and has been found to be very effective at learning a variety of problems. These networks are based upon Nonlinear AutoRegressive models with eXogenous Inputs (NARX models), and are therefore called NARX networks. As opposed to other recurrent networks, NARX networks have a limited feedback which comes only from the output neuron rather than from hidden states. They are formalized by y(t)=/spl Psi/(u(t-n/sub u/), ..., u(t-1), u(t), y(t-n/sub y/), ..., y(t-1)) where u(t) and y(t) represent input and output of the network at time t, n/sub u/ and n/sub y/ are the input and output order, and the function /spl Psi/ is the mapping performed by a Multilayer Perceptron. We constructively prove that the NARX networks with a finite number of parameters are computationally as strong as fully connected recurrent networks and thus Turing machines. We conclude that in theory one can use the NARX models, rather than conventional recurrent networks without any computational loss even though their feedback is limited. Furthermore, these results raise the issue of what amount of feedback or recurrence is necessary for any network to be Turing equivalent and what restrictions on feedback limit computational power.

462 citations


Authors

Showing all 31937 results

NameH-indexPapersCitations
Robert Langer2812324326306
Nicholas G. Martin1921770161952
Tobin J. Marks1591621111604
Grant W. Montgomery157926108118
David Eisenberg156697112460
David J. Mooney15669594172
Dirk Inzé14964774468
Jerrold M. Olefsky14359577356
Joseph J.Y. Sung142124092035
Deborah Estrin135562106177
Bruce Yabsley133119184889
Jerry W. Shay13363974774
Richard N. Bergman13047791718
Shlomit Tarem129130686919
Allen Mincer129104080059
Network Information
Related Institutions (5)
Imperial College London
209.1K papers, 9.3M citations

93% related

Massachusetts Institute of Technology
268K papers, 18.2M citations

92% related

University of Illinois at Urbana–Champaign
225.1K papers, 10.1M citations

92% related

Stanford University
320.3K papers, 21.8M citations

92% related

University of Toronto
294.9K papers, 13.5M citations

92% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
2023147
2022390
20213,397
20203,526
20193,273
20183,131