scispace - formally typeset
Search or ask a question
Institution

Technion – Israel Institute of Technology

EducationHaifa, Israel
About: Technion – Israel Institute of Technology is a education organization based out in Haifa, Israel. It is known for research contribution in the topics: Population & Upper and lower bounds. The organization has 31714 authors who have published 79377 publications receiving 2603976 citations. The organization is also known as: Technion Israel Institute of Technology & Ṭekhniyon, Makhon ṭekhnologi le-Yiśraʼel.


Papers
More filters
Journal ArticleDOI
TL;DR: In this article, a real-world dataset of positive severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) test results after inoculation with the BNT162b2 messenger RNA vaccine was analyzed, and the authors found that the viral load was substantially reduced for infections occurring 12-37 days after the first dose of vaccine.
Abstract: Beyond their substantial protection of individual vaccinees, coronavirus disease 2019 (COVID-19) vaccines might reduce viral load in breakthrough infection and thereby further suppress onward transmission. In this analysis of a real-world dataset of positive severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) test results after inoculation with the BNT162b2 messenger RNA vaccine, we found that the viral load was substantially reduced for infections occurring 12–37 d after the first dose of vaccine. These reduced viral loads hint at a potentially lower infectiousness, further contributing to vaccine effect on virus spread. Breakthrough infections of SARS-CoV-2 occurring 12 or more days after the first dose of the BNT162b2 mRNA vaccine were associated with lower viral loads than those found in unvaccinated individuals, suggesting that the vaccine might reduce infectiousness.

373 citations

Proceedings Article
01 May 2017
TL;DR: In this paper, a random walk on a random landscape (RWN) model is proposed to solve the generalization gap problem in the training of deep learning models, which is known to exhibit similar "ultra-slow" diffusion behavior.
Abstract: Background: Deep learning models are typically trained using stochastic gradient descent or one of its variants. These methods update the weights using their gradient, estimated from a small fraction of the training data. It has been observed that when using large batch sizes there is a persistent degradation in generalization performance - known as the "generalization gap" phenomenon. Identifying the origin of this gap and closing it had remained an open problem. Contributions: We examine the initial high learning rate training phase. We find that the weight distance from its initialization grows logarithmically with the number of weight updates. We therefore propose a "random walk on a random landscape" statistical model which is known to exhibit similar "ultra-slow" diffusion behavior. Following this hypothesis we conducted experiments to show empirically that the "generalization gap" stems from the relatively small number of updates rather than the batch size, and can be completely eliminated by adapting the training regime used. We further investigate different techniques to train models in the large-batch regime and present a novel algorithm named "Ghost Batch Normalization" which enables significant decrease in the generalization gap without increasing the number of updates. To validate our findings we conduct several additional experiments on MNIST, CIFAR-10, CIFAR-100 and ImageNet. Finally, we reassess common practices and beliefs concerning training of deep models and suggest they may not be optimal to achieve good generalization.

373 citations

Journal ArticleDOI
TL;DR: In this article, the authors apply the theoretical concepts of local isotropy to explain the behavior of liquid in liquid dispersions, subjected to turbulent agitation, and compare the influence of turbulence on both breakup and coalescence of individual droplets.
Abstract: The present paper is concerned with the conditions of flow in tanks containing stirred fluids An attempt is made to apply the theoretical concepts of local isotropy to explain the behaviour of liquid in liquid dispersions, subjected to turbulent agitation Relations describing quantitatively the influence of turbulence on both break-up and coalescence of individual droplets are derived and are compared with experimental evidence A special type of dispersion is described in which droplet size is controlled by the prevention of coalescence due to turbulence The dependence of droplet size on energy dissipation per unit mass, as predicted by the theory of local isotropy, is put to an experimental test using geometrically similar vessels of different sizesThough the results are not entirely conclusive, experimental evidence suggests that the hypothesis of locally isotropic flow may be applicable to the flow conditions described in the paper, and that statistical theories of turbulence can be of practical value in estimating droplet sizes in agitated dispersions

373 citations

Journal ArticleDOI
TL;DR: An elastic-plastic model for contacting rough surfaces that is based on accurate Finite Element Analysis (FEA) of an elasticplastic single asperity contact is presented in this paper.
Abstract: An elastic-plastic model for contacting rough surfaces that is based on accurate Finite Element Analysis (FEA) of an elastic-plastic single asperity contact is presented. The plasticity index π is ...

373 citations

Book ChapterDOI
20 Jan 1997
TL;DR: A new optimized standard implementation of DES on 64-bit processors is described, which is about twice faster than the fastest known standard DES implementation on the same processor.
Abstract: In this paper we describe a fast new DES implementation. This implementation is about five times faster than the fastest known DES implementation on a (64-bit) Alpha computer, and about three times faster than than our new optimized DES implementation on 64-bit computers. This implementation uses a non-standard representation, and view the processor as a SIMD computer, i.e., as 64 parallel one-bit processors computing the same instruction. We also discuss the application of this implementation to other ciphers. We describe a new optimized standard implementation of DES on 64-bit processors, which is about twice faster than the fastest known standard DES implementation on the same processor. Our implementations can also be used for fast exhaustive search in software, which can find a key in only a few days or a few weeks on existing parallel computers and computer networks.

372 citations


Authors

Showing all 31937 results

NameH-indexPapersCitations
Robert Langer2812324326306
Nicholas G. Martin1921770161952
Tobin J. Marks1591621111604
Grant W. Montgomery157926108118
David Eisenberg156697112460
David J. Mooney15669594172
Dirk Inzé14964774468
Jerrold M. Olefsky14359577356
Joseph J.Y. Sung142124092035
Deborah Estrin135562106177
Bruce Yabsley133119184889
Jerry W. Shay13363974774
Richard N. Bergman13047791718
Shlomit Tarem129130686919
Allen Mincer129104080059
Network Information
Related Institutions (5)
Imperial College London
209.1K papers, 9.3M citations

93% related

Massachusetts Institute of Technology
268K papers, 18.2M citations

92% related

University of Illinois at Urbana–Champaign
225.1K papers, 10.1M citations

92% related

Stanford University
320.3K papers, 21.8M citations

92% related

University of Toronto
294.9K papers, 13.5M citations

92% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
2023147
2022390
20213,397
20203,526
20193,273
20183,131