scispace - formally typeset
Search or ask a question
Institution

Waseda University

EducationTokyo, Japan
About: Waseda University is a education organization based out in Tokyo, Japan. It is known for research contribution in the topics: Large Hadron Collider & Catalysis. The organization has 24220 authors who have published 46859 publications receiving 837855 citations. The organization is also known as: Waseda daigaku & Sōdai.


Papers
More filters
Proceedings ArticleDOI
07 Dec 2015
TL;DR: This paper uses Convolutional Neural Networks to learn discriminant patch representations and in particular train a Siamese network with pairs of (non-)corresponding patches to develop 128-D descriptors whose euclidean distances reflect patch similarity and can be used as a drop-in replacement for any task involving SIFT.
Abstract: Deep learning has revolutionalized image-level tasks such as classification, but patch-level tasks, such as correspondence, still rely on hand-crafted features, e.g. SIFT. In this paper we use Convolutional Neural Networks (CNNs) to learn discriminant patch representations and in particular train a Siamese network with pairs of (non-)corresponding patches. We deal with the large number of potential pairs with the combination of a stochastic sampling of the training set and an aggressive mining strategy biased towards patches that are hard to classify. By using the L2 distance during both training and testing we develop 128-D descriptors whose euclidean distances reflect patch similarity, and which can be used as a drop-in replacement for any task involving SIFT. We demonstrate consistent performance gains over the state of the art, and generalize well against scaling and rotation, perspective transformation, non-rigid deformation, and illumination changes. Our descriptors are efficient to compute and amenable to modern GPUs, and are publicly available.

848 citations

Journal ArticleDOI
Markus Ackermann, Marco Ajello1, Alice Allafort2, Luca Baldini3  +197 moreInstitutions (42)
15 Feb 2013-Science
TL;DR: The characteristic pion-decay feature is detected in the gamma-ray spectra of two SNRs, IC 443 and W44, with the Fermi Large Area Telescope, providing direct evidence that cosmic-ray protons are accelerated in SNRs.
Abstract: Cosmic rays are particles (mostly protons) accelerated to relativistic speeds. Despite wide agreement that supernova remnants (SNRs) are the sources of galactic cosmic rays, unequivocal evidence for the acceleration of protons in these objects is still lacking. When accelerated protons encounter interstellar material, they produce neutral pions, which in turn decay into gamma rays. This offers a compelling way to detect the acceleration sites of protons. The identification of pion-decay gamma rays has been difficult because high-energy electrons also produce gamma rays via bremsstrahlung and inverse Compton scattering. We detected the characteristic pion-decay feature in the gamma-ray spectra of two SNRs, IC 443 and W44, with the Fermi Large Area Telescope. This detection provides direct evidence that cosmic-ray protons are accelerated in SNRs.

846 citations

Book
28 Nov 1988
TL;DR: This book contains various types of mathematical descriptions of curves and surfaces, such as Ferguson, Coons, Spline, Bzier and B-spline curves and surface, in a unified way so that beginners can easily understand the whole spectrum of parametric curve and surfaces.
Abstract: This book contains various types of mathematical descriptions of curves and surfaces, such as Ferguson, Coons, Spline, Bzier and B-spline curves and surfaces. The materials are classified and arranged in a unified way so that beginners can easily understand the whole spectrum of parametric curves and surfaces. This book will be useful to many researchers, designers, teachers, and students who are working on curves and surfaces. The book can be used as a textbook in computer aided design classes.

825 citations

Journal ArticleDOI
TL;DR: It is shown that normal HSCs maintain intracellular hypoxia and stabilize Hypoxia-inducible factor-1 alpha (HIF-1alpha) protein, which indicates that H SCs maintain cell cycle quiescence through the precise regulation of HIF- 1alpha levels.

808 citations

Proceedings ArticleDOI
30 Mar 2018
TL;DR: In this article, a new open source platform for end-to-end speech processing named ESPnet is introduced, which mainly focuses on automatic speech recognition (ASR), and adopts widely used dynamic neural network toolkits, Chainer and PyTorch, as a main deep learning engine.
Abstract: This paper introduces a new open source platform for end-to-end speech processing named ESPnet. ESPnet mainly focuses on end-to-end automatic speech recognition (ASR), and adopts widely-used dynamic neural network toolkits, Chainer and PyTorch, as a main deep learning engine. ESPnet also follows the Kaldi ASR toolkit style for data processing, feature extraction/format, and recipes to provide a complete setup for speech recognition and other speech processing experiments. This paper explains a major architecture of this software platform, several important functionalities, which differentiate ESPnet from other open source ASR toolkits, and experimental results with major ASR benchmarks.

806 citations


Authors

Showing all 24378 results

NameH-indexPapersCitations
Yusuke Nakamura1792076160313
Yoshio Bando147123480883
Charles Maguire142119795026
Kazunori Kataoka13890870412
Senta Greene134134690697
Intae Yu134137289870
Kohei Yorita131138991177
Wei Xie128128177097
Susumu Kitagawa12580969594
Leon O. Chua12282471612
Jun Kataoka12160354274
S. Youssef12068365110
Katsuhiko Mikoshiba12086662394
Yusuke Yamauchi117100051685
Teruo Okano11747647081
Network Information
Related Institutions (5)
Tokyo Institute of Technology
101.6K papers, 2.3M citations

96% related

University of Tsukuba
79.4K papers, 1.9M citations

94% related

University of Tokyo
337.5K papers, 10.1M citations

94% related

Osaka University
185.6K papers, 5.1M citations

94% related

Nagoya University
128.2K papers, 3.2M citations

93% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
202380
2022237
20212,347
20202,467
20192,367
20182,289