scispace - formally typeset
Search or ask a question
Institution

IBM

CompanyArmonk, New York, United States
About: IBM is a company organization based out in Armonk, New York, United States. It is known for research contribution in the topics: Layer (electronics) & Cache. The organization has 134567 authors who have published 253905 publications receiving 7458795 citations. The organization is also known as: International Business Machines Corporation & Big Blue.


Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, Ba−La−Cu−O system, with the composition BaxLa5−xCu5O5(3−y) have been prepared in polycrystalline form, and samples with x=1 and 0.75,y>0, annealed below 900°C under reducing conditions, consist of three phases, one of them a perovskite-like mixed-valent copper compound.
Abstract: Metallic, oxygen-deficient compounds in the Ba−La−Cu−O system, with the composition BaxLa5−xCu5O5(3−y) have been prepared in polycrystalline form. Samples withx=1 and 0.75,y>0, annealed below 900°C under reducing conditions, consist of three phases, one of them a perovskite-like mixed-valent copper compound. Upon cooling, the samples show a linear decrease in resistivity, then an approximately logarithmic increase, interpreted as a beginning of localization. Finally an abrupt decrease by up to three orders of magnitude occurs, reminiscent of the onset of percolative superconductivity. The highest onset temperature is observed in the 30 K range. It is markedly reduced by high current densities. Thus, it results partially from the percolative nature, bute possibly also from 2D superconducting fluctuations of double perovskite layers of one of the phases present.

10,272 citations

Book
J.P. Biersack, James F. Ziegler1
01 Aug 1985
TL;DR: A review of existing widely-cited tables of ion stopping and ranges can be found in this paper, where a brief exposition of what can be determined by modern calculations is given.
Abstract: The stopping and range of ions in matter is physically very complex, and there are few simple approximations which are accurate. However, if modern calculations are performed, the ion distributions can be calculated with good accuracy, typically better than 10%. This review will be in several sections: a) A brief exposition of what can be determined by modern calculations. b) A review of existing widely-cited tables of ion stopping and ranges. c) A review of the calculation of accurate ion stopping powers.

10,060 citations

Journal ArticleDOI
Peter D. Welch1
TL;DR: In this article, the use of the fast Fourier transform in power spectrum analysis is described, and the method involves sectioning the record and averaging modified periodograms of the sections.
Abstract: The use of the fast Fourier transform in power spectrum analysis is described. Principal advantages of this method are a reduction in the number of computations and in required core storage, and convenient application in nonstationarity tests. The method involves sectioning the record and averaging modified periodograms of the sections.

9,705 citations

Journal ArticleDOI
TL;DR: This article provides an overview of progress and represents the shared views of four research groups that have had recent successes in using DNNs for acoustic modeling in speech recognition.
Abstract: Most current speech recognition systems use hidden Markov models (HMMs) to deal with the temporal variability of speech and Gaussian mixture models (GMMs) to determine how well each state of each HMM fits a frame or a short window of frames of coefficients that represents the acoustic input. An alternative way to evaluate the fit is to use a feed-forward neural network that takes several frames of coefficients as input and produces posterior probabilities over HMM states as output. Deep neural networks (DNNs) that have many hidden layers and are trained using new methods have been shown to outperform GMMs on a variety of speech recognition benchmarks, sometimes by a large margin. This article provides an overview of this progress and represents the shared views of four research groups that have had recent successes in using DNNs for acoustic modeling in speech recognition.

9,091 citations

Journal ArticleDOI
TL;DR: A comprehensive description of the primal-dual interior-point algorithm with a filter line-search method for nonlinear programming is provided, including the feasibility restoration phase for the filter method, second-order corrections, and inertia correction of the KKT matrix.
Abstract: We present a primal-dual interior-point algorithm with a filter line-search method for nonlinear programming. Local and global convergence properties of this method were analyzed in previous work. Here we provide a comprehensive description of the algorithm, including the feasibility restoration phase for the filter method, second-order corrections, and inertia correction of the KKT matrix. Heuristics are also considered that allow faster performance. This method has been implemented in the IPOPT code, which we demonstrate in a detailed numerical study based on 954 problems from the CUTEr test set. An evaluation is made of several line-search options, and a comparison is provided with two state-of-the-art interior-point codes for nonlinear programming.

7,966 citations


Authors

Showing all 134658 results

NameH-indexPapersCitations
Zhong Lin Wang2452529259003
Anil K. Jain1831016192151
Hyun-Chul Kim1764076183227
Rodney S. Ruoff164666194902
Tobin J. Marks1591621111604
Jean M. J. Fréchet15472690295
Albert-László Barabási152438200119
György Buzsáki15044696433
Stanislas Dehaene14945686539
Philip S. Yu1481914107374
James M. Tour14385991364
Thomas P. Russell141101280055
Naomi J. Halas14043582040
Steven G. Louie13777788794
Daphne Koller13536771073
Network Information
Related Institutions (5)
Carnegie Mellon University
104.3K papers, 5.9M citations

93% related

Georgia Institute of Technology
119K papers, 4.6M citations

92% related

Bell Labs
59.8K papers, 3.1M citations

90% related

Microsoft
86.9K papers, 4.1M citations

89% related

Massachusetts Institute of Technology
268K papers, 18.2M citations

88% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
202330
2022137
20213,163
20206,336
20196,427
20186,278