scispace - formally typeset
Search or ask a question
Institution

Bell Labs

Company
About: Bell Labs is a based out in . It is known for research contribution in the topics: Laser & Optical fiber. The organization has 36499 authors who have published 59862 publications receiving 3190823 citations. The organization is also known as: Bell Laboratories & AT&T Bell Laboratories.


Papers
More filters
Journal ArticleDOI
S. P. Lloyd1
TL;DR: In this article, the authors derived necessary conditions for any finite number of quanta and associated quantization intervals of an optimum finite quantization scheme to achieve minimum average quantization noise power.
Abstract: It has long been realized that in pulse-code modulation (PCM), with a given ensemble of signals to handle, the quantum values should be spaced more closely in the voltage regions where the signal amplitude is more likely to fall. It has been shown by Panter and Dite that, in the limit as the number of quanta becomes infinite, the asymptotic fractional density of quanta per unit voltage should vary as the one-third power of the probability density per unit voltage of signal amplitudes. In this paper the corresponding result for any finite number of quanta is derived; that is, necessary conditions are found that the quanta and associated quantization intervals of an optimum finite quantization scheme must satisfy. The optimization criterion used is that the average quantization noise power be a minimum. It is shown that the result obtained here goes over into the Panter and Dite result as the number of quanta become large. The optimum quautization schemes for 2^{b} quanta, b=1,2, \cdots, 7 , are given numerically for Gaussian and for Laplacian distribution of signal amplitudes.

11,872 citations

Proceedings ArticleDOI
01 Jul 1992
TL;DR: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented, applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions.
Abstract: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented. The technique is applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions. The effective number of parameters is adjusted automatically to match the complexity of the problem. The solution is expressed as a linear combination of supporting patterns. These are the subset of training patterns that are closest to the decision boundary. Bounds on the generalization performance based on the leave-one-out method and the VC-dimension are given. Experimental results on optical character recognition problems demonstrate the good generalization obtained when compared with other learning algorithms.

11,211 citations

Journal ArticleDOI
William S. Cleveland1
TL;DR: Robust locally weighted regression as discussed by the authors is a method for smoothing a scatterplot, in which the fitted value at z k is the value of a polynomial fit to the data using weighted least squares, where the weight for (x i, y i ) is large if x i is close to x k and small if it is not.
Abstract: The visual information on a scatterplot can be greatly enhanced, with little additional cost, by computing and plotting smoothed points. Robust locally weighted regression is a method for smoothing a scatterplot, (x i , y i ), i = 1, …, n, in which the fitted value at z k is the value of a polynomial fit to the data using weighted least squares, where the weight for (x i , y i ) is large if x i is close to x k and small if it is not. A robust fitting procedure is used that guards against deviant points distorting the smoothed points. Visual, computational, and statistical issues of robust locally weighted regression are discussed. Several examples, including data on lead intoxication, are used to illustrate the methodology.

10,225 citations

Journal ArticleDOI
TL;DR: This paper demonstrates how constraints from the task domain can be integrated into a backpropagation network through the architecture of the network, successfully applied to the recognition of handwritten zip code digits provided by the U.S. Postal Service.
Abstract: The ability of learning networks to generalize can be greatly enhanced by providing constraints from the task domain. This paper demonstrates how such constraints can be integrated into a backpropagation network through the architecture of the network. This approach has been successfully applied to the recognition of handwritten zip code digits provided by the U.S. Postal Service. A single network learns the entire recognition operation, going from the normalized image of the character to the final classification.

9,775 citations

Journal ArticleDOI
Philip W. Anderson1
TL;DR: In this article, a simple model for spin diffusion or conduction in the "impurity band" is presented, which involves transport in a lattice which is in some sense random, and in them diffusion is expected to take place via quantum jumps between localized sites.
Abstract: This paper presents a simple model for such processes as spin diffusion or conduction in the "impurity band." These processes involve transport in a lattice which is in some sense random, and in them diffusion is expected to take place via quantum jumps between localized sites. In this simple model the essential randomness is introduced by requiring the energy to vary randomly from site to site. It is shown that at low enough densities no diffusion at all can take place, and the criteria for transport to occur are given.

9,647 citations


Authors

Showing all 36526 results

NameH-indexPapersCitations
Yoshua Bengio2021033420313
David R. Williams1782034138789
John A. Rogers1771341127390
Zhenan Bao169865106571
Stephen R. Forrest1481041111816
Bernhard Schölkopf1481092149492
Thomas S. Huang1461299101564
Kurt Wüthrich143739103253
John D. Joannopoulos137956100831
Steven G. Louie13777788794
Joss Bland-Hawthorn136111477593
Marvin L. Cohen13497987767
Federico Capasso134118976957
Christos Faloutsos12778977746
Robert J. Cava125104271819
Network Information
Related Institutions (5)
IBM
253.9K papers, 7.4M citations

90% related

Georgia Institute of Technology
119K papers, 4.6M citations

89% related

University of California, Santa Barbara
80.8K papers, 4.6M citations

89% related

Massachusetts Institute of Technology
268K papers, 18.2M citations

88% related

Princeton University
146.7K papers, 9.1M citations

87% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
20233
202245
2021479
2020712
2019750
2018862