Institution
Bell Labs
Company•
About: Bell Labs is a based out in . It is known for research contribution in the topics: Laser & Optical fiber. The organization has 36499 authors who have published 59862 publications receiving 3190823 citations. The organization is also known as: Bell Laboratories & AT&T Bell Laboratories.
Topics: Laser, Optical fiber, Amplifier, Semiconductor laser theory, Signal
Papers published on a yearly basis
Papers
More filters
••
TL;DR: In this article, the authors derived necessary conditions for any finite number of quanta and associated quantization intervals of an optimum finite quantization scheme to achieve minimum average quantization noise power.
Abstract: It has long been realized that in pulse-code modulation (PCM), with a given ensemble of signals to handle, the quantum values should be spaced more closely in the voltage regions where the signal amplitude is more likely to fall. It has been shown by Panter and Dite that, in the limit as the number of quanta becomes infinite, the asymptotic fractional density of quanta per unit voltage should vary as the one-third power of the probability density per unit voltage of signal amplitudes. In this paper the corresponding result for any finite number of quanta is derived; that is, necessary conditions are found that the quanta and associated quantization intervals of an optimum finite quantization scheme must satisfy. The optimization criterion used is that the average quantization noise power be a minimum. It is shown that the result obtained here goes over into the Panter and Dite result as the number of quanta become large. The optimum quautization schemes for 2^{b} quanta, b=1,2, \cdots, 7 , are given numerically for Gaussian and for Laplacian distribution of signal amplitudes.
11,872 citations
••
01 Jul 1992TL;DR: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented, applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions.
Abstract: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented. The technique is applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions. The effective number of parameters is adjusted automatically to match the complexity of the problem. The solution is expressed as a linear combination of supporting patterns. These are the subset of training patterns that are closest to the decision boundary. Bounds on the generalization performance based on the leave-one-out method and the VC-dimension are given. Experimental results on optical character recognition problems demonstrate the good generalization obtained when compared with other learning algorithms.
11,211 citations
••
TL;DR: Robust locally weighted regression as discussed by the authors is a method for smoothing a scatterplot, in which the fitted value at z k is the value of a polynomial fit to the data using weighted least squares, where the weight for (x i, y i ) is large if x i is close to x k and small if it is not.
Abstract: The visual information on a scatterplot can be greatly enhanced, with little additional cost, by computing and plotting smoothed points. Robust locally weighted regression is a method for smoothing a scatterplot, (x i , y i ), i = 1, …, n, in which the fitted value at z k is the value of a polynomial fit to the data using weighted least squares, where the weight for (x i , y i ) is large if x i is close to x k and small if it is not. A robust fitting procedure is used that guards against deviant points distorting the smoothed points. Visual, computational, and statistical issues of robust locally weighted regression are discussed. Several examples, including data on lead intoxication, are used to illustrate the methodology.
10,225 citations
••
TL;DR: This paper demonstrates how constraints from the task domain can be integrated into a backpropagation network through the architecture of the network, successfully applied to the recognition of handwritten zip code digits provided by the U.S. Postal Service.
Abstract: The ability of learning networks to generalize can be greatly enhanced by providing constraints from the task domain. This paper demonstrates how such constraints can be integrated into a backpropagation network through the architecture of the network. This approach has been successfully applied to the recognition of handwritten zip code digits provided by the U.S. Postal Service. A single network learns the entire recognition operation, going from the normalized image of the character to the final classification.
9,775 citations
••
TL;DR: In this article, a simple model for spin diffusion or conduction in the "impurity band" is presented, which involves transport in a lattice which is in some sense random, and in them diffusion is expected to take place via quantum jumps between localized sites.
Abstract: This paper presents a simple model for such processes as spin diffusion or conduction in the "impurity band." These processes involve transport in a lattice which is in some sense random, and in them diffusion is expected to take place via quantum jumps between localized sites. In this simple model the essential randomness is introduced by requiring the energy to vary randomly from site to site. It is shown that at low enough densities no diffusion at all can take place, and the criteria for transport to occur are given.
9,647 citations
Authors
Showing all 36526 results
Name | H-index | Papers | Citations |
---|---|---|---|
Yoshua Bengio | 202 | 1033 | 420313 |
David R. Williams | 178 | 2034 | 138789 |
John A. Rogers | 177 | 1341 | 127390 |
Zhenan Bao | 169 | 865 | 106571 |
Stephen R. Forrest | 148 | 1041 | 111816 |
Bernhard Schölkopf | 148 | 1092 | 149492 |
Thomas S. Huang | 146 | 1299 | 101564 |
Kurt Wüthrich | 143 | 739 | 103253 |
John D. Joannopoulos | 137 | 956 | 100831 |
Steven G. Louie | 137 | 777 | 88794 |
Joss Bland-Hawthorn | 136 | 1114 | 77593 |
Marvin L. Cohen | 134 | 979 | 87767 |
Federico Capasso | 134 | 1189 | 76957 |
Christos Faloutsos | 127 | 789 | 77746 |
Robert J. Cava | 125 | 1042 | 71819 |