Journal ArticleDOI
The theory of information and coding: A mathematical framework for communication
G. Longo
- Vol. 67, Iss: 10, pp 1467-1468
TLDR
The stochastic integral with respect to processes with values in a reflexive Banach space, Theor.Abstract:
14. M. Metivier, The stochastic integral with respect to processes with values in a reflexive Banach space, Theor. Probability 19 (1974), 758-787. 15. M. Metivier and G. Pistone, Une formule d'isométrie pour r intégrale stochastique Hilbertienne et équations d'évolution linéaires stochastiques, Z. Wahrscheinlicnkeitstheorie und Verw. Gebiete 33 (1975), 1-18. 16. P. A. Meyer, A decomposition theorem for supermartingales, Illinois J. Math. 6 (1962), 193-205. 17. , Intégrales stochastiques. IV, Lecture Notes in Math., vol. 39, Springer-Verlag, Berlin and New York, 1967, pp. 142-162. 18. , Un cours sur les intégrales stochastiques, Lecture Notes in Math., vol. 511, Springer-Verlag, Berlin and New York, 1976, pp. 245-400. 19. , Intégrales Hilbertiennes, Lecture Notes in Math., vol. 581, Springer-Verlag, Berlin and New York, 1977, pp. 446-461. 20. R. E. A. C. Paley, N. Wiener and A. Zygmund, Notes on random functions, Math. Z. 37 (1933), 647-668. 21. J. Pellaumail, Sur lintégrale stochastique et la décomposition de Doob-Meyer, Asterique 9 (1973), 1-125. 22. P. E. Protter, Markov solutions of stochastic differential equations, TL Wahrscheinlichkeitstheorie und Verw. Gebiete 41 (1977), 39-58. 23. , A comparison of stochastic integrals, Ann. Probability (to appear). 24. R. L. Stratonovich, A new representation for stochastic integrals and equations, SIAM. J. Control 4 (1966), 362-371. 25. N. Wiener, Differential-space, J. Math, and Physics 2 (1923), 131-174.read more
Citations
More filters
Book
Elements of information theory
Thomas M. Cover,Joy A. Thomas +1 more
TL;DR: The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment.
Journal ArticleDOI
Good error-correcting codes based on very sparse matrices
TL;DR: It is proved that sequences of codes exist which, when optimally decoded, achieve information rates up to the Shannon limit, and experimental results for binary-symmetric channels and Gaussian channels demonstrate that practical performance substantially better than that of standard convolutional and concatenated codes can be achieved.
Journal ArticleDOI
Vector quantization in speech coding
John Makhoul,S. Roucos,H. Gish +2 more
TL;DR: This tutorial review presents the basic concepts employed in vector quantization and gives a realistic assessment of its benefits and costs when compared to scalar quantization, and focuses primarily on the coding of speech signals and parameters.
Journal ArticleDOI
Finding the Number of Clusters in a Dataset
TL;DR: A simple, yet powerful nonparametric method for choosing the number of clusters based on distortion, a quantity that measures the average distance, per dimension, between each observation and its closest cluster center, is developed.
Journal ArticleDOI
On channel capacity per unit cost
TL;DR: It is shown that, if the input alphabet contains a zero-cost symbol, then the capacity per unit cost admits a simple expression as the maximum normalized divergence between two conditional output distributions.
References
More filters
On Personal Storage Systems: Architecture and Design Considerations
TL;DR: In this paper, the authors propose a set of mecanismes de manegament de dades per millorar aquestes limitacions, aixi com de potencials contramesures.
Posted Content
A Novel Mutual Information-based Feature Selection Algorithm.
Pietro Cassara,Alessandro Rozza +1 more
TL;DR: This work proposes a novel algorithm to manage the optimization problem that is the foundation of the Mutual Information feature selection methods thus to formalize a novel approach that is also able to automatically estimate the number of dimensions to retain and rank the features to select from the most probable to the less probable.