RELATION BETWEEN SHANNON ENTROPY, RENYI ENTROPY AND INFORMATION
DOI:
https://doi.org/10.59367/fj4cbh36Abstract
This paper provides evidence that the Renyi Entropy and information are both limiting case of the Shannon Entropy. In addition, experimental evidence supports these findings. As a last thought, we offer some broad conclusions on the usefulness of entropy metrics. In an appendix, a brief history of the idea of physical entropy is given
References
P A Bromiley, M Pokric, and N A Thacker. Computing covariances for mutual information coregistration. In Proceedings MIUA 2004, 2004.
P A Bromiley, M Pokric, and N A Thacker. Empirical evaluation of covariance matrices for mutual information coregistration. In Proceedings MICCAI 2004, 2004.
P A Bromiley, M. Pokric, and N A Thacker. Tina memo 2004-001: Empirical evaluation of covariance matrices for mutual information coregistration. Technical report, Imaging Science and Biomedical Engineering Division, Medical School, University of Manchester, 2004.
P A Bromiley and N A Thacker. Tina memo 2003-002: Computing covariances for mutual information coregistration 2. Technical report, Imaging Science and Biomedical Engineering Division, Medical School, University of Manchester, 2003.
A Cho. A fresh take on disorder, or disorderly science? Science, 297:1268–1269, 2002.
R V L Hartley. Transmission of information. Bell Systems Technical Journal, page 535, July 1928.
J Havrda and F Charvat. Quantification method of classification processes: concept of structural α-entropy.
Kybernetika, 3:30–35, 1967. [8] H Nyquist. Certain factors affecting telegraph speed. Bell Systems Technical Journal, page 324, April 1924.
H Nyquist. Certain topics in telegraph transmission theory. A.I.E.E. Trans., page 617, April 1928.
K Ord and S Arnold. Kendall’s Advanced Theory of Statistics: Distribution Theory. Arnold, 1998.
A Renyi. On measures of entropy and information. In Proc. Fourth Berkeley Symp. Math. Stat. Prob., 1960, volume 1, page 547, Berkeley, 1961. University of California Press.
C E Shannon. A mathematical theory of communication. Bell Systems Technical Journal, 27:379–423 and 623–656, Jul and Oct 1948.
N A Thacker and P A Bromiley. Tina memo 2001-013: Computing covariances for mutual information coreg- istration. Technical report, Imaging Science and Biomedical Engineering Division, Medical School, University of Manchester, 2001.
C Tsallis. Possible generalization of Boltzmann-Gibbs statistics. Journal of Statistical Physics, 52:479–487, 1988. P Viola andWMWells. Alignment by maximisation of mutual information. International Journal of Computer Vision, 24(2):137–154, 1997.
Downloads
Published
Issue
Section
License
Copyright (c) 2023 International Journal of Futuristic Innovation in Engineering, Science and Technology (IJFIEST)
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.