A view on weighted exponential entropy and examining some of its features

Document Type : Research Paper

Authors

2 Department of Statistics, University of Sistan and Baluchestan, Zahedan, Iran

Abstract

One of the alternative versions of Shannon entropy is a measure of information which is called exponential entropy. Shannon and exponential entropies depend only on the event probabilities. These measures have also been extended to incorporate a set of weights associated with the events. Such weights may reflect some additional characteristics of the events such as their relative importance. In this paper, Axiomatic derivations and properties of weighted exponential entropy parallel to those achieved for weighted entropy are investigated. The relation between weighted exponential entropy of  X and a strictly monotone and nonnegative function of X has obtained. The generalized weighted entropy and the generalized weighted exponential entropy for continuous random variable have been presented.

Keywords

Main Subjects

References

[1] A. D. Alnaser, A. I., Rawashdeh and A., Talal. (2022). On using Shannon entropy measure for formulating new weighted exponential distribution. Journal of Taibah University for science, 16(1), 1035-1047. https://doi.org/10.1080/16583655.2022.2135806
[2] B. A., Bhat, and M. A. K., Baig. , (2019). A new two parametric weighted generalized entropy for lifetime distributions. Journal of Modern Applied Statistical Methods, 18(2), 2-21. https://doi.org/10.22237/jmasm/1604189340
[3] M. Belis, and S. A. Guiasu. (1968). Quantitative-qualitative measure of lifetime in cybernetic systems. IEEE Transactions Information Theory, 4, 593-594. https://doi.org/10.1109/tit.1968.1054185
[4] L. L. Campbell. (1966). Exponential entropy as a measure of extent of distribution, Z. Wahrscheinlichkitstheorie verw. Geb, 5, 217-225. https://doi.org/10.1007/bf00533058
[5] T. M. Cover, and J. A. Thomas. (2006).Elements of Information Theory. Second Edition, Wiley, New York, https://doi.org/10.1002/047174882x
[6] S. Das. (2017). On weighted generalized entropy. Communications in Statistics-Theory and Methods, 46 (12), 5707-5727. https://doi.org/10.1080/03610926.2014.960583
[7] A. Di Crescenzo, and M. Longobardi. , (2006). On weighted residual and past entropies. Scientiae Mathematicae Japonicae, 64, 255-266. https://doi.org/10.48550/arXiv.math/0703489
[8] D. e. Cuhna, S., Guimaraes, I. and G. Dial. (1982). On axiomatic characterization of weighted entropy. Information Sciences, 27(2), 91-97. https://doi.org/10.1016/0020-0255(82)90053-6
[9] G. Dial and I. J.Taneja, On weighted entropy of type (￿ and ￿) and its generalizations, Aplikace matematiky, 26(6), ( 1981), 418-425. 10.21136/AM.1981.103931
[10] S. Guiasu. (1971). Weighted entropy. Report on Math. Physics, 2, 165-179. http://dx.doi.org/10.1016/0034-4877(71)90002-4
[11] J. F. Havrda and F. Charvát.(1967). Qualification method of classification process : the concept of structural -entropy. Kybernetika, 3, 30-35. http://dml.cz/dmlcz/125526
[12] P. L. Kannappan and P. K. Sahoo. (1986). On the general solution of a function equation connected to sum form information measures on open domain. Math. Sci., 9, 545-550. https://doi.org/10.1155/S0161171286000686
[13] J. N. Kapur. (1983). A comparative assessment of various measures of entropy. Journal of Information and Optimization Sciences, 4(3), 207-232. https://doi.org/10.1080/02522667.1983.10698762
[14] T. O. Kvalseth. (2001). On weighted exponential entropies. Perceptual and motor skills, 92(1), 3-7. https://doi.org/10.2466/pms.2001.92.1.3
[15] M. Mahdy. (2018). Weighted entropy measure: A new measure of information with its properties in reliability. theory and stochastic orders, Journal of Statistical Theory and Applications, 17(4), 703-718. https://doi.org/10.2991/jsta.2018.17.4.11
[16] D. N. Nawrocki and W. H. Harding. , (1986). State-value weighted entropy as a measure of investment risk. Applied Economics, 18(4), 411-419. https://doi.org/10.1080/00036848600000038
[17] N. R. Pal and S. K. Pal. (1989). Object background segmentation using new definitions of entropy. IEE Proc. 136, 284-295. https://doi.org/10.1049/ip-e.1989.0039
[18] N. R. Pal and S. K. Pal. (1991). Entropy: A new definition and its applications. IEEE Trans. Syst. Man Cybren, 21, 1260-1270. https://doi.org/10.1109/21.120079
[19] O. Parkash and H. C. Taneja. (1986). Characterization of the quantitativequalitative measure of inaccuracy for discrete generalized probability distributions. Communication in Statistics-Theory and Methods, 15, 3763-3771. https://doi.org/10.1080/03610928608829345
[20] F. M. Reza. (1994). An introduction to information theory. McGraw-Hill, ISBN 0-486-68210-2
[21] A. Rényi. (1961). On measures of entropy and information. Proc. Fourth Berkeley Symp. I. Berkeley: UC Press, 547-561. http://projecteuclid.org/euclid.bsmsp/1200512181.
[22] C. E. Shannon. (1948). A mathematical theory of communication. Bell System Tech. 27, 379-423. https://doi:10.1002/j.1538-7305.1948.tb01338.x.
[23] R. P. Singh and J. D. Bhardwaj. (1992). On parametric weighted information improvement. Information Sciences, 59,149-163. https://doi.org/10.1016/0020-0255(92)90048-D
[24] C. Tsallis. (1988). Possible generalization of Boltzmann–Gibbs statistics. J. Stat.Phys, 52, , 479-487. https://doi.org/10.1007/BF01016429
[25] Sh. Yu, and T, Huang. (2017). Exponential weighted entropy and exponential weighted mutual information. Neurocomputing, 249, 86-94. https://doi.org/10.1016/j.neucom.2017.03.075

History

• Receive Date: 10 November 2022
• Revise Date: 10 June 2023
• Accept Date: 13 September 2023