Hostname: page-component-848d4c4894-pftt2 Total loading time: 0 Render date: 2024-05-16T07:52:43.321Z Has data issue: false hasContentIssue false

The unified extropy and its versions in classical and Dempster–Shafer theories

Published online by Cambridge University Press:  23 October 2023

Francesco Buono*
Affiliation:
Università di Napoli Federico II
Yong Deng*
Affiliation:
University of Electronic Science and Technology of China
Maria Longobardi*
Affiliation:
Università di Napoli Federico II
*
*Postal address: Università di Napoli Federico II, Naples, Italy.
**Postal address: RWTH Aachen University, Aachen, Germany. Email address: francesco.buono3@unina.it
***Postal address: University of Electronic Science and Technology of China, China. Email address: dengentropy@uestc.edu.cn

Abstract

Measures of uncertainty are a topic of considerable and growing interest. Recently, the introduction of extropy as a measure of uncertainty, dual to Shannon entropy, has opened up interest in new aspects of the subject. Since there are many versions of entropy, a unified formulation has been introduced to work with all of them in an easy way. Here we consider the possibility of defining a unified formulation for extropy by introducing a measure depending on two parameters. For particular choices of parameters, this measure provides the well-known formulations of extropy. Moreover, the unified formulation of extropy is also analyzed in the context of the Dempster–Shafer theory of evidence, and an application to classification problems is given.

Type
Original Article
Copyright
© The Author(s), 2023. Published by Cambridge University Press on behalf of Applied Probability Trust

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Balakrishnan, N., Buono, F. and Longobardi, M. (2022). A unified formulation of entropy and its application. Physica A 596, 127214.CrossRefGoogle Scholar
Balakrishnan, N., Buono, F. and Longobardi, M. (2022). On Tsallis extropy with an application to pattern recognition. Statist. Prob. Lett. 180, 109241.CrossRefGoogle Scholar
Buono, F. and Longobardi, M. (2020). A dual measure of uncertainty: the Deng extropy. Entropy 22, 582.CrossRefGoogle ScholarPubMed
Dempster, A. P. (1967). Upper and lower probabilities induced by a multivalued mapping. Ann. Math. Statist. 38, 325339.CrossRefGoogle Scholar
Deng, Y. (2016). Deng entropy. Chaos Solitons Fractals 91, 549553.CrossRefGoogle Scholar
Deng, Y. (2020). Uncertainty measure in evidence theory. Sci. China Inf. Sci. 63, 210201.CrossRefGoogle Scholar
Di Crescenzo, A. and Longobardi, M. (2002). Entropy-based measure of uncertainty in past lifetime distributions. J. Appl. Prob. 39, 434440.CrossRefGoogle Scholar
Dua, D. and Graff, C. (2019). UCI Machine Learning Repository. Available at http://archive.ics.uci.edu/ml.Google Scholar
Kang, B. Y., Li, Y., Deng, Y., Zhang, Y. J. and Deng, X. Y. (2012). Determination of basic probability assignment based on interval numbers and its application. Acta Electron. Sin. 40, 10921096.Google Scholar
Kazemi, M. R., Tahmasebi, S., Buono, F. and Longobardi, M. (2021). Fractional Deng entropy and extropy and some applications. Entropy 23, 623.CrossRefGoogle ScholarPubMed
Lad, F., Sanfilippo, G. and Agrò, G. (2015). Extropy: complementary dual of entropy. Statist. Sci. 30, 4058.CrossRefGoogle Scholar
Liu, F., Gao, X. and Deng, Y. (2019). Generalized belief entropy and its application in identifying conflict evidence. IEEE Access 7, 126625126633.CrossRefGoogle Scholar
Mirali, M. and Baratpour, S. (2017). Some results on weighted cumulative entropy. J. Iranian Statist. Soc. 16, 2132.Google Scholar
Rao, M., Chen, Y., Vemuri, B. and Wang, F. (2004). Cumulative residual entropy: a new measure of information. IEEE Trans. Inf. Theory 50, 12201228.CrossRefGoogle Scholar
Shafer, G. (1976). A Mathematical Theory of Evidence. Princeton University Press.CrossRefGoogle Scholar
Shannon, C. E. (1948). A mathematical theory of communication. Bell System Tech. J. 27, 379423.CrossRefGoogle Scholar
Smets, P. (2000). Data fusion in the transferable belief model. In Proceedings of the Third International Conference on Information Fusion, vol. 1, pp. PS21–PS33. IEEE.CrossRefGoogle Scholar
Toomaj, A., Sunoj, S. and Navarro, J. (2017). Some properties of the cumulative residual entropy of coherent and mixed systems. J. Appl. Prob. 54, 379393.CrossRefGoogle Scholar
Tsallis, C. (1988). Possible generalization of Boltzmann–Gibbs statistics. J. Statist. Phys. 52, 479487.CrossRefGoogle Scholar
Ubriaco, M. R. (2009). Entropies based on fractional calculus. Phys. Lett. A 373, 25162519.CrossRefGoogle Scholar
Zhou, Q. and Deng, Y. (2021) Belief eXtropy: measure uncertainty from negation. Commun. Statist. Theory Meth. 52, 38253847.CrossRefGoogle Scholar