Towards an Axiomatization for the Generalization of the Kullback-Leibler Divergence to Belief Functions
Available Online August 2011.
- https://doi.org/10.2991/eusflat.2011.28How to use a DOI?
- Dempster-Shafer theory of belief functions, channel capacity, Kullback-Leibler divergence
- In his information theory, Shannon  defined a notion of uncertainty, the entropy, which has been generalized in several wways to belief functions . He also defined the channel capacity for which we propose in this paper the first generalization to belief functions. To do that, we need first to generalize the Kullback-Leibler (KL) divergence, for which the present work proposes some axioms. Their list is still not exhaustive since the proposed solution is not unique. But there are many practical interests, since the notion of channel capacity is useful to characterize and optimize for example systems of sensors; its generalization to belief functions allows us to include imprecise sensors such as the human. Finally we show an example of gradient algorithm to compute the generalized channel capacity.
- Open Access
- This is an open access article distributed under the CC BY-NC license.
Cite this article
TY - CONF AU - Hélène Soubaras PY - 2011/08 DA - 2011/08 TI - Towards an Axiomatization for the Generalization of the Kullback-Leibler Divergence to Belief Functions PB - Atlantis Press SP - 1090 EP - 1097 SN - 1951-6851 UR - https://doi.org/10.2991/eusflat.2011.28 DO - https://doi.org/10.2991/eusflat.2011.28 ID - Soubaras2011/08 ER -