Some Inequalities on ‘Useful’ Mean g-deviation with Applications in Information Theory
Main Article Content
Abstract
The objective of this correspondence is to offer an elaboration of some latest inequalities' findings, in which we have given a new improvement of ‘useful’ Jensen's inequality, as well as utilization in the theory of information. In linear spaces, for convex functions constructed on a convex subset, an improvement inequality of Jensen's is provided. For ‘useful mean deviation and ‘useful’ divergences, we provide robust lower bounds as well as the ‘useful’ mean h-absolute deviation, and lastly, we have given applications of divergence measure. Uniqueness for the ‘useful’ KL-Divergence and ‘useful’ Jeffreys divergence is obtained.
Downloads
Metrics
Article Details
Licensing
TURCOMAT publishes articles under the Creative Commons Attribution 4.0 International License (CC BY 4.0). This licensing allows for any use of the work, provided the original author(s) and source are credited, thereby facilitating the free exchange and use of research for the advancement of knowledge.
Detailed Licensing Terms
Attribution (BY): Users must give appropriate credit, provide a link to the license, and indicate if changes were made. Users may do so in any reasonable manner, but not in any way that suggests the licensor endorses them or their use.
No Additional Restrictions: Users may not apply legal terms or technological measures that legally restrict others from doing anything the license permits.