Main Article Content
The objective of this correspondence is to offer an elaboration of some latest inequalities' findings, in which we have given a new improvement of ‘useful’ Jensen's inequality, as well as utilization in the theory of information. In linear spaces, for convex functions constructed on a convex subset, an improvement inequality of Jensen's is provided. For ‘useful mean deviation and ‘useful’ divergences, we provide robust lower bounds as well as the ‘useful’ mean h-absolute deviation, and lastly, we have given applications of divergence measure. Uniqueness for the ‘useful’ KL-Divergence and ‘useful’ Jeffreys divergence is obtained.