International Research Journal of Engineering and Technology (IRJET)
e-ISSN: 2395-0056
Volume: 11 Issue: 10 | Oct 2024
p-ISSN: 2395-0072
www.irjet.net
APPROXIMATE CONVEXITY OF USEFUL INFORMATION OF J-DIVERGENCE OF TYPE ALFA IN CONNECTION WITH J-S MIXTURE DISTANCE MODELS Rohit Kumar Verma1, M. BhagyaLaxmi2 1 Associate Professor, Department of Mathematics,
BhartiVishwavidyalaya, Durg, C.G., India 2 Research Scholar Ph.D. Department of Mathematics,
BhartiVishwavidyalaya, Durg, C.G., India --------------------------------------------------------------------------***----------------------------------------------------------------------------
Abstract Inthis work we review Kullback-Leibler divergence and Jeffery’s distance divergence measures for the flexible family of multivariate R-norm.we use jefferys divergence measure to compare the multivariate R-norm. A J-divergence Measuredbased on Renyi’s-Tsallis Entropy much like kullback-Leibler divergence is related to Shannon’s entropy .in this paper ,we have characterized the sum of two general measures associated with two distribution with discrete random variables.one of these measures is logarithmic ,while other contains the power of variable named as J-divergencebased on Renyi’s –Tsallis entropy measures. Some illustrative examples are given to support the finding and further exhibit and adequacy of measure. Keywords– Shannon’s Entropy, Kullbuck–Leiblerdivergence,J-divergence, Information Measure, J- Shannon.
1.INTRODUTION 1.1 KULLBACK-LEIBLER DIVERGENCE (KL- DIVERGENCE) The relative entropy from Q to P for discrete probability distributions P and Q specified on the same sample space is defined as in [12,13,15] (
)= ∑
( )
( )
( )
)=
(
∑
( )
( )
( )
This study has developed several new generalized measures of relevant relative information and examined their specific cases. These metrics have also yielded novel and useful information measures, as well as their relationship with various entropy measurements. Relative entropy Entropy is only defined in this following way ( ) (1) If, for all x, Q(x) = 0 (2) If Q(x) ( )=+
(absolutely continuity)
For distribution P and Q of continuous random variable ,relative entropy is defined to be the integral (
)=∫
( )
( ) ( )
dx
P, q are probability densities of P and Q. It has the following properties: 1) If P and Q probability measures on a measurable space
and P is absolutely continuous with respect to Q, the relative
entropy from Q to P is defined as 2)
(
(
)=∫
© 2024, IRJET
. (
|
)
(
/ P(dx),where (
)
) )
is the Random-Nikodymderivative of Q with respective to Q.
Impact Factor value: 8.315
|
ISO 9001:2008 Certified Journal
|
Page 111