Online Learning Platform

Information Theory and Coding > Entropy > What is Chain rule of Mutual Information?

Chain Rule for Mutual Information

Conditional mutual information of random variables X given  Y defined by

I (X; Y ) = H (X ) − H (X|Y) = H (Y ) − H (Y|X)

Conditional mutual information of random variables X and Y given Z is defined by

I (X; Y|Z) = H (X|Z) − H (X|Y,Z)

Conditional mutual information of random variables X1 , XX3   ….. Xn and Y is defined by

Prev
What is Mitual Information?
Next
What is Chain Rule for Distance?
Feedback
ABOUT

Statlearner


Statlearner STUDY

Statlearner