Menu

Mutual Information between two nodes

cgPerera
2014-07-30
2014-08-06
  • cgPerera

    cgPerera - 2014-07-30

    Hi,

    I am trying to identify the node which can be give me the best evidence with respect to a certain hypothesis.
    Is there any way to find the mutual information (cross entropy) between two nodes in the Bayesian network? Mutual information between the nodes X and Y is defined as I(X, Y) = H(X) − H(X|Y), and to calculate the above, I need P(X|Y) and P(X,Y). Is there a way to find P(X|Y) and P(X,Y) for any two nodes X and Y?

    Your prompt response is highly appreciated.

     
  • Shou Matsumoto

    Shou Matsumoto - 2014-08-02

    Hello.

    My apologies (for my ignorance) if I'm wrong, but as far as I remember there is no method in the core package of UnBBayes (i.e. UnBBayes without plug-ins) precisely for this purpose, so you may need to calculate such mutual information "manually", or write your own code...

    However, in the learning plug-in (the id of the plug-in is unbbayes.learning.<version#>), there is a class called unbbayes.learning.CBLToolkit. In this class, the method "mutualInformation(LearningNode xi, LearningNode xk)" may be close to what you are looking for (sorry, I could not check/compare it in details myself). The method "conditionalMutualInformation(int v1, int v2, ArrayList sep)" may also be useful.

    Unfortunately, these methods are for Bayesian learning, so the methods' signatures are using classes/objects specific to the learning module (so you may need to change some code if you want to run it outside the context of Bayesian learning).

    Sorry if this was not the answer you've been expecting.

    Sincerely,

     

    Last edit: Shou Matsumoto 2014-08-02
  • cgPerera

    cgPerera - 2014-08-06

    Hi,

    Thank you for your valuable information. I will check it out and will update on how it goes.

    Thanks

     

Log in to post a comment.