Computing conditional entropy
I am trying to compute the conditional entropy for arbitrary sets of nodes in a Bayesian Network, given some evidence that has been inserted into the network. For this, I tried employing the method jointMutualInformation
in LazyPropagation
. However, it seems that this method gives wrong results when evidence has been inserted into the network. The reason for this seems to be that it removes any evidence from the network, see here. Is this intended behaviour? If so, is there a quick workaround to compute jointMutualInformation
with evidence in the network?
Edited by Robert Passmann