Sequential update evidence on a binary node converges to extreeme
I'm trying go use PyAgrum for sequential update of a belief network. However it is not behaving as a I expect it to. Taking a simple single node example of a belief in a fair coin.
bn=gum.fastBN("Coin")
bn.cpt('Coin').fillWith([0.5,0.5])
Lets imagine we get evidence of 100 throws giving 70 heads, then to add that evidence:
ie=gum.LazyPropagation(bn)
ie.setEvidence({'Coin':[0.3,0.7]})
ie.makeInference()
ie.posterior("Coin")
provides a posterior distribution of 0.3/0.7. A little surprising for the 'network' to change its mind so completely in the face of evidence. Moreover, if then we accept the posterior as a new prior and add a new observation (which, for this thought experiment, happens to have given the same result)
bn.cpt("Coin").fillWith(ie.posterior("Coin"))
ie=gum.LazyPropagation(bn)
ie.setEvidence({'Coin':[0.3,0.7]})
ie.makeInference()
ie.posterior("Coin")
This then gives the result (on my system at least) of 0.1552/0.8448. Doing the same repeatedly converges to 0/1. I may be misunderstanding things, but I had expected that the posterior would converge to 0.3/0.7. Where am I going wrong?