I am currently taking the PGM course by Daphne Koller on Coursera. In that, we generally model a Bayesian Network as a cause and effect directed graph of the variables which are part of the observed data. But on PyMC tutorials and examples I generally see that it not quite modeled in the same way as the PGM or atleast I am confused. In PyMC the parents of any observed real world variable are often the parameters of the distribution that you use to model the variable.
Now my question really is a practical one. Suppose I have 3 variables for which data is observed (A, B, C) (lets assume they are all continuous variables just for the sake of it). From some domain knowledge, one can say that A and B cause C. So we have a BN here - A, B are the parents and C is the children. now from the BN equation P(A, B, C) = P(C | A, B) * P(A) * P(B)
I can say A and B are some normal distributions with some mu and sigma, but how do I model P(C | A, B) ? The general idea I want to learn, is how do I learn this BN using PyMC so that I can query the BN. Or do I have to augment the BN with parameters of the model in some fashion.
Is this problem solvable using pymc? or have I got some fundamentals wrong?
Any help would be appreciated!