Questions tagged [belief-propagation]

Belief propagation, also known as sum-product message passing, is a message-passing algorithm for performing inference on graphical models, such as Bayesian networks and Markov random fields.

Belief propagation, also known as sum-product message passing, is a message-passing algorithm for performing inference on graphical models, such as Bayesian networks and Markov random fields. It calculates the marginal distribution for each unobserved node, conditional on any observed nodes. Belief propagation is commonly used in artificial intelligence and information theory and has demonstrated empirical success in numerous applications including low-density parity-check codes, turbo codes, free energy approximation, and satisfiability.

28 questions
5
votes
1 answer

Belief propagation using pgmpy lib - algorithm understanding

I am now starting to use pgmpy lib for probabailistic graphical model implementation. The probability that I get using this lib differs from the one I get manually (e.g. using SamIam). Here is a screenshot of the very small graphical model made in…
Olga
  • 53
  • 3
4
votes
1 answer

How do you do belief propagation on nodes with conditional dependence?

Take this graph: $$ \overset{S}{\fbox{$+$}} \underset{\textstyle \nwarrow \!\underset{\overset{\scriptstyle X_2}{\textstyle\bigcirc}} {}} {\overset{\textstyle \swarrow \!\overset{\overset{\scriptstyle X_1}{\textstyle \bigcirc}} …
4
votes
0 answers

Can (loopy) belief propagation be used to learn from a data set?

I'm trying to expand my experience with restricted Boltzmann machines to a more general class of graphical models and currently learning about belief propagation using message passing algorithms. One thing I can't understand is whether BP may be…
3
votes
0 answers

Model or State Uncertainty in Queueing Model due to uncertain arrival rate

$\textbf{Introduction}$ I am currently modelling a scenario where two queues need to be served by a single server in a non preemptive discipline. I am quite sorted on generating the optimal policy via Value or Policy Iteration when given the arrival…
3
votes
2 answers

What is the difference between belief propagation and loopy belief propagation?

Belief propagation (BP) is an algorithm (or a family of algorithms) that can be used to perform inference on graphical models (e.g. a Bayesian network). BP can produce exact results on cycle-free graphs (or trees). BP is a message passing algorithm:…
user82135
3
votes
1 answer

Bethe approximation for factor graphs

I am confused at computing Bethe approximation for factor graphs in here. It generalizes Bethe approxmiation in a pairwise case. However, I am wondering why (75) goes to (78) with (76): We can verify that $$\hat b_i(x_i)\propto…
3
votes
0 answers

Is the posterior a sufficient statistic when observations are conditionally independent?

Suppose there are two random variables, $X_1$ and $X_2$, and we're trying to infer $\theta$. If $X_1$ and $X_2$ are conditionally independent, then is $f(\theta|X_1)$ a sufficient statistic for $X_1$? I.e., will this hold: $f(\theta|X_1,X_2) =…
3
votes
1 answer

How do you find mathematical expressions for the posterior marginals i.e. $P(x_n|y_0, ... , y_n)$ in an HMM?

My goal is to find closed form equations for posterior marginals $P(x_n|y_0, ... , y_n)$ in a general HMM. I was told that we can calculate it exactly via BP (belief propagation, thought not sure how BP comes in the picture...) in an iterative…
2
votes
1 answer

Do variational approximations capture the flow of influence or "conditional independence" relationships in graphical models?

Probabilistic Graphical Models (PGMs) are used to model all sorts of complex decision processes, such as medical diagnoses or robot positions, etc. In common machine learning textbooks, like Christopher Bishops book on pattern recognition or…
2
votes
0 answers

What is the relation between message passing and probabilities in Bayesian inference?

The belief propagation algorithm is a message passing algorithm that can be used to estimate marginal probabilities on Bayesian networks. What is the definition of these messages? What is the relation between the messages and the marginal…
user82135
2
votes
2 answers

How does Dempster-Shafer relate to Machine Learning?

I read Dempster-Schafer can be thought of as a generalization of Bayesian theory. Say I have data from disparate sources that indicate the class of some object. If I have some prior beliefs about the world, or by examination of some corpus of…
Pavel Komarov
  • 1,097
  • 9
  • 19
2
votes
1 answer

How to compute the Gibbs free energy in Bethe approximation for MRF

Hi, I am learning loopy belief propagation for MRF. The general roadmap is to define a Bethe approximation, which is exact for a tree but wrong for general graphs. I'm currently stuck at the point to compute the Bethe entropy. Let's consider a…
1
vote
0 answers

Factor graph vs "moral graph" (preprocessing for belief propagation)

Some articles* about belief propagation formulate the algorithm in terms of undirected graphical models. Thus if we had a directed (acyclic) graphical model, it would need to be preprocessed into an undirected graph before the algorithm can be…
1
vote
1 answer

Introduction to approximate message passing

I'm interested in learning approximate message passing from the paper "Message Passing Algorithms for Compressed Sensing: I. Motivation and Construction". My background is in computer science and engineering and I have never taken a course on…
1
vote
1 answer

Why do we fit Recurrent Neural Networks with backprop instead of message passing/expectation propagation?--as with hidden markov models

The form of a Recurrent Neural Network (RNN) seems to resemble that of a hidden markov model. With a hidden markov model we have transitions between discrete states, as well as an emission variable that charts the observed measurement--from which we…
1
2