Questions tagged [markov-random-field]

58 questions
26
votes
3 answers

What's the difference between a Markov Random Field and a Conditional Random Field?

If I fix the values of the observed nodes of an MRF, does it become a CRF?
6
votes
1 answer

use MCMC posterior as prior for future inference

Would you kindly let me know how to use the estimated posterior distribution as the prior of another Bayesian update? Or even use that in an iterative manner, e.g. in my case the posterior is updated according to a spatial correlated prior update?…
colddie
  • 163
  • 4
6
votes
2 answers

Markov Random Fields vs Hidden Markov Model

I'm kinda new to these topics, I wanted to know if there are any relations between those two topics, Markov Random Fields and Hidden Markov Models (Markov Chains). I feel like they are completely different from each other, though in some sources…
6
votes
1 answer

Markov Random Field Non-Positive Distribution

The joint distribution in a Markov Network can be represented as: $P(X=x) = \frac{1}{Z}\phi_k(x_k)$ where $\phi_k$ represents the $k^{th}$ factor. While reading Improving Markov Network Structure Learning Using Decision Trees, I came across a line…
statBeginner
  • 1,251
  • 2
  • 17
  • 22
4
votes
0 answers

Relation between Gaussian Processes and Gaussian Markov Random Fields

As a non expert in the field, I am relating Gaussian Processes (GP) and Gaussian Markov Random Fields (GMRF). I might just be confused by the fact that different resources use different formalism. Here I try to report the main definitions and my…
4
votes
1 answer

Mean field theory and neural networks

Mean field algorithm has been proposed to be used in combination with convolutional networks and recursive neural networks. What is the purpose of doing this? Is the goal to estimate a probability distribution which can be used for the loss function…
4
votes
1 answer

Confusion regarding terminology related to the junction tree algorithm

As far as I understand, the "junction tree algorithm" is a general inference framework which roughly consists of the four steps 1) triangulate, 2) construct junction tree, 3) propagate probabilities/pass messages and 4) perform intra-clique…
4
votes
1 answer

Sampling a random binary matrix with "Gaussian" probability distribution

Let $A_{ij}$ be a $n\times n$ random binary matrix with probability mass function $P(A)$ given by $$ \log P(A)=-\frac 12 \mathrm{tr}\left[\left(A-M\right)^TV\left(A-M\right)\right] + C, $$ where $M$ and $V$ are also $n\times n$ matrices,…
3
votes
1 answer

Probability of at least one success in a long string of connected events

I have N events (i from 1 to N), each with an estimated probability of success, p(i). If all my events were independent I'd be able to calculate the probability of at least one success as (1 - product of (1 - p(i))). But some events are not…
3
votes
1 answer

Markov random field potentials

Consider a pairwise Markov random field, for any two neighbours $A$ and $B$, is it correct to use any function to describe the relationship between them? Is there any constraint or any condition that a function needs to be satisfied to be a valid…
3
votes
1 answer

Understanding the distribution of the Spike & Slab Restricted Boltzmann Machine (ssRBM)

The ssRBM is described as a way to model mean and covariance using Restricted Boltzmann Machines. I'm reading the paper that introduced the spike and slab restricted boltzmann machine. I have yet do more than skim the follow up paper that fine-tuned…
2
votes
1 answer

How can I derive the joint distribution for this Markov network?

I am reading Bayesian Reasoning And Machine Learning and I'm not sure how to do exercise 4.6 on p.80. The undirected graph: represents a Markov network with nodes $x1, x2, x3, x4, x5$, counting clockwise around the pentagon with potentials…
Slim Shady
  • 203
  • 9
2
votes
1 answer

How can I show that these two variables in a Markov network are marginally independent?

I am reading "Bayesian Reasoning And Machine Learning" and I'm doing exercise 4.2 on page 79. This is the exercise: Consider the Markov network $$p(a,b,c)=\phi(a,b)\phi(b,c)$$ Nominally, by summing over $b$, the variables $a$ and $c$ are dependent.…
Slim Shady
  • 203
  • 9
2
votes
0 answers

Decomposition of a Gaussian Markov random field in independent subfields

A zero-mean GMRF (i.e., a multivariate normal distribution whose precision matrix is sparse) with precision $Q \in \mathbb{R}^{n \times n}$ and covariance $\Sigma = Q^{-1}$ is eigendecomposed as $Q = V \Lambda V^\top$ and $\Sigma = V \Lambda^{-1}…
2
votes
1 answer

Do variational approximations capture the flow of influence or "conditional independence" relationships in graphical models?

Probabilistic Graphical Models (PGMs) are used to model all sorts of complex decision processes, such as medical diagnoses or robot positions, etc. In common machine learning textbooks, like Christopher Bishops book on pattern recognition or…
1
2 3 4