\[ P(y, x; a, b) = \frac{1}{Z}\exp(-\| y - (ax + b) \|_2) \]
The binary segmentation problem: A more practical example
For each pixel define a binary random variable \(y_i\) that denotes whether
the pixel belongs to foreground or background.
\[
Y^* = \arg \max_{y_i \in \{0, 1\} \forall i} P(Y, X)
\]
If the size of the image is 100 x 100, then what is the
size of search space?
\( 2^{100 \times 100} \)
Two unanswered questions
What should we do about the high dimensional search space?
What is so "graphical" about "Probablistic Graphical Models"?
Recall some probability
Independence: Events \(y\) and \(x\) are independent iff
\[ P(y, x) = P(y)P(x) \]
Conditional Probability of \(y\) given \(x\) is
\(
\begin{align}
P(y|x) = \frac{P(y, x)}{P(x)} &\qquad \text{ or }P(y, x) = P(y|x)P(x)
\end{align}
\)
Conditional independence:
\(y\) and \(x\) are conditionally independent given \(z\) iff
\[ P(y, x|z) = P(y|z)P(x|z) \]
Note that, independence is intimately linked to factorization.
Probablistic Graphical models (PGMs) represent
conditional independence relations between random variables.
Different types of PGMs
Bayes net
Factor graphs
Conditional Random Fields
Markov Random Fields
Markov Random fields (MRF)
Define a graph \(G = (V, E)\) where \(V\) is a set of random
variables and \(E\) is a set of edges such that each (or set of)
random variable is conditionally independent of all other (or set
of) random variables given its neighbors.
Examples
Draw an MRF for two random variables (not independent).
Draw an MRF for three random variables (not independent).
Draw an MRF for four random variables (not independent).
Draw an MRF for four random variables that are all independent.
Draw an MRF for three random variables \(x,y,z\) such that
\(x\) and \(y\) are conditionally independent given \(z\).
Draw an MRF for four random variables \(w, x,y,z\) such that
\(x\) and \(z\) are conditionally independent given \(y\) and \(w\)
and \(y\) and \(w\) are conditionally independent given \(x\) and \(z\)
Let's make some independence assumptions
Assume that RV \(y_1\) is independent of the rest of the graph given its neighbors \(y_2, y_3, y_4, y_5\).
The following slides are borrowed from L. Ladicky
For further reading
For Gibbs sampling: D. J. MacKay, “Introduction to monte carlo methods,” in Learning in graphical models . Springer, 1998, pp. 175–204.
For belief propagation: R. Kschischang, B. J. Frey, and H.-A. Loeliger, “Factor
graphs and the sum-product algorithm,” Information Theory, IEEE
Transactions on , vol. 47, no. 2, pp. 498–519, 2001.