Whitened Expectation Propagation: Non-Lambertian Shape from Shading and Shadow
Summary (2 min read)
1. Introduction
- Probabilistic inference for large loopy graphical models has become an important subfield with a growing body of applications, including many in computer vision.
- These methods have resulted in significant progress for several applications.
- The principal difference between BP and Gaussian EP can thus be summarized by a trade-off in their respective approximating families: BP favors flexible non-Gaussian marginals, while Gaussian EP favors a flexible covariance structure.
- Another possible explanation is that for a grid-based graphical model with D pixels, Gaussian EP requires O(D2) space and a run time of O(D3).
- Finally, the authors use the method to efficiently perform inference over large cliques produced by cast shadows and by global spatial priors.
2. Expectation Propagation
- The family P̃ is chosen so that EP̃ [τj( x)] can be estimated easily.
- EP achieves this goal by approximating each potential function φi( x) with an exponential family distribution P̃i( xi| θ(i)).
- Regardless of the rank of each potential, the covariance matrix of the posterior S remains full-rank, and must be stored as a D×D matrix.
- For large problems with tens of thousands of variables or more, this becomes limiting.
- When the underlying graphical model is highly sparse, such as a nearest-neighbor pairwiseconnected MRFs, each iteration can be performed in time O(D1.5) [2].
3. Whitened EP
- For many problems of computer vision, both the number of variables D and the number of potentials N grow linearly with the number of pixels.
- Low-rank potentials of large clique size have a wide array of promising applications in computer vision [17, 10].
- Expectation propagation can be made more efficient by limiting the forms of covariance structure expressible by S. Let S denote the covariance matrix for natural scenes.
4. Shape from Shading
- Whitened EP permits inference over images in linear time with respect to both pixels and clique size.
- In particular, the authors are interested in whether Gaussian message approximation will be effective when the potentials φi are highly non-Gaussian.
- In recent years, several methods have been developed that solve the classical SfS problem well as long as surface reflectance R is assumed to be Lambertian [19, 17, 6, 3, 7].
- For each pixel, one potential φR(p, q|i) enforces the surface normal to be consistent with the known pixel intensity i(x, y).
- Whitened EP provides two benefits for spatial priors.
5. Conclusions
- The methods in this paper reduce the run time of EP from cubic to linear in the number of pixels for visual inference, while retaining a run time that is linear in clique size.
- The computational expense of inference for large cliques has prohibited the investigation of complex probabilistic models for vision.
- The authors hope is that whitened EP will facilitate further research in these directions.
- Results for whitened EP on SfS shows that the sacrifice in performance for this approach is small, even in problems with highly non-Gaussian potentials.
- Performance remained strong for surfaces with arbitrary reflectance and arbitrary lighting, which is a novel finding in SfS.
Did you find this useful? Give us your feedback
Citations
2 citations
References
6,804 citations
"Whitened Expectation Propagation: N..." refers background in this paper
...Cues that stem from non-local similarity within a scene have been applied very successfully towards image denoising [2]....
[...]
3,233 citations
"Whitened Expectation Propagation: N..." refers methods in this paper
...Others have advanced methods of inference which can be applied to probabilistic models over discrete variables with large cliques[10, 24], or large numbers of small cliques [12]....
[...]
1,827 citations
"Whitened Expectation Propagation: N..." refers background in this paper
...In addition to its efficient run time, unifying many pairwise potentials into one large potential increases the fidelity of the Bethe approximation implicit in message passing algorithms [25]....
[...]
1,514 citations
"Whitened Expectation Propagation: N..." refers background in this paper
...Computing ViSV ′ i here does not require sampling, and can be found analytically [14]....
[...]
...Minka showed that when x is discrete-valued and the approximating exponential family is a product of independent univariate discrete distributions, then EP is equivalent to classical belief propagation (BP) [14]....
[...]
...When the approximating family is a product of independent univariate marginals, EP is equivalent to BP [14]....
[...]
...In 2001, Minka proposed a generalization of BP known as Expectation Propagation [14]....
[...]
...Recall that when S is constrained to be diagonal, EP is equivalent to belief propagation [14]....
[...]
1,365 citations