Abstract #2477
Bayesian Parallel Imaging with Edge-Preserving Priors
Raj A, Singh G, Zabih R
UCSF
Existing parallel imaging methods are limited by a fundamental tradeoff, where suppressing background noise introduces aliasing artifacts. Bayesian methods offer a promising alternative; however, previous methods with spatial priors assume that intensities vary smoothly over the entire image, and therefore blur edges. We introduce an edge-preserving prior which instead assumes that intensities are piecewise smooth, and show how to efficiently compute its Bayesian estimate. This is formulated as an optimization problem, which requires minimizing a large non-convex objective function. Traditional continuous minimization methods cannot be applied. However, it is closely related to some problems in computer vision for which discrete optimization methods have been developed in the last few years. We extend these algorithms, which are based on graph cuts, to address our problem. An empirical analysis indicates a significant improvement in overall quality compared to conventional SENSE methods.