Seminario del 2020

2020
15 ottobre
A common task in inverse problems and imaging is finding a solution that is sparse, in the sense that most of its components vanish. In the framework of compressed sensing, general results guaranteeing exact recovery have been proven. In practice, sparse solutions are often computed combining \ell_1-penalized least squares optimization with an appropriate numerical scheme to accomplish the task. A computationally efficient alternative for finding sparse solutions to linear inverse problems is provided by Bayesian hierarchical models, in which the sparsity is encoded by defining a conditionally Gaussian prior model with the prior parameter obeying a generalized gamma distribution. In this talk, we are discussing the analytic properties of this class of hypermodels, together with their sparsity promoting effects. The typical benefits of non-convex penalty terms will be coupled with the pleasant convexity guarantees thus making the way for a hybrid solver which allows to have the best of both worlds.

indietro