Seminario del 2023

We consider structured optimization problems defined in terms of the sum of a smooth and convex function and a proper, lower semicontinuous (l.s.c.), convex (typically nonsmooth) function in reflexive variable exponent Lebesgue spaces $L^p(cdot)$. Due to their intrinsic space-variant properties, such spaces can be naturally used as solution spaces and combined with space-variant functionals for the solution of ill-posed inverse problems. For this purpose, we propose a new proximal gradient algorithm in L^p(), where the proximal step, rather than depending on the natural (non-separable) L^p()-norm, is defined in terms of its modular function, which, thanks to its separability, allows for the efficient computation of algorithmic iterates. To highlight the effectiveness of the modeling, we report some numerical tests for the CT imaging application.

indietro