Seminario del 2023

2023
29 settembre
We investigate a Tikhonov method that embeds a graph Laplacian operator in the penalty term (graphLa+). The novelty lies in building the graph Laplacian based on a first approximation of the solution derived by any other reconstruction method. Consequently, the penalty term becomes dynamic, depending on and adapting to the observed data and noise. We demonstrate that graphLa+ is a regularization method and we rigorously establish both its convergence and stability properties. Moreover, we present some selected numerical experiments in 2D computerized tomography, where we combine the graphLa+ method with several reconstructors: Filter Back Projection (graphLa+FBP), standard Tikhonov (graphLa+Tik), Total Variation (graphLa+TV) and a trained deep neural network (graphLa+Net). The quality increase of the approximated solutions granted by the graphLa+ approach is outstanding for each given method. In particular, graphLa+Net outperforms any other method, presenting a robust and stable implementation of deep neural networks for applications involving inverse problems.

indietro