Questo sito utilizza solo cookie tecnici per il corretto funzionamento delle pagine web e per il miglioramento dei servizi.
Se vuoi saperne di più o negare il consenso consulta l'informativa sulla privacy.
Proseguendo la navigazione del sito acconsenti all'uso dei cookie.
Se vuoi saperne di più o negare il consenso consulta l'informativa sulla privacy.
Proseguendo la navigazione del sito acconsenti all'uso dei cookie.
Seminario del 2023
2023
29 settembre
Davide Bianchi
Seminario di analisi numerica
We investigate a Tikhonov method that embeds a graph Laplacian operator in the penalty term (graphLa+). The novelty lies in building the graph Laplacian based on a first approximation of the solution derived by any other reconstruction method. Consequently, the penalty term becomes dynamic, depending on and adapting to the observed data and noise. We demonstrate that graphLa+ is a regularization method and we rigorously establish both its convergence and stability properties.
Moreover, we present some selected numerical experiments in 2D computerized tomography, where we combine the graphLa+ method with several reconstructors: Filter Back Projection (graphLa+FBP), standard Tikhonov (graphLa+Tik), Total Variation (graphLa+TV) and a trained deep neural network (graphLa+Net). The quality increase of the approximated solutions granted by the graphLa+ approach is outstanding for each given method. In particular, graphLa+Net outperforms any other method, presenting a robust and stable implementation of deep neural networks for applications involving inverse problems.