Seminario del 2023

While density compensation has been introduced artificially in iterative SENSE reconstructions as a preconditioner for accelerating the convergence of the conjugate gradient descent algorithm, it turns out that this density compensation appears naturally if the problem is reformulated with appropriate Euclidean products in the image space and the data space. This is also true for compressed sensing reconstruction, which can be seen as solving the same least-squares problem as iterative SENSE, but with L1 regularization. Modifying the Euclidean products also changes the adjoint operator of the linear model, which has important consequences for implementations. The purpose of this talk is to illustrate how conjugate gradient descent can be modified to take different inner products into account.

indietro