Seminario del 2024

2024
24 aprile
We are interested in the numerical solution of the matrix least squares problem min_X ∥AXB + CXD-F ∥_F , with A and C full column rank, B, D full row rank, F an n×n matrix of low rank, and ∥•∥_F the Frobenius norm. We derive a matrix-oriented implementation of LSQR, and devise an implementation of the truncation step that exploits the properties of the method. Experimental comparisons with the Conjugate Gradient method applied to the normal matrix equation and with a (new) sketched implementation of matrix LSQR illustrate the competitiveness of the proposed algorithm. We also explore the applicability of our method in the context of Kronecker-based Dictionary Learning, and devise a representation of the data that seems to be promising for classification purposes.

indietro