Seminari periodici
DIPARTIMENTO DI MATEMATICA

SCUBE

Series of Semester Seminars organized at the Department of Mathematics, University of Bologna. The presentations mainly cover Numerical Linear Algebra problems and their wide range of applications. Broader contributions are very welcome.
Organizzato da: Davide Palitta and Valeria Simoncini

Seminari passati

2025
09 gennaio
Giovanni Seraghiti
nell'ambito della serie: SCUBE
Seminario di analisi numerica
In this seminar, I will talk about Objective Function Free Optimization (OFFO) in the context of pruning the parameter of a given model. OFFO algorithms are methods where the objective function is never computed; instead, they rely only on derivative information, thus on the gradient in the first-order case. I will give an overview of the main OFFO methods, focusing on adaptive algorithms such as Adagrad, Adam, RMSprop, ADADELTA, which are gradient methods that share the common characteristic of depending only on current and past gradient information to adaptively determine the step size at each iteration. Next, I will briefly discuss the most popular pruning approaches. As the name implies, pruning a model, typically a neural networks, refers to the process of reducing its size and complexity, typically by removing certain parameters that are considered unnecessary for its performance. Pruning emerges as an alternative compression technique for neural networks to matrix and tensor factorization or quantization. Mainly, I will focus on pruning-aware methods that uses specific rules to classify parameters as relevant or irrelevant at each iteration, enhancing convergence to a solution of the problem at hand, which is robust to pruning irrelevant parameters after training.Finally, I will introduce a novel deterministic algorithm which is both adaptive and pruning-aware, based on a modification Adagrad scheme that converges to a solution robust to pruning with complexity of $\log(k) \backslash k$. I will illustrate some preliminary results on different applications.
2024
16 dicembre
This seminar presents two recent works focused on sparse signal recovery and inverse problems. The first part introduces the truncated Huber penalty, a non-convex penalty function designed for robust signal recovery. We explore its application in constrained and unconstrained models, proving theoretical properties of the optimal solutions. An efficient algorithm based on the block coordinate descent method is also discussed, along with applications in signal and image processing. The second part covers a generalized Tikhonov regularization framework with spatially varying weights estimated via a neural network. This end-to-end approach integrates adaptive parameter estimation, improving detail preservation.
2024
26 novembre
Paolo Zuzolo
nell'ambito della serie: SCUBE
Seminario di analisi numerica
3D shape analysis tasks often involve characterizing a 3D object by an invariant, computationally efficient, and discriminative numerical representation, called shape descriptors. Among those, spectral-based shape descriptors have become increasingly widespread, since the spectrum is an isometry invariant, and thus is independent of the object’s representation including parametrization and spatial position[1]. However, large spectral decompositions and the choice of the most significant eigen-couples become computationally expensive for large set of data-points. We introduce a concise learning-based shape descriptor, computed through a Generalized Graph Neural Network (G-GNN) [2]. The G-GNN is an unsupervised graph neural network, leveraging spectral-based convolutional operators, derived from a learnable, energy-driven evolution process. Applied to a 3D polygonal mesh, the G-GNN allows to learn features acting as global shape descriptor of the 3D object. Using a 3D mesh related Dirichlet-like energy leads to a spectral and intrinsic shape descriptor, tied to the isometry invariant Laplace-Beltrami operator. Finally, by equipping the G-GNN with a suitable shape retrieval loss, the spectral shape descriptor can be employed in non-linear dimensionality reduction problems since it can define an optimal embedding, squeezing the latent information of a 3D model into a compact low-dimensional shape representation of the 3D object [1] Martin Reuter, Franz-Erich Wolter, Niklas Peinecke, Laplace–Beltrami spectra as ‘Shape-DNA’ of surfaces and solids, Computer-Aided Design, Volume 38, Issue 4, 2006, Pages 342-366, ISSN 0010-4485, https://doi.org/10.1016/j.cad.2005.10.011. [2] D. Lazzaro, S. Morigi, P. Zuzolo, Learning intrinsic shape representations via spectral mesh convolutions, Neurocomputing, Volume 598, 2024, 128152, ISSN 0925-2312, https://doi.org/10.1016/j.neucom.2024.128152.
2024
10 ottobre
Kai Bergermann, Math Dept, TU-Chemnitz, Germany
nell'ambito della serie: SCUBE
nel ciclo di seminari: SCUBE
Seminario di analisi numerica
Multiplex networks are used to model complex systems from myriad applications. They generalize classical complex networks by recording different types of relationships, different interactions, or changing interactions over time between the same entities in different layers. They possess natural linear algebraic representations in terms of structured matrices, which makes efficient numerical linear algebra techniques a valuable tool for their analysis. In this talk, we give an overview over several network science problems that can be formulated in terms of matrix function expressions, which we approximate by polynomial and rational Krylov methods. We discuss centrality measures, the solution of stiff systems of non-linear differential equations with exponential Runge--Kutta integrators, as well as un- and semi-supervised community detection. Additionally, we present a nonlinear spectral method for core-periphery detection in multiplex networks. All presented methods have a linear runtime scaling, which allows the treatment of large-scale multiplex networks and we present numerical experiments for all considered problems.