Συντάχθηκε 27-10-2023 09:56
Τόπος:
Σύνδεσμος τηλεδιάσκεψης
Έναρξη: 10/11/2023 17:30
Λήξη: 10/11/2023 18:30
Abstract
Iterative (implicit) regularization is a classic idea in regularization theory that has recently become popular in machine learning. On the one hand, it allows to design efficient algorithms controlling at the same time numerical and statistical accuracy. On the other hand, it allows shedding light on the learning curves observed while training neural networks. In this talk, we will focus on iterative regularization in the context of classification. After contrasting this setting with regression and inverse problems, we develop an iterative regularization approach based on the hinge loss function, used frequently in practice. More precisely we consider a diagonal approach for a family of algorithms for which we prove convergence as well as rates of convergence. Our approach compares favorably with other alternatives, as confirmed also in numerical simulations.
About the Speaker
Vassilis Apidopoulos is a Post-Doctoral Researcher in the Laboratory for Computational and Statistical Learning (LCSL) at MaLGa Research Center (Università di Genova), working with Silvia Villa and Lorenzo Rosasco on implicit regularization. He completed his Ph.D. at the Institut de Mathématiques de Bordeaux, based on the study of inertial gradient descent algorithms. Before that, he did his Master’s Studies at the Université Claude Bernard Lyon 1 and the École Normale Supérieure de Lyon, and his undergraduate studies were completed in the Department of Mathematics of Aristotle University of Thessaloniki in Greece. His research interests lie in the field of optimization with a particular focus on Machine learning applications.
Σύνδεσμος εκδήλωσης