Previous talks at the SCCS Colloquium

Angelos Nikitaras: Solving least-squares problems involving large dense matrices

SCCS Colloquium |


With the advent of Deep Learning, training large-scale neural networks has become a crucial task across a wide range of applications. Typically, the training is performed using gradient-based optimization algorithms, with the most popular being Stochastic Gradient Descent and its variants. Random feature models offer a different method for training neural networks, where the parameters of the hidden layers are sampled from a data-agnostic distribution, such as a normal distribution. Building on this concept, a novel approach, known as the Sampling Where It Matters (SWIM) algorithm, was recently proposed. In SWIM, the network parameters of the hidden layers are constructed using a data-driven sampling scheme, followed by solving a linear least squares problem for the output layer. This linear problem is dense and highly ill-conditioned, posing significant challenges for traditional numerical solvers.


In this work, we investigate numerical methods for solving the least-squares problems that arise in the context of the SWIM algorithm, with a focus on scalability and efficiency. We propose using a recently developed iterative solver called LSRN, which leverages randomized preconditioning to significantly improve the convergence rate. In addition, we introduce an alternative approach that divides the problem into smaller subproblems, which are solved sequentially. We demonstrate the effectiveness of these methods through a series of numerical experiments, showcasing substantial improvements in training speed and scalability.

Guided research presentation. Angelos is advised by Prof. Dr. Felix Dietrich.