Loading Events

« All Events

IEOR Seminar Series: Jelena Diakonikolas, University of Wisconsin-Madison

Jelena Diakonikolas

Title: Cyclic Block Coordinate Methods on a Finer Scale: Tighter Bounds and New Methods

Abstract: Cyclic block coordinate methods represent one of the simplest and most intuitive approaches in continuous optimization. They operate by partitioning coordinates into subsets and cyclically updating these subsets. These methods have received significant attention in the literature, with the alternating minimization method from the 1970s being one of the earliest examples. An even earlier method of Kaczmarz from 1937 can arguably also be viewed as a cyclic coordinate method. Today, cyclic (block) coordinate methods are widely employed in practical applications and serve as the default choice in software packages designed for large-scale statistical learning, including SparseNet and GLMNet.

Despite their broad adoption in practice, the empirical success of cyclic methods is not matched by the existing theory. In particular, even though these methods are often preferred over their randomized or full-vector-update counterparts, their theoretical convergence bounds are usually substantially worse. For example, the theoretical convergence bound for cyclic coordinate gradient descent is worse than the bound of (full vector) gradient descent by a factor scaling linearly with the dimension, and this is tight in the worst case.

In this talk, I will present a novel perspective on cyclic methods that leads to a more fine-grained characterization of their non-asymptotic convergence. I will then present novel cyclic methods with provably better scaling as a function of the number of blocks (dimension in the case of single-coordinate blocks). These results break nearly a decade-old computational barrier of cyclic methods by a factor scaling with the square-root of the dimension. I will further provide examples of problems and cyclic methods for which the convergence of cyclic methods is no worse than the convergence of randomized or full-vector-update methods, even in the worst case.

Bio: Jelena Diakonikolas is an assistant professor in the Department of Computer Sciences at the University of Wisconsin-Madison. Her primary research interests are in algorithm design and complexity analysis for large-scale continuous optimization problems, with a particular focus on challenges arising in machine learning applications. Prior to joining UW-Madison, she held postdoctoral positions at UC Berkeley and Boston University. She received her PhD from Columbia University. Jelena is a recipient of a Simons-Berkeley Research Fellowship, the Morton B. Friedman Prize for Excellence at Columbia Engineering, and a Qualcomm Innovation Fellowship.