- This event has passed.
IEOR Seminar: Vasilis Charisopoulos, Cornell University
February 8 @ 3:00 pm – 4:30 pm
A superlinearly convergent first-order method for nonsmooth optimization
Nonsmooth optimization problems appear throughout machine learning and signal processing. However, standard first-order methods for nonsmooth optimization can be slow for “poorly conditioned” problems. In this talk, I will present a locally accelerated first-order method that is less sensitive to conditioning and achieves superlinear (i.e., double-exponential) convergence near solutions for a broad family of problems. The algorithm is inspired by Newton’s method for solving nonlinear equations.
Vasilis is a final-year PhD candidate at Cornell ORIE advised by Damek Davis. He works on optimization and numerical linear algebra methods for machine learning and scientific computing. In particular, he studies what drives the success of heuristically motivated methods, which often find optimal solutions of nonconvex optimization problems despite NP-hardness, as well as the tradeoffs between communication, statistical efficiency, and privacy that arise when these algorithms are deployed in massively distributed computing environments. In support of this research, Vasilis was awarded the 2020-2021 Andreas G. Leventis scholarship and was a Cornell University nominee for the 2021-2022 Google PhD fellowship. Outside research, Vasilis is passionate about teaching and access to education. He was a recipient of the Cornelia Ye outstanding teaching assistant award in 2021, and served as instructor for Cornell’s Prison Education Program during Fall 2019.
1174 Etcheverry Hall or Zoom