Webinar Series: Mathematical Foundations of Data Science

June 9th, 3pm EDT

From Stochastic Frank-Wolfe to the Ellipsoid Method: Recent Progress on Practical Optimization in Data Science (the Frank-Wolfe Method) and Theoretical Optimization (the Ellipsoid Method)

Speaker:

Robert M. Freund, Sloan School of Management, MIT

Abstract:

This talk will report on two ongoing research projects – one on the stochastic Frank-Wolfe method, and the other on the Ellipsoid Algorithm.  The first part of the talk will report on recent progress on a stochastic version of the Frank-Wolfe method (“StochFW”) for constrained finite-sum minimization problems, which arise in empirical loss minimization problems in particular and are ubiquitous in machine learning applications.  Here the Frank-Wolfe method is very attractive in particular because the algorithm itself encourages desirable solution structure (such as sparsity or low-rank) without sacrificing computational guarantees.  We present STOCHFW, which is a novel stochastic version of the Frank-Wolfe method, that is designed for the regime where the sample size n is huge and computing an exact gradient is too expensive.  The STOCHFW method matches or improves on the best computational guarantees for this class of algorithms depending on the regime; and furthermore we present computational results that point to improved performance in practice as well.

The second part of the talk concerns the ellipsoid algorithm, which is a fundamental algorithm for computing a solution to system of linear inequalities P: Ax <= u when the set of solutions S := { x : Ax <= u } has positive volume.  However, when P is infeasible the ellipsoid algorithm has no built-in mechanism for proving that P is infeasible. (This is in contrast to the other two fundamental algorithms for tackling P, namely the simplex method and interior point methods, each of which can be easily implemented in a way that either produces a solution of P or proves that P is infeasible by producing a solution to the alternative system ALT: A*w = 0, u*w < 0, w >= 0.)  Motivated thusly, we develop an Oblivious Ellipsoid Algorithm (OEA) that produces a solution of P when P is feasible and proves infeasibility otherwise, with improved complexity over the standard ellipsoid algorithm in certain regimes. 

Joint work with Geoffrey Négiar, Gideon Dresdner, Alicia Yi-Ting Tsai, Laurent El Ghaoui, Francesco Locatello, Fabian Pedregosa, Jourdain Lamperski, and Michael Todd

Bio:

Robert Freund is the Theresa Seley Professor in Management Science at the Sloan School of Management at MIT.  His research interests are in convex optimization, computational complexity and related computational science, convex geometry, large-scale nonlinear optimization in business analytics and machine learning, and related mathematical systems. He received his B.A. in Mathematics from Princeton University and M.S. and Ph.D. degrees in Operations Research at Stanford University.  He is the former Co-Director of MIT Operations Research Center, the MIT Program in Computation for Design and Optimization, and the former Chair of the INFORMS Optimization Section.  He also served a term as Deputy Dean of the Sloan School at MIT (2008-11).  He is an INFORMS Fellow and a recipient of the MIT Samuel M. Seegal Prize (2020). He received the Longuet-Higgins Prize in computer vision (2007) as well as numerous teaching and education awards at MIT in conjunction with the course and textbook (co-authored with Dimitris Bertsimas) Data, Models, and Decisions: the Fundamentals of Management Science.

Event Type

Sponsor

Georgia Institute of Technology
Northwestern University
Pennsylvania State University
Princeton University
University of Illinois at Urbana-Champaign
National Institute of Statistical Sciences
Harvard University

Location

Online Webinar
Robert M. Freund, Sloan School of Management, MIT