A Unified Methodology for Constructing Optimal Optimization Methods

To view complete details for this event, click here to view the announcement

Branch-and-Bound Performance Estimation Programming: A Unified Methodology for Constructing Optimal Optimization Methods


Abstract: First-order methods (FOMs) are optimization algorithms that can be described and analyzed using the values and (sub)gradients of the functions to be minimized. FOMs are the main workhorses for modern large-scale optimization and machine learning due to their low iteration costs, minimal memory requirements, and dimension-independent convergence guarantees. As the data revolution continues to unfold, the pressing demand for faster FOMs has become the key problem in today’s big data era. To that goal, we present Branch-and-Bound Performance Estimation Programming (BnB-PEP), the first unified methodology for constructing optimal (provably fastest) FOMs for convex and nonconvex optimization. BnB-PEP poses the problem of finding the optimal FOM as a nonconvex but practically tractable quadratically constrained quadratic optimization problem and solves it to certifiable global optimality using a custom branch-and-bound algorithm. Our open-source custom branch-and-bound algorithm, through exploiting specific problem structures, outperforms the latest off-the-shelf software by orders of magnitude, accelerating the solution time to discover the optimal FOMs from hours to seconds and weeks to minutes. We apply BnB-PEP to several practically relevant convex and nonconvex setups and obtain FOMs with bounds that improve upon prior state-of-the-art results. Furthermore, we use the BnB-PEP methodology to find proofs with potential function structures, thereby systematically generating analytical convergence proofs. Recently, BnB-PEP has helped pave the way for some fundamental results in optimization: (i) novel momentum-free accelerated gradient descent methods that broke with decades of conventional wisdom (ii) the optimal FOM for constrained convex optimization that outperforms the celebrated FISTA algorithm by Beck and Teboulle, and (iii) the first non-asymptotic convergence theory for the widely used nonlinear conjugate gradient methods. (joint work with Bart Van Parys and Ernest K. Ryu)
 

Date and Time

  • Date: 05 Mar 2025
  • Time: 04:00 PM to 05:00 PM
  • All times are (UTC-05:00) Eastern Time (US & Canada)
  • Add_To_Calendar_icon Add Event to Calendar

Location

  • 141 Warren St
  • New Jersey Institute of Technology
  • Newark, New Jersey
  • United States 07103
  • Building: ECE
  • Room Number: 202

Hosts

Registration

  • Starts 26 February 2025 09:00 AM
  • Ends 05 March 2025 03:00 PM
  • All times are (UTC-05:00) Eastern Time (US & Canada)
  • No Admission Charge

Speakers

Shuvomoy Das Gupta of Columbia University

 

Topic:

Branch-and-Bound Performance Estimation Programming: A Unified Methodology for Constructing Optimal Optimization Methods



Agenda

- Talk by Shuvomoy Das Gupta at 4:00 pm
- Dinner box after the talk at 5:00 pm
- You don't have to be an IEEE member to attend this meeting.




Attachments: