Skip to content
CSDE News & Events

On Gradient-Based Optimization: Accelerated, Stochastic and Nonconvex (Urban@UW Taskar Memorial Lecture, 3/1/18)

Posted: 2/26/2018 (Local Events)

Abstract
Many new theoretical challenges have arisen in the area of gradient-based optimization for large-scale statistical data analysis, driven by the needs of applications and the opportunities provided by new hardware and software platforms. I discuss several recent, related results in this area: (1) a new framework for understanding Nesterov acceleration, obtained by taking a continuous-time, Lagrangian/Hamiltonian/symplectic perspective, (2) a discussion of how to escape saddle points efficiently in nonconvex optimization, and (3) the acceleration of Langevin diffusion.

Read Full Article

Date: 03/01/2018

Time: 3:30-4:30 PM

Location: Electrical Engineering Building, Room 105