Stochastic Gradient Descent: Strong convergence guarantees -- without parameter tuning

Stochastic Gradient Descent: Strong convergence guarantees -- without parameter tuning

Neyman Seminar
Aug 30, 2018, 04:00 PM - 05:00 PM | 60 Evans Hall | Happening As Scheduled
Rachel Ward, UT Austin
Stochastic Gradient Descent is the basic optimization algorithm behind powerful deep learning architectures which are becoming increasingly omnipresent in society. However, existing theoretical guarantees of convergence rely on knowing certain properties of the optimization problem such as maximal curvature and noise level which are not known a priori in practice. Thus, in practice, hyper...