机器学习与数据科学博士生系列论坛(第七十四期)—— Non-Asymptotic Convergence Analysis for Nonsmooth Nonconvex Optimization
报告人:陈坤 (tyc234cc 太阳成集团)
时间:2024-06-20 16:00-17:00
地点:腾讯会议 627-5441-1672
摘要:
The theoretical analysis of nonsmooth and nonconvex optimization has garnered tremendous research attention over the years, and the asymptotic results have been well-established. In recent years, this field has also received more attention due to the explosion of neural networks with non-linear activations such as ReLU. While gradient-based non-asymptotic analysis is comprehensive for convex or smooth optimization problems, there are still many unknowns for nonsmooth and nonconvex cases. The breakthrough result of Zhang et al. [2020] proposed a randomized gradient-based algorithm with finite complexity that identifies Goldstein stationary points of nonsmooth and nonconvex functions. Since then, several subsequent works have proposed variants of the algorithm in gradient-based or zero-order settings, along with some lower bounds of the complexity.
In this talk, we will first introduce the different choices of the stationary points for non-asymptotic convergence, explaining why the Goldstein stationary point is widely accepted. Based on this, we will then present the corresponding gradient-based and zero-order methods, along with their variants and applications. Additionally, we will discuss the limitations and some lower bounds of this problem.
论坛简介:该线上论坛是由张志华教授机器学习实验室组织,每两周主办一次(除了公共假期)。论坛每次邀请一位博士生就某个前沿课题做较为系统深入的介绍,主题包括但不限于机器学习、高维统计学、运筹优化和理论计算机科学。