应用数学青年讨论班(午餐会)—— Convergence Analysis of Forward-Backward Accelerated Algorithms
报告人:李博文(中国科学院数学与系统科学研究院)
时间:2024-06-12 11:45-13:30
地点:智华楼四元厅
摘要:
A significant milestone in modern gradient-based optimization is the development of Nesterov’s accelerated gradient descent (NAG) method. This forward-backward technique has been further enhanced by its proximal generalization, known as the fast iterative shrinkage-thresholding algorithm (FISTA), which finds extensive applications in image science and engineering. In this talk, I will present a tighter inequality for the proximal gradient step of iteration points. By incorporating this tighter inequality into a well-constructed Lyapunov function, we achieve proximal subgradient norm minimization for convex objective functions using the phase-space representation. This approach provides a unified framework to prove the convergence of forward-backward algorithms. A key question in the literature is whether both NAG and FISTA exhibit linear convergence for strongly convex functions without needing prior knowledge of the strongly convex modulus. We address this question using the high-resolution ordinary differential equation (ODE) framework. Our analysis introduces a new Lyapunov function with a dynamically adapting coefficient of kinetic energy that evolves throughout the iterations. This advancement offers deeper insights into the convergence behavior of these algorithms.
报告人简介:
李博文,中国科学院数学与系统科学研究院博士生,主要研究方向为最优化理论和算法。目前的工作重点是通过微分方程框架研究一阶优化算法。对Nesterov加速梯度法和快速迭代收缩阈值算法(FISTA)提出了统一的分析框架,并且在此框架下得到了新的线性收敛结果。对于交替方向乘子法(ADMM)和原始对偶混合梯度法(PDHG)提出了对应的高精度微分方程,并研究了相应的Lyapunov函数和收敛性结果。
报名问卷:
我们从11:45开始按照问卷情况提供午餐(报告时间12:30-13:30),请需要预定午餐的老师同学填写此报名问卷,周二晚6点截止。https://www.wjx.cn/vm/ecBCo5Y.aspx#