题目:On the global convergence of coordinate gradient descent for non-convex optimization
报告人: 李颖洲 研究员(复旦大学)
时间:2022年8月24日,10:00-11:00
腾讯会议 ID:378-354-102
摘要: Coordinat descent descent methods are considered for eigenvalue problems based on a reformulation of the leading eigenvalue problem as a nonconvex optimization problem. The convergence of several deterministic coordinate methods is analyzed and compared. We also analyze the global convergence property of coordinate gradient descent with random choice of coordinates and stepsizes. Under generic assumptions, we prove that the algorithm iterate will almost surely escape strict saddle points of the objective function. As a result, the algorithm is guaranteed to converge to local minima if all saddle points are strict. Numerical examples of applications to quantum many-body problems demonstrate the efficiency and provide benchmarks of the proposed coordinate descent methods.
邀请人:张雷洪