||Zirui Zhou, Hong Kong Baptist University
||10:00-11:00 Dec. 19, 2018
||Zibin S301, Fudan University
||The cubic regularization (CR) method has attracted much attention lately in the optimization community. While it is well-known that the CR method is globally convergent and enjoys a superior global iteration complexity, existing results on its local quadratic convergence require a stringent non-degeneracy condition. In this talk, we show that under a local error bound (EB) condition, which is much weaker a requirement than the existing non-degeneracy condition, the sequence of iterates generated by the CR method converges at least Q-quadratically to a second-order critical point. This indicates that adding cubic regularization not only equips Newton’s method with remarkable global convergence properties but also enables it to converge quadratically even in the presence of degenerate solutions. As a byproduct, we show that without assuming convexity, the proposed EB condition is equivalent to a quadratic growth condition, which could be of independent interest. To demonstrate the usefulness and relevance of our convergence analysis, we focus on two concrete nonconvex optimization problems that arise in phase retrieval and low-rank matrix recovery and show that with overwhelming probability, the sequence of iterates generated by the CR method for solving these two problems converges at least Q-quadratically to a global minimizer. We also present numerical results of the CR method when applied to solve these two problems to support and complement our theoretical development..
||Dr. Zirui Zhou is currently an assistant professor in the Department of Mathematics at Hong Kong Baptist University. Prior to that, he was an Alan Mekler Postdoctoral Fellow in the Department of Mathematics at Simon Fraser University and received his PhD in the Department of Systems Engineering and Engineering Management at the Chinese University of Hong Kong. Dr. Zhou’s research area is mainly in continuous optimization and its application to machine learning, signal processing, and data analysis. His research works have been published in Mathematical Programming, Optimization Methods & Software, ICML, and NIPS.