PROJECT TITLE :
Global Optimality in Low-Rank Matrix Optimization - 2018
This Project considers the minimization of a general objective function f(X) over the set of rectangular n × m matrices that have rank at most r. To scale back the computational burden, we tend to factorize the variable X into a product of 2 smaller matrices and optimize over these two matrices rather than X. Despite the ensuing nonconvexity, recent studies in matrix completion and sensing have shown that the factored downside has no spurious local minima and obeys the so-known as strict saddle property (the operate encompasses a directional negative curvature in any respect critical points however local minima). We analyze the worldwide geometry for a general and however well-conditioned objective perform f(X) whose restricted sturdy convexity and restricted sturdy smoothness constants are comparable. In explicit, we tend to show that the reformulated objective function has no spurious native minima and obeys the strict saddle property. These geometric properties imply that a range of iterative optimization algorithms (such as gradient descent) can provably solve the factored drawback with global convergence.
Did you like this research project?
To get this research project Guidelines, Training and Code... Click Here