PROJECT TITLE :
Parallel Selective Algorithms for Nonconvex Big Data Optimization
We tend to propose a decomposition framework for the parallel optimization of the sum of a differentiable (presumably nonconvex) perform and a (block) separable nonsmooth, convex one. The latter term is usually employed to enforce structure in the answer, sometimes sparsity. Our framework is terribly flexible and includes each totally parallel Jacobi schemes and Gauss–Seidel (i.e., sequential) ones, in addition to just about all prospects “in between” with solely a subset of variables updated at each iteration. Our theoretical convergence results improve on existing ones, and numerical results on LASSO, logistic regression, and a few nonconvex quadratic issues show that the new methodology consistently outperforms existing algorithms.
Did you like this research project?
To get this research project Guidelines, Training and Code... Click Here