PPD: A Parallel Primal-Dual Coordinate Descent Algorithm that is Scalable and Effective PROJECT TITLE : PPD: A Scalable and Efficient Parallel Primal-Dual Coordinate Descent Algorithm ABSTRACT: One of the most common approaches to optimization is called Dual Coordinate Descent, or DCD for short. Due to the sequential nature of DCD, parallelization of the process is difficult to achieve. Because of this, running multiple DCD threads concurrently on batches of data elements leads to inaccuracy in the results as well as a slow convergence. This is because the results are simultaneously updated with multiple coordinates. Certain methods of parallelization make use of distinct approximate functions, the approximation of which is determined by the degree of parallelism. Because of these dependencies, the scalability of the system is poor, and it converges slowly. In this paper, we present a new parallel primal-dual algorithm for DCD that we call PPD. Its purpose is to address the challenges described above. When performing PPD, the block data distribution is exploited in order to obtain a new approximate function that is unaffected by the parallelism. In addition, PPD was developed with an innovative primal-dual acceleration scheme, which allows it to get closer to the optimal solution in a more expedient manner. Through a series of experiments, we present evidence that demonstrates the benefits of PPD in terms of scalability and efficiency. Did you like this research project? To get this research project Guidelines, Training and Code... Click Here facebook twitter google+ linkedin stumble pinterest PRIMAL-GMM stands for PaRametrically Assisted Learning of Gaussian Mixture Models. Reinforcement Learning for Sequence Modeling that Maximizes Attention