PROJECT TITLE :
For prime-dimensional knowledge, it is typically fascinating to group similar options along throughout the training process. This may reduce the estimation variance and improve the soundness of feature selection, leading to higher generalization. Moreover, it will also help in understanding and interpreting data. Octagonal shrinkage and clustering algorithm for regression (OSCAR) could be a recent sparse-modeling approach that uses a l1 -regularizer and a pairwise l∞-regularizer on the feature coefficients to encourage such feature grouping. But, computationally, its optimization procedure is terribly expensive. During this paper, we tend to propose an efficient solver primarily based on the accelerated gradient method. We show that its key proximal step will be solved by a highly economical simple iterative group merging algorithm. Given d input options, this reduces the empirical time complexity from O(dtwo ~ d5) for the existing solvers to only O(d). Experimental results on a range of toy and real-world datasets demonstrate that OSCAR may be a competitive sparse-modeling approach, but with the added ability of automatic feature grouping.
Did you like this research project?
To get this research project Guidelines, Training and Code... Click Here