Sell Your Projects | My Account | Careers | This email address is being protected from spambots. You need JavaScript enabled to view it. | Call: +91 9573777164

Sparse Generalized Eigenvalue Problem Via Smooth Optimization

1 1 1 1 1 Rating 4.80 (83 Votes)


Sparse Generalized Eigenvalue Problem Via Smooth Optimization


In this paper, we contemplate an $ell_0$-norm penalized formulation of the generalized eigenvalue downside (GEP), aimed at extracting the leading sparse generalized eigenvector of a matrix combine. The formulation involves maximization of a discontinuous nonconcave objective operate over a nonconvex constraint set, and is therefore computationally intractable. To tackle the problem, we have a tendency to first approximate the $ell_0$-norm by never-ending surrogate operate. Then an algorithm is developed via iteratively majorizing the surrogate operate by a quadratic separable operate, which at each iteration reduces to an everyday generalized eigenvalue downside. A preconditioned steepest ascent algorithm for locating the leading generalized eigenvector is provided. A systematic means primarily based on smoothing is proposed to house the “singularity issue” that arises when a quadratic operate is used to majorize the nondifferentiable surrogate perform. For sparse GEPs with special structure, algorithms that admit a closed-form answer at every iteration are derived. Numerical experiments show that the proposed algorithms match or outperform existing algorithms in terms of computational complexity and support recovery.

Did you like this research project?

To get this research project Guidelines, Training and Code... Click Here

Sparse Generalized Eigenvalue Problem Via Smooth Optimization - 4.8 out of 5 based on 83 votes

Project EnquiryLatest Ready Available Academic Live Projects in affordable prices

Included complete project review wise documentation with project explanation videos and Much More...