PROJECT TITLE :
Rectified Gaussian Scale Mixtures and the Sparse Non-Negative Least Squares Problem - 2018
In this Project, we develop a Bayesian evidence maximization framework to unravel the sparse non-negative least squares (S-NNLS) problem. We tend to introduce a family of chance densities called the rectified Gaussian scale mixture (R-GSM) to model the sparsity imposing previous distribution for the answer. The R-GSM prior encompasses a selection of heavy-tailed densities like the rectified Laplacian and rectified Student's t-distributions with a proper choice of the blending density. We have a tendency to utilize the hierarchical illustration induced by the R-GSM previous and develop an proof maximization framework based mostly on the expectation-maximization (EM) algorithm. Using the EM based methodology, we estimate the hyper-parameters and acquire a purpose estimate for the answer. We confer with the proposed method as rectified sparse Bayesian learning (R-SBL). We tend to provide four R-SBL variants that provide a range of choices for computational complexity and the standard of the E-step computation. These strategies embrace the Markov chain Monte Carlo EM, linear minimum mean-square-error estimation, approximate message passing, and a diagonal approximation. Using numerical experiments, we have a tendency to show that the proposed R-SBL technique outperforms existing S-NNLS solvers in terms of each signal and support recovery performance, and is also terribly robust against the structure of the design matrix.
Did you like this research project?
To get this research project Guidelines, Training and Code... Click Here