PROJECT TITLE :
We present an analysis of the Domestically Competitive Algorithm (LCA), that is a Hopfield-style neural network that efficiently solves sparse approximation problems (e.g., approximating a vector from a dictionary using simply a few nonzero coefficients). This category of problems plays a important role in each theories of neural coding and applications in signal processing. However, the LCA lacks analysis of its convergence properties, and previous results on neural networks for nonsmooth optimization do not apply to the specifics of the LCA architecture. We show that the LCA has desirable convergence properties, such as stability and global convergence to the optimum of the target perform when it's distinctive. Beneath some mild conditions, the support of the solution is also proven to be reached in finite time. Furthermore, some restrictions on the matter specifics permit us to characterize the convergence rate of the system by showing that the LCA converges exponentially quick with an analytically bounded convergence rate. We have a tendency to support our analysis with many illustrative simulations.
Did you like this research project?
To get this research project Guidelines, Training and Code... Click Here