Flexible Multi-layer Sparse Approximations of Matrices and Applications - 2016 PROJECT TITLE: Flexible Multi-layer Sparse Approximations of Matrices and Applications - 2016 ABSTRACT: In this paper we propose a family of recent algorithms for non-negative matrix/tensor factorization (NMF/NTF) and sparse nonnegative coding and representation that has several potential applications in computational neuroscience, multi-sensory, multidimensional information analysis and text mining. We have developed a category of local algorithms which are extensions of Hierarchical Alternating Least Squares (HALS) algorithms proposed by us in [one]. For these purposes, we tend to have performed simultaneous constrained minimization of a group of sturdy cost functions known as alpha and beta divergences. Our algorithms are locally stable and work well for the NMF blind source separation (BSS) not solely for the over-determined case but additionally for an under-determined (over-complete) case (i.e., for a system that has less sensors than sources) if data are sufficiently sparse. The NMF learning rules are extended and generalized for N-th order nonnegative tensor factorization (NTF). Moreover, new algorithms will be doubtless accommodated to totally different noise statistics by just adjusting a single parameter. Extensive experimental results confirm the validity and high performance of the developed algorithms, especially, with usage of the multi-layer hierarchical approach [1]. Did you like this research project? To get this research project Guidelines, Training and Code... Click Here facebook twitter google+ linkedin stumble pinterest Graph Signal Denoising via Trilateral Filter onGraph Spectral Domain - 2016 Explicit State-Estimation Error Calculations for Flag Hidden Markov Models - 2016