Minority Oversampling in Kernel Adaptive Subspaces for Class Imbalanced Datasets - 2018


The class imbalance drawback in machine learning occurs when sure classes are underrepresented relative to the others, resulting in a learning bias toward the majority categories. To deal with the skewed category distribution, many learning ways that includes minority oversampling have been proposed, which are proved to be effective. To cut back info loss throughout feature house projection, this study proposes a unique oversampling algorithm, named minority oversampling in kernel adaptive subspaces (MOKAS), which exploits the invariant feature extraction capability of a kernel version of the adaptive subspace self-organizing maps. The artificial instances are generated from well-trained subspaces and then their pre-images are reconstructed within the input space. Additionally, these instances characterize nonlinear structures present within the minority class knowledge distribution and facilitate the educational algorithms to counterbalance the skewed category distribution in a very fascinating manner. Experimental results on both real and artificial data show that the proposed MOKAS is capable of modeling complicated information distribution and outperforms a collection of state-of-the-art oversampling algorithms.

Did you like this research project?

To get this research project Guidelines, Training and Code... Click Here

PROJECT TITLE : Joint Interference Coordination and Load Balancing for OFDMA Multihop Cellular Networks - 2014 ABSTRACT: Multihop cellular networks (MCNs) have drawn tremendous attention due to its high throughput and extensive
PROJECT TITLE :Distributed Processing of Probabilistic Top-k Queries in Wireless Sensor Networks - 2013ABSTRACT:In this paper, we introduce the notion of sufficient set and necessary set for distributed processing of probabilistic

Ready to Complete Your Academic MTech Project Work In Affordable Price ?

Project Enquiry