Extreme Learning Machines Using Mixture Correntropy PROJECT TITLE : Mixture Correntropy-Based Kernel Extreme Learning Machines ABSTRACT: As a natural extension of ELM to kernel learning, the kernel-based extreme learning machine, also known as KELM, has achieved outstanding performance in addressing a variety of regression and classification issues. In contrast to the standard ELM, the KELM has a superior capacity for generalization thanks to the fact that it does not require the number of hidden nodes to be specified in advance and utilizes a mechanism for random projection. Due to the fact that KELM is developed using the minimum mean square error (MMSE) criterion for the Gaussian assumption of noise, the performance of KELM may significantly deteriorate when applied to non-Gaussian cases. This article proposes a mixture correntropy-based KELM (MC-KELM), which adopts the recently proposed maximum mixture correntropy criterion as the optimization criterion, rather than using the MMSE criterion. The goal of this proposal is to improve the robustness of the KELM algorithm. In addition, an online sequential version of MC-KELM known as MCOS-KELM has been developed to deal with the situation in which the data arrive in sequential order (one-by-one or chunk-by-chunk). In order to validate the performance superiorities of the new methods, experimental results on regression and classification data sets are reported here. Did you like this research project? To get this research project Guidelines, Training and Code... Click Here facebook twitter google+ linkedin stumble pinterest Randomized Multi-Dimensional Response Learning to Solve Social Internet of Things Task-Optimized Group Search Problems