Generalized SMO Algorithm for SVM-Based Multitask Learning PROJECT TITLE :Generalized SMO Algorithm for SVM-Based Multitask LearningABSTRACT:Exploiting additional information to improve traditional inductive learning is an active research area in Machine Learning. In many supervised-learning applications, training data can be naturally separated into several groups, and incorporating this group information into learning may improve generalization. Recently, Vapnik proposed a general approach to formalizing such problems, known as “learning with structured data” and its support vector machine (SVM) based optimization formulation called ${rm SVM}+$. Liang and Cherkassky showed the connection between ${rm SVM}+$ and multitask learning (MTL) approaches in Machine Learning, and proposed an SVM-based formulation for MTL called ${rm SVM}{+}{rm MTL}$ for classification. Training the ${rm SVM}+{rm MTL}$ classifier requires the solution of a large quadratic programming optimization problem which scales as $O(n^{3})$ with sample size $n$. So there is a need to develop computationally efficient algorithms for implementing ${rm SVM}{+}{rm MTL}$. This brief generalizes Platt's sequential minimal optimization (SMO) algorithm to the ${rm SVM}{+}{rm MTL}$ setting. Empirical results show that, for typical ${rm SVM}{+}{rm MTL}$ problems, the proposed generalized SMO achieves over 100 times speed-up, in comparison with general-purpose optimization routines. Did you like this research project? To get this research project Guidelines, Training and Code... Click Here facebook twitter google+ linkedin stumble pinterest Neural Assembly Computing Tangent Hyperplane Kernel Principal Component Analysis for Denoising