PROJECT TITLE :
A Holistic Approach for Distributed Dimensionality Reduction of Big Data - 2018
With the exponential growth of knowledge volume, huge information have placed an unprecedented burden on current computing infrastructure. Dimensionality reduction of huge data attracts a nice deal of attention lately as an economical method to extract the core information that is smaller to store and faster to process. This Project aims at addressing the 3 fundamental problems closely connected to distributed dimensionality reduction of big data, i.e., massive knowledge fusion, dimensionality reduction algorithm and construction of distributed computing platform. A chunk tensor methodology is presented to fuse the unstructured, semi-structured and structured information as a unified model in which all characteristics of the heterogeneous data are appropriately arranged along the tensor orders. A Lanczos based mostly high order singular value decomposition algorithm is proposed to scale back dimensionality of the unified model. Theoretical analyses of the algorithm are provided in terms of storage scheme, convergence property and computation value. To execute the dimensionality reduction task, this Project employs the clear computing paradigm to construct a distributed computing platform also utilizes a four-objectives optimization model to schedule the tasks. Experimental results demonstrate that the proposed holistic approach is efficient for distributed dimensionality reduction of big data.
Did you like this research project?
To get this research project Guidelines, Training and Code... Click Here