PROJECT TITLE :
Leveraging Conceptualization for Short-Text Embedding - 2018
Most short-text embedding models sometimes represent each short-text solely using the literal meanings of the words, that makes these models indiscriminative for the ever present polysemy. In order to boost the semantic illustration capability of the short-texts, we have a tendency to (i) propose a completely unique short-text conceptualization algorithm to assign the associated ideas for each short-text, and then (ii) introduce the conceptualization results into learning the conceptual short-text embeddings. Hence, this semantic illustration is a lot of expressive than some widely-used text illustration models like the latent topic model. Wherein, the short-text conceptualization algorithm used here is predicated on a novel co-ranking framework, enabling the signals (i.e., the words and also the ideas) to totally interplay to derive the solid conceptualization for the short-texts. Afterwards, we tend to any extend the conceptual short-text embedding models by utilizing an attention-based model that selects the relevant words inside the context to make more efficient prediction. The experiments on the real-world datasets demonstrate that the proposed conceptual short-text embedding model and short-text conceptualization algorithm are a lot of effective than the state-of-the-art strategies.
Did you like this research project?
To get this research project Guidelines, Training and Code... Click Here