Abstract
Textual network embedding leverages rich text information associated with the network to learn low-dimensional vectorial representations of vertices. Rather than using typical natural language processing (NLP) approaches, recent research exploits the relationship of texts on the same edge to graphically embed text. However, these models neglect to measure the complete level of connectivity between any two texts in the graph. We present diffusion maps for textual network embedding (DMTE), integrating global structural information of the graph to capture the semantic relatedness between texts, with a diffusion-convolution operation applied on the text inputs. In addition, a new objective function is designed to efficiently preserve the high-order proximity using the graph diffusion. Experimental results show that the proposed approach outperforms state-of-the-art methods on the vertex-classification and link-prediction tasks.
Original language | English (US) |
---|---|
Title of host publication | Advances in Neural Information Processing Systems |
Publisher | Neural information processing systems foundation |
Pages | 7587-7597 |
Number of pages | 11 |
State | Published - Jan 1 2018 |
Externally published | Yes |