Seminars

View all Seminars  |  Download ICal for this event

Neural Graph Embedding Methods for Natural Language Processing

Series: Ph.D Thesis Defence (On-line)

Speaker: Mr. Shikhar Vashishth Ph.D Student Dept. of CSA

Date/Time: May 04 10:30:00

Location: Microsoft Meeting Link

Faculty Advisor: Prof. Partha Pratim Talukdar & Prof. Chiranjib Bhattacharyya

Abstract:
Graphs are all around us, ranging from citation and social networks to Knowledge Graphs (KGs). They are one of the most expressive data structures which have been used to model a variety of problems. Knowledge graphs are structured representations of facts in a graph, where nodes represent entities and edges represent relationships between them. Recent research has resulted in the development of several large KGs; examples include DBpedia, YAGO, NELL, and Freebase. However, all of them tend to be sparse with very few facts per entity. For instance, NELL KG consists of only 1.34 facts per entity. In the first part of the thesis, we propose three solutions to alleviate this problem: (1) KG Canonicalization, i.e., identifying and merging duplicate entities in a KG, (2) Relation Extraction which involves automating the process of extracting semantic relationships between entities from unstructured text, and (3) Link prediction which includes inferring missing facts based on the known facts in a KG. For KG Canonicalization, we propose CESI (Canonicalization using Embeddings and Side Information), a novel approach that performs canonicalization over learned embeddings of Open KGs. The method extends recent advances in KG embedding by incorporating relevant NP and relation phrase side information in a principled manner. For relation extraction, we propose RESIDE, a distantly-supervised neural relation extraction method which utilizes additional side information from KGs for improved relation extraction. Finally, for link prediction, we propose InteractE which extends ConvE, a convolutional neural network-based link prediction method, by increasing the number of feature interaction through three key ideas – feature permutation, a novel feature reshaping, and circular convolution. Through extensive experiments on multiple datasets, we demonstrate the effectiveness of our proposed methods.

Traditional Neural Networks like Convolutional Networks and Recurrent Neural Networks are constrained to handle Euclidean data. However, graphs in Natural Language Processing (NLP) are prominent. Recently, Graph Convolutional Networks (GCNs) have been proposed to address this shortcoming and have been successfully applied for several problems. In the second part of the thesis, we utilize GCNs for Document Timestamping problem, which forms an essential component of tasks like document retrieval, and summarization. For this, we propose NeuralDater which leverages GCNs for jointly exploiting syntactic and temporal graph structures of document for obtaining state-of-the-art performance on the problem. We also propose SynGCN, a flexible Graph Convolution based method for learning word embeddings which utilize dependency context of a word instead of linear context for learning more meaningful word embeddings. In this third part of the thesis, we address two limitations of existing GCN models, i.e., (1) The standard neighborhood aggregation scheme puts no constraints on the number of nodes that can influence the representation of a target node. This leads to a noisy representation of hub-nodes which coves almost the entire graph in a few hops. To address this shortcoming, we propose ConfGCN (Confidence-based GCN) which estimates confidences to determine the importance of a node on another during aggregation, thus restricting its influence neighborhood. (2) Most of the existing GCN models are limited to handle undirected graphs. However, a more general and pervasive class of graphs are relational graphs where each edge has a label and direction associated with it. Existing approaches to handle such graphs suffer from over-parameterization and are restricted to the learning representation of nodes only. We propose CompGCN, a novel Graph Convolutional framework which jointly embeds entity and relations in a relational graph. CompGCN is parameter efficient and scales with the number of relations. It leverages a variety of entity-relation composition operations from KG Embedding techniques and achieves demonstrably superior results on node classification, link prediction, and graph classification tasks.

Speaker Bio:

Host Faculty: