Published on Sun Nov 04 2018

Logic Attention Based Neighborhood Aggregation for Inductive Knowledge Graph Embedding

Peifeng Wang, Jialong Han, Chenliang Li, Rong Pan

knowledge graph embedding aims at modeling entities and relations with low-dimensional vectors. Most previous methods require that all entities should be seen during training. We introduce a novel aggregator, namely, Logic Attention Network (LAN), which addresses the properties by aggregating neighbors.

0
0
0
Abstract

Knowledge graph embedding aims at modeling entities and relations with low-dimensional vectors. Most previous methods require that all entities should be seen during training, which is unpractical for real-world knowledge graphs with new entities emerging on a daily basis. Recent efforts on this issue suggest training a neighborhood aggregator in conjunction with the conventional entity and relation embeddings, which may help embed new entities inductively via their existing neighbors. However, their neighborhood aggregators neglect the unordered and unequal natures of an entity's neighbors. To this end, we summarize the desired properties that may lead to effective neighborhood aggregators. We also introduce a novel aggregator, namely, Logic Attention Network (LAN), which addresses the properties by aggregating neighbors with both rules- and network-based attention weights. By comparing with conventional aggregators on two knowledge graph completion tasks, we experimentally validate LAN's superiority in terms of the desired properties.

Sat Aug 11 2018
Machine Learning
Knowledge Graph Embedding with Entity Neighbors and Deep Memory Network
Knowledge Graph Embedding (KGE) aims to represent entities and relations in a low-dimensional continuous vector space. Recent works focus on incorporating structural knowledge with additional information.
0
0
0
Sun Dec 13 2020
Artificial Intelligence
Context-Enhanced Entity and Relation Embedding for Knowledge Graph Completion
Most researches for knowledge graph completion learn representations of entities and relations to predict missing links in incomplete knowledge graphs. We propose a model named AggrE, which conducts efficient aggregations respectively on entity context and relation context in multi-hops.
0
0
0
Fri Aug 28 2020
Machine Learning
HittER: Hierarchical Transformers for Knowledge Graph Embeddings
HittER is a Hierarchical Transformer model to jointly learn entity-relation composition and Relational contextualization. The bottom block extracts features of each entity- correlation pair in the local neighborhood of the source entity. The top block aggregates the outputs of the bottom block.
1
0
0
Fri Oct 16 2020
Artificial Intelligence
Decentralized Knowledge Graph Representation Learning
knowledge graph (KG) representation learning methods have achieved competitive performance in many KG-oriented tasks. In this paper, we present a decentralized KG representation learning approach, decentRL, which encodes each entity from and only from its neighbors.
0
0
0
Wed Nov 20 2019
Artificial Intelligence
Knowledge Graph Alignment Network with Gated Multi-hop Neighborhood Aggregation
Graph neural networks (GNNs) have emerged as a powerful paradigm for embedding-based entity alignment. However, in real knowledge graphs (KGs), the counterpart entities usually have non-isomorphic neighborhood structures. To tackle this problem, we propose a new KG alignment network,
0
0
0
Fri Jan 22 2021
NLP
Knowledge Graph Completion with Text-aided Regularization
KG Completion is a task of expanding the knowledge graph/base. Traditional approaches focus on using the existing graphical information that is intrinsic of the graph. We think that the corpus that are related to the entities should also contain information that can positively influence the embeddings.
0
0
0