Sparsity of formal knowledge and roughness of non-ontological construction methods make sparsity problem particularly prominent in Open Knowledge Graphs (OpenKGs). Due to sparse links, learning effective representation for few-shot entities becomes difficult. We hypothesize that by introducing negative samples, a contrastive learning (CL) formulation could be beneficial in such scenarios. However, existing CL methods consider binary objects while modeling KG triplets and they are too generic, i.e., they ignore zero-shot, few-shot and synonymity problems that appear in OpenKGs. To address this, we propose TernaryCL, a CL framework based on ternary propagation patterns among head, relation and tail. TernaryCL designs \emphContrastive Entity and \emphContrastive Relation to mine ternary discriminative features by considering both negative entities and relations. It also introduces \emphContrastive Self to help zero- and few-shot entities learn discriminative features, \emphContrastive Synonym to consider synonymous entities, and \emphContrastive Fusion to aggregate graph features from multiple paths. With extensive experiments on benchmark datasets, we demonstrate the superiority of TernaryCL over state-of-the-art models.
Alleviating Sparsity of Open Knowledge Graphs with Ternary Contrastive Learning
Qian Li, Shafiq Joty, Daling Wang, Shi Feng, and Yifei Zhang. In the 2022 Conference on Empirical Methods in Natural Language Processing (Findings) (EMNLP'22) 2022.
PDF Abstract BibTex Slides