Improving cross-lingual entity alignment via optimal transport

Shichao Pei, Lu Yu, Xiangliang Zhang

Research output: Chapter in Book/Report/Conference proceedingConference contribution

40 Scopus citations


Cross-lingual entity alignment identifies entity pairs that share the same meanings but locate in different language knowledge graphs (KGs). The study in this paper is to address two limitations that widely exist in current solutions: 1) the alignment loss functions defined at the entity level serve well the purpose of aligning labeled entities but fail to match the whole picture of labeled and unlabeled entities in different KGs; 2) the translation from one domain to the other has been considered (e.g., X to Y by M1 or Y to X by M2). However, the important duality of alignment between different KGs (X to Y by M1 and Y to X by M2) is ignored. We propose a novel entity alignment framework (OTEA), which dually optimizes the entity-level loss and group-level loss via optimal transport theory. We also impose a regularizer on the dual translation matrices to mitigate the effect of noise during transformation. Extensive experimental results show that our model consistently outperforms the state-of-the-arts with significant improvements on alignment accuracy.
Original languageEnglish (US)
Title of host publicationProceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence
PublisherInternational Joint Conferences on Artificial Intelligence Organization
Number of pages7
ISBN (Print)9780999241141
StatePublished - Jul 28 2019

Bibliographical note

KAUST Repository Item: Exported on 2020-10-01
Acknowledged KAUST grant number(s): FCC/1/1976-19-01
Acknowledgements: The research reported in this publication was supported by funding from King Abdullah University of Science and Technology (KAUST), under award number FCC/1/1976-19-01, and NSFC No. 61828302.


Dive into the research topics of 'Improving cross-lingual entity alignment via optimal transport'. Together they form a unique fingerprint.

Cite this