LowFER: Low-rank Bilinear Pooling for Link Prediction

Part of Proceedings of the International Conference on Machine Learning 1 pre-proceedings (ICML 2020)

Bibtex »Metadata »Paper »Supplemental »

Bibtek download is not availble in the pre-proceeding


Authors

Saadullah Amin, Stalin Varanasi, Katherine Ann Dunfield, Günter Neumann

Abstract

<p>Knowledge graphs are incomplete by nature, representing only a limited number of observed facts about the world knowledge as relations between entities. An important task in statistical relational learning is that of link prediction or knowledge graph completion to partly address this issue. Both linear and non-linear (deep learning based) models have been proposed to solve the problem, with former being parameter efficient and interpretable. Bilinear models, while expressive, are prone to overfitting and lead to quadratic growth of parameters in number of relations. Simpler models have become more standard, with certain constraints on bilinear maps as relation parameters. In this work, we propose a factorized bilinear pooling model, commonly used in multi-modal learning, for better fusion of entities and relations, leading to an efficient and constraints free model. We prove that our model is fully expressive and provide bounds on the entity and relation embedding dimensions and the factorization rank. Our model naturally generalizes TuckER model (Balazevic et al., 2019), which has shown to generalize other models as special cases, by efficient low-rank approximation without compromising much on performance. The model complexity can be controlled by the factorization rank as opposed to the cubic growth of core tensor in TuckER model when entities and relations share the same space. Empirically, we evaluate on real-world datasets, reaching on par or state-of-the-art performance. In extreme low-ranks, the model already outperforms many of the recently proposed methods.</p>