A Study of LFT Embeddings in the Second Order Clifford Algebra <italic>Cl</italic>(R&#x00B2;,&#x00BA;)

Knowledge graph embedding models represent entities as vectors in continuous spaces and their relations by geometric transformations, mainly translation, and rotation. However, multi-relational knowledge graphs contain complex sub-graph structures, for which these two families of models fail to sole...

Full description

Saved in:
Bibliographic Details
Main Authors: Kossi Amouzouvi, Sahar Vahdati, Bowen Song, Bernard O. Bainson, Nur A. Zarin Nishat, Jens Lehmann
Format: Article
Language:English
Published: IEEE 2024-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10786921/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Knowledge graph embedding models represent entities as vectors in continuous spaces and their relations by geometric transformations, mainly translation, and rotation. However, multi-relational knowledge graphs contain complex sub-graph structures, for which these two families of models fail to solely preserve. The complexities of these sub-graph structures often require embedding vector spaces with additional structures and composite transformations. A typical intrinsic additional algebraic structure on these embedding vector spaces has the potential to allow models to preserve complex relational patterns, such as non-commutativity, which are intrinsically related to the vector space. In this paper, we use linear fractional transformations (<inline-formula> <tex-math notation="LaTeX">$\mathrm {LFT_{}}$ </tex-math></inline-formula>) in the noncommutative Clifford Algebra of order 2, <inline-formula> <tex-math notation="LaTeX">$Cl(\mathbb {R}^{2,0})$ </tex-math></inline-formula>, to embed relations. <inline-formula> <tex-math notation="LaTeX">$\mathrm {LFT_{}}$ </tex-math></inline-formula> can be understood as transformations that generalize M&#x00F6;bius transformations to the framework of Clifford algebras, in a non-commutative settings. Thus, the <inline-formula> <tex-math notation="LaTeX">$\mathrm {LFT_{}}$ </tex-math></inline-formula> relation embeddings in our proposed models can exhibit geometric transformations namely translation, rotation, reflection, homothety, inversion, and their composites. By examining the preservation of complex structures using eleven benchmark knowledge graph datasets, our models showed results which are superior or competitive to existing state-of-the-art models relative to learning relational, many-to-many structural, and hierarchical patterns. This literature does not only advance the performance of knowledge graph embeddings but also emphasizes the importance of robust mathematical foundations and particularly the potential of Clifford algebras in embedding models.
ISSN:2169-3536