Dynamic heterogeneous network representation learning method based on Hawkes process

Existing methods for heterogeneous network representation learning mainly focus on static networks, overlooking the significant impact of temporal attributes on node representations. However, real heterogeneous information networks are very dynamic, and even minor changes in nodes and edges can affe...

Full description

Saved in:
Bibliographic Details
Main Authors: CHEN Lei, DENG Kun, LIU Xingyan
Format: Article
Language:zho
Published: Beijing Xintong Media Co., Ltd 2024-08-01
Series:Dianxin kexue
Subjects:
Online Access:http://www.telecomsci.com/zh/article/doi/10.11959/j.issn.1000-0801.2024195/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Existing methods for heterogeneous network representation learning mainly focus on static networks, overlooking the significant impact of temporal attributes on node representations. However, real heterogeneous information networks are very dynamic, and even minor changes in nodes and edges can affect the entire structure and semantics. In this context, a dynamic heterogeneous network representation learning method based on Hawkes process was proposed. Firstly, the vector representation of nodes was obtained by utilizing the relational rotation encoding method and attention mechanism, where the attention coefficients of adjacent nodes were learned. Secondly, the optimal weighted combination of different meta-paths was learned to better captures the structural and semantic information of the network. Finally, leveraging the time decay effect, time features were introduced into node representations through the formation of neighborhood sequences, resulting in the ultimate embedding representation of nodes. Experimental results on various benchmark datasets indicate that the proposed method significantly outperforms baseline methods. In node classification tasks, Macro-F1 average is increased by 0.15% to 3.45%, and NMI value in node clustering tasks is improved by 1.08% to 3.57%.
ISSN:1000-0801