Graph neural networks with configuration cross-attention for tensor compilers
With the recent popularity of neural networks comes the need for efficient serving of inference workloads. A neural network inference workload can be represented as a computational graph with nodes as operators transforming multidimensional tensors. The tensors can be transposed and/or tiled in a co...
Saved in:
| Main Authors: | Dmitrii Khizbullin, Eduardo Rocha de Andrade, Thanh Hau Nguyen, Matheus Pedroza Ferreira, David R. Pugh |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Frontiers Media S.A.
2025-08-01
|
| Series: | Frontiers in Artificial Intelligence |
| Subjects: | |
| Online Access: | https://www.frontiersin.org/articles/10.3389/frai.2025.1605539/full |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
GNN-MAM: A graph neural network based multiple attention mechanism for regional financial risk prediction
by: Yuli Ma, et al.
Published: (2025-08-01) -
Efficient knowledge graph to text powered by LLGM: linear latent graph model
by: Xiaokang Zhao, et al.
Published: (2025-06-01) -
LoadSeer: Exploiting Tensor Graph Convolutional Network for Power Load Forecasting With Spatio-Temporal Characteristics
by: Jiahao Zhang, et al.
Published: (2024-01-01) -
D3GNN: Double dual dynamic graph neural network for multisource remote sensing data classification
by: Teng Yang, et al.
Published: (2025-05-01) -
A graph transformer with optimized attention scores for node classification
by: Yu Zhang, et al.
Published: (2025-08-01)