Method of adaptive knowledge distillation from multi-teacher to student deep learning models

Transferring knowledge from multiple teacher models to a compact stu-dent model is often hindered by domain shifts between datasets and a scarcity of labeled target data, degrading performance. While existing methods address parts of this problem, a unified framework is lacking. In this work, we im...

Full description

Saved in:
Bibliographic Details
Main Authors: Oleksandr Chaban, Eduard Manziuk, Pavlo Radiuk
Format: Article
Language:English
Published: Academy of Cognitive and Natural Sciences 2025-08-01
Series:Journal of Edge Computing
Subjects:
Online Access:https://acnsci.org/journal/index.php/jec/article/view/978
Tags: Add Tag
No Tags, Be the first to tag this record!