DM-KD: Decoupling Mixed-Images for Efficient Knowledge Distillation
Knowledge distillation (KD) is a method of model compression. It involves extracting valuable knowledge from a high-performance and high-capacity teacher model and transferring this knowledge to a target student model having relatively small capacity. However, we discover that naively applying mixed...
Saved in:
Main Authors: | Jongkyung Im, Younho Jang, Junpyo Lim, Taegoo Kang, Chaoning Zhang, Sung-Ho Bae |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2025-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10819346/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Assessment-Based Optimization of Distillation Parameters
by: Ludmila N. Krikunova, et al.
Published: (2023-06-01) -
Current Status and Analysis of Decoupling Mechanism
by: Qu Shuwei, et al.
Published: (2022-10-01) -
Effects of different wet distillers’ grains ratios on fermentation quality, nitrogen fractions and bacterial communities of total mixed ration silage
by: Ermei Du, et al.
Published: (2025-01-01) -
Kinematic and Dynamic Analysis of the Partially Decoupled Parallel Manipulator
by: Canguo Zhang, et al.
Published: (2022-08-01) -
Knowledge Distillation in Object Detection for Resource-Constrained Edge Computing
by: Arief Setyanto, et al.
Published: (2025-01-01)