Knowledge Distillation for Molecular Property Prediction: A Scalability Analysis

Abstract Knowledge distillation (KD) is a powerful model compression technique that transfers knowledge from complex teacher models to compact student models, reducing computational costs while preserving predictive accuracy. This study investigated KD's efficacy in molecular property predictio...

Full description

Saved in:
Bibliographic Details
Main Authors: Rahul Sheshanarayana, Fengqi You
Format: Article
Language:English
Published: Wiley 2025-06-01
Series:Advanced Science
Subjects:
Online Access:https://doi.org/10.1002/advs.202503271
Tags: Add Tag
No Tags, Be the first to tag this record!