Knowledge Distillation for Molecular Property Prediction: A Scalability Analysis
Abstract Knowledge distillation (KD) is a powerful model compression technique that transfers knowledge from complex teacher models to compact student models, reducing computational costs while preserving predictive accuracy. This study investigated KD's efficacy in molecular property predictio...
Saved in:
| Main Authors: | , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Wiley
2025-06-01
|
| Series: | Advanced Science |
| Subjects: | |
| Online Access: | https://doi.org/10.1002/advs.202503271 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Abstract Knowledge distillation (KD) is a powerful model compression technique that transfers knowledge from complex teacher models to compact student models, reducing computational costs while preserving predictive accuracy. This study investigated KD's efficacy in molecular property prediction across domain‐specific and cross‐domain tasks, leveraging state‐of‐the‐art graph neural networks (SchNet, DimeNet++, and TensorNet). In the domain‐specific setting, KD improved regression performance across diverse quantum mechanical properties in the QM9 dataset, with DimeNet++ student models achieving up to an 90% improvement in R2 compared to non‐KD baselines. Notably, in certain cases, smaller student models achieved comparable or even superior R2 improvements while being 2× smaller, highlighting KD's ability to enhance efficiency without sacrificing predictive performance. Cross‐domain evaluations further demonstrated KD's adaptability, where embeddings from QM9‐trained teacher models enhanced predictions for ESOL (logS) and FreeSolv (ΔGhyd), with SchNet exhibiting the highest gains of ≈65% in logS predictions. Embedding analysis revealed substantial student‐teacher alignment gains, with the relative shift in cosine similarity distribution peaks reaching up to 1.0 across student models. These findings highlighted KD as a robust strategy for enhancing molecular representation learning, with implications for cheminformatics, materials science, and drug discovery. |
|---|---|
| ISSN: | 2198-3844 |