Probabilistic Automated Model Compression via Representation Mutual Information Optimization

Deep neural networks, despite their remarkable success in computer vision tasks, often face deployment challenges due to high computational demands and memory usage. Addressing this, we introduce a probabilistic framework for automated model compression (Prob-AMC) that optimizes pruning, quantizatio...

Full description

Saved in:
Bibliographic Details
Main Authors: Wenjie Nie, Shengchuan Zhang, Xiawu Zheng
Format: Article
Language:English
Published: MDPI AG 2024-12-01
Series:Mathematics
Subjects:
Online Access:https://www.mdpi.com/2227-7390/13/1/108
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Deep neural networks, despite their remarkable success in computer vision tasks, often face deployment challenges due to high computational demands and memory usage. Addressing this, we introduce a probabilistic framework for automated model compression (Prob-AMC) that optimizes pruning, quantization, and knowledge distillation simultaneously using information theory. Our approach is grounded in maximizing the mutual information between the original and compressed network representations, ensuring the preservation of essential features under resource constraints. Specifically, we employ layer-wise self-representation mutual information analysis, sampling-based pruning and quantization allocation, and progressive knowledge distillation using the optimal compressed model as a teacher assistant. Through extensive experiments on CIFAR-10 and ImageNet, we demonstrate that Prob-AMC achieves a superior compression ratio of 33.41× on ResNet-18 with only a 1.01% performance degradation, outperforming state-of-the-art methods in terms of both compression efficiency and accuracy. This optimization process is highly practical, requiring merely a few GPU hours, and bridges the gap between theoretical information measures and practical model compression, offering significant insights for efficient deep learning deployment.
ISSN:2227-7390