26th EAAAI (EANN) 2025, 26 - 29 June 2025, Limassol, Cyprus

Exploring Knowledge Distillation for Model Compression in Edge Environments

Telma Garção, David Belo, Joana Sousa, João Ferreira

Abstract:

  This paper examines knowledge distillation (KD) for model compression in edge environments. As the number of Internet of Things (IoT) devices in-creases, optimizing machine learning models for resource-constrained set-tings is necessary. Knowledge transfer from larger models (teachers) to smaller models (students) is studied to improve efficiency while maintaining performance. Experiments compare teacher-student pairs, including VGG19 with MobileNetV3_small, ResNet-152 with ShuffleNet V2 x0.5, and ResNet-152 with ResNet-18. Accuracy and computational efficiency are evaluated across CIFAR-10, CIFAR-100, and SLT-10 datasets. Results show that KD improves student model performance, particularly under data corruption. Self-distillation with ResNet-18 is analyzed using the same architecture. This research examines KD for optimizing deep learning models in edge de-ployment, addressing accuracy and resource constraints in machine learning applications.  

*** Title, author list and abstract as submitted during Camera-Ready version delivery. Small changes that may have occurred during processing by Springer may not appear in this window.