Applied Sciences, Vol. 14, Pages 3284: Knowledge Distillation Based on Fitting Ground-Truth Distribution of Images

2 weeks ago 16

Applied Sciences, Vol. 14, Pages 3284: Knowledge Distillation Based on Fitting Ground-Truth Distribution of Images

Applied Sciences doi: 10.3390/app14083284

Authors: Jianze Li Zhenhua Tang Kai Chen Zhenlei Cui

Knowledge distillation based on the features from the penultimate layer allows the student (lightweight model) to efficiently mimic the internal feature outputs of the teacher (high-capacity model). However, the training data may not conform to the ground-truth distribution of images in terms of classes and features. We propose two knowledge distillation algorithms to solve the above problem from the directions of fitting the ground-truth distribution of classes and fitting the ground-truth distribution of features, respectively. The former uses teacher labels to supervise student classification output instead of dataset labels, while the latter designs feature temperature parameters to correct teachers’ abnormal feature distribution output. We conducted knowledge distillation experiments on the ImageNet-2012 and Cifar-100 datasets using seven sets of homogeneous models and six sets of heterogeneous models. The experimental results show that our proposed algorithms improve the performance of penultimate layer feature knowledge distillation and outperform other existing knowledge distillation methods in terms of classification performance and generalization ability.

Read Entire Article