Learning Efficient Detector with Semi-supervised Adaptive Distillation

Shitao Tang (SenseTime Research), Litong Feng (SenseTime Research), Wenqi Shao (The Chinese University of HongKong), Zhanghui Kuang (SenseTime Ltd.), Wayne Zhang (SenseTime Research), Zheng Lu (University of Nottingham, Ningbo China)

Abstract
Convolutional Neural Networks based object detection techniques produce accurate results but often time consuming. Knowledge distillation has been popular for model compression to speed up. In this paper, we propose a Semi-supervised Adaptive Distillation (SAD) framework to accelerate single-stage detectors while still improving the overall accuracy. We introduce our Adaptive Distillation Loss (ADL) that enables student model to mimic teacher's logits adaptively with more attention paid on two types of hard samples, hard-to-learn samples predicted by teacher model with low certainty and hard-to-mimic samples with a large gap between the teacher's and the student's prediction. We then show that student model can be improved further in the semi-supervised setting with the help of ADL. Our experiments validate that for distillation on unlabeled data. ADL achieves better performance than existing data distillation using both soft and hard targets. On the COCO database, SAD makes a student detector with a backbone of ResNet-50 out-perform its teacher with a backbone of ResNet-101, while the student has half of the teacher's computation complexity.

DOI
10.5244/C.33.21
https://dx.doi.org/10.5244/C.33.21

Files
Paper (PDF)

BibTeX
@inproceedings{BMVC2019,
title={Learning Efficient Detector with Semi-supervised Adaptive Distillation},
author={Shitao Tang and Litong Feng and Wenqi Shao and Zhanghui Kuang and Wayne Zhang and Zheng Lu},
year={2019},
month={September},
pages={21.1--21.12},
articleno={21},
numpages={12},
booktitle={Proceedings of the British Machine Vision Conference (BMVC)},
publisher={BMVA Press},
editor={Kirill Sidorov and Yulia Hicks},
doi={10.5244/C.33.21},
url={https://dx.doi.org/10.5244/C.33.21}
}