Skip to main navigation Skip to search Skip to main content

Anomaly detection based on multi-teacher knowledge distillation

Ye Ma, Xu Jiang*, Nan Guan, Wang Yi

*Corresponding author for this work

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

Abstract

Anomaly detection on high-dimensional data is crucial for real-world industrial applications. Recent works adopt the Knowledge Distillation (KD) technique to improve the accuracy of anomaly detection Neural Networks (NN). Most KD-based solutions only adopt a single teacher NN and have not yet fully incorporated the distinct advantages of different NN structures. To fill this gap, this paper proposes a novel Multi-teacher Knowledge Distillation approach, which effectively integrates multiple teachers with importance weights to provide guidance for the accurate anomaly detection of students. However, the importance weights are hard to get when training only with normal data. To overcome this challenge, we use an autoencoder-based reconstruction process to update teacher importance weights. In the meantime, the student model parameters are optimized by giving a set of teacher importance weights. Anomalies are then detected based on the deviations between the outputs of teacher and student, as well as the reconstruction errors through the student network. Our proposed approach is evaluated on both CIFAR10 and MVTec datasets. The results show good performance on both high-level semantic anomaly detection and low-level pixel anomaly detection. © 2023 Elsevier B.V.
Original languageEnglish
Article number102861
JournalJournal of Systems Architecture
Volume138
Online published23 Mar 2023
DOIs
Publication statusPublished - May 2023

Research Keywords

  • Anomaly detection
  • Knowledge distillation
  • Multi-teacher
  • Normal feature
  • Semantic and pixel anomaly

Fingerprint

Dive into the research topics of 'Anomaly detection based on multi-teacher knowledge distillation'. Together they form a unique fingerprint.

Cite this