No-Reference Image Quality Assessment via Inter-Level Adaptive Knowledge Distillation

Bo Hu, Wenzhi Chen, Jia Zheng, Leida Li, Wen Lu, Xinbo Gao

Research output: Contribution to journalJournal Article peer-review

Abstract

Compared with no-reference image quality assessment (IQA), full-reference IQA often achieves higher consistency with human subjective perception due to the reference information for comparison. A natural idea is to design strategies that allow the latter to guide the former’s learning to achieve better performance. However, how to construct the reference information and how to transfer prior knowledge are two important issues we are going to face that have not been fully explored. To this end, a novel method called no-reference IQA via inter-level adaptive knowledge distillation (AKD-IQA) is proposed. The core of AKD-IQA lies in transferring image distribution difference information from the full-reference teacher model to the no-reference student model through inter-level AKD. First, the teacher model is constructed based on multi-level feature discrepancy extractor and cross-scale feature integrator. Then, it is trained on a large synthetic distortion dataset to establish a comprehensive difference prior distribution. Finally, the image re-distortion strategy and inter-level AKD are introduced into the student model for effective learning. Experimental results on six standard IQA datasets demonstrate that the AKD-IQA achieves state-of-the-art performance. In addition, cross-dataset experiments confirm the superiority of it in generalization ability.

Original languageEnglish
Pages (from-to)1-12
Number of pages12
JournalIEEE Transactions on Broadcasting
DOIs
StatePublished - 26 03 2025

Bibliographical note

Publisher Copyright:
1963-12012 IEEE.

Keywords

  • Adaptation models
  • Deep learning
  • Distortion
  • Faces
  • Feature extraction
  • Frequency-domain analysis
  • Image quality
  • Image quality assessment
  • Predictive models
  • Quality assessment
  • Transformers
  • Full-reference teacher model
  • Inter-level adaptive knowledge distillation
  • No-reference student model
  • inter-level adaptive knowledge distillation
  • no-reference student model
  • full-reference teacher model

Fingerprint

Dive into the research topics of 'No-Reference Image Quality Assessment via Inter-Level Adaptive Knowledge Distillation'. Together they form a unique fingerprint.

Cite this