Zero-Shot Knowledge Distillation Using Label-Free Adversarial Perturbation With Taylor Approximation

Volume: 9, Pages: 45454 - 45461
Published: Jan 1, 2021
Abstract
Knowledge distillation (KD) is one of the most effective neural network light-weighting techniques when training data is available. However, KD is seldom applicable to an environment where it is difficult or impossible to access training data. To solve this problem, a complete zero-shot KD (C-ZSKD) based on adversarial learning has been recently proposed, but the so-called biased sample generation problem limits the performance of C-ZSKD. To...
Paper Details
Title
Zero-Shot Knowledge Distillation Using Label-Free Adversarial Perturbation With Taylor Approximation
Published Date
Jan 1, 2021
Volume
9
Pages
45454 - 45461
Citation AnalysisPro
  • Scinapse’s Top 10 Citation Journals & Affiliations graph reveals the quality and authenticity of citations received by a paper.
  • Discover whether citations have been inflated due to self-citations, or if citations include institutional bias.