Filter Pruning and Re-Initialization via Latent Space Clustering
Filter pruning is prevalent for pruning-based model compression. Most filter pruning methods have two main issues: 1) the pruned network capability depends on that of source pretrained models, and 2) they do not consider that filter weights follow a normal distribution. To address these issues, we propose a new pruning method employing both weight re-initialization and latent space clustering. For latent space clustering, we define filters and their feature maps as vertices and edges to be a graph, transformed into a latent space by graph convolution, alleviating to prune zero-near weight filters only. In addition, a part of filters is re-initialized with a constraint for enhancing filter diversity, and thus the pruned model is less dependent on the source network. This approach provides more robust accuracy even when pruned from the pretrained model with low accuracy. Extensive experimental results show our method decreases 56.6% and 84.6% of FLOPs and parameters of VGG16 with negligible loss of accuracy on CIFAR100, which is the state-of-the art performance. Furthermore, our method presents outperforming or comparable pruning results against state-of-the-art models on multiple datasets.