Abstract:
Channel pruning is one of the main methods of depth model compression. DenseNet is one of the widely applied deep convolutional neural networks in image classification. In the DenseNet, each layer receives the output feature maps of all the convolutional layers in front of it as input. But not each layer needs all the features of the previous layers. There is a lot of redundancy in the DenseNet. Aiming at the shortcoming, this paper proposes a method for removing redundant channels in the DenseNet through self-learning. We get a sparse densely concatenated convolutional neural network. First, the method for measuring the contribution of each input feature map to the output feature map is proposed in each convolution layer. The input feature map with small contribution is the redundant feature map. Secondly, we introduce the training process of removing redundant channels in stages through self-learning. A sparse densely concatenated convolutional neural network is obtained, which prune the redundant channels, reducing network parameters, storage and computation. Finally, in order to show the effectiveness of the method, we performed experiments on the image classification dataset CIFAR-10/100. Experiments show that model redundancy is reduced without sacrificing accuracy.