NIPS 2018
Sun Dec 2nd through Sat the 8th, 2018 at Palais des Congrès de Montréal
Paper ID: 560 Frequency-Domain Dynamic Pruning for Convolutional Neural Networks

### Reviewer 1

In this submission, authors proposed a dynamic pruning scheme in the frequency domain to compress convolutional neural networks. Spatial-domain convolutional is converted into frequency-domain multiplication, in which weight pruning is performed to compress the model. A band adaptive pruning scheme is proposed to vary compression rates for different frequency bands. Experimental results show that the proposed method outperforms spatial-domain pruning methods. The manuscript is well organized and easy to follow. However, the empirical evaluation is not yet convincing (at least to me), and further experiments are required to improve the overall quality. a) ResNet on CIFAR-10. The reported baseline models’ accuracies are 91.23% for ResNet-20 and 92.81% for ResNet-110. This is considerably lower than results reported in other papers, e.g. 93.39% for ResNet-110 as in He et al.’s ECCV ’16 paper. b) How does the proposed method works on ResNet models on the ILSVRC-12 dataset? This can be more convincing than AlexNet for ILSVRC-12, since ResNet models are more commonly used recently. c) What is the actual speed up for the proposed method? There is only theoretical speed up reported in the manuscript. d) Is it possible to automatically determine the pruning ratio? For instance, use reinforcement learning as the external controller to choose the pruning ratio. e) For BA-FDNP, since different frequency bands have different compression rates, is it possible to apply structured pruning to improve the actual speed up? f) In Equation (12), the last term should be $1 / \xi$, instead of $\xi$.