Title
Relaxed quantization for discretized neural networks
Author
Louizos, C.
Reisser, M.
Blankevoort, T.
Gavves, E.
Welling, M.
Publication year
2019
Abstract
Neural network quantization has become an important research area due to its great impact on deployment of large models on resource constrained devices. In order to train networks that can be effectively discretized without loss of performance, we introduce a differentiable quantization procedure. Differentiability can be achieved by transforming continuous distributions over the weights and activations of the network to categorical distributions over the quantization grid. These are subsequently relaxed to continuous surrogates that can allow for efficient gradient-based optimization. We further show that stochastic rounding can be seen as a special case of the proposed approach and that under this formulation the quantization grid itself can also be optimized with gradient descent. We experimentally validate the performance of our method on MNIST, CIFAR 10 and Imagenet classification. © 7th International Conference on Learning Representations, ICLR 2019. All Rights Reserved.
Subject
Gradient methods
Stochastic systems
Continuous distribution
Differentiability
Gradient descent
Gradient-based optimization
Large models
Loss of performance
Resourceconstrained devices
Optimization
To reference this document use:
http://resolver.tudelft.nl/uuid:26265e99-ed8f-495b-95a7-f6fdbe66bb19
TNO identifier
875978
Publisher
International Conference on Learning Representations, ICLR
Source
7th International Conference on Learning Representations, ICLR 2019, 7th International Conference on Learning Representations, ICLR 2019, 6 May 2019 through 9 May 2019
Document type
conference paper