« May 23-26, 2016: Boot Camp Reunion | Main | PREPRINT: Structured Convolution Matrices for Energy-efficient Deep learning »

PREPRINT: Deep neural networks are robust to weight binarization and other non-linear distortions

Guest Blog by Paul A. Merolla

To seek feedback from fellow scientists, my colleagues and I are very excited to share a preprint with the community.

Title: Deep neural networks are robust to weight binarization and other non-linear distortions

Authors: Paul A. Merolla, Rathinakumar Appuswamy, John V. Arthur, Steve K. Esser, Dharmendra S. Modha 

Abstract: Recent results show that deep neural networks achieve excellent performance even when, during training, weights are quantized and projected to a binary representation. Here, we show that this is just the tip of the iceberg: these same networks, during testing, also exhibit a remarkable robustness to distortions beyond quantization, including additive and multiplicative noise, and a class of non-linear projections where binarization is just a special case. To quantify this robustness, we show that one such network achieves 11% test error on CIFAR-10 even with 0.68 effective bits per weight. Furthermore, we find that a common training heuristic--namely, projecting quantized weights during backpropagation--can be altered (or even removed) and networks still achieve a base level of robustness during testing. Specifically, training with weight projections other than quantization also works, as does simply clipping the weights, both of which have never been reported before. We confirm our results for CIFAR-10 and ImageNet datasets. Finally, drawing from these ideas, we propose a stochastic projection rule that leads to a new state of the art network with 7.64% test error on CIFAR-10 using no data augmentation.

Link: http://arxiv.org/abs/1606.01981


TrackBack URL for this entry:

Post a comment

(If you haven't left a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Thanks for waiting.)