« PREPRINT: Deep neural networks are robust to weight binarization and other non-linear distortions | Main | Gearing Up for 2016 Telluride Neuromorphic Cognition Engineering Workshop »

PREPRINT: Structured Convolution Matrices for Energy-efficient Deep learning

Guest Blog by Rathinakumar Appuswamy

To seek feedback from fellow scientists, my colleagues and I are very excited to share a preprint with the community.

Title: Structured Convolution Matrices for Energy-efficient Deep learning

Authors: Rathinakumar Appuswamy, Tapan Nayak, John Arthur, Steven Esser, Paul Merolla, Jeffrey Mckinstry, Timothy Melano, Myron Flickner, Dharmendra S. Modha 

Extended Abstract: We derive a relationship between network representation in energy-efficient neuromorphic architectures and block Toplitz convolutional matrices. Inspired by this connection, we develop deep convolutional networks using a family of structured convolutional matrices and achieve state-of-the-art trade-off between energy efficiency and classification accuracy for well-known image recognition tasks. We also put forward a novel method to train binary convolutional networks by utilising an existing connection between noisy-rectified linear units and binary activations. We report a novel approach to train deep convolutional networks with structured kernels. Specifically, all the convolution kernels are generated by the commutative pairs of elements from the Symmetric group S4. This particular structure is inspired by the TrueNorth architecture and we use it to achieve an improved accuracy vs energy tradeoff than we had previously reported. Our work builds on the growing body of literature devoted to developing convolutional networks for low-precision hardware toward energy-efficient deep learning.

Link: http://arxiv.org/abs/1606.02407

TrackBack

TrackBack URL for this entry:
http://p9.hostingprod.com/@modha.org/blog-mt/mt-tb.fcgi/264

Post a comment

(If you haven't left a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Thanks for waiting.)