[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.5555/3199700.3199771acmconferencesArticle/Chapter ViewAbstractPublication PagesiccadConference Proceedingsconference-collections
research-article

TraNNsformer: Neural Network Transformation for memristive crossbar based neuromorphic system design

Published: 13 November 2017 Publication History

Abstract

Implementation of Neuromorphic Systems using post Complementary Metal-Oxide-Semiconductor (CMOS) technology based Memristive Crossbar Array (MCA) has emerged as a promising solution to enable low-power acceleration of neural networks. However, the recent trend to design Deep Neural Networks (DNNs) for achieving human-like cognitive abilities poses significant challenges towards the scalable design of neuromorphic systems (due to the increase in computation/storage demands). Network pruning [7] is a powerful technique to remove redundant connections for designing optimally connected (maximally sparse) DNNs. However, such pruning techniques induce irregular connections that are incoherent to the crossbar structure. Eventually they produce DNNs with highly inefficient hardware realizations (in terms of area and energy). In this work, we propose TraNNsformer - an integrated training framework that transforms DNNs to enable their efficient realization on MCA-based systems. TraNNsformer first prunes the connectivity matrix while forming clusters with the remaining connections. Subsequently, it retrains the network to fine tune the connections and reinforce the clusters. This is done iteratively to transform the original connectivity into an optimally pruned and maximally clustered mapping. We evaluated the proposed framework by transforming different Multi-Layer Perceptron (MLP) based Spiking Neural Networks (SNNs) on a wide range of datasets (MNIST, SVHN and CIFAR10) and executing them on MCA-based systems to analyze the area and energy benefits. Without accuracy loss, TraNNsformer reduces the area (energy) consumption by 28% - 55% (49% - 67%) with respect to the original network. Compared to network pruning, TraNNsformer achieves 28% - 49% (15% - 29%) area (energy) savings. Furthermore, TraNNsformer is a technology-aware framework that allows mapping a given DNN to any MCA size permissible by the memristive technology for reliable operations.

References

[1]
F. Akopyan et al Truenorth: Design and tool flow of a 65 mw 1 million neuron programmable neurosynaptic chip. IEEE TCAD, 2015
[2]
A. Ankit et al. Resparc: A reconfigurable and energy-efficient architecture with memristive crossbars for deep spiking neural networks. ACM DAC, 2017.
[3]
S. Anwar et al. Structured pruning of deep convolutional neural networks. arXiv preprint arXiv:1512.08571, 2015.
[4]
P. Chi et al. Prime: A novel processing-in-memory architecture for neural network computation in reram-based main memory. ACM ISCA, 2016.
[5]
P. U. Diehl et al. Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing. IEEE IJCNN, 2015.
[6]
S. Han et al. Eie: efficient inference engine on compressed deep neural network. ACM ISCA, 2016.
[7]
S. Han et al. Learning both weights and connections for efficient neural network. NIPS, 2015.
[8]
S. H. Jo et al. Nanoscale memristor device as synapse in neuromorphic systems. Nano letters, 2010.
[9]
A. Joubert et al. Hardware spiking neurons design: Analog or digital? IEEE IJCNN, 2012.
[10]
A. Karpathy et al. Deep visual-semantic alignments for generating image descriptions. CVPR, 2015.
[11]
A. Krizhevsky et al. Learning multiple layers of features from tiny images. 2009.
[12]
A. Krizhevsky et al. Imagenet classification with deep convolutional neural networks. NIPS, 2012.
[13]
Y. LeCun et al. Gradient-based learning applied to document recognition. Proceedings of the IEEE, 1998.
[14]
T. Mikolov et al. Recurrent neural network based language model. Interspeech, 2010.
[15]
N. Muralimanohar et al. Optimizing nuca organizations and wiring alternatives for large caches with cacti 6.0. IEEE MICRO, 2007.
[16]
Y. Netzer et al. Reading digits in natural images with unsupervised feature learning. 2011.
[17]
A. Y. Ng et al. On spectral clustering: Analysis and an algorithm. NIPS, 2001.
[18]
R. B. Palm. Prediction as a candidate for learning deep hierarchical models of data. Technical University of Denmark, 5, 2012.
[19]
M. Prezioso et al. Training and operation of an integrated neuromorphic network based on metal-oxide memristors. Nature, 2015.
[20]
B. Rajendran et al. Specifications of nanoscale devices and circuits for neuromorphic computational systems. IEEE Transactions on Electron Devices, 2013.
[21]
A. Sengupta et al. Proposal for an all-spin artificial neural network: Emulating neural and synaptic functionalities through domain wall motion in ferromagnets. IEEE TBioCAS, 2016.
[22]
A. Shafiee et al. Isaac: A convolutional neural network accelerator with in-situ analog arithmetic in crossbars. ACM ISCA, 2016.
[23]
K. Simonyan et al. Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556, 2014.
[24]
M. Sundermeyer et al. Lstm neural networks for language modeling. Interspeech, 2012.
[25]
W. Wen et al. Learning structured sparsity in deep neural networks. NIPS, 2016.
[26]
W. Wen et al. An eda framework for large scale hybrid neuromorphic computing systems. ACM DAC, 2015.

Cited By

View all
  • (2019)Analog/Mixed-Signal Hardware Error Modeling for Deep Learning InferenceProceedings of the 56th Annual Design Automation Conference 201910.1145/3316781.3317770(1-6)Online publication date: 2-Jun-2019
  • (2019)PUMAProceedings of the Twenty-Fourth International Conference on Architectural Support for Programming Languages and Operating Systems10.1145/3297858.3304049(715-731)Online publication date: 4-Apr-2019
  • (2019)Learning the sparsity for ReRAMProceedings of the 24th Asia and South Pacific Design Automation Conference10.1145/3287624.3287715(639-644)Online publication date: 21-Jan-2019
  • Show More Cited By
  1. TraNNsformer: Neural Network Transformation for memristive crossbar based neuromorphic system design

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ICCAD '17: Proceedings of the 36th International Conference on Computer-Aided Design
    November 2017
    1077 pages

    Sponsors

    In-Cooperation

    • IEEE-EDS: Electronic Devices Society

    Publisher

    IEEE Press

    Publication History

    Published: 13 November 2017

    Check for updates

    Author Tags

    1. computer-aided design
    2. energy-efficiency
    3. memristive crossbars
    4. neuromorphic computing
    5. sparsity

    Qualifiers

    • Research-article

    Conference

    ICCAD '17
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 457 of 1,762 submissions, 26%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)6
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 01 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2019)Analog/Mixed-Signal Hardware Error Modeling for Deep Learning InferenceProceedings of the 56th Annual Design Automation Conference 201910.1145/3316781.3317770(1-6)Online publication date: 2-Jun-2019
    • (2019)PUMAProceedings of the Twenty-Fourth International Conference on Architectural Support for Programming Languages and Operating Systems10.1145/3297858.3304049(715-731)Online publication date: 4-Apr-2019
    • (2019)Learning the sparsity for ReRAMProceedings of the 24th Asia and South Pacific Design Automation Conference10.1145/3287624.3287715(639-644)Online publication date: 21-Jan-2019
    • (2018)Memristive Crossbar Mapping for Neuromorphic Computing Systems on 3D ICProceedings of the 2018 Great Lakes Symposium on VLSI10.1145/3194554.3194636(451-454)Online publication date: 30-May-2018

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media