[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2967413.2974038acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicdscConference Proceedingsconference-collections
short-paper

Propagation of Quantification Error Over Convolutional Neural Network layers: PhD Forum

Published: 12 September 2016 Publication History

Abstract

Deep Neural Networks (DNNs) have become a real standard in computer vision as they achieved state-of-the art results in detection, classification and scene understanding. However, training and accelerating large-scale deep neural networks is often constrained by the available computational resources. Using fixed point arithmetic could lighten this at the price of quantification error. This work shows how this error variate at different stages along a neural network for various fixed point representation schemes.

References

[1]
J. T. Huang, J. Li, and D. Yu. Cross-language knowledge transfer using multilingual deep neural network with shared hidden layers. May 2013.
[2]
Alex Krizhevsky, Ilya Sutskever, and Geoffrey E. Hinton. Imagenet classification with deep convolutional neural networks. 2012.
[3]
Karen Simonyan and Andrew Zisserman. Very deep convolutional networks for large-scale image recognition. CoRR, abs/1409.1556, 2014.
[4]
Y. Lecun, L. Bottou, Y. Bengio, and P. Haffner. Gradient-based learning applied to document recognition. Proceedings of the IEEE, 1998.
[5]
Hong-Phuc Trinh & Marc Duranton & Michel Paindavoine. Efficient data encoding for convolutional neural network application. ACM (TACO), 2015.
[6]
Anwar. S & Kyuyeon Hwang & Wonyong Sung. Fixed point optimization of deep convolutional neural networks for object recognition. ICASSP, 2015 IEEE International Conference, 2015.
[7]
C. Farabet, C. Poulet, J. Y. Han, and Y. LeCun. Cnp: An fpga-based processor for convolutional networks. In FPL International Conference on, 2009.
[8]
Vinayak Gokhale, Jonghoon Jin, Aysegul Dundar, Berin Martini, and Eugenio Culurciello. A 240 g-ops/s mobile coprocessor for deep neural networks. In The IEEE (CVPR) Workshops, June 2014.
[9]
C. Farabet, B. Martini, B. Corda, P. Akselrod, E. Culurciello, and Y. LeCun. Neuflow: A runtime reconfigurable dataflow processor for vision. In CVPRW'11,IEEE Computer Society Conference.
[10]
Yangqing Jia, Evan Shelhamer, Jeff Donahue, Sergey Karayev, and Long. Caffe: Convolutional architecture for fast feature embedding. arXiv, 2014.
[11]
Suyog Gupta, Ankur Agrawal, and Pritish Narayanan. Deep learning with limited numerical precision. JMLR Workshop and Conference Proceedings, 2015.
  1. Propagation of Quantification Error Over Convolutional Neural Network layers: PhD Forum

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Other conferences
      ICDSC '16: Proceedings of the 10th International Conference on Distributed Smart Camera
      September 2016
      242 pages
      ISBN:9781450347860
      DOI:10.1145/2967413
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 12 September 2016

      Permissions

      Request permissions for this article.

      Check for updates

      Qualifiers

      • Short-paper
      • Research
      • Refereed limited

      Conference

      ICDSC '16

      Acceptance Rates

      Overall Acceptance Rate 92 of 117 submissions, 79%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • 0
        Total Citations
      • 47
        Total Downloads
      • Downloads (Last 12 months)0
      • Downloads (Last 6 weeks)0
      Reflects downloads up to 01 Mar 2025

      Other Metrics

      Citations

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media