[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1109/ISIT.2019.8849501guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
research-article

On the CVP for the root lattices via folding with deep ReLU neural networks

Published: 07 July 2019 Publication History

Abstract

Point lattices and their decoding via neural networks are considered in this paper. Lattice decoding in &#x211D;<sup>n</sup>, known as the closest vector problem (CVP), becomes a classification problem in the fundamental parallelotope with a piecewise linear function defining the boundary. Theoretical results are obtained by studying root lattices. We show how the number of pieces in the boundary function reduces dramatically with folding, from exponential to linear. This translates into a two-layer ReLU neural network requiring a number of neurons growing exponentially in n to solve the CVP, whereas this complexity becomes polynomial in n for a deep ReLU neural network.

References

[1]
M. Anthony, P. Bartlett Neural Network Learning: Theoretical Foundations. Cambridge University Press, 2009.
[2]
R. Arora, A. Basu, P. Mianjy, and A. Mukherjee “Understanding deep neural networks with rectified linear units.” International Conference on Learning Representations, 2018.
[3]
P. Bartlett, N. Harvey, C. Liaw, and A. Mehrabian, (2017). “Nearly-tight VC-dimension and pseudodimension bounds for piecewise linear neural networks.” arXiv preprint arXiv:1703.02930, 2017.
[4]
J. Conway and N. Sloane Sphere packings, lattices and groups. Springer-Verlag, New York, 3rd edition, 1999.
[5]
V. Corlay, J. J. Boutros, P. Ciblat, and L. Brunel, “Neural Lattice Decoders,” 6th IEEE Global Conference on Signal and Information Processing, also available at: arXiv preprint arXiv:1807.00592, 2018.
[6]
H. Coxeter Regular Polytopes. Dover, NY, 3rd ed., 1973.
[7]
R. Eldan and O. Shamir “The power of depth for feedforward neural networks.” 29th Annual Conference on Learning Theory, pp. 907–940, 2016.
[8]
J. Håstad “Almost optimal lower bounds for small depth circuits.” Proc. 18th ACM symposium on Theory of computing, pp. 6–20, 1986.
[9]
G. Montùfar, R. Pascanu, K. Cho, and Y. Bengio, “On the Number of Linear Regions of Deep Neural Networks,” Advances in neural information processing systems, pp. 2924-2932, 2014.
[10]
R. Pascanu, G. Montufar, and Y. Bengio “On the number of inference regions of deep feed forward with piece-wise linear activations,” arXiv preprint arXiv:1312.6098, 2013.
[11]
M. Raghu, B. Poole, J. Kleinberg, S. Ganguli, and J. Sohl-Dickstein “On the expressive power of deep neural networks,” arXiv preprint arXiv:1606.05336, 2016.
[12]
I. Safran and O. Shamir “Depth-width tradeoffs in approximating natural functions with neural networks,” International Conference on Machine Learning, pp. 2979–2987, 2017
[13]
O. Shamir, S. Sabato, and N. Tishby, “Learning and generalization with the information bottleneck,” Theor. Comput. Sci., vol. 411, no.29-30, pp. 2696–2711, 2010.
[14]
L. Szymanski and B. McCane, “Learning in deep architectures with folding transformations,” 2013 International Joint Conference on Neural Networks, pp. 1-8, 2013.
[15]
M. Telgarsky “Benefits of depth in neural networks,” 29th Annual Conference on Learning Theory, pp. 1517–1539, 2016.
[16]
The long version of this paper including an appendix is available at: arXiv preprint arXiv:1902.05146.

Index Terms

  1. On the CVP for the root lattices via folding with deep ReLU neural networks
            Index terms have been assigned to the content through auto-classification.

            Recommendations

            Comments

            Please enable JavaScript to view thecomments powered by Disqus.

            Information & Contributors

            Information

            Published In

            cover image Guide Proceedings
            2019 IEEE International Symposium on Information Theory (ISIT)
            Jul 2019
            5716 pages

            Publisher

            IEEE Press

            Publication History

            Published: 07 July 2019

            Qualifiers

            • Research-article

            Contributors

            Other Metrics

            Bibliometrics & Citations

            Bibliometrics

            Article Metrics

            • 0
              Total Citations
            • 0
              Total Downloads
            • Downloads (Last 12 months)0
            • Downloads (Last 6 weeks)0
            Reflects downloads up to 14 Dec 2024

            Other Metrics

            Citations

            View Options

            View options

            Media

            Figures

            Other

            Tables

            Share

            Share

            Share this Publication link

            Share on social media