Asymptotic Convergence Rate of Dropout on Shallow Linear Neural Networks
Abstract
References
Index Terms
- Asymptotic Convergence Rate of Dropout on Shallow Linear Neural Networks
Recommendations
Asymptotic Convergence Rate of Dropout on Shallow Linear Neural Networks
SIGMETRICS/PERFORMANCE '22: Abstract Proceedings of the 2022 ACM SIGMETRICS/IFIP PERFORMANCE Joint International Conference on Measurement and Modeling of Computer SystemsWe analyze the convergence rate of gradient flows on objective functions induced by Dropout and Dropconnect, when applying them to shallow linear Neural Networks (NNs) ---which can also be viewed as doing matrix factorization using a particular ...
Asymptotic Convergence Rate of Dropout on Shallow Linear Neural Networks
SIGMETRICS '22We analyze the convergence rate of gradient flows on objective functions induced by Dropout and Dropconnect, when applying them to shallow linear Neural Networks (NNs) ---which can also be viewed as doing matrix factorization using a particular ...
New globally asymptotic stability criteria for delayed cellular neural networks
This brief is concerned with the stability analysis for cellular neural networks with time-varying delays. First, an appropriate Lyapunov-Krasovskii functional is introduced to form some new delay-dependent stability conditions in terms of linear matrix ...
Comments
Please enable JavaScript to view thecomments powered by Disqus.Information & Contributors
Information
Published In
Publisher
Association for Computing Machinery
New York, NY, United States
Publication History
Check for updates
Author Tags
Qualifiers
- Research-article
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 200Total Downloads
- Downloads (Last 12 months)96
- Downloads (Last 6 weeks)12
Other Metrics
Citations
View Options
Login options
Check if you have access through your login credentials or your institution to get full access on this article.
Sign in