Global optimisation of neural network models via sequential sampling
Abstract
References
Index Terms
- Global optimisation of neural network models via sequential sampling
Recommendations
Global optimisation of neural network models via sequential sampling
Proceedings of the 1998 conference on Advances in neural information processing systems IIGlobal sampling for sequential filtering over discrete state space
In many situations, there is a need to approximate a sequence of probability measures over a growing product of finite spaces. Whereas it is in general possible to determine analytic expressions for these probability measures, the number of computations ...
Comparison of Stochastic Global Optimization Methods to Estimate Neural Network Weights
Training a neural network is a difficult optimization problem because of numerous local minima. Many global search algorithms have been used to train neural networks. However, local search algorithms are more efficient with computational resources, and ...
Comments
Please enable JavaScript to view thecomments powered by Disqus.Information & Contributors
Information
Published In
Publisher
MIT Press
Cambridge, MA, United States
Publication History
Qualifiers
- Article
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 0Total Downloads
- Downloads (Last 12 months)0
- Downloads (Last 6 weeks)0