Alleviating Catastrophic Interference in Online Learning via Varying Scale of Backward Queried Data
Abstract
References
Recommendations
A review of online learning in supervised neural networks
Learning in neural networks can broadly be divided into two categories, viz., off-line (or batch) learning and online (or incremental) learning. In this paper, a review of a variety of supervised neural networks with online learning capabilities is ...
Scalable hands-free transfer learning for online advertising
KDD '14: Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data miningInternet display advertising is a critical revenue source for publishers and online content providers, and is supported by massive amounts of user and publisher data. Targeting display ads can be improved substantially with machine learning methods, but ...
Overcoming Catastrophic Interference with Bayesian Learning and Stochastic Langevin Dynamics
Advances in Neural Networks – ISNN 2019AbstractNeural networks encounter serious catastrophic forgetting when information is learned sequentially. Although simply replaying all previous data alleviates the problem, it may require large memory to store all previous training examples. Even with ...
Comments
Please enable JavaScript to view thecomments powered by Disqus.Information & Contributors
Information
Published In
Publisher
Springer-Verlag
Berlin, Heidelberg
Publication History
Author Tags
Qualifiers
- Article
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 0Total Downloads
- Downloads (Last 12 months)0
- Downloads (Last 6 weeks)0