Abstract
In this paper we define a kernel function which is the dual space equivalent of infinitely large sparse threshold unit networks. We first explain how to couple a kernel function to an infinite recurrent neural network, and next we use this definition to apply the theory to sparse threshold unit networks. We validate this kernel function with a theoretical analysis and an illustrative signal processing task.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Arsie, I., Pianese, C., Sorrentino, M.: Development and real-time implementation of recurrent neural networks for afr control and prediction. SAE International Journal of Passenger Cars - Electronic Electrical Systems 1(1), 403–412 (2009)
Graves, A., Eck, D., Beringer, N., Schmidhuber, J.: Biologically plausible speech recognition with LSTM neural nets. In: Proceedings of Bio-ADIT, pp. 127–136 (2004)
Hay, E., Hill, S., Schürmann, F., Markram, H., Segev, I.: Models of neocortical layer 5b pyramidal cells capturing a wide range of dendritic and perisomatic active properties. PLoS Computational Biology 7(7), e1002107 (2011)
Hermans, M., Schrauwen, B.: Recurrent kernel machines: Computing with infinite echo state networks. Neural Computation 24(1), 104–133 (2011)
Hopfield, J.J.: Neural networks and physical systems with emergent collective computational abilities. Proc. Natl. Acad. Sci. USA 79(8), 2554–2558 (1982)
Jaeger, H.: Adaptive nonlinear system identification with echo state networks. In: Advances in Neural Information Processing Systems, pp. 593–600 (2003)
Jaeger, H., Haas, H.: Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless telecommunication. Science 308, 78–80 (2004)
Kauffman, S.A.: Metabolic stability and epigenesis in randomly constructed genetic nets. Journal of Theoretical Biology 22(3), 437–467 (1969)
Maass, W., Natschläger, T., Markram, H.: Real-time computing without stable states: A new framework for neural computation based on perturbations. Neural Computation 14(11), 2531–2560 (2002)
Natschläger, T., Maass, W.: Information dynamics and emergent computation in recurrent circuits of spiking neurons. In: Thrun, S., Saul, L., Schölkopf, B. (eds.) Proc. of NIPS 2003, Advances in Neural Information Processing Systems, vol. 16, pp. 1255–1262. MIT Press, Cambridge (2004)
Neal, R.M.: Bayesian Learning for Neural Networks. Springer (1996)
Steil, J.J.: Stability of backpropagation-decorrelation efficient O(N) recurrent learning. In: Proceedings of ESANN 2005, Brugge (2005)
Suykens, J., Vandewalle, J. (eds.): Enhanced Multi-Stream Kalman Filter Training for Recurrent Networks. In: Nonlinear Modeling: Advanced Black-Box Techniques, pp. 29–53. Kluwer Academic Publishers (1998)
Williams, C.K.A.: Computation with infinite neural networks. Neural Computation 10, 1203–1216 (1998)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Hermans, M., Schrauwen, B. (2012). Infinite Sparse Threshold Unit Networks. In: Villa, A.E.P., Duch, W., Érdi, P., Masulli, F., Palm, G. (eds) Artificial Neural Networks and Machine Learning – ICANN 2012. ICANN 2012. Lecture Notes in Computer Science, vol 7552. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-33269-2_77
Download citation
DOI: https://doi.org/10.1007/978-3-642-33269-2_77
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-33268-5
Online ISBN: 978-3-642-33269-2
eBook Packages: Computer ScienceComputer Science (R0)