[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3589737.3605975acmconferencesArticle/Chapter ViewAbstractPublication PagesiconsConference Proceedingsconference-collections
research-article
Open access

Context Modulation Enables Multi-tasking and Resource Efficiency in Liquid State Machines

Published: 28 August 2023 Publication History

Abstract

Memory storage and retrieval are context-sensitive in both humans and animals; memories are more accurately retrieved in the context where they were acquired, and similar stimuli can elicit different responses in different contexts. Researchers have suggested that such effects may be underpinned by mechanisms that modulate the dynamics of neural circuits in a context-dependent fashion. Based on this idea, we design a mechanism for context-dependent modulation of a liquid state machine, a recurrent spiking artificial neural network. We find that context modulation enables a single network to multitask and requires fewer neurons than when several smaller networks are used to perform the tasks individually.

References

[1]
James B. Aimone and William M. Severa. 2017. Context-Modulation of Hippocampal Dynamics and Deep Convolutional Networks. arXiv:1711.09876 [cs, q-bio, stat] (Nov. 2017). arXiv:1711.09876 [cs, q-bio, stat]
[2]
Shawn Beaulieu, Lapo Frati, Thomas Miconi, Joel Lehman, Kenneth O. Stanley, Jeff Clune, and Nick Cheney. 2020. Learning to Continually Learn. arXiv:2002.09571 [cs, stat] (March 2020). arXiv:2002.09571 [cs, stat]
[3]
Daniel Bernau, Jonas Robl, and Florian Kerschbaum. 2022. Assessing Differentially Private Variational Autoencoders under Membership Inference. arXiv:2204.07877 [cs]
[4]
M. E. Bouton. 2004. Context and Behavioral Processes in Extinction. Learning & Memory 11, 5 (Sept. 2004), 485--494.
[5]
Anurag Daram, Angel Yanguas-Gil, and Dhireesha Kudithipudi. 2020. Exploring Neuromodulation for Dynamic Learning. Frontiers in Neuroscience 14 (2020).
[6]
Mike Davies, Narayan Srinivasa, Tsung-Han Lin, Gautham Chinya, ..., and Hong Wang. 2018. Loihi: A Neuromorphic Manycore Processor with On-Chip Learning. IEEE Micro 38, 1 (Jan. 2018), 82--99.
[7]
Michael V. DeBole, Brian Taba, Arnon Amir, Filipp Akopyan, ..., and John V. Arthur. 2019. TrueNorth: Accelerating From Zero to 64 Million Neurons in 10 Years. Computer 52, 5 (May 2019), 20--29.
[8]
Harish Haresamudram, Irfan Essa, and Thomas Ploetz. 2023. Towards Learning Discrete Representations via Self-Supervision for Wearables-Based Human Activity Recognition. arXiv:2306.01108 [cs, eess]
[9]
Emmanouil Hourdakis and Panos Trahanias. 2013. Use of the Separation Property to Derive Liquid State Machines with Enhanced Classification Performance. Neurocomputing 107 (May 2013), 40--48.
[10]
Almut Hupbach, Rebecca Gomez, Oliver Hardt, and Lynn Nadel. 2007. Reconsolidation of Episodic Memories: A Subtle Reminder Triggers Integration of New Information. Learning & Memory 14, 1--2 (Jan. 2007), 47--53.
[11]
Zohar Jackson. 2020. Free Spoken Digit Dataset (FSDD) v1.0.10.
[12]
Zohar Jackson, César Souza, Jason Flaks, Yuxin Pan, Hereman Nicolas, and Adhish Thite. 2018. Jakobovski/Free-Spoken-Digit-Dataset: V1.0.8. Zenodo.
[13]
Herbert Jaeger. 2001. The "Echo State" Approach to Analysing and Training Recurrent Neural Networks-with an Erratum Note. Bonn, Germany: German National Research Center for Information Technology GMD Technical Report 148, 34 (2001), 13.
[14]
Ziyang Kang, Shiying Wang, Lei Wang, Shiming Li, Lianhua Qu, and Weixia Xu. 2022. Hardware-Aware Liquid State Machine Generation for 2D/3D Network-on-Chip Platforms. Journal of Systems Architecture 124 (March 2022), 102429.
[15]
Dhireesha Kudithipudi, Mario Aguilar-Simon, Jonathan Babb, Maxim Bazhenov, ..., and Hava Siegelmann. 2022. Biological Underpinnings for Lifelong Learning Machines. Nature Machine Intelligence 4, 3 (March 2022), 196--210.
[16]
Wolfgang Maass, Thomas Natschläger, and Henry Markram. 2002. Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations. Neural Computation 14, 11 (Nov. 2002), 2531--2560.
[17]
Mohammad Malekzadeh, Richard G. Clegg, Andrea Cavallaro, and Hamed Haddadi. 2018. Protecting Sensory Data against Sensitive Inferences. Proceedings of the 1st Workshop on Privacy by Design in Distributed Systems (April 2018), 1--6. arXiv:1802.07802
[18]
M. R. Milad, S. P. Orr, R. K. Pitman, and S. L. Rauch. 2005. Context Modulation of Memory for Fear Extinction in Humans. Psychophysiology 42, 4 (July 2005), 456--464.
[19]
David Norton and Dan Ventura. 2010. Improving Liquid State Machines through Iterative Refinement of the Reservoir. Neurocomputing 73, 16 (Oct. 2010), 2893--2904.
[20]
Christopher Olah. 2015. Understanding LSTM Networks - Colah's Blog. https://colah.github.io/posts/2015-08-Understanding-LSTMs/.
[21]
Alberto Patiño-Saucedo, Horacio Rostro-González, Teresa Serrano-Gotarredona, and Bernabé Linares-Barranco. 2022. Liquid State Machine on SpiNNaker for Spatio-Temporal Classification Tasks. Frontiers in Neuroscience 16 (2022).
[22]
Praveen K. Pilly, Michael D. Howard, and Rajan Bhattacharyya. 2018. Modeling Contextual Modulation of Memory Associations in the Hippocampus. Frontiers in Human Neuroscience 12 (Nov. 2018), 442.
[23]
Sainath Reddy. 2022. Spoken_Digit_Recognition Using CNN and LSTM. https://kaggle.com/code/sainathrk/spoken-digit-recognition-using-cnn-and-lstm.
[24]
William A. Roberts. 2019. The Role of Context in Animal Memory. Learning & Behavior 47, 2 (June 2019), 117--130.
[25]
Subhrajit Roy and Arindam Basu. 2016. An Online Structural Plasticity Rule for Generating Better Reservoirs. Neural Computation 28, 11 (Nov. 2016), 2557--2584.
[26]
Sadasivan Shankar and Albert Reuther. 2022. Trends in Energy Estimates for Computing in AI/Machine Learning Accelerators, Supercomputers, and Compute-Intensive Applications. In 2022 IEEE High Performance Extreme Computing Conference (HPEC). IEEE, Waltham, MA, USA, 1--8.
[27]
Nicholas Soures and Dhireesha Kudithipudi. 2019. Deep Liquid State Machines With Neural Plasticity for Video Activity Recognition. Frontiers in Neuroscience 13 (July 2019), 686.
[28]
Nicholas Soures and Dhireesha Kudithipudi. 2019. Spiking Reservoir Networks: Brain-Inspired Recurrent Algorithms That Use Random, Fixed Synaptic Strengths. IEEE Signal Processing Magazine 36, 6 (Nov. 2019), 78--87.
[29]
Nicholas Soures, Cory Merkel, Dhireesha Kudithipudi, Clare Thiem, and Nathan McDonald. 2017. Reservoir Computing in Embedded Systems Three Variants of the Reservoir Algorithm. Ieee Consumer Electronics Magazine 6, 3 (July 2017), 67--73.
[30]
Lei Wang, Zhijie Yang, Shasha Guo, Lianhua Qu, Xiangyu Zhang, Ziyang Kang, and Weixia Xu. 2022. LSMCore: A 69k-Synapse/Mm(2) Single-Core Digital Neuromorphic Processor for Liquid State Machine. Ieee Transactions on Circuits and Systems I-Regular Papers 69, 5 (May 2022), 1976--1989.
[31]
Xinyun Zou, Soheil Kolouri, Praveen K. Pilly, and Jeffrey L. Krichmar. 2020. Neuromodulated Attention and Goal-Driven Perception in Uncertain Domains. Neural Networks 125 (May 2020), 56--69.

Index Terms

  1. Context Modulation Enables Multi-tasking and Resource Efficiency in Liquid State Machines

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ICONS '23: Proceedings of the 2023 International Conference on Neuromorphic Systems
    August 2023
    270 pages
    ISBN:9798400701757
    DOI:10.1145/3589737
    This work is licensed under a Creative Commons Attribution International 4.0 License.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 28 August 2023

    Check for updates

    Author Tags

    1. context modulation
    2. neuromorphic
    3. spiking neural network
    4. liquid state machine

    Qualifiers

    • Research-article

    Funding Sources

    Conference

    ICONS '23
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 13 of 22 submissions, 59%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 231
      Total Downloads
    • Downloads (Last 12 months)155
    • Downloads (Last 6 weeks)18
    Reflects downloads up to 11 Dec 2024

    Other Metrics

    Citations

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media