[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

Li et al., 2021 - Google Patents

In situ parallel training of analog neural network using electrochemical random-access memory

Li et al., 2021

View HTML
Document ID
4781592410128529947
Author
Li Y
Xiao T
Bennett C
Isele E
Melianas A
Tao H
Marinella M
Salleo A
Fuller E
Talin A
Publication year
Publication venue
Frontiers in Neuroscience

External Links

Snippet

In-memory computing based on non-volatile resistive memory can significantly improve the energy efficiency of artificial neural networks. However, accurate in situ training has been challenging due to the nonlinear and stochastic switching of the resistive memory elements …
Continue reading at www.frontiersin.org (HTML) (other versions)

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computer systems based on biological models
    • G06N3/02Computer systems based on biological models using neural network models
    • G06N3/06Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
    • G06N3/063Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means
    • G06N3/0635Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means using analogue means
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N99/00Subject matter not provided for in other groups of this subclass
    • G06N99/005Learning machines, i.e. computer in which a programme is changed according to experience gained by the machine itself during a complete run
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computer systems based on biological models
    • G06N3/02Computer systems based on biological models using neural network models
    • G06N3/04Architectures, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computer systems utilising knowledge based models
    • G06N5/04Inference methods or devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computer systems based on biological models
    • G06N3/12Computer systems based on biological models using genetic models
    • G06N3/126Genetic algorithms, i.e. information processing using digital simulations of the genetic system
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computer systems utilising knowledge based models
    • G06N5/02Knowledge representation
    • G06N5/022Knowledge engineering, knowledge acquisition
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11CSTATIC STORES
    • G11C13/00Digital stores characterised by the use of storage elements not covered by groups G11C11/00, G11C23/00 - G11C25/00
    • G11C13/0002Digital stores characterised by the use of storage elements not covered by groups G11C11/00, G11C23/00 - G11C25/00 using resistance random access memory [RRAM] elements
    • G11C13/0009RRAM elements whose operation depends upon chemical change
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11CSTATIC STORES
    • G11C11/00Digital stores characterised by the use of particular electric or magnetic storage elements; Storage elements therefor
    • G11C11/56Digital stores characterised by the use of particular electric or magnetic storage elements; Storage elements therefor using storage elements with more than two stable states represented by steps, e.g. of voltage, current, phase, frequency
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N7/00Computer systems based on specific mathematical models
    • G06N7/005Probabilistic networks

Similar Documents

Publication Publication Date Title
Li et al. In situ parallel training of analog neural network using electrochemical random-access memory
Li et al. Long short-term memory networks in memristor crossbar arrays
Mahmoodi et al. Versatile stochastic dot product circuits based on nonvolatile memories for high performance neurocomputing and neurooptimization
Dalgaty et al. In situ learning using intrinsic memristor variability via Markov chain Monte Carlo sampling
Wang et al. Reinforcement learning with analogue memristor arrays
Oh et al. Energy-efficient Mott activation neuron for full-hardware implementation of neural networks
Fuller et al. Parallel programming of an ionic floating-gate memory array for scalable neuromorphic computing
Yao et al. Fully hardware-implemented memristor convolutional neural network
Dutta et al. Supervised learning in all FeFET-based spiking neural network: Opportunities and challenges
Nandakumar et al. Mixed-precision deep learning based on computational memory
Sun et al. One-step regression and classification with cross-point resistive memory arrays
Hirtzlin et al. Digital biologically plausible implementation of binarized neural networks with differential hafnium oxide resistive memory arrays
Boybat et al. Neuromorphic computing with multi-memristive synapses
Yao et al. Face classification using electronic synapses
Dutta et al. Neural sampling machine with stochastic synapse allows brain-like learning and inference
John et al. Optogenetics inspired transition metal dichalcogenide neuristors for in-memory deep recurrent neural networks
Abunahla et al. NeuroMem: Analog graphene-based resistive memory for artificial neural networks
Wang et al. A memristive deep belief neural network based on silicon synapses
Zahari et al. Analogue pattern recognition with stochastic switching binary CMOS-integrated memristive devices
Ernoult et al. Using memristors for robust local learning of hardware restricted Boltzmann machines
Fahimi et al. Combinatorial optimization by weight annealing in memristive hopfield networks
Antolini et al. Combined HW/SW drift and variability mitigation for PCM-based analog in-memory computing for neural network applications
Liu et al. Bayesian neural networks using magnetic tunnel junction-based probabilistic in-memory computing
Lin et al. Uncertainty quantification via a memristor Bayesian deep neural network for risk-sensitive reinforcement learning
Park et al. Implementation of convolutional neural networks in memristor crossbar arrays with binary activation and weight quantization