Li et al., 2021 - Google Patents
In situ parallel training of analog neural network using electrochemical random-access memoryLi et al., 2021
View HTML- Document ID
- 4781592410128529947
- Author
- Li Y
- Xiao T
- Bennett C
- Isele E
- Melianas A
- Tao H
- Marinella M
- Salleo A
- Fuller E
- Talin A
- Publication year
- Publication venue
- Frontiers in Neuroscience
External Links
Snippet
In-memory computing based on non-volatile resistive memory can significantly improve the energy efficiency of artificial neural networks. However, accurate in situ training has been challenging due to the nonlinear and stochastic switching of the resistive memory elements …
- 230000001537 neural 0 title abstract description 43
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computer systems based on biological models
- G06N3/02—Computer systems based on biological models using neural network models
- G06N3/06—Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
- G06N3/063—Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means
- G06N3/0635—Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means using analogue means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N99/00—Subject matter not provided for in other groups of this subclass
- G06N99/005—Learning machines, i.e. computer in which a programme is changed according to experience gained by the machine itself during a complete run
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computer systems based on biological models
- G06N3/02—Computer systems based on biological models using neural network models
- G06N3/04—Architectures, e.g. interconnection topology
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computer systems utilising knowledge based models
- G06N5/04—Inference methods or devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computer systems based on biological models
- G06N3/12—Computer systems based on biological models using genetic models
- G06N3/126—Genetic algorithms, i.e. information processing using digital simulations of the genetic system
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computer systems utilising knowledge based models
- G06N5/02—Knowledge representation
- G06N5/022—Knowledge engineering, knowledge acquisition
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11C—STATIC STORES
- G11C13/00—Digital stores characterised by the use of storage elements not covered by groups G11C11/00, G11C23/00 - G11C25/00
- G11C13/0002—Digital stores characterised by the use of storage elements not covered by groups G11C11/00, G11C23/00 - G11C25/00 using resistance random access memory [RRAM] elements
- G11C13/0009—RRAM elements whose operation depends upon chemical change
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11C—STATIC STORES
- G11C11/00—Digital stores characterised by the use of particular electric or magnetic storage elements; Storage elements therefor
- G11C11/56—Digital stores characterised by the use of particular electric or magnetic storage elements; Storage elements therefor using storage elements with more than two stable states represented by steps, e.g. of voltage, current, phase, frequency
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N7/00—Computer systems based on specific mathematical models
- G06N7/005—Probabilistic networks
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Li et al. | In situ parallel training of analog neural network using electrochemical random-access memory | |
Li et al. | Long short-term memory networks in memristor crossbar arrays | |
Mahmoodi et al. | Versatile stochastic dot product circuits based on nonvolatile memories for high performance neurocomputing and neurooptimization | |
Dalgaty et al. | In situ learning using intrinsic memristor variability via Markov chain Monte Carlo sampling | |
Wang et al. | Reinforcement learning with analogue memristor arrays | |
Oh et al. | Energy-efficient Mott activation neuron for full-hardware implementation of neural networks | |
Fuller et al. | Parallel programming of an ionic floating-gate memory array for scalable neuromorphic computing | |
Yao et al. | Fully hardware-implemented memristor convolutional neural network | |
Dutta et al. | Supervised learning in all FeFET-based spiking neural network: Opportunities and challenges | |
Nandakumar et al. | Mixed-precision deep learning based on computational memory | |
Sun et al. | One-step regression and classification with cross-point resistive memory arrays | |
Hirtzlin et al. | Digital biologically plausible implementation of binarized neural networks with differential hafnium oxide resistive memory arrays | |
Boybat et al. | Neuromorphic computing with multi-memristive synapses | |
Yao et al. | Face classification using electronic synapses | |
Dutta et al. | Neural sampling machine with stochastic synapse allows brain-like learning and inference | |
John et al. | Optogenetics inspired transition metal dichalcogenide neuristors for in-memory deep recurrent neural networks | |
Abunahla et al. | NeuroMem: Analog graphene-based resistive memory for artificial neural networks | |
Wang et al. | A memristive deep belief neural network based on silicon synapses | |
Zahari et al. | Analogue pattern recognition with stochastic switching binary CMOS-integrated memristive devices | |
Ernoult et al. | Using memristors for robust local learning of hardware restricted Boltzmann machines | |
Fahimi et al. | Combinatorial optimization by weight annealing in memristive hopfield networks | |
Antolini et al. | Combined HW/SW drift and variability mitigation for PCM-based analog in-memory computing for neural network applications | |
Liu et al. | Bayesian neural networks using magnetic tunnel junction-based probabilistic in-memory computing | |
Lin et al. | Uncertainty quantification via a memristor Bayesian deep neural network for risk-sensitive reinforcement learning | |
Park et al. | Implementation of convolutional neural networks in memristor crossbar arrays with binary activation and weight quantization |