Zhou et al., 2021 - Google Patents
Energy‐Efficient Memristive Euclidean Distance Engine for Brain‐Inspired Competitive LearningZhou et al., 2021
View PDF- Document ID
- 15831922972509195587
- Author
- Zhou H
- Chen J
- Wang Y
- Liu S
- Li Y
- Li Q
- Liu Q
- Wang Z
- He Y
- Xu H
- Miao X
- Publication year
- Publication venue
- Advanced Intelligent Systems
External Links
Snippet
Inspired by competitive rules of the nature, competitive learning contributes to the specialization of the human brain and the general creativity of mankind. However, the construction of hardware competitive learning neural network still faces great challenges …
- 230000002860 competitive 0 title abstract description 36
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computer systems based on biological models
- G06N3/02—Computer systems based on biological models using neural network models
- G06N3/06—Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
- G06N3/063—Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means
- G06N3/0635—Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means using analogue means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computer systems based on biological models
- G06N3/02—Computer systems based on biological models using neural network models
- G06N3/04—Architectures, e.g. interconnection topology
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computer systems based on biological models
- G06N3/02—Computer systems based on biological models using neural network models
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N99/00—Subject matter not provided for in other groups of this subclass
- G06N99/005—Learning machines, i.e. computer in which a programme is changed according to experience gained by the machine itself during a complete run
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computer systems based on biological models
- G06N3/12—Computer systems based on biological models using genetic models
- G06N3/126—Genetic algorithms, i.e. information processing using digital simulations of the genetic system
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/62—Methods or arrangements for recognition using electronic means
- G06K9/6267—Classification techniques
- G06K9/6268—Classification techniques relating to the classification paradigm, e.g. parametric or non-parametric approaches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/50—Computer-aided design
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11C—STATIC STORES
- G11C11/00—Digital stores characterised by the use of particular electric or magnetic storage elements; Storage elements therefor
- G11C11/56—Digital stores characterised by the use of particular electric or magnetic storage elements; Storage elements therefor using storage elements with more than two stable states represented by steps, e.g. of voltage, current, phase, frequency
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11C—STATIC STORES
- G11C13/00—Digital stores characterised by the use of storage elements not covered by groups G11C11/00, G11C23/00 - G11C25/00
- G11C13/0002—Digital stores characterised by the use of storage elements not covered by groups G11C11/00, G11C23/00 - G11C25/00 using resistance random access memory [RRAM] elements
- G11C13/0021—Auxiliary circuits
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/62—Methods or arrangements for recognition using electronic means
- G06K9/6217—Design or setup of recognition systems and techniques; Extraction of features in feature space; Clustering techniques; Blind source separation
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Nandakumar et al. | Mixed-precision deep learning based on computational memory | |
Dalgaty et al. | In situ learning using intrinsic memristor variability via Markov chain Monte Carlo sampling | |
Yang et al. | Research progress on memristor: From synapses to computing systems | |
Chen et al. | Mitigating effects of non-ideal synaptic device characteristics for on-chip learning | |
Jaiswal et al. | 8T SRAM cell as a multibit dot-product engine for beyond von Neumann computing | |
Xi et al. | In-memory learning with analog resistive switching memory: A review and perspective | |
Joshi et al. | Accurate deep neural network inference using computational phase-change memory | |
Cai et al. | A fully integrated reprogrammable memristor–CMOS system for efficient multiply–accumulate operations | |
Chen et al. | Multiply accumulate operations in memristor crossbar arrays for analog computing | |
Wang et al. | Integration and co-design of memristive devices and algorithms for artificial intelligence | |
Eryilmaz et al. | Training a probabilistic graphical model with resistive switching electronic synapses | |
Zhou et al. | Energy‐Efficient Memristive Euclidean Distance Engine for Brain‐Inspired Competitive Learning | |
Li et al. | In situ parallel training of analog neural network using electrochemical random-access memory | |
Mao et al. | Experimentally validated memristive memory augmented neural network with efficient hashing and similarity search | |
Dalgaty et al. | Ex situ transfer of bayesian neural networks to resistive memory‐based inference hardware | |
Li et al. | Mixed‐Precision Continual Learning Based on Computational Resistance Random Access Memory | |
Bennett et al. | Contrasting advantages of learning with random weights and backpropagation in non-volatile memory neural networks | |
Liu et al. | Bayesian neural networks using magnetic tunnel junction-based probabilistic in-memory computing | |
Lee et al. | Ferroelectric field-effect transistors for binary neural network with 3-D NAND architecture | |
Yang et al. | Tolerating noise effects in processing‐in‐memory systems for neural networks: a hardware–software codesign perspective | |
Ravichandran et al. | Artificial neural networks based on memristive devices | |
Cai et al. | A Fully Integrated System‐on‐Chip Design with Scalable Resistive Random‐Access Memory Tile Design for Analog in‐Memory Computing | |
Zhou et al. | Low-time-complexity document clustering using memristive dot product engine | |
Li et al. | Restricted Boltzmann Machines Implemented by Spin–Orbit Torque Magnetic Tunnel Junctions | |
Cao et al. | Parasitic-aware modelling for neural networks implemented with memristor crossbar array |