[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (3)

Search Parameters:
Keywords = analog MLP circuit

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
26 pages, 3383 KiB  
Article
Programmable Energy-Efficient Analog Multilayer Perceptron Architecture Suitable for Future Expansion to Hardware Accelerators
by Jeff Dix, Jeremy Holleman and Benjamin J. Blalock
J. Low Power Electron. Appl. 2023, 13(3), 47; https://doi.org/10.3390/jlpea13030047 - 31 Jul 2023
Cited by 2 | Viewed by 1658
Abstract
A programmable, energy-efficient analog hardware implementation of a multilayer perceptron (MLP) is presented featuring a highly programmable system that offers the user the capability to create an MLP neural network hardware design within the available framework. In addition to programmability, this implementation provides [...] Read more.
A programmable, energy-efficient analog hardware implementation of a multilayer perceptron (MLP) is presented featuring a highly programmable system that offers the user the capability to create an MLP neural network hardware design within the available framework. In addition to programmability, this implementation provides energy-efficient operation via analog/mixed-signal design. The configurable system is made up of 12 neurons and is fabricated in a standard 130 nm CMOS process occupying approximately 1 mm2 of on-chip area. The system architecture is analyzed in several different configurations with each achieving a power efficiency of greater than 1 tera-operations per watt. This work offers an energy-efficient and scalable alternative to digital configurable neural networks that can be built upon to create larger networks capable of standard machine learning applications, such as image and text classification. This research details a programmable hardware implementation of an MLP that achieves a peak power efficiency of 5.23 tera-operations per watt while consuming considerably less power than comparable digital and analog designs. This paper describes circuit elements that can readily be scaled up at the system level to create a larger neural network architecture capable of improved energy efficiency. Full article
Show Figures

Figure 1

Figure 1
<p>General MLP structure.</p>
Full article ">Figure 2
<p>PCB test board for MLP chip (see text for color details).</p>
Full article ">Figure 3
<p>Hardware configuration for first configuration.</p>
Full article ">Figure 4
<p>First system configuration measurement (no load at 4.07 MHz).</p>
Full article ">Figure 5
<p>First system configuration measurement for rising edge propagation delay.</p>
Full article ">Figure 6
<p>First system configuration measurement for falling edge propagation delay.</p>
Full article ">Figure 7
<p>Hardware configuration for seventh configuration.</p>
Full article ">Figure 8
<p>Seventh system configuration measurement (no load at 4.07 MHz).</p>
Full article ">Figure 9
<p>Seventh system configuration measurement for rising edge propagation delay.</p>
Full article ">Figure 10
<p>Seventh system configuration measurement for falling edge propagation delay.</p>
Full article ">Figure 11
<p>First noise analysis run on system.</p>
Full article ">Figure 12
<p>Second noise analysis run on system.</p>
Full article ">Figure 13
<p>MLP hardware block diagram for programmable energy-efficient system.</p>
Full article ">Figure 14
<p>Multiplier cell schematic.</p>
Full article ">Figure 15
<p>Minch cascode bias cell schematics.</p>
Full article ">Figure 16
<p>Simulation of the multiplier circuit with an input current of 100 nA and weighting signals of 100 nA.</p>
Full article ">Figure 17
<p>Simulation of the Multiplier Circuit with a DC sweep of the input current and weighting signals of 100 nA.</p>
Full article ">Figure 18
<p>Logistic function (<b>left</b>) and hyperbolic tangent function (<b>right</b>).</p>
Full article ">Figure 19
<p>Sigmoid circuit schematic.</p>
Full article ">Figure 20
<p>Sigmoid circuit schematic simulation with current and reference biasing to maintain a similar output current signal as the input signal.</p>
Full article ">Figure 21
<p>Sigmoid circuit schematic simulation with the input current swept from 0 to 500 nA with a constant reference and bias current of 200 and 25 nA, respectively.</p>
Full article ">Figure 22
<p>Winner-take-all block diagram.</p>
Full article ">Figure 23
<p>Thresholding circuit cell schematic.</p>
Full article ">Figure 24
<p>Winner-take-all circuit simulation with a 100 nA input current signal and 100 nA weight, saturation, and biasing signals to produce an output voltage that corresponds to the high signal levels of the input signal.</p>
Full article ">Figure 25
<p>Four-bit shift register made up of D-type flip-flops.</p>
Full article ">Figure 26
<p>Master bias input cell with single current mirror output.</p>
Full article ">Figure 27
<p>Neuron/WTA input bias cell with single current mirror output.</p>
Full article ">Figure 28
<p>Neuron block diagram.</p>
Full article ">Figure 29
<p>Switching structures: (<b>a</b>) C-switch and (<b>b</b>) S-switch.</p>
Full article ">Figure 30
<p>C-switch matrix.</p>
Full article ">Figure 31
<p>S-switch matrix.</p>
Full article ">Figure 32
<p>Final layout view of the MLP architecture that fits within a 1 mm by 1 mm physical space.</p>
Full article ">
22 pages, 4861 KiB  
Article
Implementation of Analog Perceptron as an Essential Element of Configurable Neural Networks
by Chao Geng, Qingji Sun and Shigetoshi Nakatake
Sensors 2020, 20(15), 4222; https://doi.org/10.3390/s20154222 - 29 Jul 2020
Cited by 4 | Viewed by 4330
Abstract
Perceptron is an essential element in neural network (NN)-based machine learning, however, the effectiveness of various implementations by circuits is rarely demonstrated from chip testing. This paper presents the measured silicon results for the analog perceptron circuits fabricated in a 0.6 μm/±2.5 [...] Read more.
Perceptron is an essential element in neural network (NN)-based machine learning, however, the effectiveness of various implementations by circuits is rarely demonstrated from chip testing. This paper presents the measured silicon results for the analog perceptron circuits fabricated in a 0.6 μm/±2.5 V complementary metal oxide semiconductor (CMOS) process, which are comprised of digital-to-analog converter (DAC)-based multipliers and phase shifters. The results from the measurement convinces us that our implementation attains the correct function and good performance. Furthermore, we propose the multi-layer perceptron (MLP) by utilizing analog perceptron where the structure and neurons as well as weights can be flexibly configured. The example given is to design a 2-3-4 MLP circuit with rectified linear unit (ReLU) activation, which consists of 2 input neurons, 3 hidden neurons, and 4 output neurons. Its experimental case shows that the simulated performance achieves a power dissipation of 200 mW, a range of working frequency from 0 to 1 MHz, and an error ratio within 12.7%. Finally, to demonstrate the feasibility and effectiveness of our analog perceptron for configuring a MLP, seven more analog-based MLPs designed with the same approach are used to analyze the simulation results with respect to various specifications, in which two cases are used to compare to their digital counterparts with the same structures. Full article
(This article belongs to the Special Issue Advanced Interface Circuits for Sensor Systems)
Show Figures

Figure 1

Figure 1
<p>Comparison between the analog method and digital method aimed at electroencephalogram (EEG) signal processing with machine learning mechanism. (<b>a</b>) EEG signal decomposition/reconstruction by using fourier series approximation. (<b>b</b>) The analog-based multi-layer perceptron (MLP) and field programmable gate array (FPGA)-based MLP in the handling of the EEG siganl.</p>
Full article ">Figure 2
<p>(<b>a</b>) Configuration of our proposed analog-based MLP and learning algorithm. (<b>b</b>) Contributions in this work.</p>
Full article ">Figure 3
<p>The general architecture of the measuring system for the perceptron chip.</p>
Full article ">Figure 4
<p>A schematic of a perceptron chip and block diagram of the measurement.</p>
Full article ">Figure 5
<p>Top-level layout of perceptron chip.</p>
Full article ">Figure 6
<p>Measurement results for perceptron chip.</p>
Full article ">Figure 7
<p>Top-level schematic of our MLP circuit.</p>
Full article ">Figure 8
<p>Circuits and simulation for the rectified linear unit (ReLU) activation function. (<b>a</b>) A source follower circuit. (<b>b</b>) Our improved source follower circuit. (<b>c</b>) Simulation results for both circuits.</p>
Full article ">Figure 9
<p>Summator inside our perceptron module.</p>
Full article ">Figure 10
<p>Impedance issue of cascading. (<b>a</b>) Comparison between the output impedance and the input impedance. (<b>b</b>) The output impedance of the circuit after inserting buffers.</p>
Full article ">Figure 11
<p>Simulation results for the top-level schematic.</p>
Full article ">Figure 11 Cont.
<p>Simulation results for the top-level schematic.</p>
Full article ">
852 KiB  
Article
An Analog Multilayer Perceptron Neural Network for a Portable Electronic Nose
by Chih-Heng Pan, Hung-Yi Hsieh and Kea-Tiong Tang
Sensors 2013, 13(1), 193-207; https://doi.org/10.3390/s130100193 - 24 Dec 2012
Cited by 29 | Viewed by 9014
Abstract
This study examines an analog circuit comprising a multilayer perceptron neural network (MLPNN). This study proposes a low-power and small-area analog MLP circuit to implement in an E-nose as a classifier, such that the E-nose would be relatively small, power-efficient, and portable. The [...] Read more.
This study examines an analog circuit comprising a multilayer perceptron neural network (MLPNN). This study proposes a low-power and small-area analog MLP circuit to implement in an E-nose as a classifier, such that the E-nose would be relatively small, power-efficient, and portable. The analog MLP circuit had only four input neurons, four hidden neurons, and one output neuron. The circuit was designed and fabricated using a 0.18 μm standard CMOS process with a 1.8 V supply. The power consumption was 0.553 mW, and the area was approximately 1.36 × 1.36 mm2. The chip measurements showed that this MLPNN successfully identified the fruit odors of bananas, lemons, and lychees with 91.7% accuracy. Full article
Show Figures


<p>Block diagram of the proposed 4-4-1 MLPNN.</p>
Full article ">
<p>Detailed block diagrams of HS, HN, OS, and ON.</p>
Full article ">
<p>Schematic of Chible's multiplier.</p>
Full article ">
<p>Schematic of Gilbert's multiplier.</p>
Full article ">
<p>The weight unit. (<b>a</b>) Training phase; (<b>b</b>) Classifying phase.</p>
Full article ">
<p>Activation function circuit and its differentiation.</p>
Full article ">
<p>Schematic of Delta block.</p>
Full article ">
<p>The photo of the components in the experiment. (<b>a</b>) Equipment for odor data collection; (<b>b</b>) PCB for bias generation; (<b>c</b>) socket and PCB with a designed chip inside.</p>
Full article ">
<p>Fruit pattern of (<b>a</b>) banana; (<b>b</b>) lemon; and (<b>c</b>) lychee odors.</p>
Full article ">
Back to TopTop