CN109472348B - LSTM neural network system based on memristor cross array - Google Patents

LSTM neural network system based on memristor cross array Download PDF

Info

Publication number
CN109472348B
CN109472348B CN201811236611.0A CN201811236611A CN109472348B CN 109472348 B CN109472348 B CN 109472348B CN 201811236611 A CN201811236611 A CN 201811236611A CN 109472348 B CN109472348 B CN 109472348B
Authority
CN
China
Prior art keywords
voltage
memristor
threshold
converter
array
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201811236611.0A
Other languages
Chinese (zh)
Other versions
CN109472348A (en
Inventor
温世平
魏华强
黄廷文
曾志刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huazhong University of Science and Technology
Original Assignee
Huazhong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huazhong University of Science and Technology filed Critical Huazhong University of Science and Technology
Priority to CN201811236611.0A priority Critical patent/CN109472348B/en
Publication of CN109472348A publication Critical patent/CN109472348A/en
Application granted granted Critical
Publication of CN109472348B publication Critical patent/CN109472348B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/06Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
    • G06N3/063Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Computational Linguistics (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Neurology (AREA)
  • Semiconductor Memories (AREA)
  • Semiconductor Integrated Circuits (AREA)
  • Thin Film Transistor (AREA)

Abstract

The invention discloses an LSTM neural network system based on a memristor cross array, which comprises: input layer, characteristic extraction layer and categorised output layer, the characteristic extraction layer includes: data memory, first memristor cross array, first DA converter and AD converter, the categorised output layer includes: the second DA converter, the second memristor cross array and the voltage comparator; the first memristor cross array is used for carrying out feature extraction; the second memristor cross array is used for carrying out feature classification, and the voltage comparator is used for comparing a plurality of analog voltages to obtain a comparison result of the plurality of analog voltages; and taking the maximum value in the comparison results as the classification result of the digital signals of the input layer. The system has smaller size and lower power consumption.

Description

LSTM neural network system based on memristor cross array
Technical Field
The invention belongs to the field of neural networks, and particularly relates to an LSTM neural network system based on a memristor cross array.
Background
The development of deep neural networks such as Convolutional Neural Networks (CNN), Full Convolutional Networks (FCN), and long term short term memory (LSTM) has facilitated the study of deep neural network hardware design. In the field of deep neural network hardware circuit design, due to the reasons of a large amount of input data (such as images or texts), complex network structure, numerous network parameters and the like, a common CMOS design deep neural network has the problems of too large circuit size, high power consumption and the like.
In 2008, researchers in hewlett packard company made nano-sized memristors for the first time, the basic challenge of reality of a neuromorphic computing system was the development of artificial synapses, and the discovery of memristors provided perfectly-fitted elements for our design of neuromorphic computing architectures. Memristors have shown a wide application prospect in neural network hardware design due to their excellent properties of nanoscale size, nonvolatility, high density, low power consumption, compatibility with CMOS technology, and the like.
However, the LSTM neural network has great difference from CNN in structure and operation principle, so the whole hardware circuit structure based on the convolutional neural network is no longer practical in LSTM, and meanwhile, the prior art has the technical problems of large size and large power consumption.
Disclosure of Invention
Aiming at the defects or improvement requirements of the prior art, the invention provides an LSTM neural network system based on a memristor cross array, so that the technical problems of large size and large power consumption in the prior art are solved.
In order to achieve the above object, the present invention provides an LSTM neural network system based on a memristive crossbar array, including: an input layer, a feature extraction layer and a classification output layer,
the feature extraction layer includes: the device comprises a data memory, a first memristor cross array, a first DA converter and an AD converter, wherein the data memory is used for storing a digital signal of an input layer and a digital signal after feature extraction; the first DA converter is used for converting the digital signal of the input layer into a first analog voltage; the first memristor cross array is used for carrying out feature extraction on the first analog voltage to obtain a first current; the AD converter is used for converting the first current into a digital signal after characteristic extraction;
the classification output layer includes: the second DA converter, the second memristor cross array and the voltage comparator; the second DA converter is used for converting the digital signal after the characteristic extraction into a second analog voltage; the second memristor cross array is used for carrying out feature classification on the second analog voltage to obtain a plurality of analog voltages; the voltage comparator is used for comparing the plurality of analog voltages to obtain a comparison result of the plurality of analog voltages; and taking the maximum value in the comparison results as the classification result of the digital signals of the input layer.
Further, the first memristive crossbar array includes a voltage input port, a threshold memristor, a voltage inverter, an operational amplifier, and a multiplier,
in any two adjacent voltage input ports, one voltage input port is connected with the threshold memristor through a voltage inverter, the other voltage input port is directly connected with the threshold memristor, and the operational amplifier is connected with the threshold memristor in parallel; one end of the voltage phase inverter is connected with the output end of the operational amplifier, and the other end of the voltage phase inverter is connected with the input end of the multiplier.
Further, the operational amplifier is connected in parallel with the threshold memristor and is used for realizing the operational function of the Sigmoid activation function and converting the current signal into the voltage signal.
Further, the operational amplifier is connected in parallel with the threshold memristor and is used for achieving the operational function of the hyperbolic tangent activation function and converting the current signal into the voltage signal.
Further, the second memristor cross array comprises voltage input ports, threshold memristors and voltage inverters, wherein one of any two adjacent voltage input ports is connected with the threshold memristor through the voltage inverter, and the other voltage input port is directly connected with the threshold memristor.
Further, to avoid threshold memristor shorts, no electrical connection is made at the intersection of the first memristive crossbar array and no electrical connection is made at the intersection of the second memristive crossbar array.
In general, compared with the prior art, the above technical solution contemplated by the present invention can achieve the following beneficial effects:
(1) due to the nanoscale size of the memristor and the high density of the cross array, the hardware circuit is small in size, easy to integrate and suitable for being popularized in deep neural network hardware design in a large range. In addition, the low power consumption characteristic of the memristor enables the power consumption of the whole system to be obviously reduced compared with the power consumption of materials such as CMOS.
(2) Usually, one threshold memristor can only express one positive weight, and in any adjacent voltage input ports, one voltage input port is connected with the threshold memristor through a voltage inverter, and the other voltage input port is directly connected with the threshold memristor, so that two adjacent threshold memristors can express one weight with positive and negative weights.
Drawings
FIG. 1 is a schematic diagram of a long-short term memory neural network provided by an embodiment of the present invention;
FIG. 2 is a structural diagram of an LSTM neural network system based on a memristor crossbar array according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a feature extraction layer unit of the neural network with long and short term memory according to an embodiment of the present invention;
FIG. 4 is a circuit diagram of a first memristive crossbar array provided by an embodiment of the present disclosure;
FIG. 5 is a circuit diagram of a second memristive crossbar array provided by an embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. In addition, the technical features involved in the embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
As shown in fig. 1, a long term short term memory (LSTM) neural network includes: the device comprises an input layer, a feature extraction layer and a classification output layer. As shown in fig. 2, an LSTM neural network system based on a memristive crossbar array includes: input layer, characteristic extraction layer and categorised output layer, the characteristic extraction layer includes: the data memory, first switch, first reading circuit, write circuit, first memristor crossbar array, first DA converter and AD converter, categorised output layer includes: the first switch, the first reading circuit, the first DA converter, the first memristor crossbar array and the voltage comparator are connected in series; when the first switch is closed and the second switch is opened, the system performs feature extraction; when the first switch is switched off and the second switch is switched off, the system carries out classified output;
the data memory is used for storing the digital signal of the input layer and the digital signal after the characteristic extraction;
the first reading circuit is used for reading the digital signals of the input layer in the data memory and transmitting the digital signals to the first DA converter;
the first DA converter is used for converting the digital signal of the input layer into a first analog voltage;
the first memristor cross array is used for performing feature extraction on the first analog voltage to obtain a first current;
the AD converter is used for converting the first current into a digital signal after feature extraction;
the writing circuit is used for writing the digital signal after the characteristic extraction into the data memory;
the second reading circuit is used for reading the digital signals after the features in the data memory are extracted and transmitting the digital signals to the second DA converter;
the second DA converter is used for converting the digital signal after the characteristic extraction into a second analog voltage;
the second memristor cross array is used for carrying out feature classification on second analog voltages to obtain a plurality of analog voltages;
the voltage comparator is used for comparing the plurality of analog voltages to obtain a comparison result of the plurality of analog voltages;
and taking the maximum value in the comparison results as the classification result of the digital signals of the input layer.
FIG. 3 shows a schematic diagram of an internal cell (LSTM cell) of the LSTM neural network, in which ctIs the state of the LSTM cell, ct-1Is the state of the LSTM cell at the last time, htIs the output of the LSTM cell, ht-1Is the output of the last time LSTM cell. c. Ct-1Has a data width of 1, ctHas a data width of 64, Sigmoid represents a Sigmoid activation function, Tanh represents a hyperbolic tangent activation function, f represents the output of a forgetting gate, i represents an input gate, j represents an update gate, o represents an output gate, f _ b represents a forgetting offset, b represents a hysteresis offset, andtindicating the offset of a forgetting gate, biIndicating the offset of the input gate, bjIndicating the offset of the update gate, boIndicating the offset of the output gate, WhfStatus weight, W, representing a forgotten doorxfInput weight, W, representing forgetting gatehiIndication inputWeight of state of entry, WxiRepresenting the input weight, W, of the input gatehjIndicating the status weight, W, of the update gatexjInput weight, W, representing an update gatehoRepresenting the state weight of the output gate, WxoRepresenting the input weights of the output gates, and X represents the input of the LSTM cell.
The feature extraction part in the schematic diagram of the LSTM cell is converted into a mathematical expression as follows:
f=sigmoid(x*Wxf+ht-1*Whf+bf+f_b)
i=sigmoid(x*Wxi+ht-1*Whi+bi)
j=tanh(x*Wxj+ht-1*Whj+bj)
o=sigmoid(x*Wxo+ht-1*Who+bo)
the above data calculates the function of the corresponding crossbar array, and the following is a mathematical expression of the completed function of the following auxiliary circuits of the crossbar array:
ct=ct-1*f+i*j
ht=o*tanh(ct)
the data in the above formula is in the form of analog voltages in the circuit.
As shown in fig. 4, the first memristive crossbar array includes a first trunk circuit including a voltage input port 1, a threshold memristor 2, a voltage inverter 3, an operational amplifier 4, and a first auxiliary circuit including an operational amplifier 4, a resistor 5, and a multiplier 6, and no electrical connection is made at the intersection of the first memristive crossbar array.
An LSTM neural network system based on a memristive cross array comprises n LSTM cells, namely the LSTM neural network system based on the memristive cross array comprises n first memristive cross arrays, and the first analog voltage comprises VX1-VXmAnd Vh1-VhnThe first analog voltage is input to the first trunk circuit through the voltage input port. One threshold memristor can only express one positive weight, and any adjacent memristors in the inventionIn the voltage input ports, one voltage input port is connected with the threshold memristor through the voltage inverter, and the other voltage input port is directly connected with the threshold memristor, so that two adjacent threshold memristors can express a weight with positive and negative values.
M1 and M2 are both threshold memristors, Vs1 +Is a 1V DC voltage, Vs1 -Ground, Vs2 +Is a 1V DC voltage, Vs2 -Is a direct current voltage of-1V, and a resistor R1、R2、R3、R4And R5The medium resistance values need to be the same, and 1K ohm to 10K ohm can be adopted. The operational amplifier is connected with the threshold memristor M1 in parallel to realize a Sigmoid activation function and convert the current signal into a voltage signal, and the operational amplifier is connected with the threshold memristor M2 in parallel to realize a hyperbolic tangent activation function and convert the current signal into a voltage signal. One end of the voltage phase inverter is connected with the output end of the operational amplifier, and the other end of the voltage phase inverter is connected with the input end of the multiplier and used for converting the direction of the voltage.
Vc(t-1)Representing the state of the first memristive crossbar array at the last moment, VctRepresenting the state of the first memristive crossbar array, VhtRepresenting a first current. Mathematical expression of the completion function of the first auxiliary circuit:
ct=ct-1*f+i*j
ht=o*tanh(ct)
as shown in fig. 5, the second memristive crossbar array includes a second trunk circuit and a second auxiliary circuit, the second trunk circuit includes voltage input ports, threshold memristors, and voltage inverters, one voltage input port is connected to the threshold memristor through the voltage inverter among any adjacent voltage input ports, and the other voltage input port is directly connected to the threshold memristor. The intersection point of the second memristor cross array is not electrically connected, the second auxiliary circuit is a voltage comparator, and the voltage comparator is used for comparing a plurality of analog voltages to obtain a comparison result of the plurality of analog voltages; using the maximum value of the comparison results as the classification result V of the digital signal of the input layero
The invention solves the difficult problem of hardware design of the LSTM neural network, provides a hardware implementation method including an activation function while introducing a memristor to carry out circuit design, and provides a complete implementation scheme of the LSTM neural network based on a memristor cross array. Due to the nanoscale size of the memristor and the high density of the cross array, the hardware circuit of the design is small in size, easy to integrate and more suitable for being popularized in deep neural network hardware design in a large range. In addition, the low power consumption characteristic of the memristor enables the power consumption of the whole system to be obviously reduced compared with the power consumption of materials such as CMOS.
It will be understood by those skilled in the art that the foregoing is only a preferred embodiment of the present invention, and is not intended to limit the invention, and that any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (3)

1. An LSTM neural network system based on a memristive crossbar array, comprising: an input layer, a feature extraction layer and a classification output layer, which are characterized in that,
the feature extraction layer includes: the device comprises a data memory, a first memristor cross array, a first DA converter and an AD converter, wherein the data memory is used for storing a digital signal of an input layer and a digital signal after feature extraction; the first DA converter is used for converting the digital signal of the input layer into a first analog voltage; the first memristor cross array is used for carrying out feature extraction on the first analog voltage to obtain a first current; the AD converter is used for converting the first current into a digital signal after characteristic extraction;
the classification output layer includes: the second DA converter, the second memristor cross array and the voltage comparator; the second DA converter is used for converting the digital signal after the characteristic extraction into a second analog voltage; the second memristor cross array is used for carrying out feature classification on the second analog voltage to obtain a plurality of analog voltages; the voltage comparator is used for comparing the plurality of analog voltages to obtain a comparison result of the plurality of analog voltages; taking the maximum value in the comparison results as the classification result of the digital signals of the input layer;
the first memristive crossbar array includes a voltage input port, two threshold memristors, a voltage inverter, an operational amplifier, and a multiplier,
in any two adjacent voltage input ports, one voltage input port is connected with the threshold memristor through the voltage inverter, and the other voltage input port is directly connected with the threshold memristor, so that the two adjacent threshold memristors can express a weight with positive and negative values; the operational amplifier is connected with one threshold memristor in parallel and used for achieving the operational function of a Sigmoid activation function and converting a current signal into a voltage signal, and the operational amplifier is connected with the other threshold memristor in parallel and used for achieving the operational function of a hyperbolic tangent activation function and converting the current signal into the voltage signal; one end of the voltage phase inverter is connected with the output end of the operational amplifier, and the other end of the voltage phase inverter is connected with the input end of the multiplier and used for converting the direction of the voltage.
2. The LSTM neural network system based on a memristive crossbar array, according to claim 1, wherein the second memristive crossbar array comprises a voltage input port, a threshold memristor and a voltage inverter, one voltage input port of any two adjacent voltage input ports is connected with the threshold memristor through the voltage inverter, and the other voltage input port is directly connected with the threshold memristor.
3. An LSTM neural network system based on memristive crossbar arrays, according to claim 1, wherein the intersections of the first memristive crossbar array are not electrically connected, and the intersections of the second memristive crossbar array are not electrically connected.
CN201811236611.0A 2018-10-23 2018-10-23 LSTM neural network system based on memristor cross array Expired - Fee Related CN109472348B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811236611.0A CN109472348B (en) 2018-10-23 2018-10-23 LSTM neural network system based on memristor cross array

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811236611.0A CN109472348B (en) 2018-10-23 2018-10-23 LSTM neural network system based on memristor cross array

Publications (2)

Publication Number Publication Date
CN109472348A CN109472348A (en) 2019-03-15
CN109472348B true CN109472348B (en) 2022-02-18

Family

ID=65664136

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811236611.0A Expired - Fee Related CN109472348B (en) 2018-10-23 2018-10-23 LSTM neural network system based on memristor cross array

Country Status (1)

Country Link
CN (1) CN109472348B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10984860B2 (en) 2019-03-26 2021-04-20 Hewlett Packard Enterprise Development Lp Self-healing dot-product engine
CN112749784B (en) * 2019-10-31 2024-05-14 华为技术有限公司 Computing device and acceleration method of neural network
CN111460365B (en) * 2020-03-10 2021-12-03 华中科技大学 Equation set solver based on memristive linear neural network and operation method thereof
CN111680792A (en) * 2020-06-18 2020-09-18 中国人民解放军国防科技大学 Activation function circuit, memristor neural network and control method of memristor neural network
CN112884141B (en) * 2021-04-16 2022-10-21 安徽大学 Memristive coupling Hindmarsh-Rose neuron circuit
CN113988281A (en) * 2021-10-26 2022-01-28 重庆因普乐科技有限公司 Long-short time memory network implementation method based on memristor structure
CN116523011B (en) * 2023-07-03 2023-09-15 中国人民解放军国防科技大学 Memristor-based binary neural network layer circuit and binary neural network training method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104916313A (en) * 2015-06-16 2015-09-16 清华大学 Neural network synapse structure based on memristive devices and synaptic weight building method
CN106815636A (en) * 2016-12-30 2017-06-09 华中科技大学 A kind of neuron circuit based on memristor
CN107273972A (en) * 2017-05-11 2017-10-20 北京大学 It is a kind of based on resistive device and to adapt to excite the neuromorphic system and implementation method of neuron

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9152827B2 (en) * 2012-12-19 2015-10-06 The United States Of America As Represented By The Secretary Of The Air Force Apparatus for performing matrix vector multiplication approximation using crossbar arrays of resistive memory devices

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104916313A (en) * 2015-06-16 2015-09-16 清华大学 Neural network synapse structure based on memristive devices and synaptic weight building method
CN106815636A (en) * 2016-12-30 2017-06-09 华中科技大学 A kind of neuron circuit based on memristor
CN107273972A (en) * 2017-05-11 2017-10-20 北京大学 It is a kind of based on resistive device and to adapt to excite the neuromorphic system and implementation method of neuron

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Design of CMOS-memristor Circuits for LSTM architecture;Kamilya Smagulova等;《2018 IEEE International Conference on Electron Devices and Solid State Circuits (EDSSC)》;20180630;第1-3页 *

Also Published As

Publication number Publication date
CN109472348A (en) 2019-03-15

Similar Documents

Publication Publication Date Title
CN109472348B (en) LSTM neural network system based on memristor cross array
Yang et al. Transiently chaotic simulated annealing based on intrinsic nonlinearity of memristors for efficient solution of optimization problems
Kim et al. A digital neuromorphic VLSI architecture with memristor crossbar synaptic array for machine learning
JP2021185479A (en) Apparatus for performing in-memory processing, and computing device including the same
CN113627601B (en) Subunit, MAC array and bit width reconfigurable analog-digital mixed memory internal computing module
CN110543933A (en) Pulse type convolution neural network based on FLASH memory array
CN105160401A (en) WTA neural network based on memristor array and application thereof
CN107545305B (en) CMOS (complementary metal oxide semiconductor) process-based digital-analog mixed charge domain neuron circuit
US20220179658A1 (en) Refactoring Mac Operations
US20230297839A1 (en) Deep learning in bipartite memristive networks
US20200327401A1 (en) Computing circuitry
CN105739944A (en) Multi-system additive operation circuit based on memristors and operation method thereof
CN110751279A (en) Ferroelectric capacitance coupling neural network circuit structure and multiplication method of vector and matrix in neural network
CN115630693A (en) Memristor self-learning circuit based on Elman neural network learning algorithm
CN112734022B (en) Four-character memristor neural network circuit with recognition and sequencing functions
CN112396176B (en) Hardware neural network batch normalization system
CN109711537B (en) Prediction circuit based on memristor neural network
CN110768660A (en) Memristor-based reversible logic circuit and operation method
CN114093394B (en) Rotatable internal computing circuit and implementation method thereof
Zhao et al. Intra-array non-idealities modeling and algorithm optimization for RRAM-based computing-in-memory applications
Guo et al. A multi-conductance states memristor-based cnn circuit using quantization method for digital recognition
Calayir et al. All-magnetic analog associative memory
CN117558320B (en) Read-write circuit based on memristor cross array
CN115857871B (en) Fuzzy logic all-hardware computing circuit and design method thereof
CN109543831B (en) Memristor cross array voltage division equivalent resistance state number expansion structure and related method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20220218

CF01 Termination of patent right due to non-payment of annual fee