CN110443356B - Current type neural network based on multi-resistance state memristor - Google Patents

Current type neural network based on multi-resistance state memristor Download PDF

Info

Publication number
CN110443356B
CN110443356B CN201910726323.1A CN201910726323A CN110443356B CN 110443356 B CN110443356 B CN 110443356B CN 201910726323 A CN201910726323 A CN 201910726323A CN 110443356 B CN110443356 B CN 110443356B
Authority
CN
China
Prior art keywords
signal
input
module
neural network
channel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910726323.1A
Other languages
Chinese (zh)
Other versions
CN110443356A (en
Inventor
肖建
张粮
张健
王宇
童祎
郭宇锋
蔡志匡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Posts and Telecommunications
Original Assignee
Nanjing University of Posts and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Posts and Telecommunications filed Critical Nanjing University of Posts and Telecommunications
Priority to CN201910726323.1A priority Critical patent/CN110443356B/en
Publication of CN110443356A publication Critical patent/CN110443356A/en
Application granted granted Critical
Publication of CN110443356B publication Critical patent/CN110443356B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/06Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
    • G06N3/063Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Computational Linguistics (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Neurology (AREA)
  • Semiconductor Integrated Circuits (AREA)
  • Semiconductor Memories (AREA)

Abstract

The invention provides a method for building a current type neuron circuit by combining a brain-like device memristor with a traditional device aiming at the special requirement of taking the memristor as a neural network core device, can simulate and realize the forward neural network operation, adopts a 1T1R weight mode taking the memristor device and an MOS tube as the core, greatly reduces the on-chip resources consumed in the neural network operation, combines other electronic devices such as the MOS tube, the low-power operational amplifier, the rail-to-rail operational amplifier technology and the number, the principle and the bionics principle in the aspect of analog circuits solve the design problems of signal input, a weight network, current equivalent adder summation and activation layers designed by taking 1T1R as a core device in a neural network, realize the processing of positive and negative signals and the transmission between neural network layers, and build a corresponding synapse weight matrix model and a multi-layer neuron network circuit.

Description

Current type neural network based on multi-resistance state memristor
Technical Field
The invention relates To the technical field of neural networks, in particular To a current type neural network system based on a multi-resistance state memristor, and particularly relates To a neural network system which is constructed by a hardware circuit with low power consumption (LP-amp), a Rail-To-Rail (Rail-To-Rail-amp) operational amplifier technology and a register technology and taking a multi-resistance state memristor as a core.
Background
Development of hardware artificial neural networks capable of adjusting parameters is a necessary development trend of modern neural network technology, and in recent 10 years, in academia, scholars are dedicated to development of efficient hardware neural networks or implementation environments corresponding to the hardware neural networks.
The brain has very high working efficiency, and although the neural network density is very high, the consumption energy consumption is extremely low compared with a computer chip. Experiments show that the local response speed of a single neuron is 10Hz/s, and the rapid operation of the brain is mainly created by the calculation of a parallel structure and the coding dynamics of the neuron, so that the method has a very important hint on how to apply memristive devices in a reasonable structure in a large amount.
In biology, action potential signals are transmitted in spikes along the nerve fiber membrane, which has a very similar effect from a microscopic point of view to pulse signals in an electric circuit. In practical circuits, the signals in an artificial neural network are implemented as stable threshold signals or waveforms.
In order to reduce the complexity of the research, the biological neural network functions are screened and implemented for their main features, i.e. through several effective signal transmissions, to implement their main functions such as classification, recognition and memory.
Disclosure of Invention
The purpose of the invention is as follows: the invention provides a current type neuron circuit built by a brain-like device, namely a memristor and combined with a traditional device aiming at the requirement of taking the memristor as a neural network core device, can simulate and realize forward neural network operation, adopts a weight mode with the memristor and an MOS (metal oxide semiconductor) transistor as a core 1T1R, and greatly reduces on-chip resources consumed in the neural network operation.
The technical scheme adopted by the invention is as follows:
a current type neural network based on multi-resistance state memristor is characterized in that: the device comprises a signal input module, a weight network module, a polarity register module, a bias signal module, a current equivalent addition module and an activation function module;
the signal input module is a three-port absolute value circuit and inputs a signal VpluseDecomposed into output modulus signal | VpluseI and a polarity characteristic signal S' are input to the weight network module; the weight network module comprises a plurality of 1T1R memristor weight units; the polarity registerThe module is respectively connected with the weight network module and the bias signal module; the bias signal module inputs the output bias current signal and the current signal output by the weight network module to the current equivalent addition module; the current equivalent addition module calculates the input current and inputs the calculation result to the activation function module; the activation function module activates the operation result and inputs the final result into the next layer of neural network;
the 1T1R memristor weight cell includes: the memristor module circuit Mem, an exclusive-nor gate circuit, a phase inverter, a PMOS (P-channel metal oxide semiconductor) tube and an NMOS (N-channel metal oxide semiconductor) tube are connected in the following 3 ways:
mode 1: two same-type MOS tubes are adopted, the S pole of the MOS tube is connected with a memristor Mem, and an input signal VpluseInput by MOS tube D pole; the input end of the exclusive-nor circuit is connected with the polarity characteristic signal S' and the polarity register module (3), the output end of the exclusive-nor circuit is respectively connected with the G pole of one MOS tube and the input end of the phase inverter, and the output end of the phase inverter is connected with the G pole of the other MOS tube;
mode 2: a P-type MOS tube and an N-type MOS tube are adopted, the S poles of the two MOS tubes are connected with a memristor Mem, and an input signal V is inputpluseInput by MOS tube D pole; the input end of the exclusive OR gate circuit is connected with the polarity characteristic signal S' and the polarity register module (3), and the output end of the exclusive OR gate circuit is respectively connected with the G poles of the two MOS tubes;
mode 3: two same-type MOS tubes are adopted, the S pole of the MOS tube is connected with a memristor Mem, and an input signal VpluseInput by MOS tube D pole; the polarity register module is respectively connected with the G pole of the MOS tube and the input end of the phase inverter, and the output end of the phase inverter is connected with the G pole of the other MOS tube;
the 1T1R memristor weight cell function realization method is as follows:
1) editing the positive and negative of the weight matrix in a polarity register W, and inputting the positive and negative of the weight matrix into a 1T1R unit through a pin;
2) when the circuit connection adopts the mode 1 or the mode 2, the input signal module converts the modulus value signal | VpluseI and polarity characteristic signals S' are input into the 1T1R memristor weight unit through pins, module value signal operation is carried out, and a junction is connectedOutputting the result to a Pos _ channel or a Neg _ channel, wherein the channel for outputting the result is determined by the polarity register module and the polarity characteristic signal S';
3) when the circuit connection adopts the mode 3, the 1T1R memristor weight cell matched modulus signal | VpluseI, calculating and outputting a result to a Pos _ channel or a Neg _ channel, wherein the channel for outputting the result is independently determined by a polarity register module;
4) after the signals input into each 1T1R unit are processed, the signals are all calculated in the mode, and the result is output to a Pos _ channel or a Neg _ channel to realize signal convergence;
the system building steps of the current type neural network based on the multi-resistance state memristor are as follows:
l1, after the neural network training of the upper computer is finished, programming the designated position of the memristor synapse weight in the built neural network system according to the data;
l2, programming the memristor units in the weight network, inputting the positive and negative weights in the weight network into the polarity register, and completing the setting of the working voltage;
l3, after the programming of the circuit is completed, transmitting the input signal to a signal input module, and separating the modulus and the polarity of the input signal by the signal input module;
l4, inputting a modulus signal and a polarity characteristic signal into a weight network, wherein the modulus signal is operated through a 1T1R memristor weight unit, an output current signal is controlled by a polarity register module and the polarity of the input signal together, and the output current signal is selected to flow into a Pos _ channel or a Neg _ channel;
and L5, performing signal input on the bias signal module, and inputting the calculated bias current signal into the Pos _ channel and the Neg _ channel.
L6, the current equivalent addition module receives current signals in the channels Pos _ channel and Neg _ channel, the internal current adder node Neronal _ node unit carries out accumulation summation on the current signals in the channels Pos _ channel and Neg _ channel, and finally a result V is outputoutxOutputting the data to the activating function module;
l7, activate functional Unit Pair output result VoutxAnd performing activation processing, and inputting the output signal into an input module in the next layer of neural network.
L8, a plurality of layers of connections of the neural network are realized through the network connection process, and a multilayer neural network system is constructed.
Further, the bias signal module of step L5 includes: the device comprises a phase inverter, a memristor and a homotype MOS tube; input level VbiasThe D pole of the MOS tube is input, and the S pole of the MOS tube is respectively connected with a memristor unit Mem; polarity register W provides Vgs,VgsOne end of the inverter is connected with the input end of the phase inverter, the other end of the inverter is connected with the G pole of the MOS tube, and the output end of the phase inverter is connected with the G pole of the other MOS tube;
output current signal IbiasThe following relationship is satisfied:
Figure RE-GDA0002191977160000031
wherein, VbiasTo bias the input level of the signal unit, rNMThe resistance is the resistance when the MOS tube is opened, and M is the resistance of the MOS tube in series connection with the memristor Mem;
implementing input signal b by controlling polarity register modulo WiasSelecting channels to output corresponding current signals IbiasAnd inputting the signals into the corresponding Pos _ channel or Neg _ channel.
Further, the current equivalent adding module in the step L6 includes low power consumption operational amplifiers LP-amp1, LP-amp2 and LP-amp3, wherein the current signal I passing through the Pos _ channel+The current signal I passing through the Neg _ channel is input into the LP-amp1 through a pin-The output ends of the LP-amp2, the LP-amp1 and the LP-amp2 are respectively connected with the positive and negative input ends of the LP-amp3 through pin input, and the negative input end is also provided with a grounding resistor Rp(ii) a The input signal satisfies the following conditions:
Figure RE-GDA0002191977160000041
Figure RE-GDA0002191977160000042
wherein I+For the current through the Pos _ channel, I-For the current through the Neg _ channel,
Figure RE-GDA0002191977160000043
Figure RE-GDA0002191977160000044
for the current signal flowing into the Pos _ channel in the corresponding Synapse unit,
Figure RE-GDA0002191977160000045
current signals flowing into the Neg _ channel in the corresponding Synapse unit;
I+after flowing into the channel pin through the Pos _ channel, for the LP-amp3 positive input voltage, there are:
V+=-K+Rfp*I+' (4)
wherein K+For the gain factor of the operational amplifier LP-amp1, RfpIs a feedback resistor.
I-After flowing into the channel pin through the Neg _ channel, for the LP-amp3 negative input voltage, there are:
V-=-K-Rfn*I-' (5)
wherein K-For the gain factor of the operational amplifier LP-amp2, RfnIs a feedback resistor.
For the output signal, by adjusting the feedback resistance Rfp、RfnThen, the flow passes through Rfp、RfnCurrent of (I)+′、I-' satisfies:
Figure RE-GDA0002191977160000046
and for the output signal, after being adjusted, the following signals are provided:
Figure RE-GDA0002191977160000047
wherein K+、K-、KAThe gain coefficients of LP-amp1, LP-amp2, LP-amp3, respectively.
When the condition is satisfied:
Figure RE-GDA0002191977160000051
and by adjusting the gain factor KAI.e. with the final output result V of the kth node thereinoutComprises the following steps:
Figure RE-GDA0002191977160000052
wherein VoutI.e. the input signal of the activation function unit.
For each different operation of the node outputs, there is a different KACorresponding output effect V can be realized by adjusting the setting mode of memristor outputout. All nodes of one output layer receive a plurality of signals V of input layer signalsoutAnd inputting the final result into the activation layer.
Further, the specific steps of constructing the multilayer neural network system in step L8 are as follows:
the specific steps of constructing the multilayer neural network system in step L8 are as follows:
m1, when selecting a neural network with a plurality of node inputs, a signal input module is arranged at an input interface of the neural network, so that the input of signals with positive and negative polarities and the matrix operation are facilitated;
m2, for subsequent neural network layers, each layer is provided with a signal input module, the polarity of the weight network module is directly controlled through the polarity of the input signal and the polarity stored by the register, and the 1T1R memristor weight unit circuit connection mode is selected to be a mode 1 or a mode 2;
m3, after passing through the input layer of the neural network, the output result is input to the next neural network containing the signal input module, and the subsequent neural network layers all execute the operation, thus realizing the multilayer neural network containing a large number of nodes in each layer.
Particularly, when the activation function module selects a ReLu or Sigmoid activation function, a simplified design can be adopted, and the specific steps for constructing the multilayer neural network system are as follows:
s1, when a neural network input by a plurality of nodes is selected, a signal input module (1) is arranged at an input interface of the neural network, so that the input of signals with positive and negative polarities and the matrix operation are facilitated;
s2, removing the signal input module for the subsequent neural network layer, and directly adopting a polarity register W to perform polarity control on the weight network module, wherein the 1T1R memristor weight unit circuit connection mode is selected as mode 3;
and S3, after passing through the input layer of the neural network, inputting the output result into the neural network without the signal input module, and executing the operation by the subsequent neural network layer, thereby realizing the multilayer neural network with a plurality of nodes in each layer.
Has the advantages that: the invention provides a current type neuron circuit built by combining a brain-like device, namely a memristor and a traditional device, can simulate and realize the operation of a forward neural network, and adopts a weight mode with the memristor device and an MOS (metal oxide semiconductor) transistor as a core 1T1R, thereby greatly reducing the on-chip resources consumed in the operation of the neural network. According to the excellent switching performance of the high-performance MOS tube and the multi-resistance-state characteristic of the memristor, the bionics function and the related operation in the neural network are realized:
1 storage and memory function of synapses;
2, receiving positive and negative information of the neuron cell;
the 3-synaptic weight network is editable, directly reducing some non-dominant signals in the neural network.
According to the invention, a novel hardware neural network is built by using the memristor, the excellent programmable characteristic of the memristor is exerted, and data information is effectively stored. The characteristics of the memristor are combined with the storage weight of the specific circuit, so that the computing power and efficiency of the conventional artificial neural network are effectively improved. The neural network circuit of the invention can also be suitable for various neural network models and some common neural network accelerators after performance optimization. The method has great significance and value for further development and application of artificial intelligence technology.
Drawings
FIG. 1 is a circuit model diagram of a signal input module;
FIG. 2a is a diagram of a 1T1R memristor weight cell under an NMOS transistor;
FIG. 2b is a diagram of a 1T1R memristor weight cell under a PMOS transistor;
FIG. 3a is a schematic diagram of a circuit structure of a 1T1R weighting unit under a homotype MOS transistor;
FIG. 3b is a schematic diagram of a circuit structure of a 1T1R weighting unit under different types of MOS transistors;
FIG. 4 is a schematic diagram of an appearance principle of a 1T1R memristor weight cell;
FIG. 5 is a schematic diagram of a circuit structure of a bias signal unit;
FIG. 6 is a schematic diagram of the appearance of the bias signal unit;
FIG. 7 is a schematic diagram of a Synapse unit structure;
FIG. 8 is a schematic diagram of a current equivalent adder circuit;
FIG. 9 is a schematic diagram of the appearance of a current equivalent adder unit;
fig. 10 is a schematic diagram of the current mode neural network structure of 6 x 4;
fig. 11 is a schematic diagram of a 4 x 4 (activation unit of ReLu inner layer) current mode neural network structure;
fig. 12 is a schematic diagram of a current mode neural network structure of 6 x 4 x 2.
Detailed Description
The embodiments of the present invention will be further explained with reference to the drawings.
A current type neural network based on multi-resistance state memristor is characterized in that: the device comprises a signal input module 1, a weight network module 2, a polarity register module 3, a bias signal module 4, a current equivalent addition module 5 and an activation function module 6.
As shown in FIG. 1, the signal input module 1 is a three-port absolute value circuit for inputting a signal VpluseDecomposed into output modulus signal | VpluseAnd | and the polarity characteristic signal S' are input to the weight network module 2. The weight network module 2 comprises a plurality of 1T1R memristor weight cells. The polarity register module 3 is respectively connected with the weight network module 2 and the bias signal module 4, and controls the polarity of the output current signal. The bias signal module 4 inputs the output bias current signal and the current signal output by the weight network module 2 to the current equivalent addition module 5. The current equivalent addition module 5 performs accumulation operation on the input current and inputs the operation result to the activation function module 6. And the activation function module 6 is used for activating the operation result and inputting the final result into the next layer of neural network.
The 1T1R memristor weight cell includes: memristor Mem, an exclusive nor circuit, a phase inverter, a PMOS tube and an NMOS tube, and the circuit connection mode is 3 types as follows:
mode 1: as shown in FIG. 3a, the 1T1R memristor model circuit comprises a memristor Mem, an exclusive-nor gate (Xnor-gate), an inverter (no-gate), and two MOS tubes. The two MOS tubes are of the same type, namely, both N type or both P type. The polarity characteristic signal S' and the polarity characteristic signal of the polarity register W are input from an exclusive-nor gate, one end of the output end is connected with a grid G of the MOS tube, one end of the output end is connected with the input end of the phase inverter, the output end of the phase inverter is connected with the grid G of the other MOS tube, and the S pole of the MOS tube is connected with the memristor Mem. Input signal VpluseIs inputted from the source D. At VgsOpen or closed, a signal is input into the Pos _ channel or the Neg _ channel.
Mode 2, as shown in FIG. 3b, the 1T1R memristor model circuit includes a memristor Mem, an exclusive-nor gate (Xnor-ga)te), PMOS tube and NMOS tube. The polarity characteristic signal S' and the polarity characteristic signal output by the polarity register W are input from the same or gate, one end of the output end is respectively connected with the grid G of the PMOS tube and the grid G of the NMOS, and the S poles of the two MOS tubes are respectively connected with the memristor Mem. Input signal VpluseIs inputted from the source D. At VgsOpen or closed, a signal is input into the Pos _ channel or the Neg _ channel.
Mode 3, 1T1R memristor model circuit includes two MOS transistors of the same type, a memristor Mem, and an inverter (no-gate). The polarity register module is respectively connected with the G pole of the MOS tube and the input end of the phase inverter, and the output end of the phase inverter is connected with the G pole of the other MOS tube. The S poles of the two MOS tubes are respectively connected with the memristor Mem. At this time, the 1T1R memristor model circuit is arranged in the same way as the bias signal module circuit.
The system building steps of the current type neural network based on the multi-resistance state memristor are as follows:
and L1, after the neural network training of the upper computer is finished, programming the designated position of the memristor synapse weight in the built neural network system according to the data.
L2, programming the memristor units in the weight network, inputting the positive and negative weights in the weight network into the polarity register, and completing the setting of the working voltage.
L3, after the circuit is programmed, the input signal is transmitted to the signal input module, which performs the separation of the modulus and polarity of the input signal.
And L4, inputting the modulus signal and the polarity characteristic signal into a weight network, wherein the modulus signal is operated by a 1T1R memristor weight unit, the output current signal is controlled by the polarity register module and the polarity of the input signal together, and the output current signal is selected to flow into a Pos _ channel or Neg _ channel.
The 1T1R memristor model circuit function realization method is as follows:
1) editing the positive and negative of the weight matrix in a polarity register W, and inputting the positive and negative of the weight matrix into a 1T1R unit through a pin;
2) when the circuit connection adopts the mode 1 orIn mode 2, the input signal module outputs the modulus signal | VpluseI and a polarity characteristic signal S 'are input into a 1T1R memristor weight unit through a pin, module value signal operation is carried out, and a result is output into a Pos _ channel or Neg _ channel, wherein the channel for outputting the result is determined by a polarity register module and the polarity characteristic signal S' together; the polarity register output signal W 'and the polarity characterization signal S' are referred to in table 1:
Figure RE-GDA0002191977160000081
TABLE 1
3) When the circuit connection adopts the mode 3, the 1T1R memristor weight cell matched modulus signal | VpluseI, calculating and outputting a result to a Pos _ channel or a Neg _ channel, wherein the channel for outputting the result is independently determined by a polarity register output signal W';
4) after the signals input into each 1T1R unit are processed, the signals are all calculated in the manner described above, and the results are output to Pos _ channel or Neg _ channel channels to realize signal convergence.
And L5, performing signal input on the bias signal module, and inputting the calculated bias current signal into the Pos _ channel and the Neg _ channel.
As shown in fig. 5 to 7, the bias signal module includes: the device comprises an inverter (no-gate), a memristor and two MOS tubes of the same type; input level VbiasThe D pole of the MOS tube is input, and the S pole of the MOS tube is respectively connected with a memristor unit Mem; if the NMOS and the memristor are connected in series, an input signal part is input at a D pole, and an S pole is connected and input to the memristor series connection part and is finally output to a corresponding channel; if the PMOS is connected with the memristor in series, an input signal part is input at the S pole, and the D pole is connected and input to the memristor series connection part and is finally output to the corresponding channel. The polarity register W provides a G-pole input signal Vgs,VgsOne end of the inverter is connected with the input end of the phase inverter, the other end of the inverter is connected with the G pole of the MOS tube, and the output end of the phase inverter is connected with the G pole of the other MOS tube;
output current signal IbiasThe following relationship is satisfied:
Figure RE-GDA0002191977160000091
wherein, VbiasTo bias the input level of the signal unit, rNMThe resistance is the resistance when the MOS tube is opened, and M is the resistance of the MOS tube in series connection with the memristor Mem;
implementing input signal b by controlling polarity register modulo WiasSelecting channels to output corresponding current signals IxiasAnd inputting the signals into the corresponding Pos _ channel or Neg _ channel.
According to the MOS cast type of chooseing for use different, the output signal that corresponds is also different, specifically sets up as follows:
(1) the bias setting mode under NMOS is shown in table 2:
Figure RE-GDA0002191977160000092
TABLE 2
(2) The bias setting mode under PMOS is shown in table 3:
Figure RE-GDA0002191977160000093
TABLE 3
L6, the current equivalent addition module receives current signals in the channels Pos _ channel and Neg _ channel, the internal current adder node Neronal _ node unit carries out accumulation summation on the current signals in the channels Pos _ channel and Neg _ channel, and finally a result V is outputoutxAnd outputting the data to the activating function module.
The current equivalent addition module is shown in fig. 8-9, wherein the input signal needs to satisfy the following conditions:
Figure RE-GDA0002191977160000101
Figure RE-GDA0002191977160000102
wherein I+For the current through the Pos _ channel, I-For the current through the Neg _ channel,
Figure RE-GDA0002191977160000103
Figure RE-GDA0002191977160000104
for the current signal flowing into the Pos _ channel in the corresponding neuron Synapse1,
Figure RE-GDA0002191977160000105
synapse current signal flowing into the Neg _ channel in Synapse1 for the corresponding neuron.
I+After flowing through the Pos _ channel to the channel pin, for R+The input voltage at the left end is:
V+=-K+Rfp*I+' (4)
wherein K+For the gain factor of the operational amplifier LP-amp1, RfpIs a feedback resistor.
I-After flowing through the Neg _ channel to the channel pin, for R-The input voltage at the left end is:
V-=-K-Rfn*I-' (5)
wherein K-For the gain factor of the operational amplifier LP-amp2, RfnIs a feedback resistor.
For the output signal, by adjusting the feedback resistance Rfp、RfnThen, the flow passes through Rfp、RfnCurrent of (I)+′、I-' satisfies:
Figure RE-GDA0002191977160000106
and for the output signal, after being adjusted, the following signals are provided:
Figure RE-GDA0002191977160000107
wherein K+、K-、KAThe gain coefficients of LP-amp1, LP-amp2, LP-amp3, respectively.
When the condition is satisfied:
Figure RE-GDA0002191977160000108
and by adjusting the gain factor KAI.e. with the final output result V of the kth node thereinoutComprises the following steps:
Figure RE-GDA0002191977160000111
wherein VoutI.e. the input signal of the Activation-function unit (Activation-function).
For each different operation of the node outputs, there is a different KACorresponding output effect V can be realized by adjusting the setting mode of memristor outputout. All nodes of one output layer receive a plurality of signals V of input layer signalsoutAnd inputting the final result into the activation layer.
L7, activate functional Unit Pair output result VoutxAnd performing activation processing, and inputting the output signal into an input module in the next layer of neural network.
L8, a plurality of layers of connections of the neural network are realized through the network connection process, and a multilayer neural network system is constructed.
Two examples of multi-layer neural network connections are provided below:
as shown in fig. 12, when any activation function is selected as the activation function, the specific steps of constructing the multilayer neural network system are as follows:
m1, when selecting a neural network with a plurality of node inputs, a signal input module (absolute value circuit module) is arranged at an input interface of the neural network, so that the input of signals with positive and negative polarities is facilitated and matrix operation is carried out;
m2, for the subsequent neural network layers, each layer is provided with a signal input module (absolute value circuit module), and the polarity of the input signal and the weight polarity stored by the register are directly adopted to carry out the polarity control of the weight network module; at the moment, the connection mode of the 1T1R memristor weight cell circuit is selected as mode 1 or mode 2;
m3, after passing through the input layer of the neural network, the output result is input to the next neural network including a signal input module (absolute value circuit module), and the subsequent neural network layers all execute the above operations, so that a multilayer neural network including a large number of nodes in each layer can be realized, such as M × n × q.
In particular, when the Activation-function unit selects a ReLu or Sigmoid Activation function, a simplified design mode may be adopted, as shown in fig. 11, and the construction steps are as follows:
s1, when a neural network input by a plurality of nodes is selected, a signal input module (absolute value circuit module) is arranged at an input interface of the neural network, so that the input of signals with positive and negative polarities is facilitated, and matrix operation is performed;
s2, removing a signal input module (absolute value circuit module) from a subsequent neural network layer, directly adopting a polarity register to perform polarity control on a weight network module, and selecting a 1T1R memristor weight unit circuit connection mode 3 at the moment, wherein the implementation method can refer to the implementation steps of a bias signal module;
s3, after passing through the input layer of the neural network, the output result is input to the neural network without signal input module (absolute value circuit module), and the subsequent neural network layers all perform the above operations, so that a multilayer neural network with a large number of nodes in each layer can be realized, such as m × n × q.
The foregoing is a more detailed description of the invention in connection with specific preferred embodiments and it is not intended that the invention be limited to these specific details. It should be noted that, for those skilled in the art, without departing from the principle of the present invention, several improvements and modifications can be made, and these improvements and modifications should also be construed as the protection scope of the present invention.

Claims (5)

1. A current type neural network based on multi-resistance state memristor is characterized in that: the device comprises a signal input module (1), a weight network module (2), a polarity register module (3), a bias signal module (4), a current equivalent addition module (5) and an activation function module (6);
the signal input module (1) is a three-port absolute value circuit and inputs a signal VpluseDecomposed into output modulus signal | VpluseI and a polarity characteristic signal S' are input into the weight value network module (2); the weight network module (2) comprises a plurality of 1T1R memristor weight units; the polarity register module (3) is respectively connected with the weight network module (2) and the bias signal module (4); the bias signal module (4) inputs the output bias current signal and the current signal output by the weight network module (2) to the current equivalent addition module (5); the current equivalent addition module (5) calculates the input current and inputs the calculation result to the activation function module (6); the activation function module (6) activates the operation result and inputs the final result into the next layer of neural network;
the 1T1R memristor weight cell includes: the memristor module circuit Mem, an exclusive-nor gate circuit, a phase inverter, a PMOS (P-channel metal oxide semiconductor) tube and an NMOS (N-channel metal oxide semiconductor) tube are connected in the following 3 ways:
mode 1: two same-type MOS tubes are adopted, the S pole of the MOS tube is connected with a memristor Mem, and an input signal VpluseBy MOS tube D poleInputting; the input end of the exclusive-nor circuit is connected with the polarity characteristic signal S' and the polarity register module (3), the output end of the exclusive-nor circuit is respectively connected with the G pole of one MOS tube and the input end of the phase inverter, and the output end of the phase inverter is connected with the G pole of the other MOS tube;
mode 2: a P-type MOS tube and an N-type MOS tube are adopted, the S poles of the two MOS tubes are connected with a memristor Mem, and an input signal V is inputpluseInput by MOS tube D pole; the input end of the exclusive OR gate circuit is connected with the polarity characteristic signal S' and the polarity register module (3), and the output end of the exclusive OR gate circuit is respectively connected with the G poles of the two MOS tubes;
mode 3: two same-type MOS tubes are adopted, the S pole of the MOS tube is connected with a memristor Mem, and an input signal VpluseInput by MOS tube D pole; the polarity register module is respectively connected with the G pole of the MOS tube and the input end of the phase inverter, and the output end of the phase inverter is connected with the G pole of the other MOS tube;
the 1T1R memristor weight cell function realization method is as follows:
1) editing the positive and negative of the weight matrix in a polarity register W, and inputting the positive and negative of the weight matrix into a 1T1R unit through a pin;
2) when the circuit connection adopts the mode 1 or the mode 2, the input signal module converts the modulus value signal | VpluseI and a polarity characteristic signal S 'are input into a 1T1R memristor weight unit through a pin, module value signal operation is carried out, and a result is output into a Pos _ channel or Neg _ channel, wherein the channel for outputting the result is determined by a polarity register module and the polarity characteristic signal S' together;
3) when the circuit connection adopts the mode 3, the 1T1R memristor weight cell matched modulus signal | VpluseI, calculating and outputting a result to a Pos _ channel or a Neg _ channel, wherein the channel for outputting the result is independently determined by a polarity register module;
4) after the signals input into each 1T1R unit are processed, the signals are all calculated in the mode, and the result is output to a Pos _ channel or a Neg _ channel to realize signal convergence;
the system building steps of the current type neural network based on the multi-resistance state memristor are as follows:
l1, after the neural network training of the upper computer is finished, programming the designated position of the memristor synapse weight in the built neural network system according to the data;
l2, programming the memristor units in the weight network, inputting the positive and negative weights in the weight network into the polarity register, and completing the setting of the working voltage;
l3, after the programming of the circuit is completed, transmitting the input signal to a signal input module, and separating the modulus and the polarity of the input signal by the signal input module;
l4, inputting a modulus signal and a polarity characteristic signal into a weight network, wherein the modulus signal is operated through a 1T1R memristor weight unit, an output current signal is controlled by a polarity register module and the polarity of the input signal together, and the output current signal is selected to flow into a Pos _ channel or a Neg _ channel;
l5, performing signal input on the bias signal module, and inputting the calculated bias current signal into Pos _ channel and Neg _ channel channels;
l6, the current equivalent addition module receives current signals in the channels Pos _ channel and Neg _ channel, the internal current adder node Neronal _ node unit carries out accumulation summation on the current signals in the channels Pos _ channel and Neg _ channel, and finally a result V is outputoutxOutputting the data to the activating function module;
l7, activate functional Unit Pair output result VoutxActivating and inputting the output signal into an input module in the next layer of neural network;
l8, a plurality of layers of connections of the neural network are realized through the network connection process, and a multilayer neural network system is constructed.
2. The amperometric neural network based on the multi-resistance state memristor according to claim 1, wherein: step L5 the bias signal module includes: the device comprises a phase inverter, a memristor Mem and two homotype MOS tubes; input level VbiasThe D pole of the MOS tube is input, and the S pole of the MOS tube is respectively connected with a memristor Mem; polarity register W provides G-pole input signalNumber Vgs,VgsOne end of the inverter is connected with the input end of the phase inverter, the other end of the inverter is connected with the G pole of the MOS tube, and the output end of the phase inverter is connected with the G pole of the other MOS tube;
output current signal IbiasThe following relationship is satisfied:
Figure FDA0003498148710000031
wherein, VbiasTo bias the input level of the signal unit, rNMThe resistance is the resistance when the MOS tube is opened, and M is the resistance of the MOS tube in series connection with the memristor Mem;
implementing input signal b by controlling polarity register modulo WiasSelecting channels to output corresponding current signals IbiasAnd inputting the signals into the corresponding Pos _ channel or Neg _ channel.
3. The amperometric neural network based on the multi-resistance state memristor according to claim 1, wherein: the current equivalent addition module in the step L6 comprises low-power operational amplifiers LP-amp1, LP-amp2 and LP-amp3, wherein the current signal I passing through a Pos _ channel+The current signal I passing through the Neg _ channel is input into the LP-amp1 through a pin-The output ends of the LP-amp2, the LP-amp1 and the LP-amp2 are respectively connected with the positive and negative input ends of the LP-amp3 through pin input, and the negative input end is also provided with a grounding resistor Rp(ii) a The input signal satisfies the following conditions:
Figure FDA0003498148710000032
Figure FDA0003498148710000033
wherein I+For the current through the Pos _ channel, I-For the current through the Neg _ channel,
Figure FDA0003498148710000034
Figure FDA0003498148710000035
for the current signal flowing into the Pos _ channel in the corresponding Synapse unit,
Figure FDA0003498148710000036
current signals flowing into the Neg _ channel in the corresponding Synapse unit;
I+after flowing into the channel pin through the Pos _ channel, for the LP-amp3 positive input voltage, there are:
V+=-K+Rfp*I+' (4)
wherein K+For the gain factor of the operational amplifier LP-amp1, RfpIs a feedback resistor;
I-after flowing into the channel pin through the Neg _ channel, for the LP-amp3 negative input voltage, there are:
V-=-K-Rfn*I-' (5)
wherein K-For the gain factor of the operational amplifier LP-amp2, RfnIs a feedback resistor;
for the output signal, by adjusting the feedback resistance Rfp、RfnThen, the flow passes through Rfp、RfnCurrent of (I)+′、I-' satisfies:
Figure FDA0003498148710000041
and for the output signal, after being adjusted, the following signals are provided:
Figure FDA0003498148710000042
wherein K+、K-、KAGain coefficients for LP-amp1, LP-amp2, LP-amp3, respectively;
when the condition is satisfied:
Figure FDA0003498148710000043
and by adjusting the gain factor KAI.e. with the final output result V of the kth node thereinoutComprises the following steps:
Figure FDA0003498148710000044
wherein VoutNamely the input signal of the activation function unit;
for each different operation of the node outputs, there is a different KACorresponding output effect V can be realized by adjusting the setting mode of memristor outputout(ii) a All nodes of one output layer receive a plurality of signals V of input layer signalsoutAnd inputting the final result into the activation layer.
4. The amperometric neural network based on the multi-resistance state memristor according to claim 1, wherein: the specific steps of constructing the multilayer neural network system in step L8 are as follows:
m1, when selecting a neural network with a plurality of node inputs, a signal input module (1) is arranged at an input interface of the neural network, so that the input of signals with positive and negative polarities and the matrix operation are facilitated;
m2, for subsequent neural network layers, each layer is provided with a signal input module (1), the polarity of the weight network module is directly controlled through the polarity of an input signal and the polarity stored by a register, and the 1T1R memristor weight unit circuit connection mode is selected to be a mode 1 or a mode 2;
m3, after passing through the input layer of the neural network, the output result is input to the next neural network containing the signal input module, and the subsequent neural network layers all execute the operation, thus realizing the multilayer neural network containing a large number of nodes in each layer.
5. The amperometric neural network based on the multi-resistance state memristor according to claim 4, wherein: in the step L8, the activation function module selects a ReLu or Sigmoid activation function; the specific steps for constructing the multilayer neural network system are as follows:
s1, when a neural network input by a plurality of nodes is selected, a signal input module (1) is arranged at an input interface of the neural network, so that the input of signals with positive and negative polarities and the matrix operation are facilitated;
s2, removing the signal input module (1) from the subsequent neural network layer, and directly adopting a polarity register W to perform polarity control on the weight network module, wherein the 1T1R memristor weight unit circuit connection mode is selected as mode 3;
and S3, after passing through the input layer of the neural network, inputting the output result into the neural network without the signal input module, and executing the operation by the subsequent neural network layer, thereby realizing the multilayer neural network with a plurality of nodes in each layer.
CN201910726323.1A 2019-08-07 2019-08-07 Current type neural network based on multi-resistance state memristor Active CN110443356B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910726323.1A CN110443356B (en) 2019-08-07 2019-08-07 Current type neural network based on multi-resistance state memristor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910726323.1A CN110443356B (en) 2019-08-07 2019-08-07 Current type neural network based on multi-resistance state memristor

Publications (2)

Publication Number Publication Date
CN110443356A CN110443356A (en) 2019-11-12
CN110443356B true CN110443356B (en) 2022-03-25

Family

ID=68433728

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910726323.1A Active CN110443356B (en) 2019-08-07 2019-08-07 Current type neural network based on multi-resistance state memristor

Country Status (1)

Country Link
CN (1) CN110443356B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111460365B (en) * 2020-03-10 2021-12-03 华中科技大学 Equation set solver based on memristive linear neural network and operation method thereof
CN112070220B (en) * 2020-08-06 2023-01-17 北京大学 In-situ self-activated neural network circuit based on nonlinear device and neural network operation method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106815636A (en) * 2016-12-30 2017-06-09 华中科技大学 A kind of neuron circuit based on memristor
CN106845634A (en) * 2016-12-28 2017-06-13 华中科技大学 A kind of neuron circuit based on memory resistor
CN109034379A (en) * 2018-10-12 2018-12-18 南京邮电大学 A kind of neuron and neuron circuit built by class brain device memristor
CN110059816A (en) * 2019-04-09 2019-07-26 南京邮电大学 A kind of neural network element circuit based on memristor

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8832009B2 (en) * 2012-05-15 2014-09-09 The United States Of America As Represented By The Secretary Of The Air Force Electronic charge sharing CMOS-memristor neural circuit

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106845634A (en) * 2016-12-28 2017-06-13 华中科技大学 A kind of neuron circuit based on memory resistor
CN106815636A (en) * 2016-12-30 2017-06-09 华中科技大学 A kind of neuron circuit based on memristor
CN109034379A (en) * 2018-10-12 2018-12-18 南京邮电大学 A kind of neuron and neuron circuit built by class brain device memristor
CN110059816A (en) * 2019-04-09 2019-07-26 南京邮电大学 A kind of neural network element circuit based on memristor

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于忆阻器的神经突触的设计;高畅 等;《天津职业技术师范大学学报》;20190630;第33-38页 *

Also Published As

Publication number Publication date
CN110443356A (en) 2019-11-12

Similar Documents

Publication Publication Date Title
CN106779059A (en) A kind of Circuit of Artificial Neural Networks of the Pavlov associative memory based on memristor
CN105160401A (en) WTA neural network based on memristor array and application thereof
CN110443356B (en) Current type neural network based on multi-resistance state memristor
CN102610274A (en) Weight adjustment circuit for variable-resistance synapses
CN109165730B (en) State quantization network implementation method in cross array neuromorphic hardware
CN110852429B (en) 1T 1R-based convolutional neural network circuit and operation method thereof
CN210627259U (en) Pulse neural network digital-analog hybrid circuit system for realizing liquid state machine
US20210342678A1 (en) Compute-in-memory architecture for neural networks
CN114723025A (en) Memristor back propagation neural network circuit and control method thereof
CN115630693B (en) Memristor self-learning circuit based on Elman neural network learning algorithm
Gocheva et al. Modeling of Electronic Circuits with Generalized Nets
Meier Special report: Can we copy the brain?-The brain as computer
Zhang et al. The framework and memristive circuit design for multisensory mutual associative memory networks
CN110428049A (en) A kind of voltage-type neural network and its operating method based on polymorphic memristor
Su et al. A 1T2M memristor-based logic circuit and its applications
CN115456157B (en) Multi-sense interconnection memory network circuit based on memristor
Sinha et al. Evolving nanoscale associative memories with memristors
Hasler Special report: Can we copy the brain?-a road map for the artificial brain
Saraswat et al. Hardware-friendly synaptic orders and timescales in liquid state machines for speech classification
Liu et al. A reconfigurable approximate computing architecture with dual-VDD for low-power Binarized weight network deployment
CN109255437B (en) A kind of memristor nerve network circuit of flexibly configurable
Ahmed et al. Introduction to Neuromorphic Computing Systems
CN115857871B (en) Fuzzy logic all-hardware computing circuit and design method thereof
CN116523012B (en) Memristor self-learning circuit based on generation countermeasure neural network
Yammenavar et al. Design and analog VLSI implementation of artificial neural network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant