CN117408312A - Method for constructing random binary neural network by using spin orbit moment tunnel junction - Google Patents

Method for constructing random binary neural network by using spin orbit moment tunnel junction Download PDF

Info

Publication number
CN117408312A
CN117408312A CN202311442028.6A CN202311442028A CN117408312A CN 117408312 A CN117408312 A CN 117408312A CN 202311442028 A CN202311442028 A CN 202311442028A CN 117408312 A CN117408312 A CN 117408312A
Authority
CN
China
Prior art keywords
mram
probability
sot
neural network
current
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311442028.6A
Other languages
Chinese (zh)
Inventor
寇煦丰
顾钰
黄浦阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ShanghaiTech University
Original Assignee
ShanghaiTech University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ShanghaiTech University filed Critical ShanghaiTech University
Priority to CN202311442028.6A priority Critical patent/CN117408312A/en
Publication of CN117408312A publication Critical patent/CN117408312A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/047Probabilistic or stochastic networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Probability & Statistics with Applications (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Mram Or Spin Memory Techniques (AREA)
  • Hall/Mr Elements (AREA)

Abstract

The technical scheme of the invention is to provide a method for constructing a random binarization neural network by using a spin orbit moment tunnel junction, which is characterized in that SOT-MRAM is used for replacing SRAM to realize probability generation and sampling in the random binarization neural network, and the function of the whole random binarization neural network is realized on a chip by the assistance of a digital-to-analog converter and a comparator. The invention further provides a method for applying the stochastic binarization neural network constructed by the method to classification problems. The invention utilizes the spin orbit moment magnetic random access memory with adjustable turnover probability and combines a digital-to-analog converter and a comparator to realize the function of a random binarization neural network on a chip. Meanwhile, in the technical scheme disclosed by the invention, the accurate probability regulation and control of the device, picosecond overturning and smaller overturning voltage ensure the high accuracy and low energy consumption of the network.

Description

Method for constructing random binary neural network by using spin orbit moment tunnel junction
Technical Field
The invention relates to a method for constructing a random binarization neural network by using a spin orbit moment tunnel junction, belonging to the technical field of semiconductors and novel calculation.
Background
Since the first time the concept of artificial neural networks was proposed, its potential in simulating the complex computational processes of the human brain has been widely accepted worldwide. The basis of the neural network model is neurons that are connected together to form a complex network structure. Through training, these networks are able to learn how to capture complex patterns from input data, thereby performing a wide variety of tasks such as image recognition, natural language processing, and the like.
In this context, a random binary neural network (PBNN) has been developed, which is a special neural network in which both weights and activation values are binarized. The method greatly improves the training speed of the neural network and has excellent noise resistance. However, despite these advantages, PBNN has some challenges in achieving randomness.
Currently, many systems use Complementary Metal Oxide Semiconductor (CMOS) or Static Random Access Memory (SRAM) to achieve the randomness of PBNN. These random number generation circuits are typically implemented by amplifying thermal noise and introducing competing signals. However, these solutions have several problems. First, the power consumption of both CMOS and SRAM is quite high. Although they are excellent in terms of processing speed and stability, they are far from ideal in terms of energy efficiency. This is clearly a big problem for neural network applications that require a large amount of computation and long run time. Second, CMOS and SRAM may be physically limited when large neural networks need to be built because of their large volumes. As such, a sufficiently complex computational model cannot be implemented in a limited space. Finally, the randomness quality of CMOS and SRAM implementations also has a problem that circuit asymmetry, signal noise, and other non-idealities can seriously affect the probability distribution of random signals, and the randomness generated may not be ideal, which affects the performance of PBNN to some extent.
To solve these problems, researchers have begun to explore the use of MRAM (magnetoresistive random access memory) as an alternative. MRAM has many advantages, including lower power consumption, higher read and write times, and its naturally occurring random switching nature. This makes MRAM an ideal choice for implementing PBNN.
In recent years, research into PBNN using MRAM has made some breakthroughs, especially on three-terminal device SOT-MRAM. The SOT-MRAM can realize magnetic moment inversion with low energy consumption by utilizing Spin Orbit Torque (SOT) phenomenon, and is very suitable for realizing PBNN. However, how to design a read-write circuit suitable for the SOT-MRAM and realize the functions of the PBNN through appropriate interconnections is still a problem to be solved.
Overall, neural networks and PBNNs have made significant progress, but challenges remain in achieving randomness. The PBNN using the SOT-MRAM may be a solution, but we also need to design a read-write circuit suitable for the SOT-MRAM and implement the functions of the PBNN through a suitable interconnect.
Disclosure of Invention
The purpose of the invention is that: a method for using a non-volatile memory SOT-MRAM to build a random binary neural network for classifying problems is provided.
In order to achieve the above object, the present invention provides a method for building a random binary neural network by using a spin-orbit torque tunnel junction, which is characterized in that a part for probability generation and sampling in the random binary neural network is implemented by using an SOT-MRAM instead of an SRAM, and the function of the whole random binary neural network is implemented on a chip by assistance of a digital-to-analog converter and a comparator, and is further divided into a part built by the SOT-MRAM and a part for assisting in operation of other circuits, wherein:
SOT-MRAM builds up an MRAM array cell for probability generation and an MRAM Dummy cell for reference current generation. The mode of generating the binary result of the specific probability by using SOT-MRAM is to set the MRAMThe high resistance state and the low resistance state respectively correspond to 0 and 1 in binarization and pass through the top electrode V of the MRAM TE And bottom electrode V BE,L 、V BE,R The tunneling resistance is obtained by dividing the small inter-pass voltage by the measured tunneling current, and the probability of inversion is determined by the tunneling current flowing through the bottom electrode V of the MRAM BE,L 、V BE,R The pulse voltage and the pulse width are determined, and the relation between the pulse and the turnover probability is obtained by sampling the turnover condition of the same pulse on the device for a plurality of times, so that a reference is provided for writing the write pulse of the SOT-MRAM in the subsequent random binary neural network;
other circuit auxiliary operations include inter-module connections and control of digital to analog converters: the weight corresponding to each layer of random binary neural network node is written into and stored in the MRAM array unit of the SOT-MRAM, and the input is used as the bit line BL of the MRAM array unit i The read voltages on the row i=0 to n, and the read results of each row are accumulated on the sub-lines to realize multiplication of the input vector and the weight matrix, wherein the trained weight is mapped to [0,1 ] as an accuracy value]In the interval, as the inversion probability of the corresponding SOT-MRAM device, before each matrix multiplication operation, the mapping relation between the writing voltage and the inversion probability of the SOT-MRAM device is referred to for random inversion, and then the accurate weight inversion probability is expressed as a binary weight in a single sampling and is stored in the corresponding magnetic tunnel junction; the result of the previous layer is used as the input of the current layer, the current layer is changed into a multi-bit digital signal through quantization, the digital signal generates analog voltage through a digital-to-analog converter and is used as the analog input signal of the current layer to be applied to a magnetic tunnel junction, the result of the accumulation of the read current of a plurality of columns of magnetic tunnel junctions on a sub-line, namely, a new random number obtained by the weighted accumulation of a plurality of binary random numbers distributed by Bernoulli is subjected to normal distribution on the premise that the binary numbers are enough according to the central limit theorem; probability binarizing the normally distributed random number as the output of the current layer, i.e. calculating the normal distribution to the fixed integral of zero, to obtain the probability that the number is less than or equal to zero, the probability binarizing operating on the circuit, i.e. corresponding to the accumulated result on the sub-lines of the MRAM array cellsComparing the accumulated result with an MRAM Dummy cell in the SOT-MRAM device; if the current layer is the last layer of the random binary neural network, selecting one with highest probability from a plurality of output results of the MRAM array units as the object type identified by the current random binary neural network.
Preferably, in the portion where the other circuit assists in operation: controlling written rows by SWLDecoder; controlling the size and pulse width of the columns and the writing voltages of the writing MRAM array units and the MRAM Dummy units through a digital-analog converter; converting current signals output by the MRAM array unit and the MRAM Dummy unit into voltage signals through TIA, and inputting the voltage signals into a comparator and RWLDecoder; selecting a row on the MRAM array cell to be compared with a reference voltage output by the MRAM Dummy cell by RWLDecoder; the current accumulated by the MRAM array units and the reference current generated by the MRAM Dummy units are compared in real time through a comparator, and the excitation probability is counted through the counting comparison result.
Preferably, the number of columns of the MRAM Dummy cells corresponds to the number of columns of the MRAM array cells; the number of the rows of the MRAM Dummy cells is 2, and the MRAM is a row of high-configuration MRAM and a row of low-resistance MRAM respectively.
The invention also provides a method for applying the random binarization neural network constructed by the method to classification problems, which is characterized by comprising the following steps:
step one: applying reset voltage of 0-99V along the positive direction of the bottom electrode of the SOT-MRAM, and writing the tunnel junction resistance into a low-resistance state, namely resetting operation, based on the spin Hall effect of the spin orbit moment layer, wherein the spin orbit moment acts on the magnetic moment of the magnetic layer below the tunnel junction and changes the magnetization direction of the magnetic moment; controlling the length of input pulse in the range of 400 ps-999 ms and the size of input pulse in the range of 0-99V, applying writing voltage in the negative direction of the bottom electrode of the SOT-MRAM, and changing the magnetic moment state with a certain specific probability; in the range of 0-99V, at the bottom electrode V of SOT-MRAM BE,L 、V BE,R Applying smaller reading voltage at two ends, reading current at two ends, calculating the resistance of the magnetic tunnel junction, and determining whether overturning occurs or not;
repeating the steps for 10-999 times to obtain the relation between the turnover probability and the pulse;
step two: finding out the corresponding pulse width and size according to the obtained turnover probability, the pulse relation and the trained target probability result;
step three: controlling the right end V of the bottom electrode of the SOT-MRAM BE,R Grounding in the range of 0-99V at the left end V of the bottom electrode of the SOT-MRAM BE,L To a sufficiently large V reset A signal to ensure that all MRAM in the SOT-MRAM are turned over to a low resistance state, and simultaneously, two rows of MRAM Dummy cells in the SOT-MRAM are respectively written into the low resistance state and the high resistance state;
step four: control access SOT-MRAM bottom electrode left end V BE,L V of (2) reset Signal grounding, selecting the row in the MRAM array unit to be written according to the pulse width and the size obtained in the second step, and applying respective pulses to the MRAM of different columns in the MRAM array unit;
step five: the digital-to-analog converter converts the probability signal transmitted by the upper network into an analog signal and applies the analog signal to the right end V of the bottom electrode of the SOT-MRAM BE,R The analog signal flows through the MRAM in the SOT-MRAM, and after the MRAM information of all columns in the MRAM array cells is overlapped by a current accumulation mode, the analog signal is transmitted from the top electrode V TE Outward transfer; at the same time, the MRAMDummy unit also transmits an accumulated current as a reference current according to the same analog signal given by the digital-to-analog converter, and the accumulated current flows out of the MRAMDummy unit through the top electrode of the MRAMDummy unit; after the reference current given by the MRAMDummy unit and the self-accumulated current given by the MRAM array unit are converted into voltage signals, the voltage signals are compared through a comparator, so that the operation of Activation is realized;
step six: repeating the steps three to five for at least ten times to ensure that the read probability information has reliability;
step seven: obtaining a probability value between 0 and 1 according to the statistical result of the step six;
step eight: if the current layer is not the last layer of the random binarization neural network, the result of the step seven is transferred into a digital-to-analog converter to be used as the input of the next layer; otherwise, comparing the results obtained in the step seven of different rows of the MRAM array units, finding the maximum value, and using the maximum value as the identification result of the random binarization neural network to finish classification.
The invention utilizes the spin orbit moment magnetic random access memory with adjustable turnover probability and combines a digital-to-analog converter and a comparator to realize the function of a random binarization neural network on a chip. Meanwhile, in the technical scheme disclosed by the invention, the accurate probability regulation and control of the device, picosecond overturning and smaller overturning voltage ensure the high accuracy and low energy consumption of the network.
Drawings
FIG. 1 is a block diagram of a PBNN network;
FIG. 2 is a PBNN flow chart;
FIG. 3 is a graph of SOT-MRAM architecture and flip probability curve;
FIG. 4 is a circuit diagram of an MRAM array and MRAM reference array;
fig. 5 is a PBNN on-chip architecture diagram.
Detailed Description
The invention will be further illustrated with reference to specific examples. It is to be understood that these examples are illustrative of the present invention and are not intended to limit the scope of the present invention. Further, it is understood that various changes and modifications may be made by those skilled in the art after reading the teachings of the present invention, and such equivalents are intended to fall within the scope of the claims appended hereto.
The PBNN network structure diagram is shown in fig. 1, and consists of an input layer, a convolution layer, a pooling layer, a full connection layer and an output layer. MRAM can be used for convolution kernel in convolution layer, and due to its nonvolatile characteristic, it is convenient for multiplexing convolution kernel, and speeds up convolution process. MRAM is used as a weight carrier in the full connection layer, and the VMM operation is realized by utilizing the self probability characteristic. The MRAM is used for generating a reference signal in the output layer, and the MRAM array output signal outputs a final result through the comparator, so that Activation is realized. As shown in fig. 2, in the PBNN flowchart, for the first layer neural network, the input signal is from the data set, the information needs to be normalized to be between 0 and 1, and if the layer is not the first layer, the input signal is the probability value transmitted by the upper layer network. The VMM operation is performed by inputting the value into the network in the form of an analog voltage signal through the DAC, and taking the tunneling resistance of the SOT-MRAM as a weight. The high-low resistance state of the tunneling resistance value is Bernoulli distribution random numbers regulated by writing voltage, and the new random numbers obtained by weighted accumulation of a plurality of Bernoulli distribution symmetrical about zero point obey normal distribution according to the central limit theorem. Then, the integral of the normal distribution probability density function to zero is calculated, so that a probability value between 0 and 1 can be obtained, and the probability (Activation) that the new random number obtained by accumulating a plurality of Bernoulli numbers is smaller than zero is indicated. Since the weight and output of each layer are Bernoulli random numbers, the probability that a sample is desired, i.e., corresponds to the Bernoulli distribution, is needed. If the layer is not the last layer, the probability is transferred to the next layer, if the layer is the output layer, the output results of different rows are compared, and the maximum probability is taken as the result (Classification) of the current Classification. Throughout the process, MRAM mainly works at the position of gray circles in the figure, i.e. VMM operation is implemented as a weight carrier, and reference currents for comparison are generated in Activation operation.
As shown in FIG. 3, the bottom electrode is a spin-orbit torque layer, and when current flows through the bottom electrode, the generated spin-orbit torque acts on the magnetic tunnel junction to change the high-low resistance state of the magnetic tunnel junction. By varying the voltage level, different probabilities of resistive switching can be generated.
The object identification working process of the SOT-MRAM in the PBNN is as follows:
step one: v along FIG. 3 BE,L 、V BE,R A reset voltage is applied in the direction, and due to the spin hall effect of the spin-orbit torque layer, the spin-orbit torque will act on the magnetic moment of the magnetic layer under the tunnel junction and change its magnetization direction, writing the tunnel junction resistance into a low resistance state, i.e., a reset operation. Controlling the length and size of the input pulse, at V BE,R 、V BE,L A write voltage is applied in a direction to change the state of the magnetic moment with a certain probability. At V TE 、V BE,R Applying small reading voltage at two ends, reading current at two ends, calculating magnetic tunnel junction resistance, and determining whether to performFlipping occurs. Repeating the steps for five hundred times to obtain the relation between the turnover probability and the pulse.
Step two: and finding out the corresponding pulse width and size according to the obtained turnover probability, the pulse relation and the trained target probability result.
Step three: control BL i (i=0 to n) to ground, give a sufficiently large V reset (positive voltage), let SWL i (i=0 to n) is high, ensuring that all MRAM flip to low resistance state (fig. 4), while two rows of MRAM dummy cell will write to low resistance state and high resistance state, respectively.
Step four: control V reset Signal grounding, applying respective pulses according to the pulse width and the pulse size obtained in the second step to the MRAM of different columns, and passing through SWL i A row to be written is selected.
Step five: all SWLs were taken i Floating, converting the probability signal transmitted from the previous layer into analog signal flowing out of DAC and applying the analog signal to BL i The voltage flows through the MRAM, and after the MRAM information of all columns is overlapped in a current accumulation mode, the voltage is distributed from RWL i And is transmitted outwards. At the same time, MRAMDummyCell will also be based on the same BL i The signal conveys an accumulated current, which is represented by RWL D And (5) flowing out. The two will be compared by the coordinator and the action will be performed. (FIG. 5)
Step six: the steps three to five are repeated at least ten times to ensure that the read probability information has reliability.
Step seven: and D, averaging according to the statistical result of the step six to obtain a sampling probability value between 0 and 1.
Step eight: if the layer is not the last layer, the result of step seven is transferred to the DAC as input to the next layer. If the layer is the output layer, comparing the results obtained in the step seven of different rows, and finding the maximum value of the results as the result of the current identification.
According to the actual measurement result of the invention between the last two layers of networks, the PBNN component can realize the identification accuracy of 90% in the digital identification training of the MINIST data set.

Claims (4)

1. The method for constructing the random binarization neural network by using the spin orbit moment tunnel junction is characterized in that an SOT-MRAM is used for replacing SRAM to realize a probability generation and sampling part in the random binarization neural network, and the function of the whole random binarization neural network is realized on a chip by the aid of a digital-to-analog converter and a comparator and is further divided into a part constructed by the SOT-MRAM and a part for assisting in working by other circuits, wherein the method comprises the following steps:
SOT-MRAM is divided into an MRAM array cell for generating probability and an MRAM Dummy cell for generating reference current, and in the portion built by SOT-MRAM, SOT-MRAM is used for generating binary result of specific probability, i.e. high resistance state and low resistance state of MRAM respectively correspond to 0 and 1 in binary, by forming a binary pattern on the top electrode V of MRAM TE And bottom electrode V BE,L 、V BE,R The tunneling resistance is obtained by dividing the small inter-pass voltage by the measured tunneling current, and the probability of inversion is determined by the tunneling current flowing through the bottom electrode V of the MRAM BE,L 、V BE,R The pulse voltage and the pulse width are determined, and the relation between the pulse and the turnover probability is obtained by sampling the turnover condition of the same pulse on the device for a plurality of times, so that a reference is provided for writing the write pulse of the SOT-MRAM in the subsequent random binary neural network;
other circuit auxiliary operations include inter-module connections and control of digital to analog converters: the weight corresponding to each layer of random binary neural network node is written into and stored in the MRAM array unit of the SOT-MRAM, and the input is used as the bit line BL of the MRAM array unit i The read voltages on the row i=0 to n, and the read results of each row are accumulated on the sub-lines to realize multiplication of the input vector and the weight matrix, wherein the trained weight is mapped to [0,1 ] as an accuracy value]In the interval, as the inversion probability of the corresponding SOT-MRAM device, before each matrix multiplication operation, the mapping relation between the writing voltage and the inversion probability of the SOT-MRAM device is referred to for random inversion, and then the accurate weight inversion probability is expressed as a binary weight in a single sampling and is stored in the corresponding magnetic tunnel junction; results of the previous layerAs the input of the current layer, the digital signal is quantized into a multi-bit digital signal, the digital signal generates analog voltage through a digital-to-analog converter and is applied to a magnetic tunnel junction as the analog input signal of the current layer, and the result of accumulating the read current of a plurality of columns of magnetic tunnel junctions on a sub-line, namely, the new random number obtained by weighting and accumulating a plurality of binary random numbers distributed by Bernoulli, is subjected to normal distribution on the premise that the binary numbers are enough according to the central limit theorem; performing probability binarization on the random number of the normal distribution as the output of the current layer, namely, calculating the constant integral of the zero point of the normal distribution to obtain the probability that the number is smaller than or equal to zero, wherein the probability binarization is operated on a circuit, namely, the operation of the probability binarization is corresponding to the comparison of the accumulated result on the sub-lines of the MRAM array unit with the accumulated result of the MRAM Dummy unit in the SOT-MRAM device; if the current layer is the last layer of the random binary neural network, selecting one with highest probability from a plurality of output results of the MRAM array units as the object type identified by the current random binary neural network.
2. A method of building a stochastic binarized neural network using spin-orbit torque tunnel junctions according to claim 1, wherein, in the portion where other circuits operate in assistance: controlling written rows by SWLDecoder; controlling the size and pulse width of the columns and the writing voltages of the writing MRAM array units and the MRAM Dummy units through a digital-analog converter; converting current signals output by the MRAM array unit and the MRAM Dummy unit into voltage signals through TIA, and inputting the voltage signals into a comparator and RWLDecoder; selecting a row on the MRAM array cell to be compared with a reference voltage output by the MRAM Dummy cell by RWLDecoder; the current accumulated by the MRAM array units and the reference current generated by the MRAM Dummy units are compared in real time through a comparator, and the excitation probability is counted through the counting comparison result.
3. The method of building a random binary neural network using spin-orbit torque tunnel junctions according to claim 1, wherein the number of columns of MRAM Dummy cells corresponds to the number of columns of MRAM array cells; the number of the rows of the MRAM Dummy cells is 2, and the MRAM is a row of high-configuration MRAM and a row of low-resistance MRAM respectively.
4. A method for classifying problems using a stochastic binarized neural network constructed by the method of claim 1, comprising the steps of:
step one: applying reset voltage of 0-99V along the positive direction of the bottom electrode of the SOT-MRAM, and writing the tunnel junction resistance into a low-resistance state, namely resetting operation, based on the spin Hall effect of the spin orbit moment layer, wherein the spin orbit moment acts on the magnetic moment of the magnetic layer below the tunnel junction and changes the magnetization direction of the magnetic moment; controlling the length of input pulse in the range of 400 ps-999 ms and the size of input pulse in the range of 0-99V, applying writing voltage in the negative direction of the bottom electrode of the SOT-MRAM, and changing the magnetic moment state with a certain specific probability; in the range of 0-99V, at the bottom electrode V of SOT-MRAM BE,L 、V BE,R Applying smaller reading voltage at two ends, reading current at two ends, calculating the resistance of the magnetic tunnel junction, and determining whether overturning occurs or not;
repeating the steps for 10-999 times to obtain the relation between the turnover probability and the pulse;
step two: finding out the corresponding pulse width and size according to the obtained turnover probability, the pulse relation and the trained target probability result;
step three: controlling the right end V of the bottom electrode of the SOT-MRAM BE,R Grounding in the range of 0-99V at the left end V of the bottom electrode of the SOT-MRAM BE,L To a sufficiently large V reset A signal to ensure that all MRAM in the SOT-MRAM are turned over to a low resistance state, and simultaneously, two rows of MRAM Dummy cells in the SOT-MRAM are respectively written into the low resistance state and the high resistance state;
step four: control access SOT-MRAM bottom electrode left end V BE,L V of (2) reset Signal grounding, selecting the row in the MRAM array unit to be written according to the pulse width and the size obtained in the second step, and applying respective pulses to the MRAM of different columns in the MRAM array unit;
step five: the D/A converter will be onThe probability signal transmitted by a layer of network is converted into an analog signal to be applied to the right end V of the bottom electrode of the SOT-MRAM BE,R The analog signal flows through the MRAM in the SOT-MRAM, and after the MRAM information of all columns in the MRAM array cells is overlapped by a current accumulation mode, the analog signal is transmitted from the top electrode V TE Outward transfer; at the same time, the MRAMDummy unit also transmits an accumulated current as a reference current according to the same analog signal given by the digital-to-analog converter, and the accumulated current flows out of the MRAMDummy unit through the top electrode of the MRAMDummy unit; after the reference current given by the MRAMDummy unit and the self-accumulated current given by the MRAM array unit are converted into voltage signals, the voltage signals are compared through a comparator, so that the operation of Activation is realized;
step six: repeating the steps three to five for at least ten times to ensure that the read probability information has reliability;
step seven: obtaining a probability value between 0 and 1 according to the statistical result of the step six;
step eight: if the current layer is not the last layer of the random binarization neural network, the result of the step seven is transferred into a digital-to-analog converter to be used as the input of the next layer; otherwise, comparing the results obtained in the step seven of different rows of the MRAM array units, finding the maximum value, and using the maximum value as the identification result of the random binarization neural network to finish classification.
CN202311442028.6A 2023-10-31 2023-10-31 Method for constructing random binary neural network by using spin orbit moment tunnel junction Pending CN117408312A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311442028.6A CN117408312A (en) 2023-10-31 2023-10-31 Method for constructing random binary neural network by using spin orbit moment tunnel junction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311442028.6A CN117408312A (en) 2023-10-31 2023-10-31 Method for constructing random binary neural network by using spin orbit moment tunnel junction

Publications (1)

Publication Number Publication Date
CN117408312A true CN117408312A (en) 2024-01-16

Family

ID=89490618

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311442028.6A Pending CN117408312A (en) 2023-10-31 2023-10-31 Method for constructing random binary neural network by using spin orbit moment tunnel junction

Country Status (1)

Country Link
CN (1) CN117408312A (en)

Similar Documents

Publication Publication Date Title
CN112183739B (en) Hardware architecture of memristor-based low-power-consumption pulse convolution neural network
Sun et al. XNOR-RRAM: A scalable and parallel resistive synaptic architecture for binary neural networks
Yang et al. Research progress on memristor: From synapses to computing systems
CN109800870B (en) Neural network online learning system based on memristor
KR20180116094A (en) A monolithic multi-bit weight cell for neuromorphic computing
CN115048075A (en) SRAM (static random Access memory) storage and calculation integrated chip based on capacitive coupling
US20200410319A1 (en) Stacked artificial neural networks
CN111193511A (en) Design of digital-analog hybrid reading circuit applied to eFlash storage and calculation integrated circuit
Lee et al. NAND flash based novel synaptic architecture for highly robust and high-density quantized neural networks with binary neuron activation of (1, 0)
WO2023217021A1 (en) Data processing method based on memristor array, and data processing apparatus
CN114400031B (en) Complement mapping RRAM (resistive random access memory) storage and calculation integrated chip and electronic equipment
Dbouk et al. KeyRAM: A 0.34 uJ/decision 18 k decisions/s recurrent attention in-memory processor for keyword spotting
CN108154225B (en) Neural network chip using analog computation
Wu et al. ReRAM crossbar-based analog computing architecture for naive bayesian engine
Lee et al. Novel method enabling forward and backward propagations in NAND flash memory for on-chip learning
Antolini et al. Combined hw/sw drift and variability mitigation for pcm-based analog in-memory computing for neural network applications
CN108154226B (en) Neural network chip using analog computation
CN108154227B (en) Neural network chip using analog computation
He et al. Towards state-aware computation in ReRAM neural networks
CN117408312A (en) Method for constructing random binary neural network by using spin orbit moment tunnel junction
Regev et al. Fully-Integrated Spiking Neural Network Using SiO x-Based RRAM as Synaptic Device
CN113949385B (en) Analog-to-digital conversion circuit for RRAM (resistive random access memory) storage and calculation integrated chip complement quantization
CN114861902A (en) Processing unit, operation method thereof and computing chip
Zang et al. 282-to-607 TOPS/W, 7T-SRAM Based CiM with Reconfigurable Column SAR ADC for Neural Network Processing
Choi et al. Implementation of an On-Chip Learning Neural Network IC Using Highly Linear Charge Trap Device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination