CN109359734B - Neural network synaptic structure based on memristor unit and adjusting method thereof - Google Patents

Neural network synaptic structure based on memristor unit and adjusting method thereof Download PDF

Info

Publication number
CN109359734B
CN109359734B CN201811245921.9A CN201811245921A CN109359734B CN 109359734 B CN109359734 B CN 109359734B CN 201811245921 A CN201811245921 A CN 201811245921A CN 109359734 B CN109359734 B CN 109359734B
Authority
CN
China
Prior art keywords
memristor
round
weight
neural network
cascade
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811245921.9A
Other languages
Chinese (zh)
Other versions
CN109359734A (en
Inventor
帅垚
乔石珺
吴传贵
罗文博
王韬
张万里
彭赟
潘忻强
梁翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN201811245921.9A priority Critical patent/CN109359734B/en
Publication of CN109359734A publication Critical patent/CN109359734A/en
Application granted granted Critical
Publication of CN109359734B publication Critical patent/CN109359734B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/06Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
    • G06N3/063Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Theoretical Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Neurology (AREA)
  • Feedback Control In General (AREA)

Abstract

The invention relates to the technical field of computers and electronic information, in particular to a neural network synaptic structure based on memristor units and a regulating method thereof. According to the method, a plurality of memristor units are cascaded together to serve as an electronic synapse (namely a cascade memristor unit group), a weighted mapping rule is used for mapping the resistance value of the memristor units to the synapse weight in the neural network, and the influence of the neural network caused by the fact that the memristor cannot be accurately adjusted in the learning process is avoided in a full-adjustment or step-by-step adjustment mode, so that various neural networks can normally operate on a system based on the memristor. The method solves the problem of mapping the resistance value to the synaptic weight and the problem of uncertainty of change of the resistance value of the memristor unit.

Description

Neural network synaptic structure based on memristor unit and adjusting method thereof
Technical Field
The invention relates to the technical field of computers and electronic information, in particular to a neural network synaptic structure based on memristor units and a regulating method thereof.
Background
Synapses are intermediate structures connecting different neurons in a neural network, and weights of the synapses are continuously updated through corresponding neural network algorithms, so that the basis of information processing of the neural network is provided. Memristors, as a new component with resistance-tunable characteristics, are considered to be one of 4 basic circuit elements in parallel with a resistor, an inductor, and a capacitor.
Researchers find that the change law of synaptic connection strength and the electrical property (the electrical conductance value of the memristor is regulated by the applied electrical signal) of the memristor have similar change laws, so that a single memristor can simulate the function of a synapse. Compared with the operation of a neural network program by using an electronic computer (namely a traditional complementary metal oxide semiconductor-based transistor circuit), the memristor unit can greatly reduce the energy consumption, improve the reading and writing speed and simultaneously reduce the complexity of an integrated circuit. However, the memristor also has inherent defects, and firstly, as an analog device, the resistance of the memristor is nonlinear analog quantity change under the regulation and control of voltage, and cannot be accurately adjusted to the expected weight value as a digital device. At present, most researchers adopt weights which map the resistance values of the memristors to the neural network in a one-to-one manner, the neural network based on the mapping method cannot accurately adjust the weights, and how to effectively map the resistance values of the memristors in the memristor circuit to the corresponding synapse weights in the neural network matrix so as to achieve the optimal results of operation precision and power consumption is a very critical problem.
At present, synapse structures based on memristive devices in the prior art mainly include: prezioso, M.1And the difference value of the two memristive cells is used for defining the weight value in the used neural network, and in the regulation process, the excitation voltage is only applied to the previous memristive cell, and the resistance value of the latter memristive cell is kept unchanged, so that the weight values of all synapses have the same reference value. The updating adopts manhattan weight value updating rule, namely
Figure BDA0001840516250000011
Therefore, the positive and negative amplitudes of the applied excitation voltage needed by the memristive cell needing to be adjusted are obtained. Ligang Gao2The practice of et al is to use program controlled excitation voltages (including pulse width and amplitude) in combination with program additionsUntil the resistance of the memristor is close to or equal to the target resistance value, so that the weight value of each step is updated to a desired value, and the neural network can normally operate.
The difference value of two memristive units is used for representing the weight value of a synapse, and the Manhattan weight value updating rule is adopted, so that only simple image recognition can be realized due to the fact that the inaccuracy of synapse weight value regulation is not fundamentally solved, and the function of a more complex neural network cannot be completed. By adding the technology of writing in the verification step, the complexity of program control is increased, and the advantage of weight adjustment which can be quickly finished originally is changed into a time-consuming repeated verification process.
Disclosure of Invention
Aiming at the problems or the defects, the invention provides a neural network synaptic structure based on a memristor unit and a regulating method thereof, aiming at solving the problem of mapping from a resistance value to a synaptic weight and the problem of uncertainty of resistance value change of the memristor unit under the action of an excitation voltage.
A neural network synaptic structure based on memristor units is a cascade memristor unit group formed by cascading n memristor units.
The number n of the memristor units is determined by the accuracy of synaptic weights required by the normal operation of a neural network, the connection mode of each memristor unit on a physical layer is organized by using a cascade connection mode, and the resistance value of each memristor unit is preprocessed according to the following formula:
Figure BDA0001840516250000021
wherein i refers to the ith cascaded memristor cell in the cascaded memristor cell group, RiIs the real-time resistance value R of the ith unit in the cascade memristor unit groupmidIs the median value of the resistance value variation range of each unit in the cascade memristor unit group, RminIn cascaded memristor cell groupsMinimum value of resistance variation range, w, of each celliAnd mapping the resistance value of the ith cascade memristor unit in the cascade memristor unit group to obtain a numerical value. Each memristor cell is the same, RmidAnd RminIs a constant value.
And organizing the numerical values obtained by the preprocessing in an algorithm level according to the following formula:
W=round(wX,Y)*10X-1+round(wX-1,Y)*10X-2+…+round(wi,Y)*10i-1 +…+round(w1,Y)*100+round(w-1,Y-1)*10-1+round(w-2,Y-2)*10-2 +…+round(w-i,Y-i)*10-i+…+round(w-Y,0)*10-Y
or
W=round(wX,0)*10X-1+round(wX-1,0)*10X-2+…+round(wi,0)*10i-1 +…+round(w1,0)*100+round(w-1,0)*10-1+round(w-2,0)*10-2 +…+round(w-i,0)*10-i+…+round(w-Y,0)*10-Y
Wherein, W is a weight of the cascade memristor unit group (namely, a neural network electronic synapse), X and Y are numbers of units correspondingly representing an integer and a decimal part of the weight in the cascade memristor unit group respectively, and Y is the precision of the represented weight at the same time; round (w)iY-i) is represented by the pair wiRounding to retain the Y-i decimal place.
The simulated synapses organized in the above manner are updated in synapse weights by using two adjustment modes.
The first is full regulation, that is, all the memristive cells in cascade participate in regulation in the process of weight value updating, that is, the resistance values of all the memristive cells in cascade are changed through the operation of applying electric pulses.
The second type is step adjustment, namely, the memristor unit needing to be adjusted is judged according to the change condition of the error in the process of updating the weight. Specifically, in the early stage of program operation, because the change of the weight value is large, only the bit with the higher weight value in the above representation method is adjusted; at the later stage of the program run, the update of each weight value is already small, i.e. the learning process is close to being completed, at which point the application of electrical pulses to all cells is resumed for fine adjustment to reach the desired weight value as soon as possible. And when the error is more than 5-20 times of err _ good, the early stage is considered, otherwise, the later stage is considered, the err _ good is an error triggering condition for ending the program set in the program, and the learning process is ended when the current error is less than the value.
According to the method, a plurality of memristor units are cascaded together to serve as an electronic synapse (namely a cascade memristor unit group), a weighted mapping rule is used for mapping the resistance value of the memristor units to the synapse weight in the neural network, and the influence of the neural network caused by the fact that the memristor cannot be accurately adjusted in the learning process is avoided in a full-adjustment or step-by-step adjustment mode, so that various neural networks can normally operate on a system based on the memristor.
In summary, the invention solves the problem of mapping the resistance value to the synaptic weight and the problem of uncertainty of change of the resistance value of the memristive cell.
Drawings
FIG. 1 is a diagram of synapse structure of a neural network based on memristive devices in the prior art;
FIG. 2 is a test and control system used in an embodiment;
FIG. 3 is a diagram of an embodiment of a neural network synapse structure based on five memristive cells;
FIG. 4 is a flow chart of the modulation principle of the synapse structure of the invention;
FIG. 5 is a tolerance test of the neural network of the embodiment to errors during the tuning process in three different learning modes.
Detailed Description
The invention is explained in more detail below with reference to the drawings and the examples
Fig. 2 shows a test and control system used in the embodiment, and the upper part of the figure is a 3706A system control switch which selects the memristive cell needing to operate through connection and disconnection of an internal circuit. The 2400 interface is used for connecting a Keithley2400Source Meter digital multimeter to read the resistance of the memristive unit, and is also connected with a function generator interface for generating excitation voltage, so that the resistance of the memristive unit is updated through pulse voltage generated by the function generator. The DUT in the figure is the memristive cell to be controlled, and only one cell is shown for illustration.
FIG. 3 shows an example of a memristive cell array, where each cross-point is a memristive cell, two ends of the memristive cell are connected to a circuit for controlling the memristive cell and connected to a system control switch, and a row of five memristive cells are used as an analog electronic synapse, and a write voltage V is applied by program controlwriteAnd a read voltage Vread. Five memristor units in the dashed box are the neural network synapse structure of the present embodiment.
Specific weight representation and update details referring to fig. 4, the determination of weight representation by a computer running a BP neural network based on an Iris dataset requires the precision of a total of five decimal places including integer digits, and therefore five memristive cell cascades are required to represent a synaptic weight.
After all external control units are connected with the memristor array, the operation of the program can be started after the initialization operation of all the units, and the resistance values of all the memristor units read in sequence by the digital multimeter are sequentially marked as R after being returned to the computer1,R2,...R5Then according to the formula wi=(Ri-Rmid)/(Rmax-Rmin) Pre-processing is carried out to modulate all resistance values to [ -1, 1 [)]Within the range. And according to the weighting mode:
W=w1·1+w2·0.1+w3·0.01+w4·0.001+w5·0.0001
and obtaining the actual weight of the whole synapse.
In the subsequent weight value updating process, the embodiment adopts a step-by-step adjusting method. In the process of program operation, a neural network error calculated by a computer needs to adjust the weight of a certain synapse during backward propagation, so that a program control instrument applies rectangular pulse voltages with specified quantity and pulse width to corresponding synapse units, and under the excitation of the adjusting pulses, the migration of carriers occurs in each memristive unit, so that the conductance of the unit is increased or decreased.
In this example, when the current error is judged to be greater than 10 times err _ good through a program (err _ good is an error triggering condition for program ending set in the program, and the learning process is ended when the current error is smaller than the value), at this time, a weight value is far away from a target weight value to be learned, so that only the head position with the largest influence on the weight value is adjusted, after a plurality of learning cycles, after the current error value is smaller than 10 times err _ good, each memristor unit is simultaneously adjusted, and fine adjustment is performed to meet the requirement of err _ good.
The specific operation mode is described in detail by taking a group of five memristor units as an example of a cascade memristor unit group: in this example, a one-bit integer, four-bit decimal number is selected for w according to the neural network accuracy requirements, so a set of five memristor cells is used as a cascaded memristor cell set.
In the initial state, the resistance values of all the units are reset to the middle state of the resistance value variation range, and the weight of a synapse unit is assumed to be
W=0.4728·1-0.231·0.1+0.80·0.01+0.2·0.001-0·0.0001=0.4579
The method can be seen to represent the weight of synapses, the value of the first memristor unit plays a main role, and the later memristor unit can accurately adjust the weight of synapses to meet the requirement of accuracy. Assuming that the weight needs to be adjusted to w-1.0482 after the first loop is run, applying a specific number of pulse voltages to the first memristor cell through program control, keeping the latter cells unchanged, gradually slowing down the change of the weight after the adjustment of the learning process for a plurality of times, that is, the learning process is nearly completed, applying pulse control to all the cells through program control at this time, gradually reducing the error value to an allowable error range, ending the learning process, and completing the weight updating.
FIG. 5 shows the results of the actual computer simulation of the above-mentioned processing method, which are the changes of the accuracy under the influence of different errors in the full-tuning, step-by-step tuning and stepless-linkage states, respectively. The abscissa is the error value of the synaptic weight change of the computer simulation, i.e., the added random error. The vertical axis is the accuracy at the end of the program run. It can be seen that the maximum order of magnitude of error can be tolerated initially using full-scale adjustments, while stepped adjustments are less tolerant of errors but can achieve greater accuracy than stepped adjustments.

Claims (2)

1. A neural network synapse structure based on memristor units, comprising:
the cascade memristor unit group is formed by cascading n memristor units, wherein n is determined by the accuracy of synaptic weights required by the normal operation of the neural network; the resistance value of each memristor unit is preprocessed according to the following formula:
Figure FDA0001840516240000011
wherein i refers to the ith cascaded memristor cell in the cascaded memristor cell group, RiIs the real-time resistance value, R, of the ith cascaded memristor cellmidIs the median value of the resistance value variation range of each unit in the cascade memristor unit group, RminThe minimum value of the resistance value change range of each unit in the cascade memristor unit group is defined as wi, and the wi is a numerical value obtained by mapping the resistance value of the ith cascade memristor unit; each memristor cell is the same, RmidAnd RminIs a constant value;
and organizing the numerical values obtained by the preprocessing in an algorithm level according to the following formula:
W=round(wX,Y)*10X-1+round(wX-1,Y)*10x-2+…+round(wi,Y)*10i-1
+…+round(w1,Y)*100+round(w-1,Y-1)*10-1+round(w-2,Y-2)*10-2
+…+round(w-i,Y-i)*10-i+…+round(w-Y,0)*10-Y
or
W=round(wX,0)*10X-1+round(wX-1,0)*10X-2+…+round(wi,0)*10i-1
+…+round(w1,0)*100+round(w-1,0)*10-1+round(w-2,0)*10-2
+…+round(w-i,0)*10-i+…+round(w-Y,0)*10-Y
W is a weight of the cascade memristor unit group, X and Y are the numbers of units which correspondingly represent the integer and the decimal part of the weight in the cascade memristor unit group respectively, and Y is the precision of the represented weight; round (w)iY-i) is represented by the pair wiRounding to retain the Y-i decimal place.
2. The synaptic structure of claim 1, wherein the synaptic weights of said neural network are updated by:
the first method is full regulation, all the memristive units in cascade connection participate in regulation in the process of weight updating, and the resistance values of all the memristive units in cascade connection are changed through the operation of applying electric pulses;
the second type is step-by-step adjustment, namely, the memristor unit needing to be adjusted is judged according to the change condition of the error in the process of updating the weight;
specifically, only the higher weight bits are adjusted in the early stage of program operation, and the electric pulses are applied to all the cells to perform fine adjustment in the later stage of program operation so as to reach the expected weight as soon as possible; and when the error is more than 5-20 times of err _ good, the early stage is considered, otherwise, the later stage is considered, the err _ good is an error triggering condition for ending the program set in the program, and the learning process is ended when the current error is less than the err _ good value.
CN201811245921.9A 2018-10-24 2018-10-24 Neural network synaptic structure based on memristor unit and adjusting method thereof Active CN109359734B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811245921.9A CN109359734B (en) 2018-10-24 2018-10-24 Neural network synaptic structure based on memristor unit and adjusting method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811245921.9A CN109359734B (en) 2018-10-24 2018-10-24 Neural network synaptic structure based on memristor unit and adjusting method thereof

Publications (2)

Publication Number Publication Date
CN109359734A CN109359734A (en) 2019-02-19
CN109359734B true CN109359734B (en) 2021-10-26

Family

ID=65346390

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811245921.9A Active CN109359734B (en) 2018-10-24 2018-10-24 Neural network synaptic structure based on memristor unit and adjusting method thereof

Country Status (1)

Country Link
CN (1) CN109359734B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110364232B (en) * 2019-07-08 2021-06-11 河海大学 High-performance concrete strength prediction method based on memristor-gradient descent method neural network
WO2023059265A1 (en) * 2021-10-08 2023-04-13 Agency For Science, Technology And Research Neural processing core, method of programming a synaptic memory array thereof and method of performing neural network inference thereon

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106650922A (en) * 2016-09-29 2017-05-10 清华大学 Hardware neural network conversion method, computing device, compiling method and neural network software and hardware collaboration system
CN107578014A (en) * 2017-09-06 2018-01-12 上海寒武纪信息科技有限公司 Information processor and method
CN108009640A (en) * 2017-12-25 2018-05-08 清华大学 The training device and its training method of neutral net based on memristor

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9798751B2 (en) * 2013-10-16 2017-10-24 University Of Tennessee Research Foundation Method and apparatus for constructing a neuroscience-inspired artificial neural network

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106650922A (en) * 2016-09-29 2017-05-10 清华大学 Hardware neural network conversion method, computing device, compiling method and neural network software and hardware collaboration system
CN107578014A (en) * 2017-09-06 2018-01-12 上海寒武纪信息科技有限公司 Information processor and method
CN108009640A (en) * 2017-12-25 2018-05-08 清华大学 The training device and its training method of neutral net based on memristor

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Exploiting Memristive BiFeO3 Bilayer Structures for Compact Sequential Logics;Tiangui You等;《Advanced Functional Materials》;20140224;第24卷(第22期);3357–3365 *
基于忆阻器的耦合行为与突触电路研究;王颜;《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》;20180215(第02期);I135-658 *
基于记忆元件的细胞神经网络研究;贾真真;《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》;20180115(第01期);I140-180 *

Also Published As

Publication number Publication date
CN109359734A (en) 2019-02-19

Similar Documents

Publication Publication Date Title
Yang et al. Research progress on memristor: From synapses to computing systems
CN111279366B (en) Training of artificial neural networks
Xia et al. Fault-tolerant training with on-line fault detection for RRAM-based neural computing systems
US10902317B2 (en) Neural network processing system
US9934463B2 (en) Neuromorphic computational system(s) using resistive synaptic devices
US20150170025A1 (en) Method and apparatus for performing close-loop programming of resistive memory devices in crossbar array based hardware circuits and systems
CN110852429B (en) 1T 1R-based convolutional neural network circuit and operation method thereof
JP6293963B1 (en) Array control device including neuromorphic element, discretization step size calculation method and program
Milo et al. Optimized programming algorithms for multilevel RRAM in hardware neural networks
Fu et al. Mitigating nonlinear effect of memristive synaptic device for neuromorphic computing
WO2023000586A1 (en) Storage and computation integrated apparatus and calibration method therefor
CN109359734B (en) Neural network synaptic structure based on memristor unit and adjusting method thereof
Ma et al. Go unary: A novel synapse coding and mapping scheme for reliable ReRAM-based neuromorphic computing
Lepri et al. Modeling and compensation of IR drop in crosspoint accelerators of neural networks
US11321608B2 (en) Synapse memory cell driver
Sun et al. Unary coding and variation-aware optimal mapping scheme for reliable ReRAM-based neuromorphic computing
CN112199234A (en) Neural network fault tolerance method based on memristor
Antolini et al. Combined HW/SW drift and variability mitigation for PCM-based analog in-memory computing for neural network applications
Zhang et al. An efficient programming framework for memristor-based neuromorphic computing
Sun et al. Quaternary synapses network for memristor-based spiking convolutional neural networks
CN117151176A (en) Synaptic array, operation circuit and operation method for neural network learning
CN115796252A (en) Weight writing method and device, electronic equipment and storage medium
CN114186667A (en) Method for mapping recurrent neural network weight matrix to memristor array
Zhang et al. A pulse-width modulation neuron with continuous activation for processing-in-memory engines
Lin et al. A High-Speed and High-Efficiency Diverse Error Margin Write-Verify Scheme for an RRAM-Based Neuromorphic Hardware Accelerator

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant