CN109359734B - Neural network synaptic structure based on memristor unit and adjusting method thereof - Google Patents

Neural network synaptic structure based on memristor unit and adjusting method thereof Download PDF

Info

Publication number
CN109359734B
CN109359734B CN201811245921.9A CN201811245921A CN109359734B CN 109359734 B CN109359734 B CN 109359734B CN 201811245921 A CN201811245921 A CN 201811245921A CN 109359734 B CN109359734 B CN 109359734B
Authority
CN
China
Prior art keywords
round
memristor
unit
cascaded
neural network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811245921.9A
Other languages
Chinese (zh)
Other versions
CN109359734A (en
Inventor
帅垚
乔石珺
吴传贵
罗文博
王韬
张万里
彭赟
潘忻强
梁翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN201811245921.9A priority Critical patent/CN109359734B/en
Publication of CN109359734A publication Critical patent/CN109359734A/en
Application granted granted Critical
Publication of CN109359734B publication Critical patent/CN109359734B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/06Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
    • G06N3/063Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Theoretical Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Neurology (AREA)
  • Feedback Control In General (AREA)

Abstract

本发明涉及计算机与电子信息技术领域,具体涉及一种基于忆阻器单元的神经网络突触结构及其调节方法。本发明将多个忆阻器单元级联在一起作为一个电子突触(即级联忆阻器单元组),使用一种加权的映射规则将其电阻值映射到神经网络中的突触权值,并采用全调节或分步调节的方式来避免神经网络在学习过程中因忆阻器无法精确调节所带来的影响,从而使各类神经网络均可在基于忆阻器的系统上正常运行。本发明解决了电阻值到突触权值的映射问题与忆阻单元阻值变化的不确定性问题。

Figure 201811245921

The invention relates to the technical field of computer and electronic information, in particular to a neural network synapse structure based on a memristor unit and an adjustment method thereof. The present invention cascades multiple memristor units together as an electronic synapse (ie, a cascaded memristor unit group), and uses a weighted mapping rule to map its resistance value to the synaptic weight in the neural network , and adopts full adjustment or step-by-step adjustment to avoid the influence of the neural network in the learning process due to the inability of memristor to adjust accurately, so that all kinds of neural networks can run normally on memristor-based systems . The invention solves the problem of mapping the resistance value to the synaptic weight value and the uncertainty problem of the resistance value change of the memristive unit.

Figure 201811245921

Description

Neural network synaptic structure based on memristor unit and adjusting method thereof
Technical Field
The invention relates to the technical field of computers and electronic information, in particular to a neural network synaptic structure based on memristor units and a regulating method thereof.
Background
Synapses are intermediate structures connecting different neurons in a neural network, and weights of the synapses are continuously updated through corresponding neural network algorithms, so that the basis of information processing of the neural network is provided. Memristors, as a new component with resistance-tunable characteristics, are considered to be one of 4 basic circuit elements in parallel with a resistor, an inductor, and a capacitor.
Researchers find that the change law of synaptic connection strength and the electrical property (the electrical conductance value of the memristor is regulated by the applied electrical signal) of the memristor have similar change laws, so that a single memristor can simulate the function of a synapse. Compared with the operation of a neural network program by using an electronic computer (namely a traditional complementary metal oxide semiconductor-based transistor circuit), the memristor unit can greatly reduce the energy consumption, improve the reading and writing speed and simultaneously reduce the complexity of an integrated circuit. However, the memristor also has inherent defects, and firstly, as an analog device, the resistance of the memristor is nonlinear analog quantity change under the regulation and control of voltage, and cannot be accurately adjusted to the expected weight value as a digital device. At present, most researchers adopt weights which map the resistance values of the memristors to the neural network in a one-to-one manner, the neural network based on the mapping method cannot accurately adjust the weights, and how to effectively map the resistance values of the memristors in the memristor circuit to the corresponding synapse weights in the neural network matrix so as to achieve the optimal results of operation precision and power consumption is a very critical problem.
At present, synapse structures based on memristive devices in the prior art mainly include: prezioso, M.1And the difference value of the two memristive cells is used for defining the weight value in the used neural network, and in the regulation process, the excitation voltage is only applied to the previous memristive cell, and the resistance value of the latter memristive cell is kept unchanged, so that the weight values of all synapses have the same reference value. The updating adopts manhattan weight value updating rule, namely
Figure BDA0001840516250000011
Therefore, the positive and negative amplitudes of the applied excitation voltage needed by the memristive cell needing to be adjusted are obtained. Ligang Gao2The practice of et al is to use program controlled excitation voltages (including pulse width and amplitude) in combination with program additionsUntil the resistance of the memristor is close to or equal to the target resistance value, so that the weight value of each step is updated to a desired value, and the neural network can normally operate.
The difference value of two memristive units is used for representing the weight value of a synapse, and the Manhattan weight value updating rule is adopted, so that only simple image recognition can be realized due to the fact that the inaccuracy of synapse weight value regulation is not fundamentally solved, and the function of a more complex neural network cannot be completed. By adding the technology of writing in the verification step, the complexity of program control is increased, and the advantage of weight adjustment which can be quickly finished originally is changed into a time-consuming repeated verification process.
Disclosure of Invention
Aiming at the problems or the defects, the invention provides a neural network synaptic structure based on a memristor unit and a regulating method thereof, aiming at solving the problem of mapping from a resistance value to a synaptic weight and the problem of uncertainty of resistance value change of the memristor unit under the action of an excitation voltage.
A neural network synaptic structure based on memristor units is a cascade memristor unit group formed by cascading n memristor units.
The number n of the memristor units is determined by the accuracy of synaptic weights required by the normal operation of a neural network, the connection mode of each memristor unit on a physical layer is organized by using a cascade connection mode, and the resistance value of each memristor unit is preprocessed according to the following formula:
Figure BDA0001840516250000021
wherein i refers to the ith cascaded memristor cell in the cascaded memristor cell group, RiIs the real-time resistance value R of the ith unit in the cascade memristor unit groupmidIs the median value of the resistance value variation range of each unit in the cascade memristor unit group, RminIn cascaded memristor cell groupsMinimum value of resistance variation range, w, of each celliAnd mapping the resistance value of the ith cascade memristor unit in the cascade memristor unit group to obtain a numerical value. Each memristor cell is the same, RmidAnd RminIs a constant value.
And organizing the numerical values obtained by the preprocessing in an algorithm level according to the following formula:
W=round(wX,Y)*10X-1+round(wX-1,Y)*10X-2+…+round(wi,Y)*10i-1 +…+round(w1,Y)*100+round(w-1,Y-1)*10-1+round(w-2,Y-2)*10-2 +…+round(w-i,Y-i)*10-i+…+round(w-Y,0)*10-Y
or
W=round(wX,0)*10X-1+round(wX-1,0)*10X-2+…+round(wi,0)*10i-1 +…+round(w1,0)*100+round(w-1,0)*10-1+round(w-2,0)*10-2 +…+round(w-i,0)*10-i+…+round(w-Y,0)*10-Y
Wherein, W is a weight of the cascade memristor unit group (namely, a neural network electronic synapse), X and Y are numbers of units correspondingly representing an integer and a decimal part of the weight in the cascade memristor unit group respectively, and Y is the precision of the represented weight at the same time; round (w)iY-i) is represented by the pair wiRounding to retain the Y-i decimal place.
The simulated synapses organized in the above manner are updated in synapse weights by using two adjustment modes.
The first is full regulation, that is, all the memristive cells in cascade participate in regulation in the process of weight value updating, that is, the resistance values of all the memristive cells in cascade are changed through the operation of applying electric pulses.
The second type is step adjustment, namely, the memristor unit needing to be adjusted is judged according to the change condition of the error in the process of updating the weight. Specifically, in the early stage of program operation, because the change of the weight value is large, only the bit with the higher weight value in the above representation method is adjusted; at the later stage of the program run, the update of each weight value is already small, i.e. the learning process is close to being completed, at which point the application of electrical pulses to all cells is resumed for fine adjustment to reach the desired weight value as soon as possible. And when the error is more than 5-20 times of err _ good, the early stage is considered, otherwise, the later stage is considered, the err _ good is an error triggering condition for ending the program set in the program, and the learning process is ended when the current error is less than the value.
According to the method, a plurality of memristor units are cascaded together to serve as an electronic synapse (namely a cascade memristor unit group), a weighted mapping rule is used for mapping the resistance value of the memristor units to the synapse weight in the neural network, and the influence of the neural network caused by the fact that the memristor cannot be accurately adjusted in the learning process is avoided in a full-adjustment or step-by-step adjustment mode, so that various neural networks can normally operate on a system based on the memristor.
In summary, the invention solves the problem of mapping the resistance value to the synaptic weight and the problem of uncertainty of change of the resistance value of the memristive cell.
Drawings
FIG. 1 is a diagram of synapse structure of a neural network based on memristive devices in the prior art;
FIG. 2 is a test and control system used in an embodiment;
FIG. 3 is a diagram of an embodiment of a neural network synapse structure based on five memristive cells;
FIG. 4 is a flow chart of the modulation principle of the synapse structure of the invention;
FIG. 5 is a tolerance test of the neural network of the embodiment to errors during the tuning process in three different learning modes.
Detailed Description
The invention is explained in more detail below with reference to the drawings and the examples
Fig. 2 shows a test and control system used in the embodiment, and the upper part of the figure is a 3706A system control switch which selects the memristive cell needing to operate through connection and disconnection of an internal circuit. The 2400 interface is used for connecting a Keithley2400Source Meter digital multimeter to read the resistance of the memristive unit, and is also connected with a function generator interface for generating excitation voltage, so that the resistance of the memristive unit is updated through pulse voltage generated by the function generator. The DUT in the figure is the memristive cell to be controlled, and only one cell is shown for illustration.
FIG. 3 shows an example of a memristive cell array, where each cross-point is a memristive cell, two ends of the memristive cell are connected to a circuit for controlling the memristive cell and connected to a system control switch, and a row of five memristive cells are used as an analog electronic synapse, and a write voltage V is applied by program controlwriteAnd a read voltage Vread. Five memristor units in the dashed box are the neural network synapse structure of the present embodiment.
Specific weight representation and update details referring to fig. 4, the determination of weight representation by a computer running a BP neural network based on an Iris dataset requires the precision of a total of five decimal places including integer digits, and therefore five memristive cell cascades are required to represent a synaptic weight.
After all external control units are connected with the memristor array, the operation of the program can be started after the initialization operation of all the units, and the resistance values of all the memristor units read in sequence by the digital multimeter are sequentially marked as R after being returned to the computer1,R2,...R5Then according to the formula wi=(Ri-Rmid)/(Rmax-Rmin) Pre-processing is carried out to modulate all resistance values to [ -1, 1 [)]Within the range. And according to the weighting mode:
W=w1·1+w2·0.1+w3·0.01+w4·0.001+w5·0.0001
and obtaining the actual weight of the whole synapse.
In the subsequent weight value updating process, the embodiment adopts a step-by-step adjusting method. In the process of program operation, a neural network error calculated by a computer needs to adjust the weight of a certain synapse during backward propagation, so that a program control instrument applies rectangular pulse voltages with specified quantity and pulse width to corresponding synapse units, and under the excitation of the adjusting pulses, the migration of carriers occurs in each memristive unit, so that the conductance of the unit is increased or decreased.
In this example, when the current error is judged to be greater than 10 times err _ good through a program (err _ good is an error triggering condition for program ending set in the program, and the learning process is ended when the current error is smaller than the value), at this time, a weight value is far away from a target weight value to be learned, so that only the head position with the largest influence on the weight value is adjusted, after a plurality of learning cycles, after the current error value is smaller than 10 times err _ good, each memristor unit is simultaneously adjusted, and fine adjustment is performed to meet the requirement of err _ good.
The specific operation mode is described in detail by taking a group of five memristor units as an example of a cascade memristor unit group: in this example, a one-bit integer, four-bit decimal number is selected for w according to the neural network accuracy requirements, so a set of five memristor cells is used as a cascaded memristor cell set.
In the initial state, the resistance values of all the units are reset to the middle state of the resistance value variation range, and the weight of a synapse unit is assumed to be
W=0.4728·1-0.231·0.1+0.80·0.01+0.2·0.001-0·0.0001=0.4579
The method can be seen to represent the weight of synapses, the value of the first memristor unit plays a main role, and the later memristor unit can accurately adjust the weight of synapses to meet the requirement of accuracy. Assuming that the weight needs to be adjusted to w-1.0482 after the first loop is run, applying a specific number of pulse voltages to the first memristor cell through program control, keeping the latter cells unchanged, gradually slowing down the change of the weight after the adjustment of the learning process for a plurality of times, that is, the learning process is nearly completed, applying pulse control to all the cells through program control at this time, gradually reducing the error value to an allowable error range, ending the learning process, and completing the weight updating.
FIG. 5 shows the results of the actual computer simulation of the above-mentioned processing method, which are the changes of the accuracy under the influence of different errors in the full-tuning, step-by-step tuning and stepless-linkage states, respectively. The abscissa is the error value of the synaptic weight change of the computer simulation, i.e., the added random error. The vertical axis is the accuracy at the end of the program run. It can be seen that the maximum order of magnitude of error can be tolerated initially using full-scale adjustments, while stepped adjustments are less tolerant of errors but can achieve greater accuracy than stepped adjustments.

Claims (2)

1.一种基于忆阻器单元的神经网络突触结构,其特征在于:1. a neural network synapse structure based on memristor unit is characterized in that: 由n个忆阻器单元级联构成的级联忆阻器单元组,n由神经网络正常运行所需要突触权值的精确度来决定;将各个忆阻单元的阻值按照如下公式进行预处理:A cascaded memristor unit group composed of n memristor units cascaded, n is determined by the accuracy of the synaptic weights required for the normal operation of the neural network; the resistance of each memristor unit is pre-determined according to the following formula: deal with:
Figure FDA0001840516240000011
Figure FDA0001840516240000011
其中,i指级联忆阻器单元组内第i个级联的忆阻器单元,Ri为第i个级联的忆阻器单元的实时阻值,Rmid为级联忆阻器单元组中各个单元的阻值变化范围中值,Rmin为级联忆阻器单元组中各个单元的阻值变化范围最小值,wi为第i个级联的忆阻器单元的阻值映射得到的数值;各忆阻器单元相同,Rmid和Rmin为定值;Among them, i refers to the i-th cascaded memristor unit in the cascaded memristor unit group, R i is the real-time resistance value of the i-th cascaded memristor unit, and R mid is the cascaded memristor unit The median value of the resistance value variation range of each unit in the group, R min is the minimum value of the resistance value variation range of each unit in the cascaded memristor unit group, and wi is the resistance value mapping of the i-th cascaded memristor unit. The value of ; each memristor unit is the same, R mid and R min are fixed values; 将上述预处理得到的数值按照如下公式进行算法层面的组织:The values obtained from the above preprocessing are organized at the algorithm level according to the following formula: W=round(wX,Y)*10X-1+round(wX-1,Y)*10x-2+…+round(wi,Y)*10i-1 W=round(w X , Y)*10 X-1 +round(w X-1 , Y)*10 x-2 +…+round( wi , Y)*10 i-1 +…+round(w1,Y)*100+round(w-1,Y-1)*10-1+round(w-2,Y-2)*10-2 +…+round(w 1 , Y)*10 0 +round(w -1 , Y-1)*10 -1 +round(w -2 , Y-2)*10 -2 +…+round(w-i,Y-i)*10-i+…+round(w-Y,0)*10-Y +…+round(w -i , Yi)*10 -i +…+round(w -Y , 0)*10 -Y or W=round(wX,0)*10X-1+round(wX-1,0)*10X-2+…+round(wi,0)*10i-1 W=round(w X , 0)*10 X-1 +round(w X-1 , 0)*10 X-2 +…+round( wi ,0)*10 i-1 +…+round(w1,0)*100+round(w-1,0)*10-1+round(w-2,0)*10-2 +…+round(w 1 , 0)*10 0 +round(w -1 , 0)*10 -1 +round(w -2 , 0)*10 -2 +…+round(w-i,0)*10-i+…+round(w-Y,0)*10-Y +…+round(w -i , 0)*10 -i +…+round(w -Y , 0)*10 -Y 其中,W为级联忆阻器单元组的权值,X 和Y分别为级联忆阻器单元组中对应表示权值整数和小数部分单元的数量,Y同时也是所表示权值的精确度;round(wi,Y-i)为表示对wi四舍五入保留Y-i位小数。Among them, W is the weight of the cascaded memristor unit group, X and Y are the number of units corresponding to the integer and fractional part of the weight in the cascaded memristor unit group, and Y is also the accuracy of the represented weight. ; round( wi , Yi) means rounding wi to keep Yi decimal places.
2.如权利要求1所述基于忆阻器单元的神经网络突触结构,其突触权值的更新方式如下:2. The neural network synapse structure based on memristor unit as claimed in claim 1, the update mode of its synaptic weight is as follows: 第一种为全调节,在权值更新的过程中级联的所有忆阻单元都参与调节,通过施加电脉冲的操作来改变级联的各个忆阻单元的阻值;The first type is full adjustment, in which all the memristive units in the cascade participate in the adjustment during the weight update process, and the resistance value of each memristive unit in the cascade is changed by the operation of applying electrical pulses; 第二种为分步调节,即在权值的更新的过程中根据误差的变化情况,来判断需要调节的忆阻单元;The second is step-by-step adjustment, that is, the memristive unit that needs to be adjusted is determined according to the change of the error during the weight update process; 具体为在程序运行的前期只对权值较高的位进行调节,在程序运行的后期才开始对所有的单元施加电脉冲进行精细的调节,以尽快达到期望权值;误差大于5~20倍err_goal时视为前期,否则为后期,err_goal是程序中设定的程序结束的误差触发条件,当前误差小于该err_goal 值时学习过程结束。Specifically, only the bits with higher weights are adjusted in the early stage of program operation, and electrical pulses are applied to all units for fine adjustment in the later stage of program operation, so as to achieve the expected weights as soon as possible; the error is greater than 5 to 20 times err_goal is regarded as the early stage, otherwise it is the later stage, err_goal is the error trigger condition for the end of the program set in the program, and the learning process ends when the current error is less than the err_goal value.
CN201811245921.9A 2018-10-24 2018-10-24 Neural network synaptic structure based on memristor unit and adjusting method thereof Active CN109359734B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811245921.9A CN109359734B (en) 2018-10-24 2018-10-24 Neural network synaptic structure based on memristor unit and adjusting method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811245921.9A CN109359734B (en) 2018-10-24 2018-10-24 Neural network synaptic structure based on memristor unit and adjusting method thereof

Publications (2)

Publication Number Publication Date
CN109359734A CN109359734A (en) 2019-02-19
CN109359734B true CN109359734B (en) 2021-10-26

Family

ID=65346390

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811245921.9A Active CN109359734B (en) 2018-10-24 2018-10-24 Neural network synaptic structure based on memristor unit and adjusting method thereof

Country Status (1)

Country Link
CN (1) CN109359734B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110364232B (en) * 2019-07-08 2021-06-11 河海大学 High-performance concrete strength prediction method based on memristor-gradient descent method neural network
WO2023059265A1 (en) * 2021-10-08 2023-04-13 Agency For Science, Technology And Research Neural processing core, method of programming a synaptic memory array thereof and method of performing neural network inference thereon
CN118428429B (en) * 2024-07-05 2024-09-13 中国人民解放军国防科技大学 Memristive synapse, memristive crossover array circuit and conductance updating method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106650922A (en) * 2016-09-29 2017-05-10 清华大学 Hardware neural network conversion method, computing device, compiling method and neural network software and hardware collaboration system
CN107578014A (en) * 2017-09-06 2018-01-12 上海寒武纪信息科技有限公司 Information processor and method
CN108009640A (en) * 2017-12-25 2018-05-08 清华大学 The training device and its training method of neutral net based on memristor

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10095718B2 (en) * 2013-10-16 2018-10-09 University Of Tennessee Research Foundation Method and apparatus for constructing a dynamic adaptive neural network array (DANNA)

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106650922A (en) * 2016-09-29 2017-05-10 清华大学 Hardware neural network conversion method, computing device, compiling method and neural network software and hardware collaboration system
CN107578014A (en) * 2017-09-06 2018-01-12 上海寒武纪信息科技有限公司 Information processor and method
CN108009640A (en) * 2017-12-25 2018-05-08 清华大学 The training device and its training method of neutral net based on memristor

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Exploiting Memristive BiFeO3 Bilayer Structures for Compact Sequential Logics;Tiangui You等;《Advanced Functional Materials》;20140224;第24卷(第22期);3357–3365 *
基于忆阻器的耦合行为与突触电路研究;王颜;《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》;20180215(第02期);I135-658 *
基于记忆元件的细胞神经网络研究;贾真真;《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》;20180115(第01期);I140-180 *

Also Published As

Publication number Publication date
CN109359734A (en) 2019-02-19

Similar Documents

Publication Publication Date Title
Yang et al. Research progress on memristor: From synapses to computing systems
CN111279366B (en) Training of artificial neural networks
Xia et al. Fault-tolerant training with on-line fault detection for RRAM-based neural computing systems
US10902317B2 (en) Neural network processing system
CN108009640B (en) Memristor-based neural network training device and training method
US9715655B2 (en) Method and apparatus for performing close-loop programming of resistive memory devices in crossbar array based hardware circuits and systems
US9934463B2 (en) Neuromorphic computational system(s) using resistive synaptic devices
US11620505B2 (en) Neuromorphic package devices and neuromorphic computing systems
CN110852429B (en) 1T 1R-based convolutional neural network circuit and operation method thereof
CN109359734B (en) Neural network synaptic structure based on memristor unit and adjusting method thereof
JP6293963B1 (en) Array control device including neuromorphic element, discretization step size calculation method and program
Milo et al. Optimized programming algorithms for multilevel RRAM in hardware neural networks
Fu et al. Mitigating nonlinear effect of memristive synaptic device for neuromorphic computing
KR20210050966A (en) Stacked neuromorphic devices and neuromorphic computing devices
Lepri et al. Modeling and compensation of IR drop in crosspoint accelerators of neural networks
Ma et al. Go unary: A novel synapse coding and mapping scheme for reliable ReRAM-based neuromorphic computing
US11321608B2 (en) Synapse memory cell driver
CN114186667A (en) A Mapping Method of Recurrent Neural Network Weight Matrix to Memristive Array
CN113971981A (en) Method and apparatus for operating memory cells and memory arrays
Sun et al. Quaternary synapses network for memristor-based spiking convolutional neural networks
CN115605026B (en) A passive memristor crossbar array device that can directly realize weight differentiation
Lin et al. A high-speed and high-efficiency diverse error margin write-verify scheme for an RRAM-based neuromorphic hardware accelerator
WO2020129204A1 (en) Neuromorphic circuit, neuromorphic array learning method and program
KR20200024419A (en) Neuromorphic device using 3d crossbar memory
US12229680B2 (en) Neural network accelerators resilient to conductance drift

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant