CN109472348A - A kind of LSTM nerve network system based on memristor crossed array - Google Patents

A kind of LSTM nerve network system based on memristor crossed array Download PDF

Info

Publication number
CN109472348A
CN109472348A CN201811236611.0A CN201811236611A CN109472348A CN 109472348 A CN109472348 A CN 109472348A CN 201811236611 A CN201811236611 A CN 201811236611A CN 109472348 A CN109472348 A CN 109472348A
Authority
CN
China
Prior art keywords
memristor
voltage
crossed array
threshold value
feature extraction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811236611.0A
Other languages
Chinese (zh)
Other versions
CN109472348B (en
Inventor
温世平
魏华强
黄廷文
曾志刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huazhong University of Science and Technology
Original Assignee
Huazhong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huazhong University of Science and Technology filed Critical Huazhong University of Science and Technology
Priority to CN201811236611.0A priority Critical patent/CN109472348B/en
Publication of CN109472348A publication Critical patent/CN109472348A/en
Application granted granted Critical
Publication of CN109472348B publication Critical patent/CN109472348B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/06Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
    • G06N3/063Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Computational Linguistics (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Neurology (AREA)
  • Semiconductor Memories (AREA)
  • Semiconductor Integrated Circuits (AREA)
  • Thin Film Transistor (AREA)

Abstract

The invention discloses a kind of LSTM nerve network systems based on memristor crossed array, it include: input layer, feature extraction layer and classification output layer, feature extraction layer includes: data storage, the first memristor crossed array, the first D/A converter and converter, and the classification output layer includes: the second D/A converter, the second memristor crossed array and voltage comparator;The first memristor crossed array, for carrying out feature extraction;The second memristor crossed array, for carrying out tagsort, the voltage comparator obtains the comparison result of multiple analog voltages for being compared to multiple analog voltages;Using the maximum value in comparison result as the classification results of the digital signal of input layer.Present system is smaller, and power consumption is lower.

Description

A kind of LSTM nerve network system based on memristor crossed array
Technical field
The invention belongs to field of neural networks, more particularly, to a kind of LSTM nerve net based on memristor crossed array Network system.
Background technique
Convolutional neural networks (CNN), full convolutional network (FCN) and long-term short-term memory (LSTM) even depth neural network Development promotes the research of deep neural network hardware design.In deep neural network hardware circuit design field, due to a large amount of Input data (such as image or text), reasons, the common CMOS projected depth such as the complicated network structure, network parameter be numerous The problems such as size that neural network faces circuit is too big, and power consumption is high.
2008, the researcher of Hewlett-Packard made the memristor of nano-scale, neuromorphic computing system reality for the first time Basic challenge be artificial synapse development, and memristor is found to be us and designs neuromorphic computing architecture and provide perfection The element agreed with.Memristor, non-volatile due to its nano-grade size, high density, low-power consumption and compatible with CMOS technology Property etc. excellent properties, shown its neural network hardware design wide application prospect.
However, since LSTM neural network has very big otherness with CNN in structure and operation logic, it is entire It is no longer practical in LSTM based on the hardware circuit of convolutional neural networks, while the prior art there are sizes larger, power consumption Larger technical problem.
Summary of the invention
Aiming at the above defects or improvement requirements of the prior art, the present invention provides a kind of based on memristor crossed array Thus LSTM nerve network system solves the prior art there are sizes larger, the larger technical problem of power consumption.
To achieve the above object, the present invention provides a kind of LSTM nerve network system based on memristor crossed array, packets It includes: input layer, feature extraction layer and classification output layer,
The feature extraction layer includes: data storage, the first memristor crossed array, the first D/A converter and AD conversion Device, wherein data storage, the digital signal after digital signal and feature extraction for storing input layer;First DA conversion Device, for converting the first analog voltage for the digital signal of input layer;First memristor crossed array, for the first simulation electricity Pressure carries out feature extraction, obtains the first electric current;Converter is believed for converting the number after feature extraction for the first electric current Number;
The classification output layer includes: the second D/A converter, the second memristor crossed array and voltage comparator;Wherein, Two D/A converters, for converting the second analog voltage for the digital signal after feature extraction;Second memristor crossed array, is used for Tagsort is carried out to the second analog voltage, obtains multiple analog voltages;Voltage comparator, for being carried out to multiple analog voltages Compare, obtains the comparison result of multiple analog voltages;Using the maximum value in comparison result as point of the digital signal of input layer Class result.
Further, the first memristor crossed array includes voltage input port, threshold value memristor, voltage inverter, operation Amplifier and multiplier,
In two voltage input ports of arbitrary neighborhood, pass through voltage between a voltage input port and threshold value memristor Phase inverter connection, a voltage input port are connected directly with threshold value memristor, and operational amplifier is in parallel with threshold value memristor;Electricity The output end of one end and operational amplifier for pressing phase inverter connects, and the input terminal of the other end and multiplier connects.
Further, operational amplifier is in parallel with threshold value memristor, for realizing the calculation function of Sigmoid activation primitive And current signal is converted into voltage signal.
Further, operational amplifier is in parallel with threshold value memristor, for realizing the operation function of tanh activation primitive Can and current signal be converted into voltage signal.
Further, the second memristor crossed array includes voltage input port, threshold value memristor and voltage inverter, arbitrarily In two adjacent voltage input ports, connected between a voltage input port and threshold value memristor by voltage inverter, One voltage input port is connected directly with threshold value memristor.
Further, in order to avoid threshold value memristor short circuit, the intersection point of the first memristor crossed array is described without electrical connection The intersection point of second memristor crossed array is without electrical connection.
In general, through the invention it is contemplated above technical scheme is compared with the prior art, can obtain down and show Beneficial effect:
(1) due to the high density of the nano-grade size of memristor and crossed array so that hardware circuit of the invention it is small in size, It is easily integrated and preferably is promoted in deep neural network hardware design on a large scale.In addition, the low power consumption characteristic of memristor So that the power consumption of whole system is substantially reduced compared to materials power consumptions such as CMOS.
(2) a usual threshold value memristor can only express a positive weight, and in the present invention arbitrary neighborhood voltage input In port, connected between a voltage input port and threshold value memristor by voltage inverter, voltage input port with Threshold value memristor is connected directly, and one can be expressed with positive and negative weight by being designed in this way two adjacent thresholds memristors.
Detailed description of the invention
Fig. 1 is shot and long term Memory Neural Networks schematic diagram provided in an embodiment of the present invention;
Fig. 2 is a kind of structure of LSTM nerve network system based on memristor crossed array provided in an embodiment of the present invention Figure;
Fig. 3 is shot and long term Memory Neural Networks feature extraction layer unit schematic diagram provided in an embodiment of the present invention;
Fig. 4 is the circuit diagram of the first memristor crossed array provided in an embodiment of the present invention;
Fig. 5 is the circuit diagram of the second memristor crossed array provided in an embodiment of the present invention.
Specific embodiment
In order to make the objectives, technical solutions, and advantages of the present invention clearer, with reference to the accompanying drawings and embodiments, right The present invention is further elaborated.It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, and It is not used in the restriction present invention.As long as in addition, technical characteristic involved in the various embodiments of the present invention described below Not constituting a conflict with each other can be combined with each other.
As shown in Figure 1, long-term short-term memory (LSTM) neural network includes: input layer, feature extraction layer and classification output Layer.As shown in Fig. 2, a kind of LSTM nerve network system based on memristor crossed array, comprising: input layer, feature extraction layer and Classification output layer, the feature extraction layer includes: data storage, first switch, the first reading circuit, write circuit, the first memristor Crossed array, the first D/A converter and converter, the classification output layer include: second switch, the second reading circuit, the 2nd DA Converter, the second memristor crossed array and voltage comparator;When first switch closure, second switch disconnect, system carries out special Sign is extracted;When first switch disconnects, second switch is closed, system carries out classification output;
The data storage, the digital signal after digital signal and feature extraction for storing input layer;
First reading circuit for reading the digital signal of the input layer in data storage, and is transmitted to the first DA Converter;
First D/A converter, for converting the first analog voltage for the digital signal of input layer;
The first memristor crossed array obtains the first electric current for carrying out feature extraction to the first analog voltage;
The converter, for converting the first electric current to the digital signal after feature extraction;
The write circuit, for data storage to be written in the digital signal after feature extraction;
Second reading circuit for reading the digital signal after the feature extraction in data storage, and is transmitted to Two D/A converters;
Second D/A converter, for converting the second analog voltage for the digital signal after feature extraction;
The second memristor crossed array obtains multiple analog voltages for carrying out tagsort to the second analog voltage;
The voltage comparator obtains the comparison result of multiple analog voltages for being compared to multiple analog voltages;
Using the maximum value in comparison result as the classification results of the digital signal of input layer.
Fig. 3 illustrates the schematic diagram of LSTM neural network internal element (LSTM cell), c in figuretIt is LSTM cell State, ct-1It is the state of last moment LSTM cell, htIt is the output of LSTM cell, ht-1It is last moment LSTM cell Output.ct-1Data width be 1, ctData width be 64, Sigmoid indicate Sigmoid activation primitive, Tanh indicate Tanh activation primitive, f indicate the output of forgetting door, and i indicates that input gate, j indicate to update door, and o indicates out gate, f_b table Show forgetting biasing, btIndicate the biasing of forgetting door, biIndicate the biasing of input gate, bjIndicate the biasing of update door, boIndicate output The biasing of door, WhfIndicate the state weight of forgetting door, WxfIndicate the input weight of forgetting door, WhiIndicate the state power of input gate Weight, WxiIndicate the input weight of input gate, WhjIndicate the state weight of update door, WxjIndicate the input weight of update door, WhoTable Show the state weight of out gate, WxoIndicate the input weight of out gate, X indicates the input of LSTM cell.
It is as follows to be converted into mathematic(al) representation for characteristic extraction part in the schematic diagram of LSTM cell:
F=sigmoid (x*Wxf+ht-1*Whf+bf+f_b)
I=sigmoid (x*Wxi+ht-1*Whi+bi)
J=tanh (x*Wxj+ht-1*Whj+bj)
O=sigmoid (x*Wxo+ht-1*Who+bo)
Above data calculates the function of corresponding crossed array, is the completion function of auxiliary circuit below crossed array below Mathematic(al) representation:
ct=ct-1*f+i*j
ht=o*tanh (ct)
Data in above formula are existed in the form of analog voltage in circuit.
As shown in figure 4, the first memristor crossed array includes the first backbone circuits and the first auxiliary circuit, the first backbone circuits Including voltage input port 1, threshold value memristor 2, voltage inverter 3, operational amplifier 4, the first auxiliary circuit includes that operation is put Big device 4, resistance 5 and multiplier 6, the intersection point of the first memristor crossed array is without electrical connection.
A kind of LSTM nerve network system based on memristor crossed array includes n LSTM cell, i.e., a kind of to be based on memristor The LSTM nerve network system of crossed array includes n the first memristor crossed arrays, and the first analog voltage includes VX1-VXmAnd Vh1-Vhn, the first analog voltage by voltage input port input the first backbone circuits.One threshold value memristor can only express one A positive weight, and in the present invention in the voltage input port of arbitrary neighborhood, between a voltage input port and threshold value memristor It is connected by voltage inverter, a voltage input port is connected directly with threshold value memristor, is designed in this way two adjacent thresholds Memristor, which can express one, has positive and negative weight.
M1 and M2 is threshold value memristor, Vs1 +For 1V DC voltage, Vs1 -Ground connection, Vs2 +For 1V DC voltage, Vs2 -For -1V DC voltage, resistance R1、R2、R3、R4And R5Middle resistance value needs are identical, can be used 1K ohm to 10K ohm.Operational amplifier and threshold It is in parallel to be worth memristor M1, realize Sigmoid activation primitive and current signal is converted into voltage signal, operational amplifier and threshold It is in parallel to be worth memristor M2, realize tanh activation primitive and current signal is converted into voltage signal.Voltage inverter The input terminal of the connection of the output end of one end and operational amplifier, the other end and multiplier connects, for converting the direction of voltage.
Vc(t-1)Indicate the state of the first memristor of last moment crossed array, VctIndicate the state of the first memristor crossed array, VhtIndicate the first electric current.The mathematic(al) representation of the completion function of first auxiliary circuit:
ct=ct-1*f+i*j
ht=o*tanh (ct)
As shown in figure 5, the second memristor crossed array includes the second backbone circuits and the second auxiliary circuit, the second backbone circuits Including voltage input port, threshold value memristor and voltage inverter, in the voltage input port of arbitrary neighborhood, a voltage input It is connected between port and threshold value memristor by voltage inverter, a voltage input port is connected directly with threshold value memristor. The intersection point of second memristor crossed array is without electrical connection, and the second auxiliary circuit is a voltage comparator, the voltage comparator, For being compared to multiple analog voltages, the comparison result of multiple analog voltages is obtained;Maximum value in comparison result is made For the classification results V of the digital signal of input layero
The present invention solves the problems, such as LSTM neural network in hardware design, while introducing memristor progress circuit design The hardware implementation method including activation primitive is provided, The present invention gives LSTM neural networks based on memristor crossed array Complete implementation.Due to the nano-grade size of memristor and the high density of crossed array, so that the hardware circuit volume of the design It is small, be easily integrated and preferably promoted in deep neural network hardware design on a large scale.In addition, the low-power consumption of memristor is special Property the power consumption of whole system is substantially reduced compared to materials power consumptions such as CMOS.
As it will be easily appreciated by one skilled in the art that the foregoing is merely illustrative of the preferred embodiments of the present invention, not to The limitation present invention, any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should all include Within protection scope of the present invention.

Claims (6)

1. a kind of LSTM nerve network system based on memristor crossed array, comprising: input layer, feature extraction layer and classification output Layer, which is characterized in that
The feature extraction layer includes: data storage, the first memristor crossed array, the first D/A converter and converter, In, data storage, the digital signal after digital signal and feature extraction for storing input layer;First D/A converter is used In converting the first analog voltage for the digital signal of input layer;First memristor crossed array, for the first analog voltage into Row feature extraction obtains the first electric current;Converter, for converting the first electric current to the digital signal after feature extraction;
The classification output layer includes: the second D/A converter, the second memristor crossed array and voltage comparator;Wherein, the 2nd DA Converter, for converting the second analog voltage for the digital signal after feature extraction;Second memristor crossed array, for the Two analog voltages carry out tagsort, obtain multiple analog voltages;Voltage comparator, for comparing multiple analog voltages Compared with obtaining the comparison result of multiple analog voltages;Using the maximum value in comparison result as the classification of the digital signal of input layer As a result.
2. a kind of LSTM nerve network system based on memristor crossed array as described in claim 1, which is characterized in that described First memristor crossed array includes voltage input port, threshold value memristor, voltage inverter, operational amplifier and multiplier,
In two voltage input ports of arbitrary neighborhood, pass through voltage inversion between a voltage input port and threshold value memristor Device connection, a voltage input port are connected directly with threshold value memristor, and operational amplifier is in parallel with threshold value memristor;Voltage is anti- One end of phase device and the output end of operational amplifier connect, and the input terminal of the other end and multiplier connects.
3. a kind of LSTM nerve network system based on memristor crossed array as claimed in claim 2, which is characterized in that described Operational amplifier is in parallel with threshold value memristor, turns for realizing the calculation function of Sigmoid activation primitive and by current signal It is changed to voltage signal.
4. a kind of LSTM nerve network system based on memristor crossed array as claimed in claim 2, which is characterized in that described Operational amplifier is in parallel with threshold value memristor, turns for realizing the calculation function of tanh activation primitive and by current signal It is changed to voltage signal.
5. a kind of LSTM nerve network system based on memristor crossed array as described in claim 1, which is characterized in that described Second memristor crossed array includes voltage input port, threshold value memristor and voltage inverter, and two voltages of arbitrary neighborhood are defeated In inbound port, connected between a voltage input port and threshold value memristor by voltage inverter, a voltage input port It is connected directly with threshold value memristor.
6. a kind of LSTM nerve network system based on memristor crossed array a method as claimed in any one of claims 1 to 5, feature exist In the intersection point of the first memristor crossed array is without electrical connection, and the intersection point of the second memristor crossed array is without electrical connection.
CN201811236611.0A 2018-10-23 2018-10-23 LSTM neural network system based on memristor cross array Active CN109472348B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811236611.0A CN109472348B (en) 2018-10-23 2018-10-23 LSTM neural network system based on memristor cross array

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811236611.0A CN109472348B (en) 2018-10-23 2018-10-23 LSTM neural network system based on memristor cross array

Publications (2)

Publication Number Publication Date
CN109472348A true CN109472348A (en) 2019-03-15
CN109472348B CN109472348B (en) 2022-02-18

Family

ID=65664136

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811236611.0A Active CN109472348B (en) 2018-10-23 2018-10-23 LSTM neural network system based on memristor cross array

Country Status (1)

Country Link
CN (1) CN109472348B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111460365A (en) * 2020-03-10 2020-07-28 华中科技大学 Equation set solver based on memristive linear neural network and operation method thereof
CN111680792A (en) * 2020-06-18 2020-09-18 中国人民解放军国防科技大学 Activation function circuit, memristor neural network and control method of memristor neural network
CN111755062A (en) * 2019-03-26 2020-10-09 慧与发展有限责任合伙企业 Self-repairing dot product engine
CN112749784A (en) * 2019-10-31 2021-05-04 华为技术有限公司 Computing device and neural network acceleration method
CN112884141A (en) * 2021-04-16 2021-06-01 安徽大学 Memristive coupling Hindmarsh-Rose neuron circuit
CN116523011A (en) * 2023-07-03 2023-08-01 中国人民解放军国防科技大学 Memristor-based binary neural network layer circuit and binary neural network training method
CN112749784B (en) * 2019-10-31 2024-05-14 华为技术有限公司 Computing device and acceleration method of neural network

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140172937A1 (en) * 2012-12-19 2014-06-19 United States Of America As Represented By The Secretary Of The Air Force Apparatus for performing matrix vector multiplication approximation using crossbar arrays of resistive memory devices
CN104916313A (en) * 2015-06-16 2015-09-16 清华大学 Neural network synapse structure based on memristive devices and synaptic weight building method
CN106815636A (en) * 2016-12-30 2017-06-09 华中科技大学 A kind of neuron circuit based on memristor
CN107273972A (en) * 2017-05-11 2017-10-20 北京大学 It is a kind of based on resistive device and to adapt to excite the neuromorphic system and implementation method of neuron

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140172937A1 (en) * 2012-12-19 2014-06-19 United States Of America As Represented By The Secretary Of The Air Force Apparatus for performing matrix vector multiplication approximation using crossbar arrays of resistive memory devices
CN104916313A (en) * 2015-06-16 2015-09-16 清华大学 Neural network synapse structure based on memristive devices and synaptic weight building method
CN106815636A (en) * 2016-12-30 2017-06-09 华中科技大学 A kind of neuron circuit based on memristor
CN107273972A (en) * 2017-05-11 2017-10-20 北京大学 It is a kind of based on resistive device and to adapt to excite the neuromorphic system and implementation method of neuron

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
KAMILYA SMAGULOVA等: "Design of CMOS-memristor Circuits for LSTM architecture", 《2018 IEEE INTERNATIONAL CONFERENCE ON ELECTRON DEVICES AND SOLID STATE CIRCUITS (EDSSC)》 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111755062A (en) * 2019-03-26 2020-10-09 慧与发展有限责任合伙企业 Self-repairing dot product engine
US11532356B2 (en) 2019-03-26 2022-12-20 Hewlett Packard Enterprise Development Lp Self-healing dot-product engine
CN112749784A (en) * 2019-10-31 2021-05-04 华为技术有限公司 Computing device and neural network acceleration method
CN112749784B (en) * 2019-10-31 2024-05-14 华为技术有限公司 Computing device and acceleration method of neural network
CN111460365A (en) * 2020-03-10 2020-07-28 华中科技大学 Equation set solver based on memristive linear neural network and operation method thereof
CN111460365B (en) * 2020-03-10 2021-12-03 华中科技大学 Equation set solver based on memristive linear neural network and operation method thereof
CN111680792A (en) * 2020-06-18 2020-09-18 中国人民解放军国防科技大学 Activation function circuit, memristor neural network and control method of memristor neural network
CN112884141A (en) * 2021-04-16 2021-06-01 安徽大学 Memristive coupling Hindmarsh-Rose neuron circuit
CN112884141B (en) * 2021-04-16 2022-10-21 安徽大学 Memristive coupling Hindmarsh-Rose neuron circuit
CN116523011A (en) * 2023-07-03 2023-08-01 中国人民解放军国防科技大学 Memristor-based binary neural network layer circuit and binary neural network training method
CN116523011B (en) * 2023-07-03 2023-09-15 中国人民解放军国防科技大学 Memristor-based binary neural network layer circuit and binary neural network training method

Also Published As

Publication number Publication date
CN109472348B (en) 2022-02-18

Similar Documents

Publication Publication Date Title
CN109472348A (en) A kind of LSTM nerve network system based on memristor crossed array
Biswas et al. Conv-RAM: An energy-efficient SRAM with embedded convolution computation for low-power CNN-based machine learning applications
CN110352436B (en) Resistance processing unit with hysteresis update for neural network training
Park et al. RRAM-based synapse for neuromorphic system with pattern recognition function
CN106374912B (en) A kind of logical operation circuit and operating method
CN106971372B (en) Coding type flash memory system and method for realizing image convolution
Tang et al. A hardware friendly unsupervised memristive neural network with weight sharing mechanism
Pisarev et al. 3D memory matrix based on a composite memristor-diode crossbar for a neuromorphic processor
CN110007895B (en) Analog multiplication circuit, analog multiplication method and application thereof
KR101912881B1 (en) Neuromorphic memristor crossbar circuit
KR102196523B1 (en) Floating gate memristor and neuromorphic device
CN107545305A (en) A kind of neuron circuit based on CMOS technology, numerical model analysis, charge-domain
CN110651330A (en) Deep learning in a two-memristive network
Jarollahi et al. Reduced-complexity binary-weight-coded associative memories
CN114365078A (en) Reconstructing MAC operations
CN112465134A (en) Pulse neural network neuron circuit based on LIF model
WO2019096660A1 (en) Competitive machine learning accuracy on neuromorphic arrays with capacitor memory devices
CN109524039B (en) Structure and related method for expanding resistance state number of memristor
CN105741870B (en) A kind of non-volatile d type flip flop circuit based on memristor
Felder et al. Reminding forgetful organic neuromorphic device networks
CN114298297A (en) Memory computing device, chip and electronic equipment
DeSalvo et al. Emerging resistive memories for low power embedded applications and neuromorphic systems
Khorami et al. One‐dimensional adiabatic circuits with inherent charge recycling
Van Nguyen et al. Comparative study on quantization-aware training of memristor crossbars for reducing inference power of neural networks at the edge
CN110991628B (en) Neuron circuit based on charge pump

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant