CN109165730A - State quantifies network implementation approach in crossed array neuromorphic hardware - Google Patents
State quantifies network implementation approach in crossed array neuromorphic hardware Download PDFInfo
- Publication number
- CN109165730A CN109165730A CN201811029532.2A CN201811029532A CN109165730A CN 109165730 A CN109165730 A CN 109165730A CN 201811029532 A CN201811029532 A CN 201811029532A CN 109165730 A CN109165730 A CN 109165730A
- Authority
- CN
- China
- Prior art keywords
- crossed array
- neuromorphic hardware
- quantization
- crossed
- state
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/06—Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
- G06N3/061—Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using biological neurons, e.g. biological neurons connected to an integrated circuit
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/06—Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
- G06N3/063—Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Theoretical Computer Science (AREA)
- Molecular Biology (AREA)
- Neurology (AREA)
- Computational Linguistics (AREA)
- Artificial Intelligence (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Feedback Control In General (AREA)
- Compression, Expansion, Code Conversion, And Decoders (AREA)
Abstract
The invention belongs to nerual network technique field, it is related to state quantization network implementation approach in a kind of crossed array neuromorphic hardware.Method of the invention is, after quantifying to artificial neural network parameters (parameters such as weight, threshold value, leakage constant, set voltage value, refractory period duration, synaptic delay duration), parameters after quantization are mapped in crossed array neuromorphic hardware, then will by pretreatment after input data be sent in crossed array neuromorphic hardware can be realized state quantization network.Quantified by state, effectively reduces requirement of the crossed array neuromorphic hardware to the scale of storage unit, storage series, reliability etc..
Description
Technical field
The invention belongs to nerual network technique field, it is related to state quantization network in a kind of crossed array neuromorphic hardware
Implementation method.
Background technique
Neuromorphic hardware (Neuromorphic computing) is used to refer to generation and universal von neumann machine
The computer, device and model derived from brain of architecture formation sharp contrast.This bionics method creates height and connects
The synthesis neuron connect and cynapse can be used for neuroscience theory modeling, solve Machine Learning Problems.
Neuromorphic circuit is one of physics realization of neural network model, with the means of Hardware to biological nervous system
High level is carried out, is efficiently abstracted and simulates, to which low function can be reached on the basis of realizing nervous system information processing capability
The characteristics such as consumption, high-adaptability.
Memristor is used for data storage with parallel computation and as a kind of important of neural network node by crossed array
Structure composed is exactly that crossed array (Crossbar) is used to set up large-scale integrated computing circuit.By square crossing array,
A large amount of memristor putting together in parallel can be formed into memristor matrix.Under different voltage control, reads and change
The value for becoming memristor can obtain and be arranged a weight matrix.Crossed array is widely used in data storage and Neural Network Science
It practises.
Crossed array infall unit also can choose other devices and constitute in addition to memristor, such as capacitor, transistor,
Variable resistance etc. can also form array as memristor, and crossed array neuromorphic hardware is stored or be used for for data
In.
The existing technology has at least the following problems:
In the crossed array neuromorphic hardware realized at present, the various parameters of synaptic weight and neuron such as threshold
Value, leakage constant, set voltage, refractory period duration, the parameters such as synaptic delay duration need to occupy many system memory resources, with
The sharply expansion of circuit scale, in the case where nowadays storage resource is relatively deficient, this will necessarily become neuromorphic
One significant bottleneck of hardware.
Summary of the invention
In view of the above-mentioned problems, the invention proposes states in crossed array neuromorphic hardware to quantify network implementation approach,
Various parameters in crossed array neuromorphic hardware are subjected to state quantization, effectively reduce crossed array neuromorphic hardware
Requirement to the scale of storage unit, storage series, reliability etc., can effectively promote answering for crossed array neuromorphic hardware
With.
Technical scheme is as follows:
S1: Selecting All Parameters simultaneously quantify it, and parameter quantization can carry out after the completion of neural metwork training, can also be in mind
Through being carried out when network training.
A: quantify after the completion of neural metwork training
Artificial neural network (including MLP, CNN, RNN, LSTM etc.) is trained under particular task and specified conditions
It obtains parameter (including weight, threshold value, leakage constant, set voltage value, refractory period duration, synaptic delay duration etc.);
The artificial neural network parameter obtained in S1 is quantified in impulsive neural networks, in impulsive neural networks
The parameter obtained at least one above-mentioned artificial neural network training quantifies, i.e., replaces trained institute with several quantization states
The institute of the parameter obtained is stateful.Adjusted the pulse so that after parameter quantization repeatedly to the parameter quantified in impulsive neural networks
Neural network reaches scheduled function and performance, then parameter quantization is completed.
B: quantify in neural metwork training
The value for the parameter (such as weight) that will quantify when being trained to artificial neural network quantifies, and will such as weigh
Value quantized value is set to -1, -0.4,0,0.4,1, is trained again to artificial neural network later, maps parameter after the completion of training
Into corresponding impulsive neural networks, and quantization parameter value progress is adjusted or chosen again to the resulting quantization parameter of training
Training, until impulsive neural networks can reach predetermined function and performance, then parameter quantization is completed.
S2: the impulsive neural networks parameter after quantifying in S1 is mapped in crossed array neuromorphic hardware
Quantization parameter in trained pulse network is mapped in crossed array neuromorphic hardware, different ginsengs
Number, is mapped to the corresponding control section in crossed array neuromorphic hardware.As quantization weight is mapped to crossed array nerve
Crossed array in form hardware;Quantization threshold is mapped to neuron threshold value control section;Quantization leakage constant is mapped to nerve
First leakage constant control section;Quantization set voltage value is mapped to neuron set voltage control section;Quantify refractory period duration
It is mapped to neuron refractory period duration control section;Quantization synaptic delay duration is mapped to synaptic delay duration control section.
S3: pre-processing input data, such as is converted into pulse input and coding, by pretreated input number
According to being sent in crossed array neuromorphic hardware, state quantization network can be realized.
Further, there are many implementations for the crossed array in above-mentioned crossed array neuromorphic hardware;
Specifically, crossed array infall unit can be by a transistor (N-type transistor, P-type transistor, floating gate crystal
Pipe, synapse transistor etc.) it realizes;
Specifically, crossed array infall unit can be realized by a semiconductor memory cell (such as 6 pipe sram cells);
Specifically, crossed array infall unit can be realized by a capacitor;
Specifically, crossed array infall unit can be added a capacitor to realize by a selection transistor;
Specifically, crossed array infall unit can be realized by a memristor;
Specifically, crossed array infall unit can be added a variable resistance to realize by a selection transistor;
Specifically, crossed array infall unit can be added a variable resistance to realize by a rectifier diode.
Beneficial effects of the present invention are to convert artificial neural network on the basis of impulsive neural networks, it can be achieved that one
A or multiple parameters quantizations, and further quantization parameter is mapped in crossed array neuromorphic hardware on this basis
Corresponding control section, to realize that state quantifies network in crossed array neuromorphic hardware, to realize in hardware view
State quantization effectively reduces crossed array neuromorphic hardware to the scale of storage unit, storage series, reliability etc.
It is required that.
Detailed description of the invention
Fig. 1 is that state quantifies network function implementation method in crossed array neuromorphic hardware provided in an embodiment of the present invention
Flow chart;
Fig. 2 is that state quantization network function is in realization crossed array neuromorphic hardware provided in an embodiment of the present invention
System structure chart;
Fig. 3 is the crossed array form realized with capacitor;
Fig. 4 is the crossed array form realized with memristor;
Fig. 5 is the crossed array form for adding variable resistance to realize with transistor;
Fig. 6 be in embodiments of the present invention used by a kind of neuron models.
Specific embodiment
To make the object, technical solutions and advantages of the present invention clearer, below in conjunction with attached drawing to embodiment party of the present invention
Formula is described in further detail.
Refering to fig. 1, the embodiment of the invention provides the sides that state quantization network is realized in crossed array neuromorphic hardware
Method, comprising the following steps:
S1: Selecting All Parameters simultaneously quantify it, and parameter quantization can carry out after the completion of neural metwork training, can also be in mind
Through being carried out when network training.
A: quantify after the completion of neural metwork training
Artificial neural network (including MLP, CNN, RNN, LSTM etc.) is trained under particular task and specified conditions
Obtain parameter, (including weight, threshold value, leakage constant, set voltage value, refractory period duration, synaptic delay duration etc.);
The artificial neural network parameter obtained in S1 is quantified in impulsive neural networks, in impulsive neural networks
The parameter obtained at least one above-mentioned artificial neural network training quantifies, i.e., replaces trained institute with several quantization states
The whole parameters obtained.Adjusted the pulse nerve net so that after parameter quantization repeatedly to the parameter quantified in impulsive neural networks
Network obtains function and performance about the same as protoplast's artificial neural networks;Then parameter quantization is completed.
B: quantify in neural metwork training
The value for the parameter (such as weight) that will quantify when being trained to artificial neural network quantifies, and will such as weigh
Value quantized value is set to -1, -0.4,0,0.4,1, is trained again to artificial neural network later, maps parameter after the completion of training
It is adjusted into corresponding impulsive neural networks, and to the resulting quantization parameter of training, until impulsive neural networks can reach
Function identical with protoplast's artificial neural networks and performance about the same, then parameter quantization are completed.
S2: the impulsive neural networks parameter after quantifying in S1 is mapped in crossed array neuromorphic hardware
Quantization parameter in trained pulse network is mapped in crossed array neuromorphic hardware,.Different ginsengs
Number, the position being mapped in crossed array neuromorphic hardware is different, as quantization weight is mapped to crossed array neuromorphic
Crossed array in hardware;Quantization threshold is mapped to neuron threshold value control section;Quantization leakage constant is mapped to neuron and lets out
Leak constant control section;Quantization set voltage value is mapped to neuron set voltage control section;Quantify the mapping of refractory period duration
To neuron refractory period duration control section;Quantization synaptic delay duration is mapped to synaptic delay duration control section.
S3: pre-processing input data, converts pulse input for original input and encodes etc. to it, will locate in advance
Input data after reason is sent in crossed array neuromorphic hardware, and state quantization network can be realized.
It is described in detail so that weight quantifies as an example below.
Establish corresponding artificial neural network for a certain particular task, the neural network can be MLP, CNN, RNN,
The artificial neural network of any model such as LSTM.
Quantify after the completion of neural metwork training: using traditional artificial neural network training method pair under certain condition
It is trained, the parameter after being trained.The trained parameter of artificial neural network is mapped to and its topological structure phase again
In same impulsive neural networks, the weighting parameter for needing to quantify is chosen, with limited weight state generation in impulsive neural networks
For all weights before, the quantization of weight being realized, impulsive neural networks being tested using the weight after quantization, comparison is at this time
The performance of impulsive neural networks and corresponding artificial neural network, if impulsive neural networks have reached performance indicator, weight amount
Change terminates, the state value that otherwise weighting value quantifies again, and tests again impulsive neural networks, repeatedly, directly
Reach performance indicator to it.
Quantify in neural metwork training: choosing several value shapes of weight before being trained to artificial neural network
State, and artificial neural network is trained, if can reach requirement, training terminates, value state that is on the contrary then changing weight,
And re -training, repeatedly, until artificial neural network reaches requirement, training terminates.By trained quantization weight and its
His parameter is mapped in impulsive neural networks identical with artificial neural network topological structure, is surveyed to impulsive neural networks
Examination compares the performance of impulsive neural networks and corresponding artificial neural network at this time, if impulsive neural networks have reached performance and referred to
Mark, then weight quantization terminate, otherwise adjustment quantization weight value, and carry out artificial neural network training and pulse nerve net again
The mapping of network, repeatedly, until it reaches performance indicator.
Trained impulsive neural networks are mapped to corresponding control of right part in crossed array neuromorphic hardware,
I.e. in crossed array unit.The relevant parameter for adjusting crossed array unit, takes the weight in crossed array neuromorphic hardware
Value resulting limited weight state after prepulse neural metwork training for it.
Input data is pre-processed, pulse input is translated into and is sent to crossed array after encoding to it
In neuromorphic hardware, i.e., available crossed array neuromorphic hardware realization weight quantifies neural network function.
Fig. 2 shows the system structures of crossed array neuromorphic hardware realization state quantization network in the embodiment of the present invention
Figure.As shown, positive weight is realized that input is divided into positive input+V with negative weight by corresponding crossed array respectivelyIn, 1,+
VIn, 2,+VIn, 3,, ,+VIn, nWith negative input-VIn, 1,-VIn, 2,-VIn, 3,, ,-VIn, n(with positive input size symbol all phases
Deng).Wherein, original positive weight, negative weight zero setting are retained in positive weight crossed array;Retain in negative weight crossed array original
Negative weight, positive weight zero setting, and taking absolute value to negative weight.Each input passes through the state of the corresponding unit of crossed array
(i.e. corresponding weight size) generates the input that attached weight, and a column positive input connects corresponding respectively with corresponding column negative input
The positive input and negative input of neuron realize that positive input is subtracted each other with negative input by inside neurons and poor function relevant portion.
Specifically, with input for 1 positive input 2 of positive input, negative input 9, when negative input 10 for, the corresponding input of analysis neuron 19 with it is defeated
Artificial situation, the input and output situation of other neurons, which can be analyzed similarly, to be obtained.Positive input 1 is via 5 generations pair in crossed array
The electric current input with weight answered;Positive input 2 generates the corresponding electric current with weight via 6 in crossed array and inputs;
Negative input 9 generates the corresponding electric current with weight via 13 in crossed array and inputs;Negative input 10 is via in crossed array
14 generate corresponding electric current with weight and input.The column of positive weight crossed array corresponding to neuron 19 generate total
The column of positive input 17, the similar negative weight crossed array corresponding to neuron 19 generate total negative input 18, neuron 19
Relevant processing is carried out to total negative input 18 to total positive input 17, realizes that positive input is subtracted each other with negative input, and carry out to it
Subsequent processing, it is final to generate corresponding output.
Impulsive neural networks quantization weight is mapped to control of right part in crossed array neuromorphic hardware, specifically,
Quantization weight is mapped to the crossed array in system structure shown in Fig. 2, by adjusting the control unit of each crossed array unit
Point, change the state of crossed array unit, realizes weight quantization.
How weight quantization is realized if being illustrated to different crossed array units below
Fig. 3 is the crossed array being made of capacitor, specifically, each of crossed array unit, controls the unit
The bonding strength of input and neuron, i.e. weight size.The quantity of electric charge that capacitor cell in crossed array is worked in partnership on the unit
How many sizes to indicate weight, it is just corresponding bigger that unit inputs generated electric current.Impulsive neural networks quantization weight is reflected
When being mapped to crossed array neuromorphic hardware, according to the value state of quantization weight, the control list of capacitor in crossed array is adjusted
The intersecting maneuver being made of capacitor then can be used so that the quantity of electric charge value state and quantization weight state on capacitor correspond in member
Column realize the weight quantization in crossed array neuromorphic hardware.
Fig. 4 is the crossed array being made of memristor, specifically, weight size is indicated by the resistance value state of memristor, from
And it controls unit and inputs generated size of current.Impulsive neural networks quantization weight is mapped to crossed array neuromorphic hardware
When, according to the value state of quantization weight, the control unit of memristor in crossed array is adjusted, so that the resistance value value of memristor
State and quantization weight state correspond, then the crossed array being made of memristor can be used to realize that crossed array neuromorphic is hard
Weight quantization in part.
Fig. 5 is the crossed array for adding variable resistance to constitute by transistor, specifically, change the resistance value size of variable resistance,
Generated size of current is inputted to control unit.It is hard that impulsive neural networks quantization weight is mapped to crossed array neuromorphic
When part, according to the value state of quantization weight, adjust the control unit of variable resistance in crossed array so that can power transformation group resistance
Be worth value state and quantization weight state to correspond, then can be used by transistor add can the crossed array that constitutes of power transformation group realize and hand over
Pitch the weight quantization in array neuromorphic hardware.
Fig. 6 is a kind of neuronal structure model employed in the embodiment of the present invention.The neuron realize positive input with
It can be to finally entering carry out subsequent processing while negative input.As shown, neuron has a positive input and negative input, positive input with
Negative input is via resistance R1、R2、R3、R4、R5, the weight processing part that constitutes of amplifier 1 and amplifier 2 realize positive input and negative input phase
Subtract, generation finally enters.It finally enters and is handled by subsequent cell and generated corresponding output.The impulsive neural networks of quantization are joined
When number is mapped to neuromorphic hardware, different parameters are mapped to the respective corresponding control section of neuron.Quantization threshold is mapped to
The threshold value control section of neuron in figure, i.e., the variable voltage source part in figure.The variable voltage source can produce multiple voltage values,
Multiple threshold values corresponding to neuron.According to impulsive neural networks quantization threshold, voltage caused by the variable voltage source is adjusted
To change the threshold value of neuron, that is, realize the mapping of quantization threshold;Quantization leakage constant is mapped in figure in neuron
R8, by adjusting R8Resistance value size can be changed capacitor C2The leakage rate of upper charge, i.e., the leakage constant of changeable neuron,
It is corresponding to adjust according to quantization leakage constant when impulsive neural networks quantization leakage constant is mapped to intersection neuromorphic hardware
Whole resistance R8Resistance value size, can be realized quantization leakage constant mapping;Quantization refractory period duration is mapped to neuron in figure
Refractory period duration control unit, i.e., the selection switch S in figure.Switch S is selected to can adjust on state and retention time,
Switch S was in the retention time of discharge condition, and neuron cannot generate reaction to external input, i.e. neuron is in and does not answer
Phase quantifies retention time of the refractory period duration adjustment switch S when neuron is provided every time according to impulsive neural networks, i.e., adjustable
The mapping of quantization refractory period duration then can be achieved in the refractory period duration of whole neuron.
Claims (10)
1. state quantifies network implementation approach in crossed array neuromorphic hardware, which comprises the following steps:
S1: it chooses target component and it is quantified;
S2: the impulsive neural networks parameter after quantifying in S1 is mapped in crossed array neuromorphic hardware: will be trained
Quantization parameter in pulse network is mapped to the corresponding parameter controlling section of crossed array neuromorphic hardware;
S3: pre-processing input data, and pretreated input data is sent in crossed array neuromorphic hardware,
State quantization network can be realized;The pretreatment includes pulse input and coding.
2. state quantifies network implementation approach in crossed array neuromorphic hardware as described in claim 1, which is characterized in that
The target component be weight, threshold value, leakage constant, set voltage value, refractory period duration, one in synaptic delay duration or
It is multiple.
3. state quantifies network implementation approach in crossed array neuromorphic hardware as claimed in claim 2, which is characterized in that
Step S1 is carried out after the completion of neural metwork training:
Artificial neural network is trained under the goal condition of setting and obtains corresponding target component, target component is mapped
Quantify into impulsive neural networks and by the parameter of selection.
4. state quantifies network implementation approach in crossed array neuromorphic hardware as claimed in claim 2, which is characterized in that
Step S1 is in neural metwork training when progress:
The target component that needs are quantified, is quantified when being trained to artificial neural network, by target after the completion of training
Parameter is mapped in corresponding impulsive neural networks, and is adjusted according to demand to the resulting quantization parameter of training, is obtained most
Whole quantization parameter.
5. state quantization its feature of network implementation approach exists in crossed array neuromorphic hardware as described in claim 3 or 4
In the unit in crossed array neuromorphic hardware in crossed array is made of a capacitor.
6. state quantifies network implementation approach in crossed array neuromorphic hardware as described in claim 3 or 4, feature exists
In the unit in crossed array neuromorphic hardware in crossed array is made of a memristor.
7. state quantifies network implementation approach in crossed array neuromorphic hardware as described in claim 3 or 4, feature exists
In the unit in crossed array neuromorphic hardware in crossed array adds variable resistance to constitute by a selection transistor.
8. state quantifies network implementation approach in crossed array neuromorphic hardware as described in claim 3 or 4, feature exists
In the unit in crossed array neuromorphic hardware in crossed array is made of a transistor.
9. state quantifies network implementation approach in crossed array neuromorphic hardware as described in claim 3 or 4, feature exists
In the unit in crossed array neuromorphic hardware in crossed array is by crossed array infall unit by a selection transistor
A capacitor is added to constitute.
10. state quantifies network implementation approach, feature in crossed array neuromorphic hardware as described in claim 3 or 4
It is, the unit in crossed array neuromorphic hardware in crossed array adds a variable resistance structure by a rectifier diode
At.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811029532.2A CN109165730B (en) | 2018-09-05 | 2018-09-05 | State quantization network implementation method in cross array neuromorphic hardware |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811029532.2A CN109165730B (en) | 2018-09-05 | 2018-09-05 | State quantization network implementation method in cross array neuromorphic hardware |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109165730A true CN109165730A (en) | 2019-01-08 |
CN109165730B CN109165730B (en) | 2022-04-26 |
Family
ID=64893970
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811029532.2A Active CN109165730B (en) | 2018-09-05 | 2018-09-05 | State quantization network implementation method in cross array neuromorphic hardware |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109165730B (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109800872A (en) * | 2019-01-28 | 2019-05-24 | 电子科技大学 | A kind of neuromorphic processor shared based on segmentation multiplexing and parameter quantization |
CN111490162A (en) * | 2020-04-14 | 2020-08-04 | 中国科学院重庆绿色智能技术研究院 | Flexible artificial afferent nervous system based on micro-nano structure force-sensitive film and preparation method thereof |
CN111598237A (en) * | 2020-05-21 | 2020-08-28 | 上海商汤智能科技有限公司 | Quantization training method, image processing device, and storage medium |
CN112163673A (en) * | 2020-09-28 | 2021-01-01 | 复旦大学 | Population routing method for large-scale brain-like computing network |
CN112183734A (en) * | 2019-07-03 | 2021-01-05 | 财团法人工业技术研究院 | Neuron circuit |
CN112199234A (en) * | 2020-09-29 | 2021-01-08 | 中国科学院上海微系统与信息技术研究所 | Neural network fault tolerance method based on memristor |
WO2022057222A1 (en) | 2020-09-15 | 2022-03-24 | 深圳市九天睿芯科技有限公司 | In-memory spiking neural network based on current integration |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009026181A (en) * | 2007-07-23 | 2009-02-05 | Ryukoku Univ | Neural network |
CN103201610A (en) * | 2010-10-29 | 2013-07-10 | 国际商业机器公司 | Neuromorphic and synaptronic spiking neural network with synaptic weights learned using simulation |
CN105390520A (en) * | 2015-10-21 | 2016-03-09 | 清华大学 | Parameter configuration method for memristor intersection array |
CN106971372A (en) * | 2017-02-24 | 2017-07-21 | 北京大学 | A kind of code-shaped flash memory system and method for realizing image convolution |
US20170228345A1 (en) * | 2016-02-08 | 2017-08-10 | Spero Devices, Inc. | Analog Co-Processor |
CN108009640A (en) * | 2017-12-25 | 2018-05-08 | 清华大学 | The training device and its training method of neutral net based on memristor |
CN108304922A (en) * | 2017-01-13 | 2018-07-20 | 华为技术有限公司 | Computing device and computational methods for neural computing |
-
2018
- 2018-09-05 CN CN201811029532.2A patent/CN109165730B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009026181A (en) * | 2007-07-23 | 2009-02-05 | Ryukoku Univ | Neural network |
CN103201610A (en) * | 2010-10-29 | 2013-07-10 | 国际商业机器公司 | Neuromorphic and synaptronic spiking neural network with synaptic weights learned using simulation |
CN105390520A (en) * | 2015-10-21 | 2016-03-09 | 清华大学 | Parameter configuration method for memristor intersection array |
US20170228345A1 (en) * | 2016-02-08 | 2017-08-10 | Spero Devices, Inc. | Analog Co-Processor |
CN108304922A (en) * | 2017-01-13 | 2018-07-20 | 华为技术有限公司 | Computing device and computational methods for neural computing |
CN106971372A (en) * | 2017-02-24 | 2017-07-21 | 北京大学 | A kind of code-shaped flash memory system and method for realizing image convolution |
CN108009640A (en) * | 2017-12-25 | 2018-05-08 | 清华大学 | The training device and its training method of neutral net based on memristor |
Non-Patent Citations (3)
Title |
---|
MIAO HU等: "Memristor Crossbar-Based Neuromorphic Computing System: A Case Study", 《IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS》 * |
XINJIANG ZHANG等: "Neuromorphic Computing with Memristor Crossbar", 《PHYSICA STATUS SOLIDI (A)》 * |
胡飞等: "基于忆阻器交叉阵列的卷积神经网络电路设计", 《计算机研究与发展》 * |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109800872A (en) * | 2019-01-28 | 2019-05-24 | 电子科技大学 | A kind of neuromorphic processor shared based on segmentation multiplexing and parameter quantization |
CN109800872B (en) * | 2019-01-28 | 2022-12-16 | 电子科技大学 | Neuromorphic processor based on segmented multiplexing and parameter quantification sharing |
CN112183734A (en) * | 2019-07-03 | 2021-01-05 | 财团法人工业技术研究院 | Neuron circuit |
CN111490162A (en) * | 2020-04-14 | 2020-08-04 | 中国科学院重庆绿色智能技术研究院 | Flexible artificial afferent nervous system based on micro-nano structure force-sensitive film and preparation method thereof |
CN111490162B (en) * | 2020-04-14 | 2023-05-05 | 中国科学院重庆绿色智能技术研究院 | Flexible artificial afferent nerve system based on micro-nano structure force-sensitive film and preparation method thereof |
CN111598237A (en) * | 2020-05-21 | 2020-08-28 | 上海商汤智能科技有限公司 | Quantization training method, image processing device, and storage medium |
WO2021233069A1 (en) * | 2020-05-21 | 2021-11-25 | 上海商汤智能科技有限公司 | Quantization training and image processing methods and devices, and storage medium |
CN111598237B (en) * | 2020-05-21 | 2024-06-11 | 上海商汤智能科技有限公司 | Quantization training, image processing method and device, and storage medium |
WO2022057222A1 (en) | 2020-09-15 | 2022-03-24 | 深圳市九天睿芯科技有限公司 | In-memory spiking neural network based on current integration |
CN112163673A (en) * | 2020-09-28 | 2021-01-01 | 复旦大学 | Population routing method for large-scale brain-like computing network |
CN112163673B (en) * | 2020-09-28 | 2023-04-07 | 复旦大学 | Population routing method for large-scale brain-like computing network |
CN112199234A (en) * | 2020-09-29 | 2021-01-08 | 中国科学院上海微系统与信息技术研究所 | Neural network fault tolerance method based on memristor |
Also Published As
Publication number | Publication date |
---|---|
CN109165730B (en) | 2022-04-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109165730A (en) | State quantifies network implementation approach in crossed array neuromorphic hardware | |
US9330355B2 (en) | Computed synapses for neuromorphic systems | |
US20200342301A1 (en) | Convolutional neural network on-chip learning system based on non-volatile memory | |
US10339041B2 (en) | Shared memory architecture for a neural simulator | |
CN109816026B (en) | Fusion device and method of convolutional neural network and impulse neural network | |
US20150269480A1 (en) | Implementing a neural-network processor | |
US10140573B2 (en) | Neural network adaptation to current computational resources | |
CN110852429B (en) | 1T 1R-based convolutional neural network circuit and operation method thereof | |
Carlson et al. | Biologically plausible models of homeostasis and STDP: stability and learning in spiking neural networks | |
CA2926098A1 (en) | Causal saliency time inference | |
US20210049448A1 (en) | Neural network and its information processing method, information processing system | |
US9361545B2 (en) | Methods and apparatus for estimating angular movement with a single two dimensional device | |
US20150317557A1 (en) | Temporal spike encoding for temporal learning | |
US10552734B2 (en) | Dynamic spatial target selection | |
CN104915195B (en) | A kind of method that neural computing is realized based on field programmable gate array | |
CN105122278B (en) | Neural network and method of programming | |
CN105701540A (en) | Self-generated neural network construction method | |
CN105913119A (en) | Row-column interconnection heterogeneous multi-core brain-like chip and usage method for the same | |
US9536189B2 (en) | Phase-coding for coordinate transformation | |
KR20210152244A (en) | Apparatus for implementing neural network and operation method thereof | |
Zhang et al. | The framework and memristive circuit design for multisensory mutual associative memory networks | |
Schuman et al. | Dynamic adaptive neural network arrays: a neuromorphic architecture | |
US20150278683A1 (en) | Plastic synapse management | |
Hossain et al. | Reservoir computing system using biomolecular memristor | |
US20140365413A1 (en) | Efficient implementation of neural population diversity in neural system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |