CN110766130A - BP neural network learning circuit - Google Patents

BP neural network learning circuit Download PDF

Info

Publication number
CN110766130A
CN110766130A CN201810850056.4A CN201810850056A CN110766130A CN 110766130 A CN110766130 A CN 110766130A CN 201810850056 A CN201810850056 A CN 201810850056A CN 110766130 A CN110766130 A CN 110766130A
Authority
CN
China
Prior art keywords
voltage
weight
circuit
learning
circuit module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810850056.4A
Other languages
Chinese (zh)
Other versions
CN110766130B (en
Inventor
胡作启
章志强
刘普昌
李阳
祝捷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huazhong University of Science and Technology
Original Assignee
Huazhong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huazhong University of Science and Technology filed Critical Huazhong University of Science and Technology
Priority to CN201810850056.4A priority Critical patent/CN110766130B/en
Publication of CN110766130A publication Critical patent/CN110766130A/en
Application granted granted Critical
Publication of CN110766130B publication Critical patent/CN110766130B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/049Temporal neural networks, e.g. delay elements, oscillating neurons or pulsed inputs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/06Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
    • G06N3/063Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • General Health & Medical Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Computational Linguistics (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Neurology (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a BP neural network learning circuit. The circuit mainly comprises a difference circuit module, a summation circuit module, a weight value storage circuit module and a weight value refreshing circuit module. The difference circuit mainly performs an error calculation, the summation circuit completes the function of weight voltage accumulation, the weight storage circuit stores the current weight voltage at two ends of the capacitor, and the weight refreshing circuit refreshes the current weight voltage through the charge and discharge of the capacitor. The main purpose of the circuit is to realize the function that the weight value in the BP neural network can be continuously adjusted, and the learning times can be controlled through the pulse switch, so that under the condition of learning a plurality of samples, the learning proportion of each sample is the same under the control of the pulse switch, and the system can learn the characteristics of each sample indiscriminately when the learning circuit is applied to the BP neural network.

Description

BP neural network learning circuit
Technical Field
The invention belongs to the field of BP (back propagation) neural networks, relates to a BP neural network realized based on a hardware circuit, and particularly relates to a learning circuit.
Background
Since the advent of computers, various complex scientific calculations and information processing can be completed without human brains, because in many complex calculations, if a definite calculation mode is given, the computer can calculate results quickly, and the calculation speed of the human brains is far less than one-tenth of the calculation speed of the human brains; however, in the case of some ambiguous information or the given information is not very accurate, the computer is not careful, and especially the processing of some problems requiring reasoning is greatly limited, so that the development of artificial neural networks is advanced. The mathematical model of neural networks was proposed as early as 1943 by the professor WarrenMcCulloch and the mathematician Walter Pitts, and later in 1958 developed a perceptron, equivalent to a binary classifier, while the artificial neural networks were far from developing into a widely applicable state, until 1986 the concept of BP neural networks was proposed in the book "parallel distributed processing" by the scientific group headed by Rumelhart and McClelland. The main sign of the BP neural network is a back propagation algorithm, until today, the algorithm is still the main learning algorithm of the neural network, and up to now, the development of the BP neural network is mature, the application is very wide, and the main application range is as follows: function fitting and prediction, pattern recognition, classification, data compression storage and the like.
However, the traditional learning algorithm of the BP neural network, i.e. the gradient descent method, has some obvious disadvantages, such as being easy to fall into a local minimum value in the learning process, or having a slow convergence rate when the input is very small or the input data is at both ends of an activation function, so at present, most researchers at home and abroad are directed to the research on the improvement of the BP neural network algorithm, or the improvement of the traditional algorithm itself, such as adding a momentum term method, an adaptive learning rate method, introducing a steepness factor and the like, or the modification of the algorithm, such as using a newton method, an LMBP algorithm, a genetic algorithm and the like. In addition, there are many researches on the application of the BP neural network, such as air index prediction, face recognition, and data compression for storage. Most of the above are implemented by using MATLAB, C + +, or other software programming languages, and it is known that these programming languages execute instructions in sequence, and when performing neural network learning operation, the stored neural network weights need to be modified one by one in sequence, and are still serial in nature, and parallel processing needs to be implemented by designing hardware circuits.
At present, few researches on hardware realization of BP neural networks are carried out at home and abroad, and difficulties mainly exist in weight storage and dynamic refreshing, which are expected to be solved in the near future, namely a novel phase change memory which is a very popular memory is researched at home and abroad at present. However, the research on the realization of the BP neural network hardware is still significant at present, the weight storage is not considered temporarily, and only the chip learning function can be realized, at present, some researches on the realization of the BP neural network hardware are probably carried out by Shandong science and technology university, Tianjin university, Xian traffic university and the like, the main research result is that the exclusive or function is realized by using the BP neural network hardware, the input and the output of a group of data are given, the system can continuously learn and finally the output is close to the given output, and the hardware system of the BP neural network can be applied to function fitting, pattern recognition and the like, and the capability of processing the groups of data by using the hardware system is also needed.
Disclosure of Invention
The invention aims to realize the controllability of the learning times of the BP neural network through a circuit, so that a BP neural network system has the function of processing a plurality of training samples, and a method is provided for realizing functions of function fitting, pattern recognition and the like by using BP neural network hardware.
The technical scheme adopted by the invention is as follows:
a BP neural network learning circuit with controllable learning times comprises:
a ground terminal;
the difference calculating circuit module comprises a preset voltage end, an output voltage end fed back by an output feedback line and an error back-propagation voltage end, wherein the preset voltage end is a preset voltage U for presetting a circuit, the output voltage end fed back by the output feedback line is used for transmitting a current weight voltage, the error back-propagation voltage end is used for outputting a difference value between the preset voltage and the current weight voltage, and the difference calculating circuit is used for calculating an error;
the summation circuit module and the difference circuit module are connected to an error back-propagation voltage end through a switch S1 and connected to an output voltage end of an output feedback line through a switch S2, and comprise a weight accumulation voltage end which is used for outputting reverse summation of the error back-propagation voltage and the current weight voltage, and the summation circuit module is used for voltage summation calculation;
the voltage reverser U4 is connected with the summation circuit module to a weight accumulation voltage end and is used for reversing the weight accumulation voltage of the weight accumulation voltage end and outputting the positive sum of the error back propagation voltage and the current weight voltage;
the weight refreshing circuit module is characterized in that switches S3 and S4 are connected with a reversed weight accumulation voltage end and a weight voltage storage end, and the current weight voltage is refreshed through the switches;
the weight value storage circuit module is connected to the grounding end and the weight value voltage storage end and stores voltage at two ends of a capacitor C1;
the voltage follower U2 is connected to the weight voltage storage end and the current weight voltage end and is used for outputting the current weight voltage and reducing the leakage speed of the capacitor C1;
when a plurality of samples are input for learning, the learning circuit can control the learning specific gravity of each sample to be the same, so that the learning circuit can learn the characteristics of each sample indiscriminately when being applied to a BP neural network circuit, the learned BP neural network system has the function of identifying the plurality of samples, namely a pattern identification function, in addition, when the plurality of samples are learned in turn, the weight voltage is continuously refreshed, and finally, the function representing the samples can be fitted, namely a function fitting function.
The weight value refreshing circuit module refreshes the current weight value voltage under the control of the pulse switch.
The learning number is controllable in such a manner that the input of the samples is synchronously controlled by the pulse for controlling learning, so that the learning number of each sample is controllable.
The mode recognition function is image recognition or face recognition, one image or one face picture is used as one sample, a plurality of sample inputs are the inputs of a plurality of pictures, and a plurality of characteristics contained in each picture are converted into corresponding data to be input into the BP neural network system.
Alternatively, the function fitting function is to fit an arbitrary function.
The learning circuit is applied to image recognition or face recognition.
The invention relates to application of a learning circuit in function fitting.
The learning circuit of the invention has the following advantages:
the learning circuit of the invention mainly completes the addition operation of the current weight voltage and the voltage of the error reverse transmission through a summation circuit, has simple design and easy understanding, can modify the learning rate only by modifying the parameters of a plurality of resistors, and is convenient and flexible; after the summation of the weight voltage is completed in the learning circuit, the refreshing of the weight voltage is completed through the specific closing and opening sequence of the four switches, and the four switches are controlled by different pulses with the same period, so the learning frequency and speed of the circuit can be controlled through the pulses.
Drawings
FIG. 1 is a schematic diagram of a learning circuit diagram according to embodiment 1 of the present invention
FIG. 2 is a diagram of the pulse waveform required by the learning circuit of the present invention
FIG. 3 is a specific pulse circuit diagram of the present invention
FIG. 4 is a block diagram of the overall circuit of the present invention applied to the function fitting function
Detailed Description
The present invention will be explained in detail below with reference to the drawings, and the present invention will be further understood by providing examples, which form a part of the present application and do not limit the examples of the present invention.
Example 1:
as shown in fig. 1, the learning circuit of the present invention mainly includes a difference circuit module, a summation circuit module, a weight value refreshing circuit module, a weight value storage circuit module, a voltage inverter U4, a voltage follower U2, a ground terminal, and four switches only for realizing a cyclic learning function.
The weight value storage circuit module is connected to a grounding end and a weight value voltage storage end, mainly comprises a capacitor C1, stores the current weight value voltage at two ends of a capacitor C1, then the weight value voltage storage end is connected to the same-phase input end of a voltage follower U2, the voltage of the voltage follower at the input end is the same as that of the voltage at the output end, the output voltage fed back by an output feedback line is also the current weight value voltage, and the current weight value voltage is used as the output voltage of the whole learning circuit.
One input end of the difference circuit module is connected to the output end of the voltage follower U2, the other input end of the difference circuit module is connected to the preset voltage end and used for performing difference operation on the current weight voltage and the preset voltage U, and the output end of the difference circuit module is an error reverse transmission voltage end.
One input end of the summation circuit module is connected to an error back-propagation voltage end of the difference calculating module through an error back-propagation switch S1, the other input end of the summation circuit module is connected to an output end of a voltage follower U2 through a current weight switch S2, the error back-propagation voltage and the current weight voltage are respectively controlled to be input through the error back-propagation switch S1 and the current weight switch S2, the output of the summation circuit module is the reverse summation of the error back-propagation voltage and the current weight voltage, so that the output end of the summation circuit module is connected to a voltage inverter U4 to invert the voltage, when the error back-propagation switch S1 and the current weight switch S2 are closed at the same time, the voltage inverter U4 outputs a learned weight accumulated voltage, and then the weight refreshing circuit module passes through.
The weight value refreshing circuit module is connected with the output end of the voltage inverter U4 and a weight value voltage storage end, the weight value accumulation switch S3 is closed at the moment, then the error reverse transmission switch S1 and the current weight value S2 are disconnected, then the weight value accumulation switch S3 is disconnected, the weight value accumulation voltage is temporarily stored at two ends of the capacitor C2, then the learning switch S4 is closed, at the moment, the capacitor C1 is connected with the C2 in parallel, the voltage at two ends of the capacitor C1 is refreshed rapidly, the learning switch S4 is disconnected, the circuit learns once, the error reverse transmission voltage is set to be U1, the current weight value voltage is U2, and the current weight value voltage U2' is after being refreshed once:
Figure BDA0001747381220000041
therefore, each time of learning is carried out, the current weight voltage is added with one half of the error back transmission voltage, at the moment, the one half can be used as a learning rate, and actually, if the learning rate is to be changed, only the resistance value of the resistor in the circuit needs to be modified. It can be known from the working mode of the learning circuit that the weight refresh in the learning circuit is mainly controlled by four switches, and the four switches can be controlled by pulses, according to the above-described sequence of closing and opening of the switches, a pulse waveform diagram as shown in fig. 2 can be designed, as shown in fig. 3, a square wave pulse is generated mainly through NE555, and then a delay is performed through the charging and discharging time of a capacitor to generate a required pulse waveform, wherein the waveform output by Vo1 can control the error back-propagation S1 and the current weight switch S2, the waveform output by Vo3 is used for controlling the weight accumulation switch S3, and the waveform output by Vo4 is used for controlling the learning switch S4.
The operation of the learning circuit of the present invention can be understood in detail through the above description, and as shown in fig. 4, a general circuit block diagram of an embodiment of the present invention is shown, and the application of the BP neural network in function fitting is realized through hardware. The whole circuit comprises a forward transmission network and a reverse transmission network, wherein the forward transmission network mainly comprises a weighted summation circuit, an I/V conversion circuit and an activation function circuit, the forward transmission network is equivalent to a main body of a function, the reverse transmission network mainly comprises a difference circuit, a multiplication circuit and a learning circuit, and the reverse transmission network is a functional main body part of a BP neural network. The learning circuit is applied to the BP neural network circuit to play a role of leading global training, so that the circuit has the capability of processing a plurality of sample data, the cyclic input of the plurality of samples is controlled simultaneously by controlling the pulse generation circuit of the learning circuit, the synchronization of the training and the sample input is played, each sample learns once in turn, and the trained system is not biased to any sample.
It will be understood by those skilled in the art that the foregoing is only a preferred embodiment of the present invention, and is not intended to limit the invention, and that any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (5)

1. A BP neural network learning circuit, comprising:
a ground terminal;
the difference calculating circuit module comprises a preset voltage end, an output voltage end fed back by an output feedback line and an error back-propagation voltage end, wherein the preset voltage end is a preset voltage U for presetting a circuit, the output voltage end fed back by the output feedback line is used for transmitting a current weight voltage, the error back-propagation voltage end is used for outputting a difference value between the preset voltage and the current weight voltage, the difference calculating circuit is used for calculating an error, and the error is a difference value between the preset voltage and the current weight voltage;
the summation circuit module and the difference circuit module are connected to an error back-propagation voltage end through a switch S1 and connected to an output voltage end of an output feedback line through a switch S2, and comprise a weight accumulation voltage end which is used for outputting reverse summation of the error back-propagation voltage and the current weight voltage, and the summation circuit module is used for voltage summation calculation;
the voltage reverser U4 is connected with the summation circuit module to a weight accumulation voltage end and is used for reversing the weight accumulation voltage of the weight accumulation voltage end and outputting the positive sum of the error back propagation voltage and the current weight voltage;
the weight refreshing circuit module is characterized in that switches S3 and S4 are connected with a reversed weight accumulation voltage end and a weight voltage storage end, and the current weight voltage is refreshed through the switches;
the weight value storage circuit module is connected to the grounding end and the weight value voltage storage end and stores voltage at two ends of a capacitor C1;
and the voltage follower U2 is connected to the weight voltage storage end and the current weight voltage end and is used for outputting the current weight voltage and reducing the leakage speed of the capacitor C1.
2. The learning circuit of claim 1, wherein the weight refresh circuit module refreshes the current weight voltage under control of a pulse switch.
3. The learning circuit according to claim 1, wherein the learning number is controlled such that the input of the samples is synchronously controlled by a pulse for controlling learning, thereby making the learning number of each sample controllable.
4. Use of the learning circuit of any one of claims 1-3 in image recognition or face recognition.
5. Use of a learning circuit as claimed in any one of claims 1 to 3 in function fitting.
CN201810850056.4A 2018-07-28 2018-07-28 BP neural network learning circuit Active CN110766130B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810850056.4A CN110766130B (en) 2018-07-28 2018-07-28 BP neural network learning circuit

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810850056.4A CN110766130B (en) 2018-07-28 2018-07-28 BP neural network learning circuit

Publications (2)

Publication Number Publication Date
CN110766130A true CN110766130A (en) 2020-02-07
CN110766130B CN110766130B (en) 2022-06-14

Family

ID=69328832

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810850056.4A Active CN110766130B (en) 2018-07-28 2018-07-28 BP neural network learning circuit

Country Status (1)

Country Link
CN (1) CN110766130B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102610274A (en) * 2012-04-06 2012-07-25 电子科技大学 Weight adjustment circuit for variable-resistance synapses
WO2013111200A1 (en) * 2012-01-23 2013-08-01 パナソニック株式会社 Neural network circuit learning method
CN103430186A (en) * 2012-01-20 2013-12-04 松下电器产业株式会社 Learning method for neural network circuit
WO2015001697A1 (en) * 2013-07-04 2015-01-08 パナソニックIpマネジメント株式会社 Neural network circuit and learning method thereof
US20150269483A1 (en) * 2014-03-18 2015-09-24 Panasonic Intellectual Property Management Co., Ltd. Neural network circuit and learning method for neural network circuit
CN105701541A (en) * 2016-01-13 2016-06-22 哈尔滨工业大学深圳研究生院 Circuit structure based on memristor pulse nerve network
CN107480782A (en) * 2017-08-14 2017-12-15 电子科技大学 Learn neural network processor on a kind of piece
CN108053029A (en) * 2017-12-27 2018-05-18 宁波山丘电子科技有限公司 A kind of training method of the neutral net based on storage array

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103430186A (en) * 2012-01-20 2013-12-04 松下电器产业株式会社 Learning method for neural network circuit
WO2013111200A1 (en) * 2012-01-23 2013-08-01 パナソニック株式会社 Neural network circuit learning method
CN102610274A (en) * 2012-04-06 2012-07-25 电子科技大学 Weight adjustment circuit for variable-resistance synapses
WO2015001697A1 (en) * 2013-07-04 2015-01-08 パナソニックIpマネジメント株式会社 Neural network circuit and learning method thereof
US20150178619A1 (en) * 2013-07-04 2015-06-25 Panasonic Intellectual Property Management Co., Ltd. Neural network circuit and learning method thereof
US20150269483A1 (en) * 2014-03-18 2015-09-24 Panasonic Intellectual Property Management Co., Ltd. Neural network circuit and learning method for neural network circuit
CN105701541A (en) * 2016-01-13 2016-06-22 哈尔滨工业大学深圳研究生院 Circuit structure based on memristor pulse nerve network
CN107480782A (en) * 2017-08-14 2017-12-15 电子科技大学 Learn neural network processor on a kind of piece
CN108053029A (en) * 2017-12-27 2018-05-18 宁波山丘电子科技有限公司 A kind of training method of the neutral net based on storage array

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
TOMOHARU YOKOYAMA 等: ""Study on simplification of processing elements in neural networks using circuit simulation"", 《2016 IEEE INTERNATIONAL MEETING FOR FUTURE OF ELECTRON DEVICES, KANSAI (IMFEDK)》 *
李安新: ""BP神经网络研究与硬件实现"", 《中国优秀博硕士学位论文全文数据库(硕士)-信息科技辑》 *

Also Published As

Publication number Publication date
CN110766130B (en) 2022-06-14

Similar Documents

Publication Publication Date Title
Zhu et al. Speech emotion recognition model based on Bi-GRU and Focal Loss
CN109816026B (en) Fusion device and method of convolutional neural network and impulse neural network
Brandt et al. Adaptive interaction and its application to neural networks
JP4620943B2 (en) Product-sum operation circuit and method thereof
Alspector et al. Performance of a stochastic learning microchip
CN114049513A (en) Knowledge distillation method and system based on multi-student discussion
JPH0380379A (en) Integrated circuit device with learning function
CN109344964A (en) A kind of multiply-add calculation method and counting circuit suitable for neural network
Gupta et al. FPGA implementation of simplified spiking neural network
CN110766130B (en) BP neural network learning circuit
US5485548A (en) Signal processing apparatus using a hierarchical neural network
Morita Smooth recollection of a pattern sequence by nonmonotone analog neural networks
Florian Biologically inspired neural networks for the control of embodied agents
Hossain et al. Reservoir computing system using biomolecular memristor
Liu et al. A neural network with a single recurrent unit for associative memories based on linear optimization
Komkov et al. The recurrent processing unit: Hardware for high speed machine learning
Nomura et al. An Energy Efficient Stochastic+ Spiking Neural Network
Zheng et al. Spiking neural encoding and hardware implementations for neuromorphic computing
EP4386629A1 (en) Monostable multivibrators-based spiking neural network training method
JPH03260785A (en) Signal processing method and its network
JP3338713B2 (en) Signal processing device
Morita et al. A model of context‐dependent association using selective desensitization of nonmonotonic neural elements
JP2517662B2 (en) Chain controller
JPH04250557A (en) Learning method of signal processor
Araki et al. Supervised STDP Learning with Weight Decay for Spiking Neural Networks

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant