CN110874629A - Structure optimization method of reserve pool network based on excitability and inhibition STDP - Google Patents
Structure optimization method of reserve pool network based on excitability and inhibition STDP Download PDFInfo
- Publication number
- CN110874629A CN110874629A CN201910911676.9A CN201910911676A CN110874629A CN 110874629 A CN110874629 A CN 110874629A CN 201910911676 A CN201910911676 A CN 201910911676A CN 110874629 A CN110874629 A CN 110874629A
- Authority
- CN
- China
- Prior art keywords
- neuron
- synaptic
- inhibitory
- excitatory
- stdp
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 18
- 238000005457 optimization Methods 0.000 title claims abstract description 17
- 206010001497 Agitation Diseases 0.000 title claims abstract description 10
- 230000005764 inhibitory process Effects 0.000 title description 2
- 210000002569 neuron Anatomy 0.000 claims abstract description 129
- 230000002401 inhibitory effect Effects 0.000 claims abstract description 90
- 230000002964 excitative effect Effects 0.000 claims abstract description 73
- 230000000946 synaptic effect Effects 0.000 claims description 55
- 239000012528 membrane Substances 0.000 claims description 29
- 230000003956 synaptic plasticity Effects 0.000 claims description 27
- 210000000225 synapse Anatomy 0.000 claims description 26
- 230000001242 postsynaptic effect Effects 0.000 claims description 25
- 210000005215 presynaptic neuron Anatomy 0.000 claims description 21
- 238000010304 firing Methods 0.000 claims description 15
- 230000000284 resting effect Effects 0.000 claims description 9
- 230000001419 dependent effect Effects 0.000 claims description 8
- 230000006870 function Effects 0.000 claims description 6
- 230000001537 neural effect Effects 0.000 claims description 5
- 230000003518 presynaptic effect Effects 0.000 claims description 5
- 230000000638 stimulation Effects 0.000 claims description 5
- 230000007423 decrease Effects 0.000 claims description 4
- 230000000694 effects Effects 0.000 claims description 4
- 230000036982 action potential Effects 0.000 claims description 3
- 238000007599 discharging Methods 0.000 claims description 3
- 230000036279 refractory period Effects 0.000 claims description 3
- 230000009711 regulatory function Effects 0.000 claims description 3
- 230000001360 synchronised effect Effects 0.000 claims description 3
- 230000008859 change Effects 0.000 claims description 2
- 230000001105 regulatory effect Effects 0.000 claims 1
- 230000003993 interaction Effects 0.000 abstract description 5
- 238000013529 biological neural network Methods 0.000 abstract description 3
- 238000013528 artificial neural network Methods 0.000 description 19
- 238000000513 principal component analysis Methods 0.000 description 12
- 238000010586 diagram Methods 0.000 description 10
- 238000003062 neural network model Methods 0.000 description 4
- 210000000653 nervous system Anatomy 0.000 description 3
- 230000000306 recurrent effect Effects 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 206010028980 Neoplasm Diseases 0.000 description 1
- 206010056342 Pulmonary mass Diseases 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 210000003169 central nervous system Anatomy 0.000 description 1
- 230000006690 co-activation Effects 0.000 description 1
- 230000019771 cognition Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000012421 spiking Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/088—Non-supervised learning, e.g. competitive learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/06—Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
- G06N3/063—Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means
Abstract
The invention provides a structure optimization method of a reserve pool network based on excitability and inhibitory STDP, wherein the reserve pool network is a network formed by connecting an excitability neuron E and an inhibitory neuron I, and the network has four connections: E-E, E-I, I-E, I-I, wherein the connection weight of E-E is adjusted by excitatory STDP, and the connection weight of I-E is adjusted by inhibitory STDP. The invention realizes the balance of excitability and inhibitivity through the unsupervised learning rule of the interaction of excitatory STDP and inhibitive STDP; the heterogeneous network structure with long tail distribution characteristics consistent with the characteristics of the biological neural network is formed, the network information capacity is large, and the network performance is optimal.
Description
Technical Field
The invention belongs to the technical field of artificial neural networks, and particularly relates to a structure optimization method of a reserve pool network based on excitability and inhibitory STDP.
Background
Echo State Networks (ESNs) are a new neural network model, and the Echo state networks belong to a multi-layer feedforward neural network model in a macroscopic structure, and the structure of the Echo state networks is shown in fig. 1, and a local recurrent neural network in the middle part of the drawing is a reserve pool (reserve volume computing) module and belongs to a key part of the Echo state networks.
The reserve pool module is a recurrent neural network (recurrent neural network), the neurons of the reserve pool network are randomly and sparsely connected, the connection weight has a decisive influence on the system performance, in the prior art, how to set the connection weight of the reserve pool network is mainly adjusted by experience, the connection weight can be completed by experienced people, and people with abundant experience are needed, because the experience has a great influence on the network performance, the subjectivity of people has a great influence, the performance is not necessarily the best, and a great deal of time is needed. Also, there are areas where empirical data may be limited and it is difficult to obtain a network structure with better performance.
Disclosure of Invention
In view of this, the present invention is directed to provide a structure optimization method for a pool network based on excitatory and inhibitory STDP, which is obtained through learning, is not affected by human experience, and has optimal performance and large information capacity.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
the structure optimization method of the reserve pool network based on the excitability and the inhibitory STDP is characterized in that the reserve pool network is formed by connecting an excitability neuron E and an inhibitory neuron I, and the network has four connections: E-E, E-I, I-E, I-I, wherein the connection weight of E-E is adjusted by excitatory STDP, and the connection weight of I-E is adjusted by inhibitory STDP.
Further, an inhibitory synaptic plasticity is acted on the I-E synapses, the inhibitory synaptic plasticity is modeled as synaptic plasticity STDP dependent on the discharge time, the plasticity is expressed by the approximately synchronous neuron discharge before and after synapses so that the inhibitory synaptic weight is enhanced, the single neuron discharge attenuates, and the formula of the regulatory function of the inhibitory synaptic plasticity is as follows:
Δt=tj-ti
wherein, tj,tiThe pulse emitting time of the pre-synaptic and post-synaptic neurons is respectively.
Furthermore, excitatory synaptic plasticity is acted on E-E synapse, and the excitatory synaptic plasticity model is plasticity STDP dependent on discharge time, and the plasticity is expressed as synaptophiaSynaptic weight enhancement when channel i discharges after presynaptic neuron j; synaptic weight decreases when the post-synaptic neuron i fires before the pre-synaptic neuron j; i.e. when neuron i or j discharges, Wji+ΔWi→WjiThe formula of the adjusting function is as follows:
wherein, WjiAs synaptic weight, tjAnd tiThe time of the most recent firing of neurons j and i, τ+And τ-The extent of the postsynaptic pulse gap when synaptic enhancement or diminution occurs, A+(Wij) And A-(Wij) As a weight-dependent function:
wherein η is the learning rate, WmaxThe maximum weight.
Further, the inhibitory synaptic strength change is determined by tracking, and the tracking method comprises the following steps:
a trace is created for each neuron i, which varies as follows:
neuron i discharges: x is the number ofi→xi+1
wherein, tauSTDPRepresents a synaptic plasticity time constant;
then, the inhibitory synaptic weight W from inhibitory neuron j to excitatory neuron iijFollowing the firing activity changes of each presynaptic neuron as follows:
Wherein xiAnd xjEach indicates synaptic trace of neuron, η indicates learning rate, α ═ 2 × ρ0*τSTDPIs an inhibiting factor, representing the attenuation factor, p0Is the target discharge rate and is a constant value.
Furthermore, the number ratio of excitatory neurons to inhibitory neurons in the reserve pool network is 4:1, random connection is performed in the initial structure, and the connection probability is 0.2.
Further, the neuron model in the reservoir network adopts a LIF neuron model, and the sub-threshold membrane potential dynamics of the post-synaptic neuron i is expressed by the following formula:
if Vi≥Vth,then Vi=Vrest
wherein, taumRepresents the membrane time constant; vrestRepresents a resting potential of a neuron; vthA membrane potential threshold representing a neuronal discharge; gleakRepresents a leakage conductance; viRepresents the subthreshold membrane potential of neuron i; vEAnd VIRespectively represent excitatory inversion potential and inhibitory inversion potential;andrespectively representing excitatory conductance of neuronsAnd inhibitory conductance.
Furthermore, the external input of the reserve pool is the pulse time from the coding neuron, the excitatory synaptic conductance acts on the neuron, the connection probability of the coding layer neuron and the excitatory neuron of the reserve pool is 0.2, and when the external stimulation fails to enable the membrane potential V of the neuron iiIncrease to discharge threshold VthWhile, the neuronal membrane potential will gradually decay to the resting potential; when external stimulation causes the membrane potential V of the neuron iiExceeds a discharge threshold VthWhen the membrane is activated, the neuron will generate an action potential, and then the membrane potential will return to the resting state potential within a certain absolute refractory period taurefThe inner part does not respond to the external stimulus.
Further, the reserve pool network uses a conductance-based synapse model, and the post-synaptic output current and the input voltage have an approximately linear relationship, and the formula is as follows:
Isyn=g(t)(Vm(t)-VX)
wherein IsynRepresents the synaptic output current; g represents synaptic conductance; vmRepresents the membrane potential of the postsynaptic neuron; vXIndicating a reversal potential, which for excitatory synapses is the excitatory reversal potential VEWhereas for inhibitory synapses, the inversion potential is the inhibitory inversion potential VI。
Synaptic conductance can be divided into excitatory conductance and inhibitory conductance according to different external stimuli. When neuron i receives the firing of excitatory presynaptic neuron j, the excitatory conductance of the neuron increases, otherwise it decays; when neuron i receives the firing of inhibitory presynaptic neuron j, the inhibitory conductance of the neuron increases, otherwise it decays:
wherein, tauERepresents an excitatory synaptic time constant; tau isIRepresenting an inhibitory synaptic time constant; the right-hand summation symbol of the formula represents the sum S of firing sequences of presynaptic neurons jj(t);WijRepresenting synaptic weights from neuron j to neuron iRepresents a unit excitatory conductance;expressed as unit inhibitory conductance.
Compared with the prior art, the invention has the following advantages:
(1) the invention realizes the balance of excitability and inhibitivity through the unsupervised learning rule of the interaction of excitatory STDP and inhibitive STDP; the heterogeneous network structure with long tail distribution characteristics consistent with the characteristics of the biological neural network is formed, the network information capacity is large, and the network performance is optimal.
(2) The traditional network structure of the reserve pool is random connection, and the connection weight is approximately in normal distribution; the network structure of the present application is scalable and scalable to an optimal structure, approaching a power law distribution (heterogeneous).
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate an embodiment of the invention and, together with the description, serve to explain the invention and not to limit the invention. In the drawings:
FIG. 1 is a schematic diagram of a structural model of a spiking neural network;
FIG. 2 is a graphical representation of the function of inhibitory synaptic plasticity according to an embodiment of the present invention;
FIG. 3 is a graphical representation of the functional response of excitatory synaptic plasticity of an embodiment of the present invention;
FIG. 4 is a diagram illustrating a synaptic weight distribution of a pool network structure according to an embodiment of the present invention;
FIG. 5 is a diagram illustrating principal component analysis results from two angles of a network structure of a reserve pool according to an embodiment of the present invention;
FIG. 6 is a diagram illustrating a synaptic weight distribution of a pool network structure according to a first conventional network model;
FIG. 7 is a diagram illustrating principal component analysis results of two angles of a network structure of a reserve pool in a first conventional network model;
FIG. 8 is a diagram illustrating a synapse weight distribution of a pool network structure in a second conventional network model;
fig. 9 is a diagram illustrating principal component analysis results of two angles of a network structure of a reserve pool in the second conventional network model.
Detailed Description
It should be noted that the embodiments and features of the embodiments may be combined with each other without conflict.
The present invention will be described in detail below with reference to the embodiments with reference to the attached drawings.
The reserve pool module of the invention adopts a pulse Neural Network (SNN), the SNN is constructed by simulating the structural characteristics of the brain neuron Network, and compared with the traditional artificial Neural Network model, the reserve pool module has the advantages of local learning and high calculation efficiency. In recent years, the impulse neural network has received much attention as a typical form of a brain-like neural network (brain-like artificial intelligence), and is called a new-generation neural network.
The application provides an optimal design method of a reserve pool neural network structure based on interaction of excitatory STDP and inhibitory STDP (pulse neural network model) by using the principle of neuroscience, and the core is as follows: the reserve pool network is added between an input layer and an output layer of the impulse neural network model, and is a complex network formed by connecting a large number of excitatory (E) neurons and inhibitory (I) neurons. There are four types of connections to the network: E-E, E-I, I-E, I-I, adjusting the connection weight of E-E through the learning rule of excitatory STDP, and adjusting the connection weight of I-E through the learning rule of inhibitory STDP. The interaction between the excitatory STDP and the inhibitory STDP is embodied in that the excitatory STDP and the inhibitory STDP jointly adjust the connection weight of the network and optimize the structure of the network, and both belong to unsupervised learning.
Core points of the optimization results: on the one hand, the size (suitable size) of the connection weight of the network is optimized; on the other hand, the distribution of the connection weights of the network (heterogeneous) is optimized. The application forms a heterogeneous network structure with long tail distribution characteristics consistent with biological neural network characteristics.
Specifically, the number of excitatory neurons and the number of inhibitory neurons in the pool network of the present application are 800, 200 (the number of both may be changed), the initial structure is random connection, and the connection probability is 0.2.
Four synaptic connections are formed between neurons by synaptic connections, according to different pre-and post-synaptic neuron types, including excitatory synaptic connections (E-E) to excitatory neurons, excitatory synaptic connections (E-I) to inhibitory neurons, inhibitory synaptic connections (I-E) to inhibitory neurons, and inhibitory synaptic connections (I-I) to inhibitory neurons. The ratio of the number of excitatory neurons to inhibitory neurons was 4:1, depending on the actual biological nervous system. The following table 1 shows the parameter meanings and values of the neural network structure according to the embodiment of the present application:
TABLE 1 parameter significance and value of neural network architecture
The neuron model in the reservoir network adopts a Leaky integral-and-fire (LIF) neuron model, which has small calculation amount, can accurately describe the physiological characteristics of biological neurons, and is widely used in the field of computational neuroscience. The subthreshold membrane potential dynamics of the postsynaptic neuron i can be expressed by the following formula:
if Vi≥Vth,then Vi=Vrest
wherein, taumRepresents the membrane time constant; vrestRepresents a resting potential of a neuron; vthRepresenting neuronsMembrane potential threshold of discharge; gleakRepresents a leakage conductance; viRepresents the subthreshold membrane potential of neuron i; vEAnd VIRespectively represent excitatory inversion potential and inhibitory inversion potential;andwhich represent excitatory and inhibitory conductances of neurons, respectively.
In the application, the external input of the reserve pool is the pulse time from the coding neuron, the excitatory synaptic conductance acts on the neuron, and the connection probability between the coding layer neuron and the excitatory neuron of the reserve pool is 0.2. When the external stimulus fails to make the membrane potential V of the neuron iiIncrease to discharge threshold VthWhile, the neuronal membrane potential will gradually decay to the resting potential; when external stimulation causes the membrane potential V of the neuron iiExceeds a discharge threshold VthWhen the membrane is activated, the neuron will generate an action potential, and then the membrane potential will return to the resting state potential within a certain absolute refractory period taurefThe inner part does not respond to the external stimulus. Typical values for all parameters in the neuron model are shown in table 2:
TABLE 2 neuron model parameter values
In the reservoir, the present application uses a conductance-based synaptic model. The post-synaptic output current and the input voltage present an approximate linear relationship, and the formula is as follows:
Isyn=g(t)(Vm(t)-VX)
wherein IsynRepresents the synaptic output current; g represents synaptic conductance; vmRepresents the membrane potential of the postsynaptic neuron; vXIndicating a reversal potential, which for excitatory synapses is the excitatory reversal potential VEFor inhibitory synapses, the reversal potential is inhibitionModulation reversal potential VI。
Synaptic conductance can be divided into excitatory conductance and inhibitory conductance according to different external stimuli. When neuron i receives the firing of excitatory presynaptic neuron j, the excitatory conductance of the neuron increases, otherwise it decays; when neuron i receives the firing of inhibitory presynaptic neuron j, the inhibitory conductance of the neuron increases, otherwise it decays:
wherein, tauERepresents an excitatory synaptic time constant; tau isIRepresenting an inhibitory synaptic time constant; the right-hand summation symbol of the formula represents the sum S of firing sequences of presynaptic neurons jj(t);WijRepresenting synaptic weights from neuron j to neuron iRepresents a unit excitatory conductance;expressed as unit inhibitory conductance. The detailed parameters in the synapse model are shown in table 3:
TABLE 3 synaptic model parameter values
In neuroscience, synaptic plasticity can cause synaptic strength to increase or decrease with electrical activity of a neural network on a timescale of seconds to days, thereby changing the network structure, recognized as the basis for learning, memory and cognition. At the same time, synaptic plasticity can provide steady-state control of circuits in the central nervous system, which is critical to the stability of the nervous system.
The application applies inhibitory synapse plasticity and excitatory synapse plasticity to the reserve pool simultaneously, adjusts the excitatory and inhibitory balance of the network, and improves the stability and information transmission of the nervous system.
The application enables inhibitory synaptic plasticity to act on I-E synapses, the selected inhibitory plasticity model is synaptic plasticity (STDP) depending on discharge time, the plasticity shows that the discharge of approximately synchronous pre-and post-synaptic neurons strengthens the weight of the inhibitory synapses, and the discharge of single neurons attenuates the neurons, which also accords with the Hebbian theory, namely, in a mutually connected neuron group, the repetitive and continuous coactivation can improve the efficiency of the connected synapses. The regulatory function curve for inhibitory synaptic plasticity is shown in FIG. 2, with the following formula:
Δt=tj-ti
wherein, tj,tiThe pulse emitting time of the pre-synaptic and post-synaptic neurons is respectively.
To further clarify the inhibitory synaptic strength changes, the present application establishes a trace for each neuron i, which changes as follows:
neuron i discharges: x is the number ofi→xi+1
wherein, tauSTDPRepresenting the synaptic plasticity time constant.
Then, the inhibitory synaptic weight W from inhibitory neuron j to excitatory neuron iijFollowing the firing activity changes of each presynaptic neuron as follows:
Wherein xiAnd xjEach indicates synaptic trace of neuron, η indicates learning rate, α ═ 2 × ρ0*τSTDPIs an inhibiting factor, representing the attenuation factor, p0Is the target discharge rate and is a constant value. The meaning and value of the parameters of inhibitory synaptic plasticity are shown in Table 4
TABLE 4 significance and value of parameters for inhibitory synaptic plasticity
Meanwhile, excitatory synaptic plasticity acts on E-E synapses, the selected excitatory plasticity model is discharge-time-dependent plasticity (STDP for short), and the plasticity shows that when a post-synaptic neuron i discharges after a pre-synaptic neuron j, synaptic weight is enhanced; synaptic weight decreases when the post-synaptic neuron i fires before the pre-synaptic neuron j; that is, when neuron i or j discharges, Wji+ΔWi→WjiThe curve of the adjusting function is shown in fig. 3, and the formula is as follows:
wherein, tjAnd tiThe time of the most recent firing of neurons j and i, τ+And τ-The extent of the postsynaptic pulse gap when synaptic enhancement or diminution occurs, A+(Wij) And A-(Wij) As a weight-dependent function:
TABLE 5 significance and value of parameters for excitatory synaptic plasticity
The evaluation indexes of the structure optimization of the reserve pool network are as follows:
the information capacity of the network is measured by a Principal Component Analysis (PCA) graph, and the large information capacity of the network indicates that the network structure is optimal.
The traditional network structure of the reserve pool is randomly connected (isomorphic), and the connection weight value follows normal distribution.
The reservoir network structure of the present application, optimized by the interaction of excitatory STDP and inhibitory STDP, is close to power law distributed (heterogeneous). PCA analysis shows that heterogeneous networks have a greater information capacity than homogeneous networks when the average strength of the network connections is the same.
The optimized neural network structure of the reserve pool is compared with the traditional network structures in other 2 to explain the optimization result of the method:
fig. 4 is a diagram showing a synapse weight distribution of an optimized pool network structure, and it can be seen that a network connection Wij is strong, and synapse weights of the network are non-uniformly distributed and approximately power law distributed. Fig. 5 shows the Principal Component Analysis (PCA) results for two angles, which is a large information capacity.
FIG. 6 is a distribution diagram of synaptic weights of a conventional network model one, which shows that the network connection Wij is strong, but the synaptic weights of the neural network are uniformly distributed. Fig. 7 shows the Principal Component Analysis (PCA) results for two angles, with a small information capacity.
Fig. 8 is a distribution diagram of synapse weights of a conventional network model two, which shows that the network connection Wij is weak, and the synapse weights of the neural network are uniformly distributed. Fig. 9 shows the Principal Component Analysis (PCA) results for two angles, with a small information capacity.
Examples of applications are:
the learning of the deep neural network requires massive data. The medical data is small sample data in nature, and the data is difficult to acquire and high in cost. Therefore, the impulse neural network has a good application prospect in intelligent analysis of medical images, such as identification of lung nodules (tumors).
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.
Claims (8)
1. The structure optimization method of the reserve pool network based on excitability and inhibitivity STDP is characterized by comprising the following steps: the reserve pool network is a network formed by connecting excitatory neurons E and inhibitory neurons I, and four connections exist in the network: E-E, E-I, I-E, I-I, wherein the connection weight of E-E is adjusted by excitatory STDP, and the connection weight of I-E is adjusted by inhibitory STDP.
2. The method of claim 1 for structural optimization of an excitatory and inhibitory STDP-based pool network, wherein: acting inhibitory synaptic plasticity on I-E synapses, wherein the inhibitory synaptic plasticity is modeled as synaptic plasticity STDP dependent on the discharge time, the plasticity is expressed by that neurons discharge before and after approximately synchronous synapses so that inhibitory synaptic weights are enhanced, and single neuron discharge attenuates the inhibitory synaptic plasticity, and the formula of a regulatory function of the inhibitory synaptic plasticity is as follows:
Δt=tj-ti
wherein, tj,tiThe pulse emitting time of the pre-synaptic and post-synaptic neurons is respectively.
3. The method of claim 1 for structural optimization of an excitatory and inhibitory STDP-based pool network, wherein: acting excitatory synaptic plasticity on E-E synapses, wherein the excitatory synaptic plasticity model is plasticity STDP dependent on discharge time, and the plasticity shows that synaptic weight is enhanced when a post-synaptic neuron i discharges after a pre-synaptic neuron j; synaptic weight decreases when the post-synaptic neuron i fires before the pre-synaptic neuron j; i.e. when neuron i or j discharges, Wji+ΔWi→WjiPublic of its regulatory functionThe formula is as follows:
wherein, WjiAs synaptic weight, tjAnd tiThe time of the most recent firing of neurons j and i, τ+And τ-The extent of the postsynaptic pulse gap when synaptic enhancement or diminution occurs, A+(Wij) And A-(Wij) As a weight-dependent function:
wherein η is the learning rate, WmaxThe maximum weight.
4. The method of claim 1 for structural optimization of an excitatory and inhibitory STDP-based pool network, wherein: the inhibitory synaptic strength change is clarified by tracking, and the tracking method comprises the following steps:
a trace is created for each neuron i, which varies as follows:
neuron i discharges: x is the number ofi→xi+1
wherein, tauSTDPRepresents a synaptic plasticity time constant;
then, the inhibitory synaptic weight W from inhibitory neuron j to excitatory neuron iijFollowing the firing activity changes of each presynaptic neuron as follows:
Wherein xiAnd xjEach indicates synaptic trace of neuron, η indicates learning rate, α ═ 2 × ρ0*τSTDPIs an inhibiting factor, representing the attenuation factor, p0Is the target discharge rate and is a constant value.
5. The method of claim 1 for structural optimization of an excitatory and inhibitory STDP-based pool network, wherein: the ratio of the number of excitatory neurons to the number of inhibitory neurons in the reserve pool network is 4:1, random connection is adopted in the initial structure, and the connection probability is 0.2.
6. The method of claim 1 for structural optimization of an excitatory and inhibitory STDP-based pool network, wherein: the neuron model in the reserve pool network adopts a LIF neuron model, and the subthreshold membrane potential dynamics of the post-synaptic neuron i is expressed by the following formula:
if Vi≥Vth,then Vi=Vrest
wherein, taumRepresents the membrane time constant; vrestRepresents a resting potential of a neuron; vthA membrane potential threshold representing a neuronal discharge; gleakRepresents a leakage conductance; viRepresents the subthreshold membrane potential of neuron i; vEAnd VIRespectively represent excitatory inversion potential and inhibitory inversion potential;andwhich represent excitatory and inhibitory conductances of neurons, respectively.
7. The method of claim 1 for structural optimization of an excitatory and inhibitory STDP-based pool network, wherein: the external input of the reserve pool is pulse time from a coding neuron, the excitatory synaptic conductance acts on the neuron, the connection probability of the coding layer neuron and the excitatory neuron of the reserve pool is 0.2, and when the external stimulation fails to enable the membrane potential V of the neuron iiIncrease to discharge threshold VthWhile, the neuronal membrane potential will gradually decay to the resting potential; when external stimulation causes the membrane potential V of the neuron iiExceeds a discharge threshold VthWhen the membrane is activated, the neuron will generate an action potential, and then the membrane potential will return to the resting state potential within a certain absolute refractory period taurefThe inner part does not respond to the external stimulus.
8. The method of claim 1 for structural optimization of an excitatory and inhibitory STDP-based pool network, wherein: the reserve pool network uses a conductance-based synapse model, and the post-synaptic output current and the input voltage present an approximate linear relationship, and the formula is as follows:
Isyn=g(t)(Vm(t)-VX)
wherein IsynRepresents the synaptic output current; g represents synaptic conductance; vmRepresents the membrane potential of the postsynaptic neuron; vXIndicating a reversal potential, which for excitatory synapses is the excitatory reversal potential VEWhereas for inhibitory synapses, the inversion potential is the inhibitory inversion potential VI。
Synaptic conductance can be divided into excitatory conductance and inhibitory conductance according to different external stimuli. When neuron i receives the firing of excitatory presynaptic neuron j, the excitatory conductance of the neuron increases, otherwise it decays; when neuron i receives the firing of inhibitory presynaptic neuron j, the inhibitory conductance of the neuron increases, otherwise it decays:
wherein, tauERepresents an excitatory synaptic time constant; tau isIRepresenting an inhibitory synaptic time constant; the right-hand summation symbol of the formula represents the sum S of firing sequences of presynaptic neurons jj(t);WijRepresenting synaptic weights from neuron j to neuron iRepresents a unit excitatory conductance;expressed as unit inhibitory conductance.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910911676.9A CN110874629A (en) | 2019-09-25 | 2019-09-25 | Structure optimization method of reserve pool network based on excitability and inhibition STDP |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910911676.9A CN110874629A (en) | 2019-09-25 | 2019-09-25 | Structure optimization method of reserve pool network based on excitability and inhibition STDP |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110874629A true CN110874629A (en) | 2020-03-10 |
Family
ID=69718061
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910911676.9A Pending CN110874629A (en) | 2019-09-25 | 2019-09-25 | Structure optimization method of reserve pool network based on excitability and inhibition STDP |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110874629A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112308221A (en) * | 2020-10-14 | 2021-02-02 | 成都市深思创芯科技有限公司 | Working memory hardware implementation method based on reserve pool calculation |
CN116312810A (en) * | 2023-05-16 | 2023-06-23 | 中国石油大学(华东) | Method for predicting human brain excitability and inhibitory connection based on hypergraph nerve field |
-
2019
- 2019-09-25 CN CN201910911676.9A patent/CN110874629A/en active Pending
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112308221A (en) * | 2020-10-14 | 2021-02-02 | 成都市深思创芯科技有限公司 | Working memory hardware implementation method based on reserve pool calculation |
CN112308221B (en) * | 2020-10-14 | 2024-02-27 | 成都市深思创芯科技有限公司 | Working memory hardware implementation method based on reserve pool calculation |
CN116312810A (en) * | 2023-05-16 | 2023-06-23 | 中国石油大学(华东) | Method for predicting human brain excitability and inhibitory connection based on hypergraph nerve field |
CN116312810B (en) * | 2023-05-16 | 2023-08-15 | 中国石油大学(华东) | Method for predicting human brain excitability and inhibitory connection based on hypergraph nerve field |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Diehl et al. | Unsupervised learning of digit recognition using spike-timing-dependent plasticity | |
Pham et al. | Control chart pattern recognition using spiking neural networks | |
KR101793011B1 (en) | Efficient hardware implementation of spiking networks | |
US9418331B2 (en) | Methods and apparatus for tagging classes using supervised learning | |
CN112101535B (en) | Signal processing method of impulse neuron and related device | |
Lin et al. | Relative ordering learning in spiking neural network for pattern recognition | |
Shrestha et al. | Stable spike-timing dependent plasticity rule for multilayer unsupervised and supervised learning | |
US20100076916A1 (en) | Autonomous Learning Dynamic Artificial Neural Computing Device and Brain Inspired System | |
US20110145179A1 (en) | Framework for the organization of neural assemblies | |
WO2015112262A1 (en) | Configuring sparse neuronal networks | |
US9959499B2 (en) | Methods and apparatus for implementation of group tags for neural models | |
KR20160125967A (en) | Method and apparatus for efficient implementation of common neuron models | |
CN110874629A (en) | Structure optimization method of reserve pool network based on excitability and inhibition STDP | |
CN114266351A (en) | Pulse neural network training method and system based on unsupervised learning time coding | |
Humaidi et al. | Spiking versus traditional neural networks for character recognition on FPGA platform | |
Sboev et al. | Solving a classification task by spiking neurons with STDP and temporal coding | |
CN115222026A (en) | Hardware circuit of impulse neural network | |
CN116796207A (en) | Self-organizing mapping clustering method based on impulse neural network | |
Zhou et al. | Improved integrate-and-fire neuron models for inference acceleration of spiking neural networks | |
Zhang et al. | The framework and memristive circuit design for multisensory mutual associative memory networks | |
Deng et al. | Stdp and competition learning in spiking neural networks and its application to image classification | |
Sun et al. | Simplified spike-timing dependent plasticity learning rule of spiking neural networks for unsupervised clustering | |
CN114528984A (en) | Multi-input neuron circuit for impulse neural network | |
LI et al. | Research on learning algorithm of spiking neural network | |
Sahni et al. | Implementation of boolean and and or logic gates with biologically reasonable time constants in spiking neural networks |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB03 | Change of inventor or designer information | ||
CB03 | Change of inventor or designer information |
Inventor after: Wang Junsong Inventor after: Yao Yang Inventor after: Ma Zengguang Inventor before: Wang Junsong Inventor before: Yao Yang |
|
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200310 |