CN110874629A - Structure optimization method of reserve pool network based on excitability and inhibition STDP - Google Patents

Structure optimization method of reserve pool network based on excitability and inhibition STDP Download PDF

Info

Publication number
CN110874629A
CN110874629A CN201910911676.9A CN201910911676A CN110874629A CN 110874629 A CN110874629 A CN 110874629A CN 201910911676 A CN201910911676 A CN 201910911676A CN 110874629 A CN110874629 A CN 110874629A
Authority
CN
China
Prior art keywords
neuron
synaptic
inhibitory
excitatory
stdp
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910911676.9A
Other languages
Chinese (zh)
Inventor
王俊松
姚洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin Medical University
Shaanxi University of Science and Technology
Original Assignee
Shaanxi University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shaanxi University of Science and Technology filed Critical Shaanxi University of Science and Technology
Priority to CN201910911676.9A priority Critical patent/CN110874629A/en
Publication of CN110874629A publication Critical patent/CN110874629A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/088Non-supervised learning, e.g. competitive learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/06Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
    • G06N3/063Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means

Abstract

The invention provides a structure optimization method of a reserve pool network based on excitability and inhibitory STDP, wherein the reserve pool network is a network formed by connecting an excitability neuron E and an inhibitory neuron I, and the network has four connections: E-E, E-I, I-E, I-I, wherein the connection weight of E-E is adjusted by excitatory STDP, and the connection weight of I-E is adjusted by inhibitory STDP. The invention realizes the balance of excitability and inhibitivity through the unsupervised learning rule of the interaction of excitatory STDP and inhibitive STDP; the heterogeneous network structure with long tail distribution characteristics consistent with the characteristics of the biological neural network is formed, the network information capacity is large, and the network performance is optimal.

Description

Structure optimization method of reserve pool network based on excitability and inhibition STDP
Technical Field
The invention belongs to the technical field of artificial neural networks, and particularly relates to a structure optimization method of a reserve pool network based on excitability and inhibitory STDP.
Background
Echo State Networks (ESNs) are a new neural network model, and the Echo state networks belong to a multi-layer feedforward neural network model in a macroscopic structure, and the structure of the Echo state networks is shown in fig. 1, and a local recurrent neural network in the middle part of the drawing is a reserve pool (reserve volume computing) module and belongs to a key part of the Echo state networks.
The reserve pool module is a recurrent neural network (recurrent neural network), the neurons of the reserve pool network are randomly and sparsely connected, the connection weight has a decisive influence on the system performance, in the prior art, how to set the connection weight of the reserve pool network is mainly adjusted by experience, the connection weight can be completed by experienced people, and people with abundant experience are needed, because the experience has a great influence on the network performance, the subjectivity of people has a great influence, the performance is not necessarily the best, and a great deal of time is needed. Also, there are areas where empirical data may be limited and it is difficult to obtain a network structure with better performance.
Disclosure of Invention
In view of this, the present invention is directed to provide a structure optimization method for a pool network based on excitatory and inhibitory STDP, which is obtained through learning, is not affected by human experience, and has optimal performance and large information capacity.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
the structure optimization method of the reserve pool network based on the excitability and the inhibitory STDP is characterized in that the reserve pool network is formed by connecting an excitability neuron E and an inhibitory neuron I, and the network has four connections: E-E, E-I, I-E, I-I, wherein the connection weight of E-E is adjusted by excitatory STDP, and the connection weight of I-E is adjusted by inhibitory STDP.
Further, an inhibitory synaptic plasticity is acted on the I-E synapses, the inhibitory synaptic plasticity is modeled as synaptic plasticity STDP dependent on the discharge time, the plasticity is expressed by the approximately synchronous neuron discharge before and after synapses so that the inhibitory synaptic weight is enhanced, the single neuron discharge attenuates, and the formula of the regulatory function of the inhibitory synaptic plasticity is as follows:
Δt=tj-ti
wherein, tj,tiThe pulse emitting time of the pre-synaptic and post-synaptic neurons is respectively.
Furthermore, excitatory synaptic plasticity is acted on E-E synapse, and the excitatory synaptic plasticity model is plasticity STDP dependent on discharge time, and the plasticity is expressed as synaptophiaSynaptic weight enhancement when channel i discharges after presynaptic neuron j; synaptic weight decreases when the post-synaptic neuron i fires before the pre-synaptic neuron j; i.e. when neuron i or j discharges, Wji+ΔWi→WjiThe formula of the adjusting function is as follows:
Figure BDA0002214880650000021
wherein, WjiAs synaptic weight, tjAnd tiThe time of the most recent firing of neurons j and i, τ+And τ-The extent of the postsynaptic pulse gap when synaptic enhancement or diminution occurs, A+(Wij) And A-(Wij) As a weight-dependent function:
Figure BDA0002214880650000022
Figure BDA0002214880650000023
wherein η is the learning rate, WmaxThe maximum weight.
Further, the inhibitory synaptic strength change is determined by tracking, and the tracking method comprises the following steps:
a trace is created for each neuron i, which varies as follows:
neuron i discharges: x is the number ofi→xi+1
Neuron i does not discharge:
Figure BDA0002214880650000031
wherein, tauSTDPRepresents a synaptic plasticity time constant;
then, the inhibitory synaptic weight W from inhibitory neuron j to excitatory neuron iijFollowing the firing activity changes of each presynaptic neuron as follows:
presynaptic neurons in
Figure BDA0002214880650000032
Discharging at the moment: wij→Wij+η(xi-α)
The postsynaptic neurons are in
Figure BDA0002214880650000033
Time of discharge Wij→Wij+ηxj
Wherein xiAnd xjEach indicates synaptic trace of neuron, η indicates learning rate, α ═ 2 × ρ0STDPIs an inhibiting factor, representing the attenuation factor, p0Is the target discharge rate and is a constant value.
Furthermore, the number ratio of excitatory neurons to inhibitory neurons in the reserve pool network is 4:1, random connection is performed in the initial structure, and the connection probability is 0.2.
Further, the neuron model in the reservoir network adopts a LIF neuron model, and the sub-threshold membrane potential dynamics of the post-synaptic neuron i is expressed by the following formula:
Figure BDA0002214880650000034
if Vi≥Vth,then Vi=Vrest
wherein, taumRepresents the membrane time constant; vrestRepresents a resting potential of a neuron; vthA membrane potential threshold representing a neuronal discharge; gleakRepresents a leakage conductance; viRepresents the subthreshold membrane potential of neuron i; vEAnd VIRespectively represent excitatory inversion potential and inhibitory inversion potential;
Figure BDA0002214880650000035
and
Figure BDA0002214880650000036
respectively representing excitatory conductance of neuronsAnd inhibitory conductance.
Furthermore, the external input of the reserve pool is the pulse time from the coding neuron, the excitatory synaptic conductance acts on the neuron, the connection probability of the coding layer neuron and the excitatory neuron of the reserve pool is 0.2, and when the external stimulation fails to enable the membrane potential V of the neuron iiIncrease to discharge threshold VthWhile, the neuronal membrane potential will gradually decay to the resting potential; when external stimulation causes the membrane potential V of the neuron iiExceeds a discharge threshold VthWhen the membrane is activated, the neuron will generate an action potential, and then the membrane potential will return to the resting state potential within a certain absolute refractory period taurefThe inner part does not respond to the external stimulus.
Further, the reserve pool network uses a conductance-based synapse model, and the post-synaptic output current and the input voltage have an approximately linear relationship, and the formula is as follows:
Isyn=g(t)(Vm(t)-VX)
wherein IsynRepresents the synaptic output current; g represents synaptic conductance; vmRepresents the membrane potential of the postsynaptic neuron; vXIndicating a reversal potential, which for excitatory synapses is the excitatory reversal potential VEWhereas for inhibitory synapses, the inversion potential is the inhibitory inversion potential VI
Synaptic conductance can be divided into excitatory conductance and inhibitory conductance according to different external stimuli. When neuron i receives the firing of excitatory presynaptic neuron j, the excitatory conductance of the neuron increases, otherwise it decays; when neuron i receives the firing of inhibitory presynaptic neuron j, the inhibitory conductance of the neuron increases, otherwise it decays:
Figure BDA0002214880650000041
Figure BDA0002214880650000042
wherein, tauERepresents an excitatory synaptic time constant; tau isIRepresenting an inhibitory synaptic time constant; the right-hand summation symbol of the formula represents the sum S of firing sequences of presynaptic neurons jj(t);WijRepresenting synaptic weights from neuron j to neuron i
Figure BDA0002214880650000043
Represents a unit excitatory conductance;
Figure BDA0002214880650000044
expressed as unit inhibitory conductance.
Compared with the prior art, the invention has the following advantages:
(1) the invention realizes the balance of excitability and inhibitivity through the unsupervised learning rule of the interaction of excitatory STDP and inhibitive STDP; the heterogeneous network structure with long tail distribution characteristics consistent with the characteristics of the biological neural network is formed, the network information capacity is large, and the network performance is optimal.
(2) The traditional network structure of the reserve pool is random connection, and the connection weight is approximately in normal distribution; the network structure of the present application is scalable and scalable to an optimal structure, approaching a power law distribution (heterogeneous).
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate an embodiment of the invention and, together with the description, serve to explain the invention and not to limit the invention. In the drawings:
FIG. 1 is a schematic diagram of a structural model of a spiking neural network;
FIG. 2 is a graphical representation of the function of inhibitory synaptic plasticity according to an embodiment of the present invention;
FIG. 3 is a graphical representation of the functional response of excitatory synaptic plasticity of an embodiment of the present invention;
FIG. 4 is a diagram illustrating a synaptic weight distribution of a pool network structure according to an embodiment of the present invention;
FIG. 5 is a diagram illustrating principal component analysis results from two angles of a network structure of a reserve pool according to an embodiment of the present invention;
FIG. 6 is a diagram illustrating a synaptic weight distribution of a pool network structure according to a first conventional network model;
FIG. 7 is a diagram illustrating principal component analysis results of two angles of a network structure of a reserve pool in a first conventional network model;
FIG. 8 is a diagram illustrating a synapse weight distribution of a pool network structure in a second conventional network model;
fig. 9 is a diagram illustrating principal component analysis results of two angles of a network structure of a reserve pool in the second conventional network model.
Detailed Description
It should be noted that the embodiments and features of the embodiments may be combined with each other without conflict.
The present invention will be described in detail below with reference to the embodiments with reference to the attached drawings.
The reserve pool module of the invention adopts a pulse Neural Network (SNN), the SNN is constructed by simulating the structural characteristics of the brain neuron Network, and compared with the traditional artificial Neural Network model, the reserve pool module has the advantages of local learning and high calculation efficiency. In recent years, the impulse neural network has received much attention as a typical form of a brain-like neural network (brain-like artificial intelligence), and is called a new-generation neural network.
The application provides an optimal design method of a reserve pool neural network structure based on interaction of excitatory STDP and inhibitory STDP (pulse neural network model) by using the principle of neuroscience, and the core is as follows: the reserve pool network is added between an input layer and an output layer of the impulse neural network model, and is a complex network formed by connecting a large number of excitatory (E) neurons and inhibitory (I) neurons. There are four types of connections to the network: E-E, E-I, I-E, I-I, adjusting the connection weight of E-E through the learning rule of excitatory STDP, and adjusting the connection weight of I-E through the learning rule of inhibitory STDP. The interaction between the excitatory STDP and the inhibitory STDP is embodied in that the excitatory STDP and the inhibitory STDP jointly adjust the connection weight of the network and optimize the structure of the network, and both belong to unsupervised learning.
Core points of the optimization results: on the one hand, the size (suitable size) of the connection weight of the network is optimized; on the other hand, the distribution of the connection weights of the network (heterogeneous) is optimized. The application forms a heterogeneous network structure with long tail distribution characteristics consistent with biological neural network characteristics.
Specifically, the number of excitatory neurons and the number of inhibitory neurons in the pool network of the present application are 800, 200 (the number of both may be changed), the initial structure is random connection, and the connection probability is 0.2.
Four synaptic connections are formed between neurons by synaptic connections, according to different pre-and post-synaptic neuron types, including excitatory synaptic connections (E-E) to excitatory neurons, excitatory synaptic connections (E-I) to inhibitory neurons, inhibitory synaptic connections (I-E) to inhibitory neurons, and inhibitory synaptic connections (I-I) to inhibitory neurons. The ratio of the number of excitatory neurons to inhibitory neurons was 4:1, depending on the actual biological nervous system. The following table 1 shows the parameter meanings and values of the neural network structure according to the embodiment of the present application:
TABLE 1 parameter significance and value of neural network architecture
Figure BDA0002214880650000071
The neuron model in the reservoir network adopts a Leaky integral-and-fire (LIF) neuron model, which has small calculation amount, can accurately describe the physiological characteristics of biological neurons, and is widely used in the field of computational neuroscience. The subthreshold membrane potential dynamics of the postsynaptic neuron i can be expressed by the following formula:
Figure BDA0002214880650000072
if Vi≥Vth,then Vi=Vrest
wherein, taumRepresents the membrane time constant; vrestRepresents a resting potential of a neuron; vthRepresenting neuronsMembrane potential threshold of discharge; gleakRepresents a leakage conductance; viRepresents the subthreshold membrane potential of neuron i; vEAnd VIRespectively represent excitatory inversion potential and inhibitory inversion potential;
Figure BDA0002214880650000073
and
Figure BDA0002214880650000074
which represent excitatory and inhibitory conductances of neurons, respectively.
In the application, the external input of the reserve pool is the pulse time from the coding neuron, the excitatory synaptic conductance acts on the neuron, and the connection probability between the coding layer neuron and the excitatory neuron of the reserve pool is 0.2. When the external stimulus fails to make the membrane potential V of the neuron iiIncrease to discharge threshold VthWhile, the neuronal membrane potential will gradually decay to the resting potential; when external stimulation causes the membrane potential V of the neuron iiExceeds a discharge threshold VthWhen the membrane is activated, the neuron will generate an action potential, and then the membrane potential will return to the resting state potential within a certain absolute refractory period taurefThe inner part does not respond to the external stimulus. Typical values for all parameters in the neuron model are shown in table 2:
TABLE 2 neuron model parameter values
Figure BDA0002214880650000081
In the reservoir, the present application uses a conductance-based synaptic model. The post-synaptic output current and the input voltage present an approximate linear relationship, and the formula is as follows:
Isyn=g(t)(Vm(t)-VX)
wherein IsynRepresents the synaptic output current; g represents synaptic conductance; vmRepresents the membrane potential of the postsynaptic neuron; vXIndicating a reversal potential, which for excitatory synapses is the excitatory reversal potential VEFor inhibitory synapses, the reversal potential is inhibitionModulation reversal potential VI
Synaptic conductance can be divided into excitatory conductance and inhibitory conductance according to different external stimuli. When neuron i receives the firing of excitatory presynaptic neuron j, the excitatory conductance of the neuron increases, otherwise it decays; when neuron i receives the firing of inhibitory presynaptic neuron j, the inhibitory conductance of the neuron increases, otherwise it decays:
Figure BDA0002214880650000082
Figure BDA0002214880650000083
wherein, tauERepresents an excitatory synaptic time constant; tau isIRepresenting an inhibitory synaptic time constant; the right-hand summation symbol of the formula represents the sum S of firing sequences of presynaptic neurons jj(t);WijRepresenting synaptic weights from neuron j to neuron i
Figure BDA0002214880650000084
Represents a unit excitatory conductance;
Figure BDA0002214880650000085
expressed as unit inhibitory conductance. The detailed parameters in the synapse model are shown in table 3:
TABLE 3 synaptic model parameter values
Figure BDA0002214880650000091
In neuroscience, synaptic plasticity can cause synaptic strength to increase or decrease with electrical activity of a neural network on a timescale of seconds to days, thereby changing the network structure, recognized as the basis for learning, memory and cognition. At the same time, synaptic plasticity can provide steady-state control of circuits in the central nervous system, which is critical to the stability of the nervous system.
The application applies inhibitory synapse plasticity and excitatory synapse plasticity to the reserve pool simultaneously, adjusts the excitatory and inhibitory balance of the network, and improves the stability and information transmission of the nervous system.
The application enables inhibitory synaptic plasticity to act on I-E synapses, the selected inhibitory plasticity model is synaptic plasticity (STDP) depending on discharge time, the plasticity shows that the discharge of approximately synchronous pre-and post-synaptic neurons strengthens the weight of the inhibitory synapses, and the discharge of single neurons attenuates the neurons, which also accords with the Hebbian theory, namely, in a mutually connected neuron group, the repetitive and continuous coactivation can improve the efficiency of the connected synapses. The regulatory function curve for inhibitory synaptic plasticity is shown in FIG. 2, with the following formula:
Δt=tj-ti
wherein, tj,tiThe pulse emitting time of the pre-synaptic and post-synaptic neurons is respectively.
To further clarify the inhibitory synaptic strength changes, the present application establishes a trace for each neuron i, which changes as follows:
neuron i discharges: x is the number ofi→xi+1
Neuron i does not discharge:
Figure BDA0002214880650000092
wherein, tauSTDPRepresenting the synaptic plasticity time constant.
Then, the inhibitory synaptic weight W from inhibitory neuron j to excitatory neuron iijFollowing the firing activity changes of each presynaptic neuron as follows:
presynaptic neurons in
Figure BDA0002214880650000101
Discharging at the moment: wij→Wij+η(xi-α)
The postsynaptic neurons are in
Figure BDA0002214880650000102
Time of discharge Wij→Wij+ηxj
Wherein xiAnd xjEach indicates synaptic trace of neuron, η indicates learning rate, α ═ 2 × ρ0STDPIs an inhibiting factor, representing the attenuation factor, p0Is the target discharge rate and is a constant value. The meaning and value of the parameters of inhibitory synaptic plasticity are shown in Table 4
TABLE 4 significance and value of parameters for inhibitory synaptic plasticity
Figure BDA0002214880650000103
Meanwhile, excitatory synaptic plasticity acts on E-E synapses, the selected excitatory plasticity model is discharge-time-dependent plasticity (STDP for short), and the plasticity shows that when a post-synaptic neuron i discharges after a pre-synaptic neuron j, synaptic weight is enhanced; synaptic weight decreases when the post-synaptic neuron i fires before the pre-synaptic neuron j; that is, when neuron i or j discharges, Wji+ΔWi→WjiThe curve of the adjusting function is shown in fig. 3, and the formula is as follows:
Figure BDA0002214880650000104
wherein, tjAnd tiThe time of the most recent firing of neurons j and i, τ+And τ-The extent of the postsynaptic pulse gap when synaptic enhancement or diminution occurs, A+(Wij) And A-(Wij) As a weight-dependent function:
Figure BDA0002214880650000111
TABLE 5 significance and value of parameters for excitatory synaptic plasticity
Figure BDA0002214880650000112
The evaluation indexes of the structure optimization of the reserve pool network are as follows:
the information capacity of the network is measured by a Principal Component Analysis (PCA) graph, and the large information capacity of the network indicates that the network structure is optimal.
The traditional network structure of the reserve pool is randomly connected (isomorphic), and the connection weight value follows normal distribution.
The reservoir network structure of the present application, optimized by the interaction of excitatory STDP and inhibitory STDP, is close to power law distributed (heterogeneous). PCA analysis shows that heterogeneous networks have a greater information capacity than homogeneous networks when the average strength of the network connections is the same.
The optimized neural network structure of the reserve pool is compared with the traditional network structures in other 2 to explain the optimization result of the method:
fig. 4 is a diagram showing a synapse weight distribution of an optimized pool network structure, and it can be seen that a network connection Wij is strong, and synapse weights of the network are non-uniformly distributed and approximately power law distributed. Fig. 5 shows the Principal Component Analysis (PCA) results for two angles, which is a large information capacity.
FIG. 6 is a distribution diagram of synaptic weights of a conventional network model one, which shows that the network connection Wij is strong, but the synaptic weights of the neural network are uniformly distributed. Fig. 7 shows the Principal Component Analysis (PCA) results for two angles, with a small information capacity.
Fig. 8 is a distribution diagram of synapse weights of a conventional network model two, which shows that the network connection Wij is weak, and the synapse weights of the neural network are uniformly distributed. Fig. 9 shows the Principal Component Analysis (PCA) results for two angles, with a small information capacity.
Examples of applications are:
the learning of the deep neural network requires massive data. The medical data is small sample data in nature, and the data is difficult to acquire and high in cost. Therefore, the impulse neural network has a good application prospect in intelligent analysis of medical images, such as identification of lung nodules (tumors).
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (8)

1. The structure optimization method of the reserve pool network based on excitability and inhibitivity STDP is characterized by comprising the following steps: the reserve pool network is a network formed by connecting excitatory neurons E and inhibitory neurons I, and four connections exist in the network: E-E, E-I, I-E, I-I, wherein the connection weight of E-E is adjusted by excitatory STDP, and the connection weight of I-E is adjusted by inhibitory STDP.
2. The method of claim 1 for structural optimization of an excitatory and inhibitory STDP-based pool network, wherein: acting inhibitory synaptic plasticity on I-E synapses, wherein the inhibitory synaptic plasticity is modeled as synaptic plasticity STDP dependent on the discharge time, the plasticity is expressed by that neurons discharge before and after approximately synchronous synapses so that inhibitory synaptic weights are enhanced, and single neuron discharge attenuates the inhibitory synaptic plasticity, and the formula of a regulatory function of the inhibitory synaptic plasticity is as follows:
Δt=tj-ti
wherein, tj,tiThe pulse emitting time of the pre-synaptic and post-synaptic neurons is respectively.
3. The method of claim 1 for structural optimization of an excitatory and inhibitory STDP-based pool network, wherein: acting excitatory synaptic plasticity on E-E synapses, wherein the excitatory synaptic plasticity model is plasticity STDP dependent on discharge time, and the plasticity shows that synaptic weight is enhanced when a post-synaptic neuron i discharges after a pre-synaptic neuron j; synaptic weight decreases when the post-synaptic neuron i fires before the pre-synaptic neuron j; i.e. when neuron i or j discharges, Wji+ΔWi→WjiPublic of its regulatory functionThe formula is as follows:
Figure FDA0002214880640000011
wherein, WjiAs synaptic weight, tjAnd tiThe time of the most recent firing of neurons j and i, τ+And τ-The extent of the postsynaptic pulse gap when synaptic enhancement or diminution occurs, A+(Wij) And A-(Wij) As a weight-dependent function:
Figure FDA0002214880640000021
Figure FDA0002214880640000022
wherein η is the learning rate, WmaxThe maximum weight.
4. The method of claim 1 for structural optimization of an excitatory and inhibitory STDP-based pool network, wherein: the inhibitory synaptic strength change is clarified by tracking, and the tracking method comprises the following steps:
a trace is created for each neuron i, which varies as follows:
neuron i discharges: x is the number ofi→xi+1
Neuron i does not discharge:
Figure FDA0002214880640000023
wherein, tauSTDPRepresents a synaptic plasticity time constant;
then, the inhibitory synaptic weight W from inhibitory neuron j to excitatory neuron iijFollowing the firing activity changes of each presynaptic neuron as follows:
presynaptic neurons in
Figure FDA0002214880640000024
Discharging at the moment: wij→Wij+η(xi-α)
The postsynaptic neurons are in
Figure FDA0002214880640000025
Time of discharge Wij→Wij+ηxj
Wherein xiAnd xjEach indicates synaptic trace of neuron, η indicates learning rate, α ═ 2 × ρ0STDPIs an inhibiting factor, representing the attenuation factor, p0Is the target discharge rate and is a constant value.
5. The method of claim 1 for structural optimization of an excitatory and inhibitory STDP-based pool network, wherein: the ratio of the number of excitatory neurons to the number of inhibitory neurons in the reserve pool network is 4:1, random connection is adopted in the initial structure, and the connection probability is 0.2.
6. The method of claim 1 for structural optimization of an excitatory and inhibitory STDP-based pool network, wherein: the neuron model in the reserve pool network adopts a LIF neuron model, and the subthreshold membrane potential dynamics of the post-synaptic neuron i is expressed by the following formula:
Figure FDA0002214880640000031
if Vi≥Vth,then Vi=Vrest
wherein, taumRepresents the membrane time constant; vrestRepresents a resting potential of a neuron; vthA membrane potential threshold representing a neuronal discharge; gleakRepresents a leakage conductance; viRepresents the subthreshold membrane potential of neuron i; vEAnd VIRespectively represent excitatory inversion potential and inhibitory inversion potential;
Figure FDA0002214880640000032
and
Figure FDA0002214880640000033
which represent excitatory and inhibitory conductances of neurons, respectively.
7. The method of claim 1 for structural optimization of an excitatory and inhibitory STDP-based pool network, wherein: the external input of the reserve pool is pulse time from a coding neuron, the excitatory synaptic conductance acts on the neuron, the connection probability of the coding layer neuron and the excitatory neuron of the reserve pool is 0.2, and when the external stimulation fails to enable the membrane potential V of the neuron iiIncrease to discharge threshold VthWhile, the neuronal membrane potential will gradually decay to the resting potential; when external stimulation causes the membrane potential V of the neuron iiExceeds a discharge threshold VthWhen the membrane is activated, the neuron will generate an action potential, and then the membrane potential will return to the resting state potential within a certain absolute refractory period taurefThe inner part does not respond to the external stimulus.
8. The method of claim 1 for structural optimization of an excitatory and inhibitory STDP-based pool network, wherein: the reserve pool network uses a conductance-based synapse model, and the post-synaptic output current and the input voltage present an approximate linear relationship, and the formula is as follows:
Isyn=g(t)(Vm(t)-VX)
wherein IsynRepresents the synaptic output current; g represents synaptic conductance; vmRepresents the membrane potential of the postsynaptic neuron; vXIndicating a reversal potential, which for excitatory synapses is the excitatory reversal potential VEWhereas for inhibitory synapses, the inversion potential is the inhibitory inversion potential VI
Synaptic conductance can be divided into excitatory conductance and inhibitory conductance according to different external stimuli. When neuron i receives the firing of excitatory presynaptic neuron j, the excitatory conductance of the neuron increases, otherwise it decays; when neuron i receives the firing of inhibitory presynaptic neuron j, the inhibitory conductance of the neuron increases, otherwise it decays:
Figure FDA0002214880640000041
Figure FDA0002214880640000042
wherein, tauERepresents an excitatory synaptic time constant; tau isIRepresenting an inhibitory synaptic time constant; the right-hand summation symbol of the formula represents the sum S of firing sequences of presynaptic neurons jj(t);WijRepresenting synaptic weights from neuron j to neuron i
Figure FDA0002214880640000043
Represents a unit excitatory conductance;
Figure FDA0002214880640000044
expressed as unit inhibitory conductance.
CN201910911676.9A 2019-09-25 2019-09-25 Structure optimization method of reserve pool network based on excitability and inhibition STDP Pending CN110874629A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910911676.9A CN110874629A (en) 2019-09-25 2019-09-25 Structure optimization method of reserve pool network based on excitability and inhibition STDP

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910911676.9A CN110874629A (en) 2019-09-25 2019-09-25 Structure optimization method of reserve pool network based on excitability and inhibition STDP

Publications (1)

Publication Number Publication Date
CN110874629A true CN110874629A (en) 2020-03-10

Family

ID=69718061

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910911676.9A Pending CN110874629A (en) 2019-09-25 2019-09-25 Structure optimization method of reserve pool network based on excitability and inhibition STDP

Country Status (1)

Country Link
CN (1) CN110874629A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112308221A (en) * 2020-10-14 2021-02-02 成都市深思创芯科技有限公司 Working memory hardware implementation method based on reserve pool calculation
CN116312810A (en) * 2023-05-16 2023-06-23 中国石油大学(华东) Method for predicting human brain excitability and inhibitory connection based on hypergraph nerve field

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112308221A (en) * 2020-10-14 2021-02-02 成都市深思创芯科技有限公司 Working memory hardware implementation method based on reserve pool calculation
CN112308221B (en) * 2020-10-14 2024-02-27 成都市深思创芯科技有限公司 Working memory hardware implementation method based on reserve pool calculation
CN116312810A (en) * 2023-05-16 2023-06-23 中国石油大学(华东) Method for predicting human brain excitability and inhibitory connection based on hypergraph nerve field
CN116312810B (en) * 2023-05-16 2023-08-15 中国石油大学(华东) Method for predicting human brain excitability and inhibitory connection based on hypergraph nerve field

Similar Documents

Publication Publication Date Title
Diehl et al. Unsupervised learning of digit recognition using spike-timing-dependent plasticity
Pham et al. Control chart pattern recognition using spiking neural networks
KR101793011B1 (en) Efficient hardware implementation of spiking networks
US9418331B2 (en) Methods and apparatus for tagging classes using supervised learning
CN112101535B (en) Signal processing method of impulse neuron and related device
Lin et al. Relative ordering learning in spiking neural network for pattern recognition
Shrestha et al. Stable spike-timing dependent plasticity rule for multilayer unsupervised and supervised learning
US20100076916A1 (en) Autonomous Learning Dynamic Artificial Neural Computing Device and Brain Inspired System
US20110145179A1 (en) Framework for the organization of neural assemblies
WO2015112262A1 (en) Configuring sparse neuronal networks
US9959499B2 (en) Methods and apparatus for implementation of group tags for neural models
KR20160125967A (en) Method and apparatus for efficient implementation of common neuron models
CN110874629A (en) Structure optimization method of reserve pool network based on excitability and inhibition STDP
CN114266351A (en) Pulse neural network training method and system based on unsupervised learning time coding
Humaidi et al. Spiking versus traditional neural networks for character recognition on FPGA platform
Sboev et al. Solving a classification task by spiking neurons with STDP and temporal coding
CN115222026A (en) Hardware circuit of impulse neural network
CN116796207A (en) Self-organizing mapping clustering method based on impulse neural network
Zhou et al. Improved integrate-and-fire neuron models for inference acceleration of spiking neural networks
Zhang et al. The framework and memristive circuit design for multisensory mutual associative memory networks
Deng et al. Stdp and competition learning in spiking neural networks and its application to image classification
Sun et al. Simplified spike-timing dependent plasticity learning rule of spiking neural networks for unsupervised clustering
CN114528984A (en) Multi-input neuron circuit for impulse neural network
LI et al. Research on learning algorithm of spiking neural network
Sahni et al. Implementation of boolean and and or logic gates with biologically reasonable time constants in spiking neural networks

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information
CB03 Change of inventor or designer information

Inventor after: Wang Junsong

Inventor after: Yao Yang

Inventor after: Ma Zengguang

Inventor before: Wang Junsong

Inventor before: Yao Yang

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200310