WO2022123781A1 - ニューラルネットワーク装置、生成装置、情報処理方法、生成方法および記録媒体 - Google Patents
ニューラルネットワーク装置、生成装置、情報処理方法、生成方法および記録媒体 Download PDFInfo
- Publication number
- WO2022123781A1 WO2022123781A1 PCT/JP2020/046331 JP2020046331W WO2022123781A1 WO 2022123781 A1 WO2022123781 A1 WO 2022123781A1 JP 2020046331 W JP2020046331 W JP 2020046331W WO 2022123781 A1 WO2022123781 A1 WO 2022123781A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- time
- neural network
- signal
- neuron model
- input
- Prior art date
Links
- 238000013528 artificial neural network Methods 0.000 title claims abstract description 357
- 238000000034 method Methods 0.000 title claims description 47
- 230000010365 information processing Effects 0.000 title claims description 9
- 238000003672 processing method Methods 0.000 title claims description 9
- 210000002569 neuron Anatomy 0.000 claims abstract description 373
- 238000012421 spiking Methods 0.000 claims abstract description 352
- 239000012528 membrane Substances 0.000 description 68
- 230000006870 function Effects 0.000 description 60
- 238000012545 processing Methods 0.000 description 46
- 238000010586 diagram Methods 0.000 description 31
- 238000010304 firing Methods 0.000 description 23
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 19
- 238000004891 communication Methods 0.000 description 15
- 230000004913 activation Effects 0.000 description 14
- 230000009466 transformation Effects 0.000 description 14
- 238000004364 calculation method Methods 0.000 description 11
- 230000000946 synaptic effect Effects 0.000 description 11
- 230000009467 reduction Effects 0.000 description 9
- 239000011159 matrix material Substances 0.000 description 7
- 230000008569 process Effects 0.000 description 7
- 239000003990 capacitor Substances 0.000 description 6
- 230000003993 interaction Effects 0.000 description 6
- 210000000225 synapse Anatomy 0.000 description 5
- 230000001934 delay Effects 0.000 description 3
- 239000013598 vector Substances 0.000 description 3
- 230000036982 action potential Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 210000004556 brain Anatomy 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000003111 delayed effect Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000001242 postsynaptic effect Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 210000005056 cell body Anatomy 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000013136 deep learning model Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000007429 general method Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000012886 linear function Methods 0.000 description 1
- 230000003278 mimic effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/049—Temporal neural networks, e.g. delay elements, oscillating neurons or pulsed inputs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/06—Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
- G06N3/063—Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means
Definitions
- the present invention relates to a neural network device, a generation device, an information processing method, a generation method, and a recording medium.
- One of the neural networks is a spiking neural network (see, for example, Patent Document 1).
- a neuron model has an internal state called a membrane potential and outputs a signal called a spike based on the time evolution of the membrane potential.
- a spiking neural network is better than a neural network using a neuron model (called an artificial neural network) that performs calculations that do not have an internal state and do not include time elements. It is expected that the power consumption at the time of calculation is small.
- a spiking neural network equivalent to an artificial neural network can be generated, it becomes possible to perform operations such as performing high-precision learning using the artificial neural network and then generating a spiking neural network equivalent to that. As a result, it is possible to achieve both high-precision processing and reduction of power consumption.
- the configuration of the spiking neural network is as simple as possible.
- One of the objects of the present invention is to provide a neural network device, a generation device, an information processing method, a generation method, and a recording medium capable of solving the above-mentioned problems.
- the neural network device includes a time-based spiking neuron model means that outputs a signal when the amount of internal state that develops over time according to a signal input time exceeds a threshold value, and the time-based spy. It is provided with a delay means for outputting a signal obtained by changing the spike time indicated by the output signal of the king neuron model means as a relative time with respect to the reference time by a set time.
- the generator is a time-based spiking neuron model means that outputs a signal when the amount of internal state that develops over time according to the signal input time exceeds a threshold value, and the time-based spiking.
- a base network generation means for generating a neural network including a delay means for outputting a signal obtained by changing the spike time indicated by an output signal of the neuron model means as a relative time to a reference time by a set time, and the time method.
- the weight of the input signal to the spiking neuron model means is the time when the input time of the input signal is the time obtained by reversing the sign of the numerical value indicated by the input signal
- the output time of the output signal of the delay means is the output signal.
- the weight setting means that sets the weight based on the equation that the sign of the numerical value indicated by is reversed, and the set time in the delay means are the input signals to the time-based spiking neuron model means.
- the information processing method outputs a first signal when the amount of internal state that develops over time according to the signal input time becomes equal to or more than a threshold value, and the first signal is relative to a reference time. It includes outputting a second signal in which the spike time indicated by the relative time is changed by a set time.
- the method for generating a neural network includes a time-based spiking neuron model means that outputs a signal when the amount of internal state that develops over time according to a signal input time exceeds a threshold value, and the time-based method.
- a neural network including a delay means for outputting a signal obtained by changing the spike time indicated by the output signal of the spiking neuron model means as a relative time to a reference time by a set time, and the time method spy.
- the weight of the input signal to the king neuron model means is the time when the input time of the input signal is the time when the sign of the numerical value indicated by the input signal is reversed
- the output time of the output signal of the delay means is the output signal.
- the weight is set based on the formula that the sign of the indicated numerical value is reversed, and the set time in the delay means is set by the input time of the input signal to the time-based spiking neuron model means.
- the time is the time when the sign of the numerical value indicated by the input signal is reversed
- the time is set based on the formula that the output time of the output signal of the delay means is the time when the sign of the numerical value indicated by the output signal is reversed. And, including.
- the recording medium outputs a first signal to the computer when the amount of internal state that develops over time according to the signal input time becomes equal to or more than a threshold value, and the first signal is a reference. It is a recording medium for recording a program for outputting a second signal obtained by changing the spike time indicated by a relative time with respect to the time by a set time and executing the second signal.
- the recording medium is a time-based spiking neuron model means that outputs a signal to a computer when the amount of internal state that develops over time according to the signal input time exceeds a threshold value, and the time.
- Method The time method of generating a neural network including a delay means for outputting a signal obtained by changing the spike time indicated by the output signal of the spiking neuron model means as a relative time to a reference time by a set time.
- the weight of the input signal to the spiking neuron model means is the time when the input time of the input signal is the time obtained by reversing the sign of the numerical value indicated by the input signal
- the output time of the output signal of the delay means is the output signal.
- the weight is set based on the formula that the sign of the numerical value indicated by is reversed, and the set time in the delay means is set by the input time of the input signal to the time-based spiking neuron model means.
- the time is the time when the sign of the numerical value indicated by the input signal is reversed
- the time is set based on the formula that the output time of the output signal of the delay means is the time when the sign of the numerical value indicated by the output signal is reversed. It is a recording medium that records a program for executing a thing.
- the configuration of a spiking neural network equivalent to an artificial neural network can be made into a relatively simple configuration.
- FIG. 1 is a schematic block diagram showing a configuration example of a neural network generation device according to an embodiment.
- the neural network generation device 10 includes a base network generation unit 11, a weight setting unit 12, and a delay setting unit 13.
- the neural network generator is also simply referred to as a generator.
- the neural network generation device 10 acquires the configuration information of the non-memory type Neural Network 110, and the spiking neural network (SNN) 120 equivalent to the non-memory type neural network. To generate.
- the neural network generation device 10 may generate a spiking neural network device as a spiking neural network 120.
- the fact that the two neural networks are equivalent here means that the two neural networks can be regarded as outputting the same information for the input of the same information.
- the term "considered” here means that the form of expression of information may be different.
- a neuron model that calculates the weighted total of the input values, inputs the calculated total value or the value obtained by adding the bias value to the total value into the activation function, and outputs the obtained function value is a non-memory type. It is called a neuron model (Non-memory type Neuron Model).
- a neural network constructed using a non-memory type neuron model is referred to as a non-memory type neural network.
- a non-memory type neural network is also referred to as an artificial neural network (ANN).
- the non-memory type neuron model is also called an artificial neuron model.
- the spiking neural network referred to here is a neural network constructed using a spiking neuron model (Spiking Neuron Model).
- the spiking neuron model referred to here is a general term for neuron models that output binary signals called spikes based on the internal state that evolves over time.
- the binary signal referred to here may be an on / off signal.
- the internal state in the spiking neuron model is called the membrane potential.
- the spiking neural network 120 generated by the neural network generation device 10 has a structure for making a neural network equivalent to the artificial neural network 110 as described later.
- a spiking neural network 120 having a structure for forming a neural network equivalent to the artificial neural network 110 is also referred to as a time adjustment type spiking neural network.
- the neuron model used in the time-adjusted spiking neural network is also called a time-adjusted spiking neuron model (Time Adjustment Type Spiking Neuron Model).
- the neural network device referred to here is a device on which a neural network is implemented.
- the neural network may be implemented by using dedicated hardware, or may be implemented by software using a computer or the like.
- a device on which an artificial neural network is mounted is also referred to as an artificial neural network device.
- a device on which a spiking neural network is implemented is also referred to as a spiking neural network device.
- the neural network generator 10 handles a feedforward neural network. That is, the neural network generation device 10 acquires the configuration information of the forward propagation type artificial neural network 110 and generates the forward propagation type spiking neural network 120 equivalent to the artificial neural network 110.
- the forward propagation type is one of the forms of the network, and is a one-way network in which information is transmitted from layer to layer.
- Each layer of a forward-propagating neural network is composed of one or more neuron models, and there is no connection between neuron models in the same layer.
- FIG. 2 is a diagram showing an example of the hierarchical structure of a forward propagation neural network.
- the forward propagation type neural network 200 is configured in a hierarchical structure, receives data input, and outputs a calculation result.
- the calculation result output by the neural network is also called a predicted value or a prediction.
- the first layer of the forward propagation neural network is called the input layer, and the last layer is called the output layer.
- the layer between the input layer and the output layer is called the hidden layer.
- the number of hidden layers may be 0 or more. Therefore, the forward propagation neural network does not have to have a hidden layer.
- the forward propagation neural network 200 includes an input layer 211, a hidden layer 212, and an output layer 213.
- the input layer 211, the hidden layer 212, and the output layer 213 are collectively referred to as a layer 210.
- FIG. 3 is a diagram showing a configuration example of the forward propagation type neural network 200.
- FIG. 3 shows an example in which the forward propagation neural network 200 includes four layers 210 including an input layer 211, two hidden layers 212, and an output layer 213.
- the upper layer is referred to as a hidden layer 212-1 and the lower layer is referred to as a hidden layer 212-2.
- the upper layer here is the side closer to the input layer 211.
- the lower part of the layer is the side closer to the output layer 213.
- FIG. 3 shows an example in which these four layers 210 each have three nodes 220.
- the number of nodes 220 included in the forward propagation neural network 200 is not limited to a specific number, and each layer 210 may include one or more nodes 220.
- Each layer 210 may have the same number of nodes 220, or may have a different number of nodes 220 depending on the layer 210.
- the nodes of two adjacent layers 210 are connected by an edge 230.
- the edge 230 transmits a signal from the upper layer node to the lower layer node. All combinations of the upper layer node 220 and the lower layer node 220 may be connected by an edge 230 between two adjacent layers 210. Alternatively, there may be combinations that are not connected at the edge 230.
- the node 220 of the input layer 211 distributes the input signal to the node 220 of the next layer 210.
- the node 220 of the input layer 211 is also referred to as an input node 221.
- a neuron model is used for both the node 220 of the hidden layer 212 and the node 220 of the output layer 213.
- the node 220 of the hidden layer 212 and the node 220 of the output layer 213 are collectively referred to as a neuron model node 222.
- the hidden layer 212 and the output layer 213 are collectively referred to as a neuron model layer 214.
- an artificial neuron model is used as the neuron model node 222.
- a spiking neuron model is used as the neuron model node 222.
- the node 220 is treated as a neuron model.
- the neuron model constituting the lower layer node 220 acquires a signal from the upper layer node 220, the signal is output regardless of whether the upper layer node 220 is the input node 221 or the neuron model node 222. It is common in terms of points, and there is no need to distinguish them for explanation. In this case, the node 220 in the upper layer is treated as a neuron model and described.
- the artificial neural network 110 is a neural network constructed by using an artificial neuron model.
- the artificial neuron model is a neuron model that calculates a weighted total of input values, inputs the calculated total value or a value obtained by adding a bias value to the total value, to an activation function, and outputs the obtained function value.
- the output value of the artificial neuron model is shown by Eq. (1).
- x (l) i represents the output of the i-th artificial neuron model of the first layer.
- w (l) ij is a coefficient indicating the strength of the connection from the j-th artificial neuron model of the l-1 layer to the i-th artificial neuron model of the l-th layer, and is called a weight.
- w (l) ij 0. .. “ ⁇ j w (l) ij x (l-1) j ” represents the weighted total of the above input values.
- b (l) i is called a bias term and indicates the above bias value.
- f represents the activation function.
- the value of the weight w (l) ij and the value of the bias term b (l) i are subject to update by learning.
- the transformation of the part of the transformation shown in the equation (1) excluding the activation function can be grasped as an affine transformation, and is expressed as the equation (2).
- x (l-1) represents the vector of the output of the artificial neuron model of the l-1 layer (x (l-1) 1 , ..., X (l-1) N (l-1) ).
- N (l-1) indicates the number of artificial neuron models in layer l-1.
- "Affine” is an affine function (a function indicating an affine transformation). In the following, a case where a Ramp Function is used as the activation function f will be described as an example. The ramp function is also called a Rectified Linear Function.
- FIG. 4 is a diagram showing a ramp function.
- the horizontal axis of the graph in FIG. 4 shows the input to the ramp function, that is, the argument value of the ramp function.
- the vertical axis shows the output of the ramp function, that is, the function value of the ramp function.
- the ramp function is represented by ReLU.
- the input to the ramp function is represented by x.
- ReLU (x) x.
- ReLU (x) 0.
- the ramp function ReLU is expressed by Eq. (3).
- Max is a function that outputs the maximum value among the arguments.
- the activation function in the artificial neuron model is not limited to the ramp function.
- various functions that can be expressed by the time method as described later can be used.
- FIG. 5 is a schematic block diagram showing a configuration example of an artificial neuron model.
- the artificial neuron model 111 includes an affine transformation unit 112 and an activation function unit 113.
- the affine transformation unit 112 adds the weighted total of the inputs to the artificial neuron model 111 and the bias value to the obtained total value. For example, the affine transformation unit 112 calculates the affine (x (l-1) ) based on the above equation (2).
- the activation function unit 113 calculates the activation function in the artificial neuron model 111. For example, the activation function unit 113 calculates the function value ReLU (Affine (x (l-1))) in which the Affine (x (l-1) ) calculated by the affine transformation unit 112 is input to the ramp function ReLU.
- a spiking neural network is a neural network constructed using a spiking neuron model.
- the spiking neuron model outputs a binary signal called a spike based on an internal state that evolves over time.
- the spiking neuron model simulates signal integration and spike generation (firing) by the cell body of biological neurons.
- the membrane potential evolves over time according to a differential equation such as Eq. (1).
- v (n) i indicates the membrane potential in the i-th spiking neuron model of the nth layer.
- ⁇ - leak is a constant coefficient indicating the magnitude of the leak in the leak integral ignition model.
- I (n) i indicates the postsynaptic current in the i-th spiking neuron model of layer n.
- w' (n) ij is a coefficient indicating the strength of the connection from the j-th spiking neuron model of the n-1st layer to the i-th spiking neuron model of the nth layer, and is called a weight.
- t indicates the time.
- t (n-1) j indicates the firing timing (firing time) of the jth neuron in the n-1th layer.
- r ( ⁇ ) is a function indicating the effect of spikes transmitted from the previous layer on the postsynaptic current.
- the spiking neuron model When the membrane potential reaches the threshold Vth , the spiking neuron model outputs spikes.
- the threshold value V th is a threshold value that simulates an action potential.
- spikes are binary signals output by the spiking neuron model.
- the output of spikes by the spiking neuron model is also called firing.
- the membrane potential may return to the reset value V reset after firing.
- the spikes output by the spiking neuron model are transmitted to the lower layer spiking neuron model that is connected to the spiking neuron model.
- FIG. 6 is a diagram showing an example of the time evolution of the membrane potential of a spiking neuron.
- the horizontal axis of the graph of FIG. 6 indicates the time, and the vertical axis indicates the membrane potential.
- FIG. 6 shows an example of the time evolution of the membrane potential of the i-th spiking neuron in the nth layer, and the membrane potential is expressed as v (n) i .
- Vth indicates the threshold value of the membrane potential.
- V reset indicates the reset value of the membrane potential.
- t (n-1) 1 indicates the firing timing of the first neuron in the n-1 layer.
- t (n-1) 2 indicates the firing timing of the second neuron in the n-1 layer.
- t (n-1) 3 indicates the firing timing of the third neuron in the n-1 layer.
- Spiking neural networks are expected to consume less power than deep learning models when they are made into hardware using CMOS (Complementary MOS).
- CMOS Complementary MOS
- the human brain is a low power consumption computing medium equivalent to 30 watts (W)
- spiking neural networks can mimic the activity of such low power consumption brains. be.
- W power consumption computing medium
- the power consumption of the signal can be reduced as compared with the case where an analog signal is used in an artificial neural network.
- the neural network generation device 10 uses a time method as an information transmission method in the spiking neural network 120. In the time method, information is transmitted at the ignition timing.
- FIG. 7 is a diagram showing an example of spikes in the time system.
- the horizontal axis of FIG. 7 indicates the time. Specifically, the horizontal axis indicates a relative time with respect to the time "0" which is the reference time.
- the vertical axis shows the signal value.
- three examples are shown: an example of firing at time "1", an example of firing at time "3", and an example of firing at time "5".
- the spiking neuron model can provide quantitative information by firing time.
- the spike that ignites at the time "3” may indicate the numerical value "3”
- the spike may indicate the numerical value of the ignition time. That is, the spiking neuron model may indicate a numerical value by the length of time between the reference time and the firing time.
- the firing time is not limited to an integer time, and the spiking neuron model may indicate a real value depending on the firing time.
- FIG. 7 shows an example in which a step signal is used as a spike, but the present invention is not limited to this.
- Spikes of various shapes that can indicate the ignition time can be used.
- a pulse signal that falls after a certain period of time from the rise may be used as a spike. In this case, by turning off the signal, it is expected that the power consumption of the signal can be further reduced as compared with the case of the step signal.
- each of the spiking neuron models may be fired at most once.
- the power consumption due to the signal can be reduced as compared with the case of the frequency method in which the numerical value is indicated by the number of spikes.
- a time-based spiking neural network is also referred to as a time-based spiking neural network.
- Time-based spiking neurons are also referred to as time-based spiking neurons.
- the neural network generation device 10 may generate a spiking neural network 120 equivalent to the trained artificial neural network 110.
- the neural network generation device 10 may generate a spiking neural network 120 equivalent to the trained artificial neural network 110 as a spiking neural network device.
- it is possible to achieve both high-precision calculation execution and reduction of power consumption.
- FIG. 8 is a diagram showing an example of the correspondence between the artificial neural network 110 and the spiking neural network 120.
- one layer of the neural network is further subdivided into a plurality of layers.
- FIG. 8 shows a configuration example of one layer of the neuron model layer 214 in the forward propagation neural network 200.
- the neuron model layer 214 is a hidden layer 212 or an output layer 213.
- generating a spiking neural network 120 equivalent to the artificial neural network 110 is also referred to as converting the artificial neural network 110 into the spiking neural network 120.
- Generating a portion of a spiking neural network 120 that corresponds to a portion of the artificial neural network 110 is also referred to as transforming.
- one neuron model layer 214 in the artificial neural network 110 is referred to as a target layer, and a case where the neural network generation device 10 converts the target layer into the neuron model layer 214 of the spiking neural network 120 will be described.
- the neural network generation device 10 converts each of all the neuron model layers 214 included in the artificial neural network 110 into the neuron model layer 214 of the spiking neural network 120.
- one neuron model layer 214 can be further grasped as a combination of the affine transformation layer 231 and the ReLU layer 232.
- the affine transformation unit 112 of all the artificial neuron models 111 included in one neuron model layer 214 is collectively regarded as a layer, which corresponds to the example of the affine transformation layer 231.
- the activation function unit 113 of all the artificial neuron models 111 included in one neuron model layer 214 is collectively regarded as a layer, which corresponds to the example of the ReLU layer 232.
- the neural network generation device 10 provides the affine transformation layer 231 with a setting spike generator 241, a time-based spiking neuron layer 242, and a delay layer 243. Further, the neural network generation device 10 provides the t-ReLU layer 244 with respect to the ReLU layer 232.
- the time-based spiking neuron layer 242 includes a time-based spiking neuron model.
- the neural network generation device 10 generates a time-based spiking neuron layer 242 including the same number of time-based spiking neuron models as the number of artificial neuron models 111 included in the target layer.
- the number of neuron models in the first layer which is the target layer, is one, and the number of neuron models in the l-1 layer is N. Then, the notation of l indicating the number of layers and the notation of i indicating the number of the neuron model of the first layer are omitted. Further, the number of the neuron model in the l-1 layer is indicated by i instead of j above.
- the neural network generation device 10 may perform the process described below for each artificial neuron model in the target layer. Further, when the number of the neuron model layers 214 of the artificial neural network 110 is two or more, the neural network generation device 10 may perform the process described below with each of the neuron model layers 214 as the target layer. good.
- the neural network generator 10 uses a time-based spiking neuron model in which the leakage size ( ⁇ leak ) is 0 for the time-based spiking neuron layer 242, and Eq. (4) is modified as in Eq. (5). do.
- t indicates the time.
- v indicates the membrane potential.
- w ⁇ i represents the weight of the connection from the i-th spiking neuron in the upper layer.
- the step function ⁇ is used as the function r in the equation (4).
- the step function ⁇ is expressed by Eq. (6).
- each of the time-based spiking neuron layers 242 of the time-based spiking neuron layer 242 receives input of set spikes from the set spike generator 241 in addition to spikes from the upper layer.
- t 1 , ..., T N indicate the input time of the spike from the upper layer.
- t 0 indicates the input time of the setting spike.
- the setting spike generator 241 outputs a setting spike at a set time (time t 0 ) that does not depend on the value of the input data to the spiking neural network 120.
- the set time may be updated by learning or the like.
- Equation (7) can be obtained by integrating equation (5).
- the delay layer 243 delays the output spikes of the time-based spiking neurons. Assuming that the delay time in the delay layer 243 is ⁇ , the spike output time of the delay layer 243 is expressed by Eq. (9).
- the equation (10) corresponds to the equation in which the notation of l and i indicating the number of the layer and the number of the spiking neuron is omitted in the equation (2) and j in the equation (2) is expressed as i.
- the output value of the affine transformation layer 231 of the artificial neural network 110 is shown by the spike output time of the delay layer 243 of the spiking neural network 120.
- w i x i -b the spike output time
- Equation (11) may hold regardless of the value of xi .
- the condition of Wi for that purpose is expressed by the equation (12).
- the equation (11) holds regardless of the value of xi .
- the delay time ⁇ in the equation (13) can take any value. By adjusting the value of the delay time ⁇ , it is easy to satisfy the equation (13). Further, when the equation (12) is modified, the equation (14) is obtained.
- the matrix A is defined as the equation (16).
- the weights w ⁇ 1 , w ⁇ 2 , ..., W ⁇ N in the time-based spiking neuron layer 242 can be calculated as in Eq. (17).
- the matrix A -1 is the inverse matrix of the matrix A.
- the delay time ⁇ can be calculated by inputting the value of the weight w ⁇ i calculated by the equation (17) into the equation (18).
- the delay time ⁇ can be a negative value.
- the information of the spiking neural network in each layer is in the time difference of each neuron. That is, even if all the neurons in a certain layer are delayed by the same time, the output of the entire network is delayed by that amount, and the same information can be expressed regardless of the magnitude of the delay time. From this, as the delay time, an amount common to a certain layer can be arbitrarily added. That is, when the delay time ⁇ calculated by the equation (18) is negative, a delay amount common to the target layers may be appropriately added so that the delay time ⁇ becomes positive.
- the neural network generation device 10 may perform processing for each artificial neuron model in the target layer. Further, when the number of the neuron model layers 214 of the artificial neural network 110 is two or more, the neural network generation device 10 may perform processing with each of the neuron model layers 214 as a target layer.
- the target layer is represented by the first layer
- the number of the neuron model in the target layer is represented by i
- the number of the neuron model in the upper layer of the target layer is represented by j
- the equation (17) is represented by the equation (19). It is shown as.
- w ⁇ (l) ij indicates the weight of the connection from the j-th spiking neuron model of the l-1 layer to the i-th spiking neuron model of the l-th layer. However, w ⁇ (l) i0 indicates the weight for the set spike.
- ⁇ i (l) indicates the delay time by the delay layer 243 for the output spike of the i-th time-based spiking neuron model of the time-based spiking neuron layer 242 of the first layer.
- t (l-1) 0 indicates the spike input time from the set spike generator 241 to the i-th time-based spiking neuron model of the l-layer time-based spiking neuron layer 242.
- V th (l) (i) indicates the threshold value of the membrane potential in the i-th time-based spiking neuron model of the l-th layer time-based spiking neuron layer 242. As mentioned above, this threshold simulates action potentials.
- FIG. 9 is a diagram showing a configuration example of the t-ReLU layer 244.
- the number of time-based spiking neurons contained in the time-based spiking neuron layer 242 is represented by M.
- M spikes are input from the delay layer 243 to the t-ReLU layer 244 at times t 1 , t 2 , ..., T M , respectively.
- a setting spike is input to the time t ReLU .
- the set spike in the t-ReLU layer 244 is a separate spike from the set spike in the time-based spiking neuron layer 242.
- the setting spike in the t-ReLU layer 244 is also referred to as a t-ReLU spike.
- the spike output times of the t - ReLU layer are t'1, t'2 , ..., T'M .
- each of the input spikes to the t-ReLU layer and the set spike are ORed.
- the t-ReLU layer outputs the spike at the earlier of the spike input time ti to the t- ReLU layer and the set spike time t ReLU .
- FIG. 10 is a diagram showing an example of spike input / output times in the t-ReLU layer 244.
- the horizontal axis of the graph of FIG. 10 indicates the time, and the vertical axis indicates the identification number i of the node in the upper layer.
- the identification number of the node in the upper layer is also used as the identification number of the spike.
- FIG. 10 shows a case where the number of input spikes to the t-ReLU layer 244 is three.
- the upper graph of FIG. 10 shows the input times t 1 , t 2 , and t 3 of the spikes to the t-ReLU layer 244.
- the lower graph of FIG. 10 shows the output times t'1, t'2, and t'3 of the spikes from the t - ReLU layer 244.
- the t- ReLU layer 244 outputs a spike at time t'i , whichever is earlier, time ti or time t ReLU .
- t'3 t ReLU .
- the right side of the equation (21) becomes the same function as the ramp function (see equation (3)) except that the input / output signs are opposite. In this respect, it can be said that the t-ReLU layer 244 applies a ramp function to the spike output time.
- the neuron model of the spiking neural network 120 may be provided for each artificial neuron model.
- the neuron model in this case is also referred to as a time-adjusted spiking neuron model.
- FIG. 11 is a schematic block diagram showing a configuration example of a time-adjustable spiking neuron model.
- the time-adjustable spiking neuron model 131 includes a time-based spiking neuron model 132, a first setting spike supply unit 135, a delay unit 136, a t-ReLU unit 137, and a second setting.
- a spike supply unit 138 is provided.
- the time-based spiking neuron model 132 includes a membrane potential calculation unit 133 and a spike generation unit 134.
- the time-based spiking neuron model 132 corresponds to the above-mentioned example of the time-based spiking neuron model in the time-based spiking neuron layer 242.
- the membrane potential calculation unit 133 calculates the membrane potential in the time-based spiking neuron model 132 based on the equation (5).
- the spike generation unit 134 compares the membrane potential calculated by the membrane potential calculation unit 133 with the threshold value. When it is determined that the membrane potential is equal to or higher than the threshold value, the spike generation unit 134 outputs a spike.
- FIG. 12 is a diagram showing a configuration example of the time-based spiking neuron model 132.
- the time-based spiking neuron model 132 includes N synaptic circuits 141-1 to 141-N, a capacitor 142, a threshold power supply 143, and a comparator 144.
- Synaptic circuits 141-1 to 141-N are collectively referred to as synaptic circuits 141.
- the synaptic circuit 141 switches the current on / off (On / Off) to the capacitor 142 in response to spikes from the time-adjustable spiking neuron model 131 in the upper layer. Further, the synaptic circuit 141 weights the current to the capacitor 142.
- FIG. 13 is a schematic configuration diagram showing a configuration example of the synapse circuit 141.
- the synapse circuit 141 includes a switching element 151 and a variable resistor 152. Further, the synapse circuit 141 is connected to the power supply 160.
- the power supply 160, the variable resistor 152, and the switching element 151 are connected in series.
- the power supply 160 supplies a current for the output current of the synaptic circuit 141.
- the switching element 151 switches the output current of the synapse circuit 141 on and off according to the input signal.
- the input signal referred to here is a spike from the time-adjustable spiking neuron model 131 in the upper layer.
- the input signal to the synaptic circuit 141-j (j is an integer of 0 ⁇ j ⁇ N) is shown by a voltage Vin (j) .
- the output current from the synaptic circuit 141- j is represented by Ij.
- a step signal may be used as a spike. Then, the switching element 151 may turn on the current when the spike is input and turn off the current when the spike is not input.
- the current on may be to pass a current (energize it).
- the current off may mean that no current is passed (not energized).
- the variable resistor 152 adjusts the flow rate of the current when the switching element 151 turns on the current.
- the potential of the power supply 160 may be constant, and a current inversely proportional to the resistance value of the variable resistor 152 may be output from the synaptic circuit 141 according to Ohm's law.
- the capacitor 142 generates an electric potential by storing the output current of the synaptic circuit 141.
- the potential of the capacitor is represented by V m .
- This potential V m represents the membrane potential.
- the threshold power supply 143 supplies a threshold potential to be compared with the membrane potential (potential V m ).
- the comparator 144 changes the output signal when the membrane potential reaches the threshold potential. Specifically, the comparator 144 outputs a spike by changing the output voltage of the comparator 144 itself when the membrane potential becomes equal to or higher than the threshold potential. In FIG. 12, the output signal of the comparator 144 is shown by the voltage V out .
- the combination of the synapse circuit 141 and the capacitor 142 corresponds to the example of the membrane potential calculation unit 133.
- the combination of the threshold power supply 143 and the comparator 144 corresponds to the example of the spike generation unit 134.
- the implementation method of the time-based spiking neuron model 132 is not limited to a specific method.
- the neural network generator 10 may implement the time-based spiking neuron model 132 on a computer in software.
- the delay unit 136 delays the output spike of the time-based spiking neuron model 132 by a set delay time. As a result, the delay unit 136 executes the delay of the spike in the delay layer 243.
- the spiking neural network 120 may be provided with a second setting spike supply unit 138, one for each layer.
- the first setting spike supply unit 135 corresponds to the example of the setting spike generator 241.
- time-based spiking neuron model 132 of all the time-adjustable spiking neuron models 131 included in one neuron model layer 214 is collectively regarded as a layer, which corresponds to the example of the time-based spiking neuron layer 242. do.
- the delay portion 136 of all the time-adjustable spiking neuron models 131 included in one neuron model layer 214 is collectively regarded as a layer, which corresponds to the example of the delay layer 243.
- the t-ReLU unit 137 of all the time-adjustable spiking neuron models 131 included in one neuron model layer 214 is collectively regarded as a layer, which corresponds to the example of the t-ReLU layer 244.
- the base network generation unit 11 generates a spiking neural network 120 in a state in which the weight in the time-based spiking neuron layer 242 and the delay time in the delay layer 243 can be set.
- the spiking neural network 120 in this state is also referred to as a base network.
- the weight setting unit 12 sets the weight in the time-based spiking neuron layer 242 of the spiking neural network 120 generated by the base network generation unit 11. For example, the weight setting unit 12 calculates the weight w ⁇ (l) ij based on the equation (19), and sets the calculated weight w ⁇ (l) ij in the time-based spiking neuron layer 242.
- the delay setting unit 13 sets the delay time in the delay layer 243 of the spiking neural network 120 generated by the base network generation unit 11. For example, the delay setting unit 13 calculates the delay time ⁇ (l) i based on the equation (20), and sets the calculated delay time ⁇ (l) i in the delay layer 243.
- FIG. 14 is a flowchart showing an example of a processing procedure in which the neural network generation device 10 converts the artificial neural network 110 into a spiking neural network 120.
- the base network generation unit 11 generates a spiking neural network 120 in a state in which the weight in the time-based spiking neuron layer 242 and the delay time in the delay layer 243 can be set (step S11).
- the weight setting unit 12 sets the weight in the time-based spiking neuron layer 242 of the spiking neural network 120 generated by the base network generation unit 11 (step S12).
- the delay setting unit 13 sets the delay time in the delay layer 243 of the spiking neural network 120 generated by the base network generation unit 11 (step S13).
- the neural network generator 10 ends the process of FIG.
- FIG. 15 is a schematic block diagram showing a configuration example of the neuron model generator according to the embodiment.
- the neuron model generation device 20 includes a base model generation unit 21, a weight setting unit 22, and a delay setting unit 23.
- the neuron model generation device 20 acquires the configuration information of the artificial neuron model 111 and generates a time-adjustable spiking neuron model 131 equivalent to the artificial neuron model 111.
- the neuron model generation device 20 may generate a time-adjustable spiking neuron model device as the time-adjustable spiking neuron model 131.
- the base model generation unit 21 generates a time-adjustable spiking neuron model 131 in which the weight in the time-based spiking neuron model 132 and the delay time in the delay unit 136 can be set.
- the time-adjustable spiking neuron model 131 in this state is also referred to as a base model.
- the weight setting unit 22 sets the weight in the time-based spiking neuron model 132 of the time-adjustable spiking neuron model 131 generated by the base model generation unit 21. For example, the weight setting unit 22 calculates the weight w ⁇ (l) ij based on the equation (19), and sets the calculated weight w ⁇ (l) ij in the time-based spiking neuron model 132.
- the delay setting unit 23 sets the delay time in the delay unit 136 of the time-based spiking neuron model 132 generated by the base model generation unit 21. For example, the delay setting unit 23 calculates the delay time ⁇ (l) i based on the equation (20), and sets the calculated delay time ⁇ (l) i in the delay unit 136.
- FIG. 16 is a flowchart showing an example of a processing procedure in which the neuron model generator 20 converts the artificial neuron model 111 into a time-adjustable spiking neuron model 131.
- the base model generation unit 21 generates a time-adjustable spiking neuron model 131 in which the weight in the time-based spiking neuron model 132 and the delay time in the delay unit 136 can be set (step). S21).
- the weight setting unit 22 sets the weight in the time-based spiking neuron model 132 of the time-adjustable spiking neuron model 131 generated by the base model generation unit 21 (step S22).
- the delay setting unit 23 sets the delay time in the delay unit 136 of the time-based spiking neuron model 132 generated by the base model generation unit 21 (step S23).
- the neuron model generator 20 ends the process of FIG.
- the neural network generation device 10 generates a time-based spiking neural network 120 equivalent to the artificial neural network 110. According to the neural network generation device 10, the same calculation as that of the artificial neural network 110 can be performed while reducing the power consumption by using the time-based spiking neural network 120.
- All or part of the neural network generator 10 may be implemented in dedicated hardware. All or part of the neural network generator 10 may be mounted on an ASIC (Application Specific Integrated Circuit) or an FPGA (Field-Programmable Gate Array).
- ASIC Application Specific Integrated Circuit
- FPGA Field-Programmable Gate Array
- All or part of the spiking neural network 120 generated by the neural network generator 10 may be implemented in dedicated hardware. All or part of the spiking neural network 120 may be implemented in an ASIC or FPGA.
- the time-based spiking neuron layer 242 corresponds to an example of the time-based spiking neuron model means, and is configured by using the time-based spiking neuron model.
- the delay layer 243 corresponds to an example of the delay means, and outputs a spike at a time changed from the spike output time of the time-based spiking neuron layer 242 by a set time.
- the t-ReLU layer 244 corresponds to an example of the time-based ramp function means, and outputs a signal at the earlier of the spike output time or the reference time of the delay layer 243.
- the set spike generator 241 corresponds to an example of the set spike supply means, and outputs a spike to the time type spiking neuron layer 242 at a time independent of an input signal to the time type spiking neuron layer 242.
- the time-based spiking neuron model 132 corresponds to an example of the time-based spiking neuron model means, and is configured by using the time-based spiking neuron model.
- the delay unit 136 corresponds to an example of the delay means, and outputs a spike at a time changed from the spike output time of the time-based spiking neuron model 132 by a set time.
- the t-ReLU unit 137 corresponds to an example of the time-based ramp function means, and outputs a signal at the earlier of the spike output time or the reference time of the delay unit 136.
- the first set spike supply unit 135 corresponds to an example of the set spike supply means, and outputs a spike to the time-based spiking neuron model 132 at a time that does not depend on an input signal to the time-based spiking neuron model 132.
- the base network generation unit 11 corresponds to an example of the base network generation means, and generates a spiking neural network 120 including a time-based spiking neuron layer 242 and a delay layer 243.
- the weight setting unit 12 corresponds to an example of the weight setting means, and sets the weight of the input spike to the time-based spiking neuron layer 242 by the weight w ⁇ i or w ⁇ (l ) based on the equation (17) or the equation (19). ) Set to ij .
- the input time ti or t ( l -1) j of the input spike to the time-based spiking neuron layer 242 reverses the sign of the numerical value indicated by the input spike.
- the delay setting unit 13 corresponds to an example of the delay setting means, and sets the set time in the delay layer 243 to the delay time ⁇ or ⁇ (l) i based on the equation (18) or the equation (20).
- the input time ti or t ( l -1) j of the input spike to the time-based spiking neuron layer 242 reverses the sign of the numerical value indicated by the input spike.
- the base model generation unit 21 corresponds to an example of the base model generation means, and generates a time-adjustable spiking neuron model 131 including a time-based spiking neuron model 132 and a delay unit 136.
- the weight setting unit 22 corresponds to an example of the weight setting means, and sets the weight of the input spike to the time-based spiking neuron model 132 as the weight w ⁇ i or w ⁇ (l ) based on the equation (17) or the equation (19). ) Set to ij .
- the delay setting unit 23 corresponds to an example of the delay setting means, and sets the set time in the delay unit 136 to the delay time ⁇ or ⁇ (l) i based on the equation (18) or the equation (20).
- the configuration of the artificial neural network 110 to be converted has the configuration of 4-3-3. That is, the dimension of the input data is four dimensions (the number of input data is four), the number of neuron models in the hidden layer is three, and the number of neuron models in the output layer is three.
- the artificial neural network 110 was trained with an iris data set by a general method of deep learning. For learning, a data set of Ayame classification was used, which was composed of 150 four-dimensional vectors and label data associated with the four-dimensional vectors in a one-to-one correspondence. Then, the artificial neural network 110 after learning was converted into a spiking neural network 120.
- FIG. 17 is a diagram showing an example of spike output time in the spiking neural network 120.
- FIG. 17 shows the output times of spikes of the input layer, the hidden layer, and the output layer in the processing for the same input data to the spiking neural network 120.
- the horizontal axis of each graph in FIG. 17 indicates the time.
- the vertical axis indicates the spike identification number i. As described above, the identification number of the node in the upper layer is also used as the identification number of the spike.
- FIG. 18 is a diagram showing an example of spike output times in the artificial neural network 110 and the spiking neural network 120.
- FIG. 18 shows the output values of the hidden layer and the output layer in the processing for the input data in the example of FIG. 17 for each of the artificial neural network 110 and the spiking neural network 120.
- “# 1”, “# 2”, and “# 3” in FIG. 18 indicate spike identification numbers 1, 2, and 3, respectively.
- the output values of the neuron model in the hidden layer of the artificial neural network 110 are 3.794, 0, 1.772 in order from the first neuron model.
- the spike output times of the neuron models in the hidden layer of the spiking neural network 120 are -3.794, 0, and -1.772 in order from the first neuron model.
- the output values of the neuron model in the output layer of the artificial neural network 110 are 12.263, 5.605, and 18.239 in order from the first neuron model.
- the spike output times of the neuron model in the hidden layer of the spiking neural network 120 are -12.263, -5.605, and -18.239 in order from the first neuron model.
- the value obtained by reversing the sign of the output value of the artificial neural network 110 is the output value of the spiking neural network 120.
- the output value of the neuron model in the spiking neural network 120 is equivalent to the output value of the neuron model in the artificial neural network 110.
- FIG. 19 is a diagram showing an example of the time evolution of the membrane potential of the neuron model of the hidden layer when the delay time is reflected.
- FIG. 19 shows an example of the time evolution of the membrane potential in the processing of the input data in the example of FIG.
- the horizontal axis of the graph in FIG. 19 indicates the time.
- the vertical axis shows the membrane potential.
- the membrane potential value of 1.0 is set as the ignition threshold.
- the time shown in FIG. 19 reflects the delay time ⁇ or ⁇ (l) i shown in the equation (18) or the equation (20).
- Line L111 shows an example of the time evolution of the membrane potential in the first neuron model.
- the membrane potential in the first neuron model reaches the threshold at time-3.7994 shown in FIG.
- Line L112 shows an example of the time evolution of the membrane potential in the second neuron model.
- the membrane potential in the second neuron model reaches the threshold at a time after time 0 shown in FIG.
- Line L113 shows an example of the time evolution of the membrane potential in the third neuron model.
- the membrane potential in the third neuron model reaches the threshold at time-1.772 shown in FIG.
- the firing time in the case of FIG. 19 is the reverse of the sign of the output value of the artificial neuron model.
- FIG. 20 is a diagram showing an example of the time evolution of the membrane potential of the neuron model of the hidden layer when the delay time is not reflected.
- FIG. 20 shows an example of the time evolution of the membrane potential in the processing of the input data in the example of FIG.
- the horizontal axis of the graph in FIG. 20 indicates the time.
- the vertical axis shows the membrane potential.
- the membrane potential value of 1.0 is set as the ignition threshold. Further, the time shown in FIG. 20 does not reflect the delay time ⁇ or ⁇ (l) i shown in the equation (18) or the equation (20).
- Line L121 shows an example of the time evolution of the membrane potential in the first neuron model.
- Line L122 shows an example of the time evolution of the membrane potential in the second neuron model.
- Line L123 shows an example of the time evolution of the membrane potential in the third neuron model.
- the ignition time according to FIG. 20 is different from the ignition time according to FIG. Therefore, the firing time in FIG. 20 is not the reverse of the sign of the output value of the artificial neuron model. As described above, when FIG. 19 and FIG. 20 are compared, it is shown that it is effective to provide the delay layer 243 or the delay portion 136 in order to generate the spiking neural network 120 equivalent to the artificial neural network 110. ing.
- FIG. 21 is a diagram showing an example of the time evolution of the membrane potential of the neuron model of the output layer when the delay time is reflected.
- FIG. 21 shows an example of the time evolution of the membrane potential in the processing of the input data in the example of FIG.
- the horizontal axis of the graph in FIG. 21 indicates the time.
- the vertical axis shows the membrane potential.
- the membrane potential value of 1.0 is set as the ignition threshold.
- the time shown in FIG. 21 reflects the delay time ⁇ or ⁇ (l) i shown in the equation (18) or the equation (20).
- Line L211 shows an example of the time evolution of the membrane potential in the first neuron model.
- the membrane potential in the first neuron model reaches the threshold at time-12.263 shown in FIG.
- Line L212 shows an example of the time evolution of the membrane potential in the second neuron model.
- the membrane potential in the second neuron model reaches the threshold at time-5.605 shown in FIG.
- Line L213 shows an example of the time evolution of the membrane potential in the third neuron model.
- the membrane potential in the third neuron model reaches the threshold at time 18.239 shown in FIG.
- the firing time in the case of FIG. 21 is obtained by reversing the sign of the output value of the artificial neuron model.
- FIG. 22 is a diagram showing an example of the time evolution of the membrane potential of the neuron model of the output layer when the delay time is not reflected.
- FIG. 22 shows an example of the time evolution of the membrane potential in the processing of the input data in the example of FIG.
- the horizontal axis of the graph in FIG. 22 indicates the time.
- the vertical axis shows the membrane potential.
- the membrane potential value of 1.0 is set as the ignition threshold. Further, the time shown in FIG. 22 does not reflect the delay time ⁇ or ⁇ (l) i shown in the equation (18) or the equation (20).
- Line L221 shows an example of the time evolution of the membrane potential in the first neuron model.
- Line L222 shows an example of the time evolution of the membrane potential in the second neuron model.
- Line L223 shows an example of the time evolution of the membrane potential in the third neuron model.
- the ignition time according to FIG. 22 is different from the ignition time according to FIG. 21. Therefore, the firing time in FIG. 22 is not the reverse of the sign of the output value of the artificial neuron model. As described above, when FIG. 21 and FIG. 22 are compared, it is shown that it is effective to provide the delay layer 243 or the delay portion 136 in order to generate the spiking neural network 120 equivalent to the artificial neural network 110. ing.
- the time-based spiking neuron layer 242 outputs a signal when the amount of internal state that evolves over time according to the signal input time exceeds the threshold value.
- the delay layer 243 outputs a signal obtained by changing the spiking time indicated by the output signal of the time-based spiking neuron layer 242 as a relative time with respect to the reference time by a set time.
- the same processing as that of the artificial neural network 110 can be performed by the spiking neural network method.
- the spiking neural network method For example, after performing high-precision learning using an artificial neural network 110, by implementing a spiking neural network 120 equivalent to it in terms of hardware, both high-precision processing and reduction of power consumption are achieved. be able to.
- the spiking neural network 120 equivalent to the artificial neural network 110 is configured by using the same number of time-adjustable spiking neuron models 131 as the number of artificial neuron models 111 included in the artificial neural network 110.
- the configuration of the spiking neural network 120 equivalent to the artificial neural network 110 can be made relatively simple.
- the configuration of the spiking neural network 120 is simple, the power consumption of the spiking neural network 120 can be reduced and the spiking neural network 120 can be made compact.
- the delay layer 243 when the weight of the input spike to the time-based spiking neuron layer 242 is the time when the spike input time to the time-based spiking neuron layer 242 is the time obtained by reversing the sign of the numerical value indicated by the input spike, the delay layer 243. Is set to a weight based on the equation that the spike output time of is the time obtained by reversing the sign of the numerical value indicated by the output spike.
- a positive numerical value can be represented by a negative time. Therefore, in the spiking neural network 120, if the lower limit of the numerical value is set by, for example, a ramp function, the maximum value of the waiting time can be limited, and the delay due to the waiting time of the input signal can be reduced.
- the t-ReLU layer 244 outputs a signal at the earlier of the spike output time or the reference time of the delay layer 243.
- the setting spike generator 241 outputs a signal to the time-based spiking neuron layer 242 at a time independent of the input spike to the time-based spiking neuron layer 242.
- the weight in the time-based spiking neuron layer 242 and the delay time in the delay layer 243 can be expressed by relatively simple equations such as equations (17) to (20).
- a spiking neural network equivalent to the artificial neural network 110 can be obtained relatively easily.
- time-based spiking neuron model 132 outputs a signal when the amount of internal state that evolves over time according to the signal input time exceeds the threshold value.
- the delay unit 136 outputs a signal obtained by changing the spiking time indicated by the output signal of the time-based spiking neuron model 132 as a relative time to the reference time by a set time.
- the same processing as that of the artificial neuron model 111 can be performed by the time-adjustable spiking neuron model 131.
- the artificial neural network 110 can be constructed by using the artificial neuron model 111.
- a spiking neural network 120 can be constructed using the time-adjusted spiking neuron model 131. For example, after performing high-precision learning using an artificial neural network 110, by implementing a spiking neural network 120 equivalent to it in terms of hardware, both high-precision processing and reduction of power consumption are achieved. be able to.
- the spiking neural network 120 equivalent to the artificial neural network 110 is configured by using the same number of time-adjustable spiking neuron models 131 as the number of artificial neuron models 111 included in the artificial neural network 110.
- the configuration of the spiking neural network 120 equivalent to the artificial neural network 110 can be made relatively simple.
- the configuration of the spiking neural network 120 is simple, it is expected that the power consumption of the spiking neural network 120 can be reduced and the spiking neural network 120 can be made compact.
- the delay unit 136 when the weight of the input spike to the time-based spiking neuron model 132 is the time when the spike input time to the time-based spiking neuron model 132 is the time obtained by reversing the sign of the numerical value indicated by the input spike, the delay unit 136. Is set to a weight based on the equation that the spike output time of is the time obtained by reversing the sign of the numerical value indicated by the output spike.
- a positive numerical value can be represented by a negative time. Therefore, in the time-adjustable spiking neuron model 131, if the lower limit of the numerical value is set by, for example, a ramp function, the maximum value of the waiting time can be limited, and the delay due to the waiting time of the input signal can be reduced. ..
- the t-ReLU unit 137 outputs a signal at the earlier of the spike output time or the reference time of the delay unit 136.
- the ramp function as the activation function in the artificial neuron model 111 can be simulated, and the delay due to the waiting time of the input signal can be reduced as described above. can.
- the first setting spike supply unit 135 outputs a signal to the time-based spiking neuron model 132 at a time that does not depend on the input spike to the time-based spiking neuron model 132.
- the weight in the time-based spiking neuron model 132 and the delay time in the delay portion 136 can be expressed by relatively simple equations such as equations (17) to (20).
- a spiking neuron model equivalent to the artificial neuron model 111 can be obtained relatively easily.
- the base network generation unit 11 generates a spiking neural network 120 including a time-based spiking neuron layer 242 and a delay layer 243.
- the weight setting unit 12 sets the weight of the input signal to the time-based spiking neuron layer 242 to the time based on the equation (17) or the equation (19).
- the delay setting unit 13 sets the delay time in the delay layer 243 to the time based on the equation (18) or the equation (20).
- a spiking neural network 120 equivalent to the artificial neural network 110 can be obtained.
- a spiking neural network 120 equivalent to it in terms of hardware, both high-precision processing and reduction of power consumption are achieved. be able to.
- the spiking neural network 120 equivalent to the artificial neural network 110 is configured by using the same number of time-adjustable spiking neuron models 131 as the number of artificial neuron models 111 included in the artificial neural network 110.
- the configuration of the spiking neural network 120 equivalent to the artificial neural network 110 can be made relatively simple.
- the configuration of the spiking neural network 120 is simple, the power consumption of the spiking neural network 120 can be reduced and the spiking neural network 120 can be made compact.
- the base model generation unit 21 generates a time-adjustable spiking neuron model 131 including a time-based spiking neuron model 132 and a delay unit 136.
- the weight setting unit 22 sets the weight of the input signal to the time-based spiking neuron model 132 to the time based on the equation (17) or the equation (19).
- the delay setting unit 23 sets the delay time in the delay unit 136 to the time based on the equation (18) or the equation (20).
- a time-adjustable spiking neuron model 131 equivalent to the artificial neuron model 111 can be obtained.
- the artificial neural network 110 can be constructed by using the artificial neuron model 111.
- a spiking neural network 120 can be constructed using the time-adjusted spiking neuron model 131. For example, after performing high-precision learning using an artificial neural network 110, by implementing a spiking neural network 120 equivalent to it in terms of hardware, both high-precision processing and reduction of power consumption are achieved. be able to.
- a spiking neural network 120 equivalent to the neural network 110 is configured by using the same number of time-adjustable spiking neuron models 131 as the number of artificial neuron models 111 included in the artificial neural network 110.
- the configuration of the spiking neural network 120 equivalent to the artificial neural network 110 can be made relatively simple.
- the configuration of the spiking neural network 120 is simple, it is expected that the power consumption of the spiking neural network 120 can be reduced and the spiking neural network 120 can be made compact.
- FIG. 23 is a diagram showing a configuration example of the neural network device according to the embodiment.
- the neural network device 610 includes a time-based spiking neuron model 611 and a delay unit 612.
- the time-based spiking neuron model 611 outputs a signal when the amount of internal state that evolves over time according to the signal input time becomes equal to or greater than the threshold value.
- the delay unit 612 outputs a signal obtained by changing the spike time indicated by the output signal of the time-based spiking neuron model 611 as a relative time to the reference time by a set time.
- processing equivalent to that of an artificial neural network can be performed by a spiking neural network method. For example, after performing high-precision learning using an artificial neural network, by implementing a spiking neural network in hardware with an equivalent neural network device 610, high-precision processing and reduction of power consumption can be achieved. Can be achieved at the same time.
- a neural network device 610 equivalent to a neural network is configured using the same number of neuron models as the number of artificial neuron models included in the artificial neural network.
- the configuration of a spiking neural network equivalent to that of an artificial neural network can be made relatively simple. Since the configuration of the neural network device 610 is simple, the power consumption of the neural network device 610 can be reduced and the neural network device 610 can be made compact.
- FIG. 24 is a diagram showing a configuration example of the generator according to the embodiment.
- the generation device 620 includes a base network generation unit 621, a weight setting unit 622, and a delay setting unit 623.
- the base network generation unit 621 outputs a time-based spiking neuron model that outputs a signal when the amount of internal state that develops over time according to the signal input time exceeds the threshold value, and an output signal of the time-based spiking neuron model means.
- Generates a neural network comprising a delay portion that outputs a signal obtained by changing the spike time indicated by the relative time with respect to the reference time by a set time.
- the weight setting unit 622 outputs the output signal of the delay unit when the weight of the input signal to the time-based spiking neuron model is the time when the input time of the input signal is the time obtained by reversing the sign of the numerical value indicated by the input signal.
- Set the weight based on the formula that the time is the time obtained by reversing the sign of the numerical value indicated by the output signal.
- the delay setting unit 623 sets the output signal of the delay unit when the input time of the input signal to the time-based spiking neuron model means is the time obtained by reversing the sign of the numerical value indicated by the input signal.
- the output time of is set to the time based on the formula that the sign of the numerical value indicated by the output signal is reversed.
- a neural network device of the spiking neural network method can be obtained, which is equivalent to an artificial neural network. For example, by performing high-precision learning using an artificial neural network and then implementing an equivalent neural network device in terms of hardware, it is possible to achieve both high-precision processing and reduction of power consumption. ..
- a neural network device equivalent to an artificial neural network is configured using the same number of spiking neuron models as the number of artificial neuron models included in the artificial neural network.
- the configuration of a spiking neural network equivalent to that of an artificial neural network can be made relatively simple. It is expected that the simple configuration of the neural network device can reduce the power consumption of the neural network device and make the spiking neural network 120 compact.
- FIG. 25 is a diagram showing an example of a processing procedure in the information processing method according to the embodiment.
- the method shown in FIG. 25 includes outputting a first signal (step S611) and outputting a second signal (step S612).
- step S611 the first signal is output when the amount of internal state that evolves with time according to the signal input time becomes equal to or greater than the threshold value.
- step S612 the second signal is output in which the spike time indicated by the first signal as a relative time with respect to the reference time is changed by a set time.
- processing equivalent to that of an artificial neural network can be performed by a spiking neural network method.
- a spiking neural network equivalent to it is implemented in hardware and the processing shown in FIG. 25 is performed to achieve high-precision processing and reduce power consumption. It is possible to achieve both.
- FIG. 26 is a diagram showing an example of a processing procedure in the generation method according to the embodiment.
- the method shown in FIG. 26 includes generating a neural network (step S621), setting weights (step S622), and setting delays (step S623).
- a time-based spiking neuron model means that outputs a signal when the amount of internal state that develops over time according to a signal input time exceeds a threshold value, and the time-based spiking neuron model means.
- step S622 when the weight of the input signal to the time-based spiking neuron model means is the time when the input time of the input signal is the time when the sign of the numerical value indicated by the input signal is reversed.
- the weight is set based on the equation that the output time of the output signal of the delay means is the time obtained by reversing the sign of the numerical value indicated by the output signal.
- step S623 the time when the input time of the input signal to the time-based spiking neuron model means reverses the sign of the numerical value indicated by the input signal with respect to the set time in the delay means.
- the time is set based on the equation that the output time of the output signal of the delay means is the time obtained by reversing the sign of the numerical value indicated by the output signal.
- a spiking neural network equivalent to an artificial neural network can be obtained. For example, after performing high-precision learning using an artificial neural network, it is possible to achieve both high-precision processing and reduction of power consumption by implementing a spiking neural network equivalent to it in terms of hardware. can.
- a spiking neural network equivalent to an artificial neural network is configured using the same number of spiking neuron models as the number of artificial neuron models included in the artificial neural network.
- the configuration of a spiking neural network equivalent to that of an artificial neural network can be made relatively simple.
- the configuration of the spiking neural network is simple, the power consumption of the spiking neural network can be reduced and the spiking neural network can be made compact.
- FIG. 27 is a schematic block diagram showing the configuration of a computer according to at least one embodiment.
- the computer 700 includes a CPU 710, a main storage device 720, an auxiliary storage device 730, and an interface 740.
- the operation of each of the above-mentioned processing units is stored in the auxiliary storage device 730 in the form of a program.
- the CPU 710 reads the program from the auxiliary storage device 730, expands it to the main storage device 720, and executes the above processing according to the program. Further, the CPU 710 secures a storage area corresponding to each of the above-mentioned storage units in the main storage device 720 according to the program. Communication between each device and other devices is executed by the interface 740 having a communication function and performing communication according to the control of the CPU 710.
- the operations of the base network generation unit 11, the weight setting unit 12, and the delay setting unit 13 are stored in the auxiliary storage device 730 in the form of a program.
- the CPU 710 reads the program from the auxiliary storage device 730, expands it to the main storage device 720, and executes the above processing according to the program.
- the CPU 710 secures a storage area for processing performed by the neural network generation device 10 in the main storage device 720 according to the program. Communication between the neural network generation device 10 and other devices is executed, for example, by having the interface 740 having a communication function and operating according to the control of the CPU 710.
- the interface 740 has a display screen and displays various images according to the control of the CPU 710, and the interface 740 has an input device such as a keyboard to perform user operations. It is executed by accepting.
- the operation of the configuration spike generator 241, the time-based spiking neuron layer 242, the delay layer 243, and the t-ReLU layer 244 is an auxiliary storage device in the form of a program. It is stored in 730.
- the CPU 710 reads the program from the auxiliary storage device 730, expands it to the main storage device 720, and executes the above processing according to the program.
- the CPU 710 secures a storage area for processing performed by the spiking neural network 120 in the main storage device 720 according to the program. Communication between the spiking neural network 120 and other devices is performed, for example, by having the interface 740 have a communication function and operating according to the control of the CPU 710.
- the interface 740 has a display screen and displays various images according to the control of the CPU 710, and the interface 740 has an input device such as a keyboard to perform user operations. It is executed by accepting.
- the time-adjustable spiking neuron model 131 When the time-adjustable spiking neuron model 131 is implemented in the computer 700, the time-based spiking neuron model 132, the first setting spike supply unit 135, the delay unit 136, the t-ReLU unit 137, and the second setting spike supply unit.
- the operation of the unit 138 and each of them is stored in the auxiliary storage device 730 in the form of a program.
- the CPU 710 reads the program from the auxiliary storage device 730, expands it to the main storage device 720, and executes the above processing according to the program.
- the CPU 710 secures a storage area for processing performed by the time-adjustable spiking neuron model 131 in the main storage device 720 according to the program. Communication between the time-adjusted spiking neuron model 131 and other devices is performed, for example, by having the interface 740 have a communication function and operating under the control of the CPU 710.
- the interface 740 has a display screen and displays various images under the control of the CPU 710, and the interface 740 has an input device such as a keyboard. It is executed by accepting user operations.
- the operations of the base model generation unit 21, the weight setting unit 22, and the delay setting unit 23 are stored in the auxiliary storage device 730 in the form of a program. ..
- the CPU 710 reads the program from the auxiliary storage device 730, expands it to the main storage device 720, and executes the above processing according to the program.
- the CPU 710 secures a storage area for processing performed by the neuron model generation device 20 in the main storage device 720 according to the program. Communication between the neuron model generation device 20 and other devices is performed, for example, by having the interface 740 having a communication function and operating according to the control of the CPU 710.
- the interface 740 has a display screen and displays various images according to the control of the CPU 710, and the interface 740 has an input device such as a keyboard to perform user operations. It is executed by accepting.
- the operation of each part of the time-based spiking neuron model 611 and the delay part 612 is stored in the auxiliary storage device 730 in the form of a program.
- the CPU 710 reads the program from the auxiliary storage device 730, expands it to the main storage device 720, and executes the above processing according to the program.
- the CPU 710 secures a storage area for processing performed by the neural network device 610 in the main storage device 720 according to the program. Communication between the neural network device 610 and other devices is executed, for example, by having the interface 740 have a communication function and operating according to the control of the CPU 710.
- the interface 740 has a display screen and displays various images according to the control of the CPU 710, and the interface 740 has an input device such as a keyboard to accept user operations. Is executed by.
- the operations of the base network generation unit 621, the weight setting unit 622, and the delay setting unit 623 are stored in the auxiliary storage device 730 in the form of a program.
- the CPU 710 reads the program from the auxiliary storage device 730, expands it to the main storage device 720, and executes the above processing according to the program.
- the CPU 710 secures a storage area for processing performed by the generation device 620 in the main storage device 720 according to the program. Communication between the generation device 620 and other devices is executed, for example, by having the interface 740 having a communication function and operating according to the control of the CPU 710.
- the interaction between the generator 620 and the user is, for example, that the interface 740 has a display screen and displays various images under the control of the CPU 710, and the interface 740 has an input device such as a keyboard to accept user operations. Is executed by.
- the neural network generation device 10 executes all or part of the processing.
- the program may be recorded on a computer-readable recording medium, and the program recorded on the recording medium may be read into a computer system and executed to perform processing of each part.
- the term "computer system” as used herein includes hardware such as an OS and peripheral devices.
- the "computer-readable recording medium” includes a flexible disk, a photomagnetic disk, a portable medium such as a ROM (Read Only Memory) and a CD-ROM (Compact Disc Read Only Memory), and a hard disk built in a computer system.
- the above-mentioned program may be for realizing a part of the above-mentioned functions, and may be further realized by combining the above-mentioned functions with a program already recorded in the computer system.
- the embodiment of the present invention may be applied to a neural network device, a generation device, an information processing method, a generation method, and a recording medium.
- Neural network generator 11 Neural network generator 11, 621 Base network generator 12, 22, 622 Weight setting unit 13, 23, 623 Delay setting unit 20
- Neuron model generator 21 Base model generator 110 Artificial neural network 120 Spiking Neural network 121 Spiking Neuron model 131 Time-adjusted spiking neuron model 132, 611 Time-based spiking neuron model 133 Membrane potential calculation unit 134 Spike generation unit 135 First setting spike supply unit 136, 612 Delay unit 137 t-ReLU unit 138 Second setting spike Supply unit 241 Setting spike generator 242 Time method Spiking neuron layer 243 Delay layer 244 t-ReLU layer 610 Neural network device 620 Generator
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- General Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Computational Linguistics (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Artificial Intelligence (AREA)
- Neurology (AREA)
- Feedback Control In General (AREA)
- Image Analysis (AREA)
Abstract
Description
例えばニューラルネットワークをハードウェア的に実装する場合、スパイキングニューラルネットワークのほうが、内部状態を有せず時間要素を含まない計算を行うニューロンモデルを用いたニューラルネットワーク(人工ニューラルネットワークと称する)よりも、計算時の消費電力が小さいことが期待される。
人工ニューラルネットワークと同等のスパイキングニューラルネットワークを生成する際、スパイキングニューラルネットワークの構成がなるべく簡単であることが好ましい。
図1は、実施形態に係るニューラルネットワーク生成装置の構成例を示す概略ブロック図である。図1に示す構成で、ニューラルネットワーク生成装置10は、ベースネットワーク生成部11と、重み設定部12と、遅延設定部13とを備える。
ニューラルネットワーク生成装置を単に生成装置とも称する。
非記憶型ニューラルネットワークを人工ニューラルネットワーク(Artificial Neural Network;ANN)とも称する。非記憶型ニューロンモデルを人工ニューロンモデル(Artificial Neuron Model)とも称する。
ニューラルネットワーク生成装置10は、順伝搬型(Feedforward)ニューラルネットワークを扱う。すなわち、ニューラルネットワーク生成装置10は、順伝搬型の人工ニューラルネットワーク110の構成情報を取得して、その人工ニューラルネットワーク110と同等の、順伝搬型のスパイキングニューラルネットワーク120を生成する。
図2の例では、順伝搬型ニューラルネットワーク200が、入力層211と、隠れ層212と、出力層213とを備える。入力層211と、隠れ層212と、出力層213とを総称して層210と表記する。
隣接する2つの層210間で、上位層のノード220と下位層のノード220との全ての組み合わがエッジ230で接続されていてもよい。あるいは、エッジ230で接続されていない組み合わせがあってもよい。
一方、隠れ層212のノード220および出力層213のノード220には、何れもニューロンモデルが用いられる。隠れ層212のノード220と、出力層213のノード220とを総称して、ニューロンモデルノード222とも表記する。隠れ層212と出力層213とを総称して、ニューロンモデル層214とも表記する。
上述したように、人工ニューラルネットワーク110は、人工ニューロンモデルを用いて構成されたニューラルネットワークである。人工ニューロンモデルは、入力値の重み付け合計を算出し、算出された合計値またはその合計値にバイアス値を加えた値を活性化関数に入力し、得られる関数値を出力するニューロンモデルである。
人工ニューロンモデルの出力値は、式(1)のように示される。
「Σjw(l) ijx(l-1) j」は、上記の入力値の重み付け合計を表す。b(l) iはバイアス項と呼ばれ、上記のバイアス値を示す。fは活性化関数を表す。
式(1)に示される変換のうち活性化関数を除いた部分の変換は、アフィン変換(Affine Transformation)として把握することができ、式(2)のように示される。
「Affine」は、アフィン関数(アフィン変換を示す関数)である。
以下では、活性化関数fとしてランプ関数(Ramp Function)を用いる場合を例に説明する。ランプ関数は、正規化線形関数(Rectified Linear Function)とも称される。
図4に示されように、入力xがx≧0の場合は、ReLU(x)=xである。一方、入力xがx<0の場合は、ReLU(x)=0である。ランプ関数ReLUは、式(3)のように示される。
ただし、人工ニューロンモデルにおける活性化関数はランプ関数に限定されない。人工ニューロンモデルにおける活性化関数として、後述するような時間方式での表現が可能な、いろいろな関数を用いることができる。
アフィン変換部112は、人工ニューロンモデル111への入力の重み付け合計、および、得られた合計値へのバイアス値の加算を行う。例えば、アフィン変換部112は、上記の式(2)に基づいてAffine(x(l-1))を算出する。
上述したように、スパイキングニューラルネットワークは、スパイキングニューロンモデルを用いて構成されたニューラルネットワークである。スパイキングニューロンモデルは、時間発展する内部状態に基づいて、スパイクと呼ばれる二値信号を出力する。
例えば、スパイキングニューロンモデルの1つである漏れ積分発火ニューロンモデルでは、膜電位が式(1)のような微分方程式に従って時間発展する。
tは時刻を示す。t(n-1) jは第n-1層のj番目のニューロンの発火タイミング(発火時刻)を示す。r(・)は前段の層から伝達されたスパイクがシナプス後電流へ与える影響を示す関数である。
スパイキングニューロンモデルが出力したスパイクは、そのスパイキングニューロンモデルと結合している下位層のスパイキングニューロンモデルへと伝達される。
例えば、スパイキングニューラルネットワークでは二値信号を用いる点で、人工ニューラルネットワークでアナログ信号を用いる場合よりも、信号による消費電力を低減することができる。
ニューラルネットワーク生成装置10は、スパイキングニューラルネットワーク120における情報伝達方式として時間方式を用いる。時間方式では、発火タイミングで情報を伝達する。
図7では、時刻「1」に発火する場合の例と、時刻「3」に発火する場合の例と、時刻「5」に発火する場合の例の、3つの例が示されている。時間方式では、スパイキングニューロンモデルは、発火時刻によって定量的な情報を示すことができる。例えば、時刻「3」に発火したスパイクが数値「3」を示すなど、スパイクが、発火時刻の数値を示すものとしてもよい。すなわち、スパイキングニューロンモデルが、基準時刻と発火時刻との時間の長さで数値を示すようにしてもよい。
発火時刻は整数の時刻に限定されず、スパイキングニューロンモデルが、発火時刻によって実数値を示すようにしてもよい。
時間方式のスパイキングニューラルネットワークを時間方式スパイキングニューラルネットワークとも称する。時間方式のスパイキングニューロンを時間方式スパイキングニューロンとも称する。
上記のように、スパイキングニューラルネットワーク120によれば、人工ニューラルネットワーク110を用いる場合よりも消費電力を低減できると期待される。一方、人工ニューラルネットワーク110のほうが、スパイキングニューラルネットワーク120よりも、高精度な学習を容易に行えると考えられる。
図8は、人工ニューラルネットワーク110とスパイキングニューラルネットワーク120との対応関係の例を示す図である。人工ニューラルネットワーク110と同等のスパイキングニューラルネットワーク120を生成するために、ニューラルネットワークの1つの層を、さらに複数の層に細分化する。
図8は、順伝搬型ニューラルネットワーク200におけるニューロンモデル層214の1層分の構成例を示している。上述したように、ニューロンモデル層214は、隠れ層212または出力層213である。
1つのニューロンモデル層214に含まれる全ての人工ニューロンモデル111のアフィン変換部112を纏めて層として捉えたものが、アフィン変換層231の例に該当する。
1つのニューロンモデル層214に含まれる全ての人工ニューロンモデル111の活性化関数部113を纏めて層として捉えたものが、ReLU層232の例に該当する。
時間方式スパイキングニューロン層242は、時間方式スパイキングニューロンモデルを含む。ニューラルネットワーク生成装置10は、対象層に含まれる人工ニューロンモデル111の個数と同じ個数の時間方式スパイキングニューロンモデルを含む時間方式スパイキングニューロン層242を生成する。
式(5)では、式(4)の関数rとしてステップ関数θを用いている。ステップ関数θは、式(6)のように示される。
式(5)では、上位層からのスパイクがi=1、・・・、Nで示され、設定スパイクがi=0で示されている。具体的にはt1、・・・、tNは、上位層からのスパイクの入力時刻を示す。t0は、設定スパイクの入力時刻を示す。
設定スパイク生成器241は、スパイキングニューラルネットワーク120への入力データの値に依存しない設定時刻(時刻t0)に設定スパイクを出力する。設定時刻は、学習等によって更新可能であってもよい。
遅延層243は、時間方式スパイキングニューロンの出力スパイクを遅延させる。遅延層243での遅延時間をτとすると、遅延層243のスパイク出力時刻は式(9)のように示される。
また、人工ニューラルネットワーク110のアフィン変換層231について、式(2)のアフィン関数「Affine」への入力をx=(x1,x2,・・・,xn)と表記すると、出力は式(10)のように示される。
人工ニューラルネットワーク110と同等のスパイキングニューラルネットワーク120として、人工ニューラルネットワーク110のアフィン変換層231の出力値を、スパイキングニューラルネットワーク120の遅延層243のスパイク出力時刻で示すことを考える。
具体的には、スパイキングニューロンモデルへのスパイク入力時刻が-x=(-x1,-x2,・・・,-xn)であるときに、スパイク出力時刻が-Σi=1 Nwixi-bとなるようにする。式(9)で、ti=-xi(i=1,2,…,N)、および、t=-Σi=1 Nwixi-bとして、式(11)を得られる。
式(13)の遅延時間τは、任意の値をとることができる。遅延時間τの値を調整することで、式(13)を満たすことは容易である。
また、式(12)を変形すると、式(14)のようになる。
式(17)で計算される重みw^iの値を式(18)に入力して遅延時間τを計算できる。
対象層を第l層と表し、対象層における何番目のニューロンモデルかをiで表し、対象層の上位層における何番目のニューロンモデルかをjで表すと、式(17)は式(19)のように示される。
行列(A(l) i)-1は、式(16)の左辺で重みwj(j=1、2、・・・、N)をw(l) ijと表記した行列A(l) iの逆行列である。
式(18)は式(20)のように示される。
t(l-1) 0は、設定スパイク生成器241から、第l層の時間方式スパイキングニューロン層242のi番目の時間方式スパイキングニューロンモデルへの、スパイク入力時刻を示す。
Vth (l) (i)は、第l層の時間方式スパイキングニューロン層242のi番目の時間方式スパイキングニューロンモデルにおける膜電位の閾値を示す。上述したように、この閾値は活動電位を模擬する。
図9では、時間方式スパイキングニューロン層242に含まれる時間方式スパイキングニューロンの個数をMで表している。
遅延層243からt-ReLU層244に、M個のスパイクがそれぞれ時刻t1、t2、・・・、tMで入力されている。さらに、時刻tReLUに設定スパイクが入力されている。t-ReLU層244における設定スパイクは、時間方式スパイキングニューロン層242における設定スパイクとは別のスパイクである。t-ReLU層244における設定スパイクを、t-ReLUスパイクとも称する。
t-ReLU層のスパイク出力時刻は、t’1、t’2、・・・、t’Mとなる。
スパイク出力時刻t’i(i=1、2、・・・M)は、式(21)のように示される。
図10は、t-ReLU層244におけるスパイクの入出力時刻の例を示す図である。図10のグラフの横軸は時刻を示し、縦軸は、上位層のノードの識別番号iを示す。上位層のノードの識別番号は、スパイクの識別番号としても用いられる。
図10は、t-ReLU層244への入力スパイクの個数が3個の場合を示している。
上述したように、t-ReLU層244は、時刻tiまたは時刻tReLUの何れか早い方の時刻t’iにスパイクを出力する。図10の例で、時刻t1およびt2は、何れも時刻tReLUよりも早いため、t’1=t1、t’2=t2となっている。
ここでtReLU=0とおく。すなわち、ReLUスパイクが、基準時刻t=0にt-ReLU層244に入力されるようにする。これにより、式(21)の右辺は、入出力の符号が逆であることを除いてランプ関数(式(3)参照)と同じ関数となる。この点で、t-ReLU層244は、スパイクの出力時刻に対してランプ関数を適用する、といえる。
膜電位計算部133は、時間方式スパイキングニューロンモデル132における膜電位を、式(5)に基づいて計算する。
スパイク生成部134は、膜電位計算部133が算出する膜電位と閾値とを比較する。膜電位が閾値以上であると判定した場合、スパイク生成部134は、スパイクを出力する。
シナプス回路141-1から141-Nまでを総称してシナプス回路141とも表記する。
図13の例で、電源160と、可変抵抗152と、スイッチング素子151とが直列に接続されている。
電源160は、シナプス回路141の出力電流のための電流を供給する。
図13では、シナプス回路141-j(jは、0≦j≦Nの整数)への入力信号を電圧Vin(j)で示している。また、シナプス回路141-jからの出力電流をIjで表している。
閾値電源143は、膜電位(電位Vm)との比較対象となる閾値電位を供給する。
ただし、時間方式スパイキングニューロンモデル132の実装方法は、特定の方法に限定されない。例えば、ニューラルネットワーク生成装置10が、時間方式スパイキングニューロンモデル132をコンピュータ上でソフトウェア的に実装するようにしてもよい。
これにより、t-ReLU部137は、t-ReLU層244における、スパイクの出力時刻に対するランプ関数の適用を実行する。
この場合、第一設定スパイク供給部135が、設定スパイク生成器241の例に該当する。
また、1つのニューロンモデル層214に含まれる全ての時間調整型スパイキングニューロンモデル131のt-ReLU部137を纏めて層として捉えたものが、t-ReLU層244の例に該当する。
図14の処理で、ベースネットワーク生成部11は、時間方式スパイキングニューロン層242における重み、および、遅延層243における遅延時間を設定可能な状態のスパイキングニューラルネットワーク120を生成する(ステップS11)。
そして、遅延設定部13は、ベースネットワーク生成部11が生成したスパイキングニューラルネットワーク120の遅延層243における遅延時間を設定する(ステップS13)。
ステップS13の後、ニューラルネットワーク生成装置10は、図14の処理を終了する。
図15は、実施形態に係るニューロンモデル生成装置の構成例を示す概略ブロック図である。図15に示す構成で、ニューロンモデル生成装置20は、ベースモデル生成部21と、重み設定部22と、遅延設定部23とを備える。
図16の処理で、ベースモデル生成部21は、時間方式スパイキングニューロンモデル132における重み、および、遅延部136における遅延時間を設定可能な状態の時間調整型スパイキングニューロンモデル131を生成する(ステップS21)。
そして、遅延設定部23は、ベースモデル生成部21が生成した時間方式スパイキングニューロンモデル132の遅延部136における遅延時間を設定する(ステップS23)。
ステップS23の後、ニューロンモデル生成装置20は、図16の処理を終了する。
ニューラルネットワーク生成装置10によれば、時間方式のスパイキングニューラルネットワーク120を用いることで消費電力を低減させながら、人工ニューラルネットワーク110と同様の演算を行うことができる。
ニューラルネットワーク生成装置10の全部または一部がASIC(Application Specific Integrated Circuit)またはFPGA(Field-Programmable Gate Array)に実装されていてもよい。
スパイキングニューラルネットワーク120の全部または一部がASICまたはFPGAに実装されていてもよい。
遅延層243は、遅延手段の例に該当し、時間方式スパイキングニューロン層242のスパイク出力時刻から設定された時間だけ変化させた時刻にスパイクを出力する。
設定スパイク生成器241は、設定スパイク供給手段の例に該当し、時間方式スパイキングニューロン層242への入力信号に依存しない時刻に、時間方式スパイキングニューロン層242にスパイクを出力する。
遅延部136は、遅延手段の例に該当し、時間方式スパイキングニューロンモデル132のスパイク出力時刻から設定された時間だけ変化させた時刻にスパイクを出力する。
第一設定スパイク供給部135は、設定スパイク供給手段の例に該当し、時間方式スパイキングニューロンモデル132への入力信号に依存しない時刻に、時間方式スパイキングニューロンモデル132にスパイクを出力する。
重み設定部12は、重み設定手段の例に該当し、時間方式スパイキングニューロン層242への入力スパイクの重みを、式(17)または式(19)に基づく重みw^iまたはw^(l) ijに設定する。
式(18)または式(20)は、時間方式スパイキングニューロン層242への入力スパイクの入力時刻tiまたはt(l-1) jが、その入力スパイクが示す数値の符号を逆にした時刻-xiまたは-x(l-1) jである場合に、遅延層243の出力スパイクの出力時刻tまたはt(l) iが、その出力スパイクが示す数値の符号を逆にした時刻Σi=1 Nwixi-bまたはΣj=1 Nw(l) ijx(l-1) j-b(l) iになるとした式の例に該当する。
重み設定部22は、重み設定手段の例に該当し、時間方式スパイキングニューロンモデル132への入力スパイクの重みを、式(17)または式(19)に基づく重みw^iまたはw^(l) ijに設定する。
遅延設定部23は、遅延設定手段の例に該当し、遅延部136における設定時間を、式(18)または式(20)に基づく遅延時間τまたはτ(l) iに設定する。
そして、学習後の人工ニューラルネットワーク110をスパイキングニューラルネットワーク120に変換した。
この点で、スパイキングニューラルネットワーク120におけるニューロンモデルの出力値は、人工ニューラルネットワーク110におけるニューロンモデルの出力値と同等であることが示されている。
図19のグラフの横軸は時刻を示す。縦軸は、膜電位を示す。膜電位の値1.0が発火の閾値に設定されている。また、図19に示される時刻には、式(18)または式(20)に示される遅延時間τまたはτ(l) iが反映されている。
線L112は、2番目のニューロンモデルにおける膜電位の時間発展の例を示す。2番目のニューロンモデルにおける膜電位は、図18に示される時刻0よりも後の時刻において閾値に達している。
このように、図19による場合の発火時刻は、人工ニューロンモデルの出力値の符号を逆にしたものになっている。
図20のグラフの横軸は時刻を示す。縦軸は、膜電位を示す。膜電位の値1.0が発火の閾値に設定されている。また、図20に示される時刻には、式(18)または式(20)に示される遅延時間τまたはτ(l) iは、反映されていない。
このように、図19と図20とを比較すると、人工ニューラルネットワーク110と同等のスパイキングニューラルネットワーク120を生成するために、遅延層243または遅延部136を設けることが有効であることが示されている。
図21のグラフの横軸は時刻を示す。縦軸は、膜電位を示す。膜電位の値1.0が発火の閾値に設定されている。また、図21に示される時刻には、式(18)または式(20)に示される遅延時間τまたはτ(l) iが反映されている。
線L212は、2番目のニューロンモデルにおける膜電位の時間発展の例を示す。2番目のニューロンモデルにおける膜電位は、図18に示される時刻-5.605において閾値に達している。
このように、図21による場合の発火時刻は、人工ニューロンモデルの出力値の符号を逆にしたものになっている。
図22のグラフの横軸は時刻を示す。縦軸は、膜電位を示す。膜電位の値1.0が発火の閾値に設定されている。また、図22に示される時刻には、式(18)または式(20)に示される遅延時間τまたはτ(l) iは、反映されていない。
このように、図21と図22とを比較すると、人工ニューラルネットワーク110と同等のスパイキングニューラルネットワーク120を生成するために、遅延層243または遅延部136を設けることが有効であることが示されている。
これにより、スパイキングニューラルネットワーク120では、人工ニューラルネットワーク110における活性化関数としてのランプ関数を模擬することができ、かつ、上記のように入力信号の待ち時間による遅延を低減させることができる。
これにより、時間方式スパイキングニューロン層242における重みおよび遅延層243における遅延時間を式(17)から式(20)までのような比較的簡単な式で表すことができる。
この点で、スパイキングニューラルネットワーク120によれば、人工ニューラルネットワーク110と同等のスパイキングニューラルネットワークを比較的簡単に得られる。
例えば、人工ニューラルネットワーク110を用いて高精度な学習を行った後、それと同等のスパイキングニューラルネットワーク120をハードウェア的に実装することで、高精度な処理と消費電力の低減との両立を図ることができる。
これにより、時間調整型スパイキングニューロンモデル131では、人工ニューロンモデル111における活性化関数としてのランプ関数を模擬することができ、かつ、上記のように入力信号の待ち時間による遅延を低減させることができる。
これにより、時間方式スパイキングニューロンモデル132における重みおよび遅延部136における遅延時間を式(17)から式(20)までのような比較的簡単な式で表すことができる。
この点で、時間調整型スパイキングニューロンモデル131によれば、人工ニューロンモデル111と同等のスパイキングニューロンモデルを比較的簡単に得られる。
例えば、人工ニューラルネットワーク110を用いて高精度な学習を行った後、それと同等のスパイキングニューラルネットワーク120をハードウェア的に実装することで、高精度な処理と消費電力の低減との両立を図ることができる。
かかる構成で、時間方式スパイキングニューロンモデル611は、信号入力時刻に応じて時間発展する内部状態量が閾値以上になると信号を出力する。遅延部612は、時間方式スパイキングニューロンモデル611の出力信号が基準時刻に対する相対時刻で示すスパイク時刻を、設定された時間だけ変化させた信号を出力する。
ニューラルネットワーク装置610の構成が簡単な構成であることで、ニューラルネットワーク装置610の消費電力の軽減およびニューラルネットワーク装置610のコンパクト化を図ることができる。
かかる構成で、ベースネットワーク生成部621は、信号入力時刻に応じて時間発展する内部状態量が閾値以上になると信号を出力する時間方式スパイキングニューロンモデルと、時間方式スパイキングニューロンモデル手段の出力信号が基準時刻に対する相対時刻で示すスパイク時刻を、設定された時間だけ変化させた信号を出力する遅延部と、を備えるニューラルネットワークを生成する。
遅延設定部623は、遅延部における設定時間を、時間方式スパイキングニューロンモデル手段への入力信号の入力時刻がその入力信号が示す数値の符号を逆にした時刻である場合に遅延部の出力信号の出力時刻がその出力信号が示す数値の符号を逆にした時刻になるとした式に基づく時間に設定する。
ニューラルネットワーク装置の構成が簡単な構成であることで、ニューラルネットワーク装置の消費電力を軽減させ、また、スパイキングニューラルネットワーク120のコンパクト化を図ることができると期待される。
第一信号を出力すること(ステップS611)では、信号入力時刻に応じて時間発展する内部状態量が閾値以上になると第一信号を出力する。第二信号を出力すること(ステップS612)では、第一信号が基準時刻に対する相対時刻で示すスパイク時刻を、設定された時間だけ変化させた第二信号を出力する。
図27に示す構成で、コンピュータ700は、CPU710と、主記憶装置720と、補助記憶装置730と、インタフェース740とを備える。
ニューラルネットワーク生成装置10と他の装置との通信は、例えば、インタフェース740が通信機能を有し、CPU710の制御に従って動作することで実行される。
スパイキングニューラルネットワーク120と他の装置との通信は、例えば、インタフェース740が通信機能を有し、CPU710の制御に従って動作することで実行される。
時間調整型スパイキングニューロンモデル131と他の装置との通信は、例えば、インタフェース740が通信機能を有し、CPU710の制御に従って動作することで実行される。
ニューロンモデル生成装置20と他の装置との通信は、例えば、インタフェース740が通信機能を有し、CPU710の制御に従って動作することで実行される。
ニューラルネットワーク装置610と他の装置との通信は、例えば、インタフェース740が通信機能を有し、CPU710の制御に従って動作することで実行される。
生成装置620と他の装置との通信は、例えば、インタフェース740が通信機能を有し、CPU710の制御に従って動作することで実行される。
また、「コンピュータ読み取り可能な記録媒体」とは、フレキシブルディスク、光磁気ディスク、ROM(Read Only Memory)、CD-ROM(Compact Disc Read Only Memory)等の可搬媒体、コンピュータシステムに内蔵されるハードディスク等の記憶装置のことをいう。また上記プログラムは、前述した機能の一部を実現するためのものであってもよく、さらに前述した機能をコンピュータシステムにすでに記録されているプログラムとの組み合わせで実現できるものであってもよい。
11、621 ベースネットワーク生成部
12、22、622 重み設定部
13、23、623 遅延設定部
20 ニューロンモデル生成装置
21 ベースモデル生成部
110 人工ニューラルネットワーク
120 スパイキングニューラルネットワーク
121 スパイキングニューロンモデル
131 時間調整型スパイキングニューロンモデル
132、611 時間方式スパイキングニューロンモデル
133 膜電位計算部
134 スパイク生成部
135 第一設定スパイク供給部
136、612 遅延部
137 t-ReLU部
138 第二設定スパイク供給部
241 設定スパイク生成器
242 時間方式スパイキングニューロン層
243 遅延層
244 t-ReLU層
610 ニューラルネットワーク装置
620 生成装置
Claims (10)
- 信号入力時刻に応じて時間発展する内部状態量が閾値以上になると信号を出力する時間方式スパイキングニューロンモデル手段と、
前記時間方式スパイキングニューロンモデル手段の出力信号が基準時刻に対する相対時刻で示すスパイク時刻を、設定された時間だけ変化させた信号を出力する遅延手段と、
を備えるニューラルネットワーク装置。 - 前記時間方式スパイキングニューロンモデル手段への入力信号の重みが、前記入力信号の入力時刻がその入力信号が示す数値の符号を逆にした時刻である場合に前記遅延手段の出力信号の出力時刻がその出力信号が示す数値の符号を逆にした時刻になるとした式に基づく重みに設定される、
請求項1に記載のニューラルネットワーク装置。 - 前記遅延手段の前記出力信号の出力時刻、または、前記基準時刻の何れか早い時刻に信号を出力する時間方式ランプ関数手段
をさらに備える請求項2に記載のニューラルネットワーク装置。 - 前記時間方式スパイキングニューロンモデル手段への入力信号に依存しない時刻に、前記時間方式スパイキングニューロンモデル手段に信号を出力する設定スパイク供給手段
をさらに備える請求項1から3の何れか一項に記載のニューラルネットワーク装置。 - 前記時間方式スパイキングニューロンモデル手段、および、前記遅延手段のうち少なくとも何れかは、ASIC(Application Specific Integrated Circuit)またはFPGA(Field-Programmable Gate Array)を用いて構成される、
請求項1に記載のニューラルネットワーク装置。 - 信号入力時刻に応じて時間発展する内部状態量が閾値以上になると信号を出力する時間方式スパイキングニューロンモデル手段と、前記時間方式スパイキングニューロンモデル手段の出力信号が基準時刻に対する相対時刻で示すスパイク時刻を、設定された時間だけ変化させた信号を出力する遅延手段と、を備えるニューラルネットワークを生成するベースネットワーク生成手段と、
前記時間方式スパイキングニューロンモデル手段への入力信号の重みを、前記入力信号の入力時刻がその入力信号が示す数値の符号を逆にした時刻である場合に前記遅延手段の出力信号の出力時刻がその出力信号が示す数値の符号を逆にした時刻になるとした式に基づく重みに設定する、重み設定手段と、
前記遅延手段における前記設定された時間を、前記時間方式スパイキングニューロンモデル手段への入力信号の入力時刻がその入力信号が示す数値の符号を逆にした時刻である場合に前記遅延手段の出力信号の出力時刻がその出力信号が示す数値の符号を逆にした時刻になるとした式に基づく時間に設定する遅延設定手段と、
を備える生成装置。 - 信号入力時刻に応じて時間発展する内部状態量が閾値以上になると第一信号を出力することと、
前記第一信号が基準時刻に対する相対時刻で示すスパイク時刻を、設定された時間だけ変化させた第二信号を出力することと、
を含む情報処理方法。 - 信号入力時刻に応じて時間発展する内部状態量が閾値以上になると信号を出力する時間方式スパイキングニューロンモデル手段と、前記時間方式スパイキングニューロンモデル手段の出力信号が基準時刻に対する相対時刻で示すスパイク時刻を、設定された時間だけ変化させた信号を出力する遅延手段と、を備えるニューラルネットワークを生成することと、
前記時間方式スパイキングニューロンモデル手段への入力信号の重みが、前記入力信号の入力時刻がその入力信号が示す数値の符号を逆にした時刻である場合に前記遅延手段の出力信号の出力時刻がその出力信号が示す数値の符号を逆にした時刻になるとした式に基づく重みに設定することと、
前記遅延手段における前記設定された時間を、前記時間方式スパイキングニューロンモデル手段への入力信号の入力時刻がその入力信号が示す数値の符号を逆にした時刻である場合に前記遅延手段の出力信号の出力時刻がその出力信号が示す数値の符号を逆にした時刻になるとした式に基づく時間に設定することと、
を含むニューラルネットワーク生成方法。 - コンピュータに、
信号入力時刻に応じて時間発展する内部状態量が閾値以上になると第一信号を出力することと、
前記第一信号が基準時刻に対する相対時刻で示すスパイク時刻を、設定された時間だけ変化させた第二信号を出力することと、
を実行させるためのプログラムを記録する記録媒体。 - コンピュータに、
信号入力時刻に応じて時間発展する内部状態量が閾値以上になると信号を出力する時間方式スパイキングニューロンモデル手段と、前記時間方式スパイキングニューロンモデル手段の出力信号が基準時刻に対する相対時刻で示すスパイク時刻を、設定された時間だけ変化させた信号を出力する遅延手段と、を備えるニューラルネットワークを生成することと、
前記時間方式スパイキングニューロンモデル手段への入力信号の重みが、前記入力信号の入力時刻がその入力信号が示す数値の符号を逆にした時刻である場合に前記遅延手段の出力信号の出力時刻がその出力信号が示す数値の符号を逆にした時刻になるとした式に基づく重みに設定することと、
前記遅延手段における前記設定された時間を、前記時間方式スパイキングニューロンモデル手段への入力信号の入力時刻がその入力信号が示す数値の符号を逆にした時刻である場合に前記遅延手段の出力信号の出力時刻がその出力信号が示す数値の符号を逆にした時刻になるとした式に基づく時間に設定することと、
を実行させるためのプログラムを記録する記録媒体。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/039,939 US20240070443A1 (en) | 2020-12-11 | 2020-12-11 | Neural network device, generation device, information processing method, generation method, and recording medium |
PCT/JP2020/046331 WO2022123781A1 (ja) | 2020-12-11 | 2020-12-11 | ニューラルネットワーク装置、生成装置、情報処理方法、生成方法および記録媒体 |
JP2022568020A JPWO2022123781A5 (ja) | 2020-12-11 | ニューラルネットワーク装置、生成装置、情報処理方法、生成方法およびプログラム |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2020/046331 WO2022123781A1 (ja) | 2020-12-11 | 2020-12-11 | ニューラルネットワーク装置、生成装置、情報処理方法、生成方法および記録媒体 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022123781A1 true WO2022123781A1 (ja) | 2022-06-16 |
Family
ID=81974287
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/046331 WO2022123781A1 (ja) | 2020-12-11 | 2020-12-11 | ニューラルネットワーク装置、生成装置、情報処理方法、生成方法および記録媒体 |
Country Status (2)
Country | Link |
---|---|
US (1) | US20240070443A1 (ja) |
WO (1) | WO2022123781A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116056285A (zh) * | 2023-03-23 | 2023-05-02 | 浙江芯源交通电子有限公司 | 一种基于神经元电路的信号灯控制系统及电子设备 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001043205A (ja) * | 1999-05-27 | 2001-02-16 | Denso Corp | ニューロン、当該ニューロンを用いて構成された階層型ニューラルネットワーク及び当該ニューロン内部での乗算処理に用いられる乗算回路 |
JP2010146514A (ja) * | 2008-12-22 | 2010-07-01 | Sharp Corp | 情報処理装置及びこれを用いたニューラルネットワーク回路 |
WO2018168580A1 (ja) * | 2017-03-13 | 2018-09-20 | 日本電気株式会社 | 関係性探索システム、情報処理装置、方法およびプログラム |
-
2020
- 2020-12-11 WO PCT/JP2020/046331 patent/WO2022123781A1/ja active Application Filing
- 2020-12-11 US US18/039,939 patent/US20240070443A1/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001043205A (ja) * | 1999-05-27 | 2001-02-16 | Denso Corp | ニューロン、当該ニューロンを用いて構成された階層型ニューラルネットワーク及び当該ニューロン内部での乗算処理に用いられる乗算回路 |
JP2010146514A (ja) * | 2008-12-22 | 2010-07-01 | Sharp Corp | 情報処理装置及びこれを用いたニューラルネットワーク回路 |
WO2018168580A1 (ja) * | 2017-03-13 | 2018-09-20 | 日本電気株式会社 | 関係性探索システム、情報処理装置、方法およびプログラム |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116056285A (zh) * | 2023-03-23 | 2023-05-02 | 浙江芯源交通电子有限公司 | 一种基于神经元电路的信号灯控制系统及电子设备 |
Also Published As
Publication number | Publication date |
---|---|
US20240070443A1 (en) | 2024-02-29 |
JPWO2022123781A1 (ja) | 2022-06-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9779355B1 (en) | Back propagation gates and storage capacitor for neural networks | |
US20200053299A1 (en) | Image sensor with analog sample and hold circuit control for analog neural networks | |
CN107077637B (zh) | 神经网络中的差分编码 | |
Matsubara et al. | A generalized rotate-and-fire digital spiking neuron model and its on-FPGA learning | |
CN112633497A (zh) | 一种基于重加权膜电压的卷积脉冲神经网络的训练方法 | |
Tanaka et al. | A CMOS spiking neural network circuit with symmetric/asymmetric STDP function | |
Stewart et al. | Online few-shot gesture learning on a neuromorphic processor | |
US20150269480A1 (en) | Implementing a neural-network processor | |
Burney et al. | Levenberg-Marquardt algorithm for Karachi Stock Exchange share rates forecasting | |
US20150046382A1 (en) | Computed synapses for neuromorphic systems | |
US20210064367A1 (en) | Method and computing device with a multiplier-accumulator circuit | |
US20190325291A1 (en) | Resistive processing unit with multiple weight readers | |
CN105981055A (zh) | 神经网络对当前计算资源的自适应 | |
US10552734B2 (en) | Dynamic spatial target selection | |
US20150317557A1 (en) | Temporal spike encoding for temporal learning | |
US20170243108A1 (en) | Current Mirror Scheme for An Integrating Neuron Circuit | |
Sun et al. | Memristor-based Hopfield network circuit for recognition and sequencing application | |
WO2022123781A1 (ja) | ニューラルネットワーク装置、生成装置、情報処理方法、生成方法および記録媒体 | |
Zhao et al. | Novel spike based reservoir node design with high performance spike delay loop | |
Yan et al. | Multilayer memristive neural network circuit based on online learning for license plate detection | |
WO2019033084A1 (en) | PULSE MODULATION MULTIPLIER DURATION | |
Oh et al. | A 57 mW 12.5 µJ/Epoch embedded mixed-mode neuro-fuzzy processor for mobile real-time object recognition | |
KR101825933B1 (ko) | 좌표 변환을 위한 위상 코딩 | |
Loyez et al. | Subthreshold neuromorphic devices for spiking neural networks applied to embedded ai | |
CN106133763A (zh) | 可塑性突触管理 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20965162 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022568020 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18039939 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20965162 Country of ref document: EP Kind code of ref document: A1 |