EP4292017A1 - Neuromorphe schaltung und zugehöriges trainingsverfahren - Google Patents
Neuromorphe schaltung und zugehöriges trainingsverfahrenInfo
- Publication number
- EP4292017A1 EP4292017A1 EP22710296.9A EP22710296A EP4292017A1 EP 4292017 A1 EP4292017 A1 EP 4292017A1 EP 22710296 A EP22710296 A EP 22710296A EP 4292017 A1 EP4292017 A1 EP 4292017A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- neuron
- neurons
- synapse
- pulse
- neuromorphic circuit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000012549 training Methods 0.000 title claims abstract description 30
- 238000000034 method Methods 0.000 title claims description 11
- 210000002569 neuron Anatomy 0.000 claims abstract description 222
- 210000000225 synapse Anatomy 0.000 claims abstract description 96
- 238000013528 artificial neural network Methods 0.000 claims abstract description 45
- 230000002457 bidirectional effect Effects 0.000 claims abstract description 12
- 230000008859 change Effects 0.000 claims description 8
- 230000003111 delayed effect Effects 0.000 claims description 6
- 239000011159 matrix material Substances 0.000 claims description 5
- 230000004048 modification Effects 0.000 claims description 5
- 238000012986 modification Methods 0.000 claims description 5
- 238000012421 spiking Methods 0.000 abstract description 5
- 230000002123 temporal effect Effects 0.000 abstract 1
- 239000012528 membrane Substances 0.000 description 12
- 230000004913 activation Effects 0.000 description 11
- 238000001994 activation Methods 0.000 description 11
- 230000006870 function Effects 0.000 description 11
- 230000000946 synaptic effect Effects 0.000 description 10
- 230000015654 memory Effects 0.000 description 9
- 238000005516 engineering process Methods 0.000 description 7
- 230000006399 behavior Effects 0.000 description 5
- 238000011144 upstream manufacturing Methods 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 4
- 230000001242 postsynaptic effect Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 239000004065 semiconductor Substances 0.000 description 4
- 238000004519 manufacturing process Methods 0.000 description 3
- 230000003518 presynaptic effect Effects 0.000 description 3
- 230000036279 refractory period Effects 0.000 description 3
- 230000000284 resting effect Effects 0.000 description 3
- 230000000638 stimulation Effects 0.000 description 3
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 239000003990 capacitor Substances 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 239000012212 insulator Substances 0.000 description 2
- 230000000670 limiting effect Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 229910044991 metal oxide Inorganic materials 0.000 description 2
- 150000004706 metal oxides Chemical class 0.000 description 2
- 210000005036 nerve Anatomy 0.000 description 2
- 229910052710 silicon Inorganic materials 0.000 description 2
- 239000010703 silicon Substances 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 206010001497 Agitation Diseases 0.000 description 1
- 240000002627 Cordeauxia edulis Species 0.000 description 1
- 108010001267 Protein Subunits Proteins 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000013529 biological neural network Methods 0.000 description 1
- 210000004027 cell Anatomy 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000002964 excitative effect Effects 0.000 description 1
- 238000009472 formulation Methods 0.000 description 1
- 230000002401 inhibitory effect Effects 0.000 description 1
- 210000002364 input neuron Anatomy 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000002955 isolation Methods 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000007620 mathematical function Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 210000000653 nervous system Anatomy 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000010355 oscillation Effects 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 239000012782 phase change material Substances 0.000 description 1
- 230000001766 physiological effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000005062 synaptic transmission Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/049—Temporal neural networks, e.g. delay elements, oscillating neurons or pulsed inputs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/06—Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
- G06N3/063—Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means
- G06N3/065—Analogue means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/088—Non-supervised learning, e.g. competitive learning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B82—NANOTECHNOLOGY
- B82Y—SPECIFIC USES OR APPLICATIONS OF NANOSTRUCTURES; MEASUREMENT OR ANALYSIS OF NANOSTRUCTURES; MANUFACTURE OR TREATMENT OF NANOSTRUCTURES
- B82Y10/00—Nanotechnology for information processing, storage or transmission, e.g. quantum computing or single electron logic
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/048—Activation functions
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11C—STATIC STORES
- G11C11/00—Digital stores characterised by the use of particular electric or magnetic storage elements; Storage elements therefor
- G11C11/54—Digital stores characterised by the use of particular electric or magnetic storage elements; Storage elements therefor using elements simulating biological cells, e.g. neuron
Definitions
- the present invention relates to a neuromorphic circuit capable of implementing a neural network and also relates to a method for training the neural network of the neuromorphic circuit.
- a CPU is a processor, the acronym CPU coming from the English term “Central Processing Unit” literally meaning central processing unit while a GPU is a graphics processor, the acronym GPU coming from the English term “Graphics Processing Unit” literally meaning graphics unit treatment.
- a neural network is generally made up of a succession of layers of neurons, each of which takes its inputs from the outputs of the previous layer. More precisely, each layer comprises neurons taking their inputs from the outputs of the neurons of the previous layer. Each layer is connected by a plurality of synapses. A synaptic weight is associated with each synapse. It is a real number, which takes both positive and negative values. For each layer, the input of a neuron is the weighted sum of the outputs of the neurons of the previous layer, the weighting being made by the synaptic weights.
- Von Neumann bottleneck also called “Von Neumann bottleneck” according to its English name
- a problem of Von Neumann bottleneck arises from the fact that the implementation of a deep neural network (more of three layers and ranging up to several tens) involves using both the memory or memories and the processor while these latter elements are spatially separated.
- CMOS complementary metal oxide semiconductor
- CMOS complementary metal oxide semiconductor
- a neural network based on technologies of the optical type is also known.
- CMOS neural networks and CMOS synapses are synapses using memristors.
- memristor or memristance
- the name is a portmanteau word formed from the two English words memory and resistor.
- a memristor is a non-volatile memory component, the value of its electrical resistance changing with the application of a voltage for a certain time and remaining at this value in the absence of voltage.
- each neuron occupies several tens of micrometers per side.
- each synapse also occupies several tens of micrometers per side.
- the number of neurons and synapses which can be integrated is limited, which results in a reduction in the performance of the neural network.
- impulse neural networks in addition to the aforementioned issues, there is also the difficulty of implementing training since the gradient backpropagation technique which is currently the most effective is not directly usable.
- the description describes a neuromorphic circuit suitable for implementing a network of impulse neurons, the neuromorphic circuit comprising synapses produced by a set of memristors arranged in the form of a matrix network, each synapse having a value.
- the neuromorphic circuit comprises neurons, each neuron being capable of emitting impulses at a variable rate, each neuron being connected to one or more neurons via a synapse, the neurons being arranged in successive layers of neurons, the layers of neurons comprising a layer input, at least one hidden layer of neurons and an output layer, the synapses being bidirectional for the neurons of the at least one hidden layer of neurons and of the output layer.
- the neuromorphic circuit comprises a neural network training module, for at least one bidirectional synapse connecting a first neuron to a second neuron, the training module comprising, for the first neuron and the second neuron, an estimation unit suitable for obtaining an estimate of the time derivative of the rate of pulses emitted by the neuron.
- the training module also comprises an interconnection between the synapse and each neuron, the interconnection having at least two positions, and a controller suitable for sending a control signal to the interconnection when the first neuron has emitted an impulse, the control signal modifying the position of the interconnection so that the estimation unit of the second neuron is connected to the synapse.
- the neuromorphic circuit has one or more of the following characteristics, taken in isolation or according to all the technically possible combinations:
- the controller is also capable of synchronizing the two neurons so that the two neurons emit control signals modifying the value of the synapse according to the estimation of the time derivative of the rate of pulses emitted by the second neuron.
- each memristor has a non-zero conductance for a voltage above a positive threshold and for a voltage below a negative threshold
- the first neuron also being capable of emitting as a control signal a pulse whose amplitude is at each instant equal to one of the positive threshold and the negative threshold, the pulse preferably comprising a single change in amplitude.
- the second neuron is capable of emitting as a control signal a pulse proportional to the estimate of the time derivative of the pulse rate obtained by the estimation unit.
- the controller controls the two neurons so that the two control signals are emitted simultaneously.
- the interconnection comprises a sub-circuit for each neuron to which the synapse is connected, each sub-circuit comprising two switches.
- the estimation unit comprises a subunit for obtaining the rate of pulses emitted by said neuron, the obtaining subunit encoding the rate of pulses in an output signal, the subunit for pulse rate obtaining is preferably a leaky integrator circuit, the estimating unit having a delay of the output signal of the obtaining sub-unit, to obtain a delayed signal, and a subtractor of the signal output of the obtaining subunit and the delayed signal of the retarder, to obtain a difference signal.
- the neuromorphic circuit further comprises a filter at the output of the subtractor, the filter preferably being a low-pass filter.
- the neuron is produced by a pulse relaxation oscillator.
- the description also describes a method for driving a network of impulse neurons that a neuromorphic circuit is capable of implementing, the neuromorphic circuit comprising synapses produced by a set of memristors arranged in the form of a matrix network, each synapse having a value, the neuromorphic circuit comprising neurons, each neuron being able to emit impulses according to a variable rate, each neuron being connected to one or more neurons via a synapse.
- the neurons are arranged in layers of successive neurons, the layers of neurons comprising an input layer, at least one hidden layer of neurons and an output layer, the synapses being bidirectional for the neurons of the at least one hidden layer of neurons and of the output layer.
- the neuromorphic circuit also comprises a neural network training module, for at least one bidirectional synapse connecting a first neuron to a second neuron, the training module comprising, for each neuron, an estimation unit capable of obtaining a estimation of the time derivative of the rate of pulses emitted by the neuron, an interconnection between the synapse and each neuron, the interconnection having at least two positions, and a controller.
- the training method includes the steps of the controller sending a control signal to the interconnect when the first neuron has emitted a pulse, and modifying the position of the interconnect so that the unit estimation of the second neuron is connected to the synapse.
- FIG. 1 is a schematic representation of an example of part of a neuromorphic circuit comprising a plurality of neurons and synapses
- FIG. 2 is a schematic representation of an example of a neural network
- FIG. 3 is a graphical representation of the behavior of a memristor
- FIG. 4 is a schematic representation of an example of a synapse and two neurons
- FIG. 5 is a schematic representation of an example of an operating timing diagram for updating the synapse of Figure 4.
- a portion of a neural network neuromorphic circuit 10 is shown in Figure 1.
- the neuromorphic circuit 10 is a single chip. This means that all the components which will be described later are located on the same chip.
- the neuromorphic circuit 10 is suitable for implementing a neural network 12 as shown schematically in Figure 2.
- the neural network 12 described is a network comprising an ordered succession of layers 14 of neurons 16 each of which takes its inputs from the outputs of the previous layer 14.
- a neuron or a nerve cell
- Neurons ensure the transmission of a bioelectrical signal called nerve impulses.
- Neurons have two physiological properties: excitability, i.e. the ability to respond to stimulation and convert it into nerve impulses, and conductivity, i.e. the ability to transmit signals. impulses.
- excitability i.e. the ability to respond to stimulation and convert it into nerve impulses
- conductivity i.e. the ability to transmit signals. impulses.
- activation a mathematical function, called activation, which has the property of being non-linear (to be able to transform the input in a useful way) and preferentially of being derivable (to allow gradient backpropagation learning).
- the activation function in this case can be represented as a function giving the variation of the average pulse emission frequency with the input current.
- a neuron 16 is a component performing a function equivalent to these latter models.
- each layer 14 comprises neurons 16 taking their inputs from the outputs of the neurons 16 of the previous layer 14.
- the neural network 12 described is a network comprising a single hidden layer of neurons 18.
- this number of hidden layers of neurons is not limiting.
- the uniqueness of the hidden layer of neurons 18 means that the neural network 10 has an input layer 20 followed by the hidden layer of neurons 18, itself followed by an output layer 22.
- the layers are indexable by an integer index i, the first layer corresponding to the input layer 20 and the last to the output layer 22.
- Each layer 14 is connected by a plurality of synapses 24.
- the synapse designates a functional contact zone which is established between two neurons 16. Depending on its behavior, the biological synapse can excite or even inhibit the downstream neuron in response to the upstream neuron.
- a positive synaptic weight corresponds to an excitatory synapse while a negative synaptic weight corresponds to an inhibitory synapse.
- Biological neural networks learn by altering synaptic transmissions throughout the network.
- formal neural networks can be trained to perform tasks by modifying synaptic weights according to a learning rule.
- a synapse 24 is a component performing a function equivalent to a synaptic weight of modifiable value.
- a synaptic weight is therefore associated with each synapse 24.
- it is a real number, which takes both positive and negative values.
- the input of a neuron 16 is the weighted sum of the outputs of the neurons 16 of the previous layer 14, the weighting being made by the synaptic weights.
- each layer 14 of neurons 16 is fully connected.
- a fully connected neuron layer is a layer in which the neurons in the layer are each connected to all the neurons in the previous layer. Such a type of layer is more often called according to the English term “fully connected”.
- the neural network 10 is a pulsed neural network.
- a spiking neural network is often referred to by the acronym SNN which refers to the English denomination of “Spiking Neural Network”.
- a neuron is a time-varying dynamic element as described above and characterized here by its spiking frequency.
- a neuron 16 called pre-synaptic, upstream, emits an impulse
- synapse 24 weights this impulse and transmits it to neuron 16, called post-synaptic, downstream, which eventually emits an impulse in turn.
- the stimulation transmitted by the synapse 24 is a stimulation of a part of the neuron 16 downstream, called membrane and presenting a potential. If this membrane potential charges beyond a so-called activation threshold, neuron 16 emits an impulse. Specifically, synapse 24 performs a multiplication between weight and input activation. The input activation of neuron 16 downstream is the output signal sent by neuron 16 upstream.
- the downstream neuron 16 increases its membrane potential, compares it to a threshold and emits an output pulse when the membrane potential exceeds this threshold.
- a neuron 16 upstream is permanently activated (like an input neuron) in order to add biases to the membrane potential of the neuron 16 downstream which enrich the expressiveness of the function learned by the neural network 12.
- Such a neuron 16 is called “bias neuron” in the following description.
- neurons 16 are connected by a synapse 24 which is bidirectional.
- a circuit implementing a synapse 24 will simply be called synapse 24 or a circuit physically implementing a neuron 16 will simply be called neuron 16.
- part of neuromorphic circuit 10 represented in figure 1 corresponds to a set of two layers 14 of neurons 16 connected by synapses 24.
- neurons 16 aligned vertically form a first layer 14 and neurons 16 aligned horizontally form a second layer 14.
- the neuromorphic circuit 10 has 16 neurons, 24 synapses and a 26 training module.
- Each neuron 16 is made with a leakage and pulse emission integrator.
- LIF Leaky Integrate-and-Fire
- each neuron 16 integrates the current received as input.
- Each neuron 16 emits an impulse when the membrane potential reaches a threshold.
- neuron 16 resets the membrane potential and does not integrate the current received at the input for a period.
- the period of non-integration is called the refractory period (more often referred to as the “refractory period” in reference to the corresponding English name).
- the neuron 16 After the refractory period, the neuron 16 again integrates the current received as input and the same mechanism begins again.
- the neuron 16 thus emits pulses at a variable rate over time.
- the pulse emission rate is p.
- the set of pulses is usually called “pulse train”.
- each neuron 16 is a pulse relaxation oscillator.
- each neuron 16 is a CMOS circuit comprising as main elements: a membrane capable of charging in integrating the currents generated by the neurons 16 of the preceding layers 14, a comparator, a pulse generator and an element producing the leak.
- the membrane of neuron 16 has a capacity as its main element.
- the leakage resistor can be made with an NMOS transistor (the abbreviation referring to the English name of "Metal-Oxide-Semiconductor” which literally means “metal - oxide - semiconductor", the N indicating that it is an NJ-type MOS transistor in saturation mode.
- a second NMOS transistor mounted in parallel with the membrane capacitor makes it possible to produce a switch across the terminals of the capacitor and to discharge it.
- the comparator is used to detect whether the voltage across the terminals of the capacity exceeds the activation threshold of neuron 16.
- the neuron 16 can also include additional modules, in particular to control the adaptability of the thresholds, the pattern of emission of the pulses or the shape of the pulses.
- a neuron 16 may have a size ranging from 28 nanometers (nm) to 180 nm.
- a technology of the FDSOI type could be used.
- the abbreviation FDSOI refers to the English name of "Fully Depleted Silicon On Insulator” which literally means silicon completely depleted on insulator.
- a neuron 16 is a neuristor, i.e. a component based on materials with volatile resistive switching.
- a neuron 16 is composed of two active memristors.
- the memristors may be based on resistive materials of the NbC>2- x or VO x type (where x is between 0 and 2), with negative differential resistance.
- Neurons 16 have an oscillation frequency which depends on the input current.
- a module which adapts the shape of the impulses is added so that it is possible to modify the synapses, as will be explained later.
- Each synapse 24 is made by a set of memristors 30.
- a memristor 30 is a component whose value of its electrical resistance changes permanently when a current is applied. Thus, a data can be recorded and rewritten by a control current. Such behavior is notably observed in phase change materials, ferroelectric tunnel junctions or redox memories based on oxides such as HfO x or Ti0 2-X .
- the change in conductance of a memristor 30 depends on the amplitude and duration of the voltage pulses applied across the memristor 30.
- Such a memristor 30 exhibits behavior that can be modeled as follows: Where :
- • s + is the slope of the conductance of the memristor 30 as a function of the voltage for an applied voltage greater than a positive threshold voltage V t + h ,
- G min is the minimum conductance that can be reached by the memristor 30,
- G max is the maximum conductance attainable by the memristor 30,
- • h + is a non-linearity coefficient corresponding to the fact that the amplitude of change in the conductance can depend on the conductance state of the memristor 30,
- ⁇ pulse is the duration of the voltage pulse applied to memristor 30.
- the set of the two memristors 30 makes it possible to produce a weight of any sign.
- one of the memristors 30 is kept fixed at an intermediate value and the conductance of the other memristor 30 varies according to the voltage pulse applied to the memristor.
- the memristors 30 are arranged in the form of a matrix network.
- the second neuron 16 of the second layer 14 will have an incident current / corresponding to:
- index i gives the number of the synapse 24 concerned and the activation of the neuron 16 concerned.
- the training module 26 is a module capable of training the neural network 12. Training a neural network 12 consists in determining the appropriate weights for the neural network 12 to perform the desired task.
- Training module 26 is capable of operating at an adjustable learning rate.
- the neuromorphic circuit 10 comprises a dedicated adjustment component, such as a resistor.
- the resistor can be positioned at the output of the 42fd filter which will be described later.
- the training module 26 includes an estimation unit 32 for each neuron 16 and a controller 34.
- the estimation unit 32 is able to obtain an estimation of the time derivative of the rate of pulses emitted by the neuron 16.
- the estimation unit 32 is capable of obtaining an estimation of the derivative of the rate of pulses at a given instant.
- the estimation unit 32 takes as input the train of pulses emitted by the neuron 16 and outputs a signal encoding the derivative of the rate of emission of the pulses.
- the signal is an impulse signal proportional to the estimated emission rate.
- the estimation unit 32 comprises a obtaining sub-unit 36, a retarder 38 and a subtractor 40.
- Obtaining subunit 36 is a subunit for obtaining the rate of pulses emitted by neuron 16.
- the obtaining subunit 36 takes as input the train of pulses emitted by the neuron 16 to which the obtaining subunit 36 is connected and emits at its output an output signal encoding the rate of emitted pulses.
- the pulse rate obtaining subunit 36 is a leaky integrator circuit.
- the leakage coefficient of the leakage integrator circuit is denoted y u .
- the obtaining sub-unit 36 then able to generate a voltage Vu varying in proportion to the rate of emission p of the pulse train.
- the delay 38 is adapted to delay the signal transmitted by the obtaining sub-unit 36. More precisely, the delay 38 is capable of introducing a delay of duration t into the signal of the obtaining sub-unit 36.
- the retarder 38 thus outputs a delayed signal.
- Such a signal can be written mathematically as V u (t - t).
- Subtractor 40 is a subtractor of the output signal of get subunit 36 and the delayed signal of delay timer 38.
- the inputs of the subtractor 40 are thus connected to the outputs of the obtaining subunit 36 and of the retarder 38.
- the signal at the output of the subtractor 40 is called the difference signal.
- the difference signal V D verifies the following relations:
- the RV difference signal is a signal encoding the derivative of the pulse emission rate.
- the difference signal V D is a signal having an amplitude proportional to the derivative of the emission rate of the pulses.
- the estimation unit 32 also comprises a filter 42 and a multiplication sub-unit 44.
- Filter 42 is positioned at the output of subtractor 40.
- the filter 42 is thus a filter of the difference signal serving to attenuate the variations thereof.
- filter 42 is a low-pass filter.
- the multiplication sub-unit 44 is capable of multiplying an incident signal by a coefficient which is here the given learning rate of the training module 26.
- Multiplication subunit 44 is placed at the output of filter 42.
- Each of the subunits of the estimation unit 32 is implemented as a CMOS component.
- the output of the estimation unit 32 is thus a signal proportional to the time derivative of the rate of transmitted pulses p.
- the proportionality coefficient is denoted e in the following.
- the controller 34 is able to send a control signal to all the synapses 24 to which the neuron 16 is connected.
- the controller 34 is able to send the activation signal when the first neuron 16 has emitted a pulse. Controller 34 sends the enable signal to interconnect 46 which is also part of drive module 26.
- the interconnection 46 has several positions, and in particular a position in which the estimation unit 32 of the second neuron 16 is connected to the synapse 24 and another position in which the estimation unit 32 of the second neuron 16 is not is not connected to synapse 24.
- the control signal causes the position of the interconnection 46 to be modified so that the estimation unit 32 of the second neuron 16 is connected to the synapse 24.
- the interconnection 46 comprises a sub-circuit 54 for each neuron 16 to which the synapse 24 is connected.
- Synapse 24 is connected at one end to sub-circuit 54 of the first neuron 16 and at the other end to sub-circuit 54 of the second neuron 16.
- Each sub-circuit 54 includes two switches 56 and 58.
- Each position of switches 56 and 58 corresponds to a state of switches 56 and 58.
- each switch 56 and 58 is a transistor.
- four positions are possible: on state for the two transistors 56 and 58 (first position), off state for the two transistors 56 (second position) and 58 and on state for one of the transistors 56 and 58 and off state for the other of the two transistors 56 and 58 (third and fourth positions).
- transistors 56 and 58 are opposite, so that only two positions are possible, namely on state for one of transistors 56 and 58 and off state for the other of them. two transistors 56 and 58.
- Each of these transistors 56 and 58 has a drain D, a gate G and a source S.
- the drive module 26 also comprises, for each neuron 16, a current conveyor 60 ensuring the flow of current between a neuron 16 and other neurons 16.
- the current conveyor 60 is a tripole which ensures that the signals emitted by the neurons 16 propagate bidirectionally.
- the current conveyor 60 makes it possible both to send an impulse to the neurons 16 of the next layer 14 and to impose a value on the synapse 24 connected upstream.
- the current conveyor 60 copies the post-synaptic resting potential on one of the terminals of the synapse 24, the other terminal of the synapse 24 having the presynaptic resting potential.
- the current generated proportional to the resistance of a synapse 24 arrives at one of the two input terminals of the conveyor of current 60 and is copied to the output terminal of the synapse 24 in order to charge the membrane capacity of the post-synaptic neuron 16.
- the activation threshold is exceeded, the post-synaptic potential generated is sent to the next layer 14 as well as to the second input of the current conveyor 60 in order to be able to write on the synapse 24.
- the drain D of the first transistor 56 is connected to the output of the estimation unit 32, the gate G of the first transistor 56 is connected to the controller 34 and the source S of the first transistor 56 is connected to the synapse 24.
- the drain D of the second transistor 58 is connected to the output of the current conveyor 60, the gate G of the second transistor 58 is connected to the controller 34 and the source S of the second transistor 58 is connected to the synapse 24.
- the controller 34 is also capable of synchronizing the two neurons 16 so that the two neurons 16 emit control signals modifying the value of the synapse 24 according to the estimate of the time derivative of the rate of pulses emitted by the second neuron 16.
- control signals emitted by each neuron 16 differ as will be explained later in this description.
- control signals are synchronized so that the voltage applied to the synapse 24 allows it to be updated to the desired value.
- the operation of the neuromorphic circuit 10, and more specifically of the training module 26, is now described with reference to a method of training the neural network 12.
- the first neuron 16 emits an impulse.
- Controller 34 sends an activation signal to all synapses 24 to which neuron 16 is connected.
- the controller 34 then forces the first transistor 56 of the sub-circuit 54 of the second neuron 16 to be in the on state.
- the controller 34 makes it possible to ensure that the sub-circuits 54 of the neurons 16 are in the appropriate state to emit a control signal at a time to.
- the command signals emitted are shown in the timing diagram in Figure 5.
- the first neuron 16 then emits a first control signal whose amplitude is at each instant equal to one of the positive threshold voltage V h of the memristor 30 to be updated and the negative threshold voltage V t ⁇ h of the memristor 30 to update.
- the first control signal comprises a single amplitude change, so that the first control signal comprises a pulse of positive amplitude followed by a pulse of negative amplitude.
- the positive amplitude pulse has as its amplitude the positive threshold voltage V h of the memristor 30 to be updated while the negative amplitude pulse has as its amplitude the negative threshold voltage V t ⁇ h of the memristor 30 to be updated. up to date.
- the positive amplitude pulse and the negative amplitude pulse have the same duration denoted At imp .
- the estimation unit 32 of each second neuron 16 is triggered and emits a voltage pulse having a pulse proportional to the time derivative of the rate of pulses emitted, the pulse having a duration 2*At imp .
- This voltage pulse corresponds to the second control signal.
- the controller 34 the second control signal is thus synchronized with the first control signal, the pulse emitted by the estimation unit 32 temporally covering the first control signal (simultaneous transmission).
- the modification signal is such that the voltage applied in the memristor 30 is therefore first between time to and time t 0 + At imp a voltage of amplitude V t + h + er then between time t 0 + At imp and time t 0 + 2 * At imp a voltage of amplitude V t ⁇ h + er.
- the update is performed according to the value of the time derivative of the rate of pulses emitted by the second neuron 16.
- the update is on the one hand carried out according to the value of the aforementioned time derivative and on the other hand carried out according to the value of the rate of pulses emitted by the first neuron 16.
- the rate of change in the value of synapse 24 is the product of the rate of impulses emitted by the first neuron 16 with the time derivative of the rate of impulses emitted by the second neuron 16.
- the update of the same synapse 24 is also performed according to the value of the time derivative of the rate of impulses emitted by the first neuron 16.
- Equilibrium Propagation a modification of a learning law called equilibrium propagation (referred to as “Equilibrium Propagation”), which allows the weights to be modified continuously, depending on the local dynamics of the neurons 16 and which encodes the derivative of the error in the changes in pulse frequencies during the training phase of the neural network 12.
- This modification of the learning law allows the implementation with memristors with functional performance equivalent to the backpropagation of the gradient of the error, by adding only a drive module 26 which consumes little resources.
- the training module 26 comprises a set of inexpensive units in terms of hardware for each neuron 16 instead of costly external memories in place and in execution time, as well as specific circuits to modify each memristor in function of the gradient of the explicit error.
- a learning module 26 that is relatively easy to implement since it suffices to use a controller 34 (depending on the case, a central controller or a controller per pair of neurons 16), a unit of estimation 32 for each neuron 16 as well as an interconnection 46 for each synapse 24.
- the drive module 26 can be made with CMOS components, the drive module 26 has good efficiency.
- the training module 26 thus allows a network of impulse neurons to learn with good precision while maintaining a high integrability of the elements of the neuromorphic circuit 10.
- Such a neuromorphic circuit 10 is in particular suitable for performing image classification with local supervised learning.
- the neuromorphic circuit 10 is then a chip making it possible to carry out an adaptive classification at high speed and at low power. In particular, such a neuromorphic circuit 10 can learn during use.
- the robustness to faults of the neuromorphic circuit 10 can be improved by considering monolithic elements of the transistor type - one memristor per synapse (English acronym 1T1 R) .
- This structure allows a finer adjustment of the conductances of the memristors 30 and remains below the on-chip footprint of the production of all-CMOS synapses.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- General Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Computational Linguistics (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Artificial Intelligence (AREA)
- Neurology (AREA)
- Image Analysis (AREA)
- Feedback Control In General (AREA)
- Air Conditioning Control Device (AREA)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR2101311A FR3119696B1 (fr) | 2021-02-11 | 2021-02-11 | Circuit neuromorphique et procede d'entraînement associé |
PCT/EP2022/053026 WO2022171632A1 (fr) | 2021-02-11 | 2022-02-08 | Circuit neuromorphique et procédé d'entraînement associé |
Publications (1)
Publication Number | Publication Date |
---|---|
EP4292017A1 true EP4292017A1 (de) | 2023-12-20 |
Family
ID=76730604
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP22710296.9A Pending EP4292017A1 (de) | 2021-02-11 | 2022-02-08 | Neuromorphe schaltung und zugehöriges trainingsverfahren |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP4292017A1 (de) |
FR (1) | FR3119696B1 (de) |
WO (1) | WO2022171632A1 (de) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115169547B (zh) * | 2022-09-09 | 2022-11-29 | 深圳时识科技有限公司 | 神经形态芯片及电子设备 |
-
2021
- 2021-02-11 FR FR2101311A patent/FR3119696B1/fr active Active
-
2022
- 2022-02-08 EP EP22710296.9A patent/EP4292017A1/de active Pending
- 2022-02-08 WO PCT/EP2022/053026 patent/WO2022171632A1/fr active Application Filing
Also Published As
Publication number | Publication date |
---|---|
FR3119696B1 (fr) | 2024-02-09 |
WO2022171632A1 (fr) | 2022-08-18 |
FR3119696A1 (fr) | 2022-08-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3663988B1 (de) | Künstliches neuron für neuromorphen chip mit resistiven synapsen | |
EP2713318B1 (de) | Neuromorphes System, das die intrinsischen Eigenschaften von Speicherzellen nutzt | |
EP2965269B1 (de) | Künstliches neuron und memristor | |
EP0546624B1 (de) | Datenverarbeitungssystem mit geteilter nicht-linearer Funktion | |
WO2017178352A1 (fr) | Neurone artificiel | |
EP3186752A1 (de) | Neuronales konvolutionsnetzwerk | |
KR20170008747A (ko) | 공통 피처들에 대한 분류자의 업데이트 | |
EP3449423B1 (de) | Vorrichtung und verfahren zur berechnung der faltung in einem neuronalen faltungsnetzwerk | |
WO2013000940A1 (fr) | Reseau de neurones artificiels a base de dispositifs memristifs complementaires | |
US20150248609A1 (en) | Neural network adaptation to current computational resources | |
EP3660749A1 (de) | Neuronaler schaltkreis, der in der lage ist, einen synaptischen lernprozess auszuführen | |
FR3087560A1 (fr) | Retro-propagation d'erreurs sous forme impulsionnelle dans un reseau de neurones impulsionnels | |
WO2017009543A1 (fr) | Dispositif de traitement de données avec représentation de valeurs par des intervalles de temps entre événements | |
EP4292017A1 (de) | Neuromorphe schaltung und zugehöriges trainingsverfahren | |
WO2019020384A9 (fr) | Calculateur pour reseau de neurones impulsionnel avec agregation maximale | |
FR3126252A1 (fr) | Circuit neuromorphique à base de cellules RRAM 2T2R | |
EP3594863B1 (de) | Neuromorpher impulsschaltkreis, der ein künstliches neuron umfasst | |
EP3549070B1 (de) | Modulationsvorrichtung und verfahren, künstliche synapse mit dieser modulationsvorrichtung, kurzzeitiges plastizitätsverfahren in einem künstlichen neuronalen netz mit dieser künstlichen synapse | |
US11977982B2 (en) | Training of oscillatory neural networks | |
FR3114718A1 (fr) | Dispositif de compensation du mouvement d’un capteur événementiel et système d’observation et procédé associés | |
FR3135562A1 (fr) | Cellule mémoire, circuit électronique comprenant de telles cellules, procédé de programmation et procédé de multiplication et accumulation associés | |
FR3124621A1 (fr) | Procédé d’apprentissage d’un ensemble de modèles d’un réseau de neurones artificiels, procédé de traitement de données et circuit électronique associés | |
FR3105659A1 (fr) | Procédé et dispositif de codage binaire de signaux pour implémenter des opérations MAC numériques à précision dynamique | |
EP4195061A1 (de) | Algorithmusrechner aus speicher mit gemischten technologien | |
FR2691004A1 (fr) | Dispositif connexionniste de suivi de l'évolution d'une information entre au moins deux valeurs données, et applications. |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20230809 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) |