CN115062583A - Hopfield network hardware circuit for solving optimization problem and operation method - Google Patents

Hopfield network hardware circuit for solving optimization problem and operation method Download PDF

Info

Publication number
CN115062583A
CN115062583A CN202210675728.9A CN202210675728A CN115062583A CN 115062583 A CN115062583 A CN 115062583A CN 202210675728 A CN202210675728 A CN 202210675728A CN 115062583 A CN115062583 A CN 115062583A
Authority
CN
China
Prior art keywords
module
bias
output
signal
annealing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210675728.9A
Other languages
Chinese (zh)
Other versions
CN115062583B (en
Inventor
李祎
任鹏宇
包涵
缪向水
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huazhong University of Science and Technology
Hubei Jiangcheng Laboratory
Original Assignee
Huazhong University of Science and Technology
Hubei Jiangcheng Laboratory
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huazhong University of Science and Technology, Hubei Jiangcheng Laboratory filed Critical Huazhong University of Science and Technology
Priority to CN202210675728.9A priority Critical patent/CN115062583B/en
Publication of CN115062583A publication Critical patent/CN115062583A/en
Application granted granted Critical
Publication of CN115062583B publication Critical patent/CN115062583B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/30Circuit design
    • G06F30/39Circuit design at the physical level
    • G06F30/398Design verification or optimisation, e.g. using design rule check [DRC], layout versus schematics [LVS] or finite element methods [FEM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/004Artificial life, i.e. computing arrangements simulating life
    • G06N3/006Artificial life, i.e. computing arrangements simulating life based on simulated virtual individual or collective life forms, e.g. social simulations or particle swarm optimisation [PSO]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N7/00Computing arrangements based on specific mathematical models
    • G06N7/08Computing arrangements based on specific mathematical models using chaos models or non-linear system models
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Computer Hardware Design (AREA)
  • Data Mining & Analysis (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • Nonlinear Science (AREA)
  • Mathematical Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Algebra (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computational Mathematics (AREA)
  • Geometry (AREA)
  • Amplifiers (AREA)

Abstract

The invention discloses a Hopfield network hardware circuit for solving an optimization problem and an operation method thereof, wherein the circuit comprises a synapse module, an annealing module, a neuron module and an activation function module; the synapse module comprises a biasing unit and a weight unit of an electronic synapse device array, wherein the biasing unit outputs a fixed bias current signal, the array is encoded into weight information in a conductance mode, and an output current is generated under the action of a voltage signal output by the activation function module and is output to the neuron module; the annealing module generates output current under the action of the transient annealing signal and the voltage signal output by the activation function module and outputs the output current to the neuron module; the neuron module comprises a resistor, a capacitor and an operational amplifier, receives the current from the synapse module and the annealing module, generates an output voltage, and outputs the output voltage to the activation function module to perform nonlinear activation function operation. The invention can effectively reduce peripheral circuits and simultaneously can improve the running speed and the convergence effect of solving the combined optimization problem.

Description

Hopfield network hardware circuit for solving optimization problem and operation method
Technical Field
The invention belongs to the technical field of solving combined optimization problems, and particularly relates to a Hopfield network hardware circuit for solving optimization problems and an operation method.
Background
The combinatorial optimization problem (combinatorial optimization problem) is a common but very difficult problem to solve in daily life, and the time complexity of solution is usually NP-Hard, so that it is difficult to solve by the conventional mathematical equation. The Hopfield network can rapidly solve the combinatorial optimization problem by mapping the solving target to its own energy function, and thus is widely applied to solving a traveling salesman problem (traveling salesman problem), a max-cut problem (max-cut problem), and the like.
The Hopfield network is a typical recurrent neural network, and after a certain time of iteration, the network can converge to its minimum value, and the state of the neuron corresponds to a solution of the problem. However, compared to the global minimum corresponding to the optimal solution, the operation of the Hopfield network without interference is more likely to converge to the local minimum, and thus less than ideal results are obtained. Therefore, many annealing algorithms are developed and applied to solve this problem, such as simulated annealing (simulated annealing), transient chaotic annealing (transient chemical annealing), adiabatic annealing (adiabatic annealing), and the like. However, complex annealing algorithms require additional control over the state of the neuron or the operation of the network.
On the other hand, the operation of the Hopfield network involves a large number of matrix-vector multiplications and can therefore be well accelerated by the memory-computing paradigm based on both traditional and new types of memory. There have been several studies to speed up the operation of the Hopfield network by using this novel in-memory computational paradigm, resulting in very good energy efficiency and speed improvements. Another study aided annealing by using memory intrinsic noise in memory computations. However, in each iteration, the output result of the neuron needs to be sampled by a complex peripheral circuit and an analog-to-digital converter (ADC), and then converted into a voltage signal by a digital-to-analog converter (DAC) after being regulated and controlled, or a clock signal is needed to control the operation iteration of the network, thereby introducing additional control and hardware overhead. One property of the Hopfield network is that each time the output of a neuron is taken directly as an input for the next iteration.
Therefore, a circuit implementation and operation method for a Hopfield network is urgently needed, and the defects that peripheral circuit overhead is reduced, operation efficiency is improved, efficient annealing algorithm is realized, network convergence effect is improved, and the like are difficultly considered in the existing method for implementing the Hopfield network by using a storage-computation integrated framework are overcome.
Disclosure of Invention
Aiming at the defects of the prior art, the invention aims to provide a Hopfield network hardware circuit for solving an optimization problem and an operation method thereof, which can effectively reduce the expenses of peripheral circuits, particularly ADC and DAC, and can improve the operation speed and the convergence effect of solving the combined optimization problem.
In order to achieve the above object, in a first aspect, the present invention provides a Hopfield network hardware circuit for solving an optimization problem, including a synapse module, an annealing module, a neuron module, and an activation function module;
a synapse module comprising a biasing unit for outputting m fixed bias current signals I bias,j M represents the scale of the combinatorial optimization problem; the weighting unit comprises an array of electronic synapse devices, the conductance G of each electronic synapse device xy The electronic synapse devices in each row of the electronic synapse device array correspond to each voltage signal v output by the activation function module i Under the action of (1) output total current signal I j (ii) a Wherein G is xy Represents the conductance of the x row and y column electronic synapse devices in the array of electronic synapse devices, x is equal to [1, m ∈],y∈[1,m](ii) a The subscripts i, j correspond to the ith and jth output signals, i ∈ [1, m ]],j∈[1,m];
The neuron module comprises a plurality of neuron units, each neuron unit comprises an operational amplifier and a resistor and a capacitor which are respectively connected between the output end and the inverting input end of the operational amplifier, and each neuron unit is used for correspondingly receiving and outputting a fixed bias current signal I according to the bias unit bias,j Total current signal I j And each current signal I output by the annealing module tca,j Output a voltage signal u j
An activation function module including a multi-stage operational amplifier circuit for receiving and outputting the voltage signal u output by each neuron unit j After nonlinear activation function f operation, m voltage signals v are output j
An annealing module comprising a plurality of field effect transistors, wherein the source electrode of each field effect transistor and the electronic synapse device in each row of the electronic synapse device array are loaded with a transient random initial excitation V at the beginning of the iteration of the hardware circuit in,j The gates of the field effect transistors are then used to anneal the signal v in the transient state tca And corresponding voltage signal v i Under the action of the voltage source, m current signals I are generated tca,j (ii) a Wherein the transient annealing signal v tca The voltage signals are in a signal form decreasing with the iteration time t, so that the hardware circuit can perform spontaneous iteration to finish convergence, and m voltage signals v output by the function module are activated during the convergence j Which is the solution solved by the combinatorial optimization problem.
According to the Hopfield network hardware circuit for solving the optimization problem, the operation function of the Hopfield network is constructed through electronic devices such as the electronic synapse array, the operational amplifier, the multistage operational amplifier circuit and the like, so that the energy efficiency can be improved when the solution of the combinatorial optimization problem is realized; compared with the existing storage and computation integrated framework, the method does not need peripheral circuits such as ADC (analog-to-digital converter) and DAC (digital-to-analog converter), can effectively reduce the overhead, does not need the control of an external clock signal, can effectively improve the operation efficiency and realize self-iteration; meanwhile, an annealing module comprising a plurality of field effect tubes is additionally arranged, so that a high-efficiency transient chaotic annealing algorithm can be realized, and the network convergence effect is improved.
In one embodiment, in the annealing module, the transient annealing signal v tca The signal form of (a) is linear, exponential or log-type.
In one embodiment, the voltage signal u output by each neuron unit j Satisfies the following relation:
Figure BDA0003696458930000031
total current signal I j Comprises the following steps:
Figure BDA0003696458930000041
in the formula, C n Representing a capacitance value of a capacitance in the nth neuron element; r n Represents the resistance of the resistor in the nth neuron element, n is equal to [1, m ∈]And when y is equal to i and x is equal to j.
In one embodiment, the bias unit comprises m electronic synapse devices arranged in a column array, and the conductance of the m electronic synapse devices is encoded as a fixed value G bias,j At a bias voltage v bias Under the action of the voltage-controlled oscillator, m fixed bias current signals I are generated bias,j ,I bias,j =v bias G bias,j And correspondingly output to the inverting input terminal of the operational amplifier in each neuron unit.
In one embodiment, the electronic synapse device array employs a memristor, a resistive random access memory, a phase change memory, a magnetic random access memory, or a Flash memory.
In one embodiment, the bias unit comprises m current sources arranged in a column array for outputting m fixed bias current signals I bias,j To the inverting input of the operational amplifier in each neuron element.
In one embodiment, each fet in the annealing module is an NMOS transistor, and the transient annealing signal v is an NMOS transistor tca Loading the initial excitation signals on the source electrodes of the NMOS transistors at the initial stage of iteration, and then activating the voltage signals v output by the operational amplifier circuits at each stage in the function module i Correspondingly sending the signals to the source electrodes of the NMOS transistors;
wherein, the current signal I output by each NMOS transistor tca,j =g(v i ,v tca ) In this case, i is j, and g represents an electrical characteristic equation of the NMOS transistor.
In one embodiment, each initial excitation signal is generated by a switch and a source of random initial excitation pulses.
In one embodiment, in the activation function module, the nonlinear activation function adopts a sigmoid, hard-sigmoid, tanh, hard-tanh or sign function.
In a second aspect, the present invention provides a method for operating a Hopfield network hardware circuit to solve an optimization problem, comprising the following steps:
(1) constructing an energy function according to the constraint conditions and the objective function of the combined optimization problem, and calculating a fixed bias current signal I according to the energy function bias,j And weight information;
(2) applying the bias current signal I bias,j Writing the weight information into a bias unit and a weight unit of the synapse module correspondingly;
(3) initializing a random initial excitation V in,j And a transient annealing signal v tca The hardware circuit starts iteration and then gives a brief random initial excitation V to the annealing module and the weighting unit in,j
(4) Loading the transient annealing signal v to an annealing module tca And the hardware circuit performs spontaneous iteration to finish convergence, and a voltage signal output by the activation function module is obtained during convergence, wherein the voltage signal is the solution obtained by the combination optimization problem.
According to the operation method of the Hopfield network hardware circuit for solving the optimization problem, the operation function of the Hopfield network is constructed through electronic devices such as the electronic synapse array, the operational amplifier, the multistage operational amplifier circuit and the like, so that the energy efficiency can be improved when the solution of the combinatorial optimization problem is realized; compared with the existing storage and computation integrated framework, the method does not need peripheral circuits such as ADC (analog-to-digital converter) and DAC (digital-to-analog converter), can effectively reduce the overhead, does not need the control of an external clock signal, can effectively improve the operation efficiency and realize self-iteration; meanwhile, an annealing module comprising a plurality of field effect tubes is additionally arranged, so that a high-efficiency transient chaotic annealing algorithm can be realized, and the network convergence effect is improved.
Drawings
FIG. 1 is a block diagram of a Hopfield network hardware circuit for solving an optimization problem according to an embodiment of the present invention;
FIG. 2 is a schematic circuit diagram of the Hopfield network hardware circuit of FIG. 1;
FIGS. 3(a) and (b) are graphs of linear and exponential transient annealing signals v provided by an embodiment of the present invention tca A signal waveform diagram of (a);
fig. 4 is a waveform diagram of an output signal of an activation function module in an iterative process for solving a combinatorial optimization problem according to an embodiment of the present invention.
FIG. 5 is a diagram illustrating the results of a combinatorial optimization problem solved according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
It should be noted that the operation function used by the Hopfield network to solve the combinatorial optimization problem is as follows:
Figure BDA0003696458930000061
v j =f(u j )
in the formula (I), the compound is shown in the specification,
Figure BDA0003696458930000062
representing the state change rule of the jth neuron; u. of j Represents the state of the jth neuron;
Figure BDA0003696458930000063
the j-th neuron receives information transmitted from other neurons to realize interconnection among the neurons; i is j Representing bias information for the jth neuron itself.
In order to improve the energy efficiency of solving a combinatorial optimization problem by using a Hopfield network, the invention provides a hardware circuit implementation mode based on the operation function of the network, and as shown in FIGS. 1 and 2, the Hopfield network hardware circuit provided by the invention comprises a synapse module 10, an annealing module 20, a neuron module 30 and an activation function module 40.
The synapse module 10 provided in the present embodiment includes a bias unit 12 and a weight unit 14.
The bias unit 12 is used for simulating a Hopfield network to generate a bias signal I required by a neuron j I.e. generating m fixed bias current signals I corresponding to the embodiment bias,j And m represents the scale of the combinatorial optimization problem. Specifically, the biasing unit 12 provided in this embodiment may adopt N electronic synapse devices arranged in a column array, where N is greater than m, and when a combinatorial optimization problem needs to be solved, m electronic synapse devices may be selected from the biasing unit 12 and their conductances may be encoded into a fixed value G bias,j At a bias voltage v bias Under the action of the voltage-controlled oscillator, m fixed bias current signals I are generated bias,j ,I bias,j =v bias G bias,j And output to the neuron module 30. Preferably, the electronic synapse device may be a memristor, a resistive random access memory, a phase change memory, a magnetic random access memory or a Flash memory. Of course, to simplify the circuit of the bias unit 12, the bias unit 12 may also directly adopt N current sources arranged in a column array, and m current sources are selected from the N current sources to directly generate the fixed bias current signal I bias,j
The weighting unit 14 is used for simulating nerves in a Hopfield networkInterconnection between the elements. Specifically, the weighting unit 14 provided in the present embodiment includes an array of electronic synapse devices, each electronic synapse device in the array of electronic synapse devices having a device conductance G xy Storing weight information in a Hopfield network, the weights of the Hopfield network being calculated from an energy function of the network, and the energy function of the network being determined from the solved combinatorial optimization problem. And each electronic synapse device in each row of the array of electronic synapse devices corresponds to each voltage signal v output at the activation function module 40 i Under the action of (1) output total current signal I j And the matrix vector multiplication is performed and output to the neuron module 30.
The index i, j correspondingly represents the ith, jth output signal, and the above process satisfies the following relation:
Figure BDA0003696458930000071
in the formula, G xy Represents the conductance of the x row and y column electronic synapse devices in the array of electronic synapse devices, x is equal to [1, m ∈],y∈[1,m],i∈[1,m],j∈[1,m]And when y is equal to i and x is equal to j.
The neuron module 30 provided by the present embodiment is used for realizing the function of neurons in a Hopfield network. Specifically, the neuron module 30 provided in this embodiment includes a plurality of neuron units, each of the neuron units includes an operational amplifier and a resistor R and a capacitor C respectively connected between an output end and an inverting input end thereof, and each of the neuron units is configured to correspondingly receive and output a fixed bias current signal I according to the bias unit 14 bias,j And the total current signal I output by all the electronic synapse devices in each row of the electronic synapse device array j And each current signal I output by the annealing module tca,j Output a voltage signal u j And output to the activate function module 40.
According to the principle of virtual short and virtual break of the operational amplifier and kirchhoff's current law, the process satisfies the following relation:
Figure BDA0003696458930000072
in the formula, C n Representing the capacitance value of the capacitor C in the nth neuron unit; r n Represents the resistance value of the resistor R in the nth neuron element.
The activation function module 40 provided in this embodiment is used for outputting a function implementing a nonlinear activation function, and includes a plurality of stages of operational amplifier circuits commonly used in the art, and is used for receiving and amplifying the voltage signal u output by each neuron unit j After nonlinear activation function operation, m voltage signals v are output j . Specifically, the nonlinear activation function includes, but is not limited to, sigmoid, hard-sigmoid, tanh, hard-tanh, or sign, among other functions.
Taking the nonlinear activation function as f, the foregoing process satisfies the following expression:
v j =f(u j )
the annealing module 40 provided in this embodiment is used to implement a transient chaotic annealing algorithm in a Hopfield network, so as to improve the convergence effect thereof. Specifically, the annealing module 40 provided in this embodiment includes a plurality of field effect transistors, and the source of each field effect transistor and the electronic synapse device in each row of the array of electronic synapse devices are used to load a short random initial stimulus V at the beginning of the hardware circuit iteration in,j Giving neuron module 30 a randomized initial state in the hardware circuit prevents the circuit from converging on an illegal solution in spontaneous iterations. The gates of the subsequent FETs are used to anneal signal v in the transient state tca And corresponding voltage signal v i Under the action of the voltage source, m current signals I are generated tca,j And output to the neuron module 30. In particular, each random initial excitation V in,j Can be generated by a switch and a source of random initial excitation pulses.
Wherein the transient annealing signal v tca In the form of a signal that decreases with iteration time t, including but not limited to linear, exponential, or log, etc., as shown in FIGS. 3(a) and (b), such that the hardware circuit spontaneously iterates until the transient annealing signal v tca When the value decreases to a certain value, the annealing module 40 is closed, convergence is completed,at the moment, m voltage signals v output by the activation function module j Which is the solution solved by the combinatorial optimization problem.
Preferably, each fet in the annealing module 40 is an NMOS transistor, and the transient annealing signal v is a transient annealing signal tca Loading the initial excitation signals on the source electrodes of the NMOS transistors at the initial stage of iteration, and activating the voltage signals v output by the operational amplifier circuits at each stage in the function module i And correspondingly sending the signals to the source electrode of each NMOS transistor. Recording the electrical characteristic formula of each NMOS transistor as g, the current signal I output by the drain electrode of each NMOS transistor tca,j Comprises the following steps:
I tca,j =g(v i ,v tca ) When i is equal to j
In the Hopfield network hardware circuit for solving the optimization problem, the electronic synapse array, the operational amplifier, the multistage operational amplifier circuit and other electronic devices are used for constructing the operation function of the Hopfield network, so that the energy efficiency can be improved when the solution of the combinatorial optimization problem is realized; compared with the existing storage and computation integrated framework, the method does not need peripheral circuits such as ADC (analog-to-digital converter) and DAC (digital-to-analog converter), can effectively reduce the overhead, does not need the control of an external clock signal, can effectively improve the operation efficiency and realize self-iteration; meanwhile, an annealing module comprising a plurality of field effect tubes is additionally arranged, so that a high-efficiency transient chaotic annealing algorithm can be realized, and the network convergence effect is improved.
Fig. 4 is an operation method of the Hopfield network hardware circuit for solving the optimization problem according to an embodiment of the present invention, which includes steps S10-S40, detailed as follows:
s10, constructing an energy function according to the constraint conditions and the objective function of the combinatorial optimization problem, and calculating a fixed bias current signal I according to the energy function bias,j And weight information.
In step S10, the expression of the energy function is:
Figure BDA0003696458930000091
according to an energy function E (v) i ,v j ) Calculating a fixed bias current signal
Figure BDA0003696458930000092
The weight information is calculated as
Figure BDA0003696458930000093
S20, calculating the obtained bias current signal I bias,j And the weight information is correspondingly written into the bias unit and the weight unit of the synapse module.
In step S20, for writing the weight information, the calculated weight information may be directly written into an electronic synapse device array with a required size in the weight unit, where the required electronic synapse device array is m × m, and m represents the scale of the combinatorial optimization problem.
For bias current signal I bias,j When the biasing unit 12 is an array of electronic synapse device columns, a bias voltage v may be selected bias Then according to the calculated bias current signal I bias,j Divided by the bias voltage v bias Obtaining the conductance value G to be coded by the array of the electronic synapse device array bias,j Then writing the value into an electronic synapse device with a required array size in the bias unit, wherein the required array of the electronic synapse device is m multiplied by 1; when the bias unit 12 is a current source, the bias current signal I can be directly obtained by calculation bias,j And selecting a corresponding current source.
S30, initializing a random initial excitation V in,j And a transient annealing signal v tca The hardware circuit starts iteration and then gives a brief random initial excitation V to the annealing module and the weighting unit in,j
S40, loading the transient annealing signal v to the annealing module tca And the hardware circuit performs spontaneous iteration to finish convergence, and a voltage signal output by the activation function module is obtained during convergence, wherein the voltage signal is a solution obtained by the combination optimization problem.
In the operation method of the Hopfield network hardware circuit for solving the optimization problem provided by the embodiment, the operation function of the Hopfield network is constructed by electronic devices such as the electronic synapse array, the operational amplifier, the multistage operational amplifier circuit and the like, so that the energy efficiency can be improved when the solution of the combinatorial optimization problem is realized; compared with the existing storage and calculation integrated framework, the storage and calculation integrated framework does not need peripheral circuits such as ADC (analog-to-digital converter) and DAC (digital-to-analog converter), can effectively reduce the overhead, does not need the control of an external clock signal, can effectively improve the operation efficiency, and realizes self-iteration; meanwhile, an annealing module comprising a plurality of field effect tubes is additionally arranged, so that a high-efficiency transient chaotic annealing algorithm can be realized, and the network convergence effect is improved.
In order to illustrate the invention more clearly, reference is made to the following specific examples:
take the 4 city traveler question as an example. The solution to the traveler problem is to visit the x/y city in the p/q order (p, q, r, s ≦ 4), so the problem size is m ≦ 16. d rs Representing the distance between city r and city s. The constraints of the traveler's problem are: only one city can be visited at a time, one city can be visited only once, and each city must be visited. The objective function of the traveler's problem is to have the shortest total path to all cities. Let A, B, C, D be 4 hyper-parameters representing constraint conditions or objective function strengths, so the Hopfield energy function corresponding to the combinatorial optimization problem is:
Figure BDA0003696458930000111
thus, the weight information is calculated as
Figure BDA0003696458930000112
Conductance of the electronic synapse array device mapped into weight cells is
Figure BDA0003696458930000113
Bias signal
Figure BDA0003696458930000114
When the bias unit adopts a current source array, the mapped bias current signal is
Figure BDA0003696458930000115
Transient annealing signal v tca And the nonlinear activation function in the activation function module is of an exponential type, and the nonlinear activation function in the activation function module is of a hard-sigmoid type. The waveforms of the output signals of the activation function modules in the iterative process for solving the combinatorial optimization problem after random initial excitation are shown in FIG. 4, where v is 2 ,v 7 ,v 12 ,v 13 Converge to high level, and the rest converge to low level, where "1" corresponds to high level and "0" corresponds to low level. Thus, the corresponding city visit order is: visit city 14 (corresponding to v) 13 ) Visit 2 City 1 (corresponding to v) 2 ) 3 rd visit city 3 (corresponding to v) 12 ) Visit city 2 (corresponding to v) 7 ). The coordinates and visiting order of the cities are as shown in fig. 5, and therefore, it is easy to know that the obtained solution is the optimal solution.
It will be understood by those skilled in the art that the foregoing is only an exemplary embodiment of the present invention, and is not intended to limit the invention to the particular forms disclosed, since various modifications, substitutions and improvements within the spirit and scope of the invention are possible and within the scope of the appended claims.

Claims (10)

1. A Hopfield network hardware circuit for solving an optimization problem is characterized by comprising a synapse module, an annealing module, a neuron module and an activation function module;
a synapse module comprising a biasing unit for outputting m fixed bias current signals I bias,j M represents the scale of the combinatorial optimization problem; the weighting unit comprises an array of electronic synapse devices, each having a conductance G xy The electronic synapse devices in each row of the electronic synapse device array correspond to each voltage signal v output by the activation function module and are obtained by calculating energy functions determined by a combinatorial optimization problem i Under the action of (1) output total current signal I j ;G xy Represents the conductance of the x row and y column electronic synapse devices in the array of electronic synapse devices, x is equal to [1, m ∈],y∈[1,m](ii) a The subscripts i, j correspond to the ith and jth output signals, i ∈ [1, m ]],j∈[1,m];
The neuron module comprises a plurality of neuron units, each neuron unit comprises an operational amplifier and a resistor and a capacitor which are respectively connected between the output end and the inverting input end of the operational amplifier, and each neuron unit is used for correspondingly receiving and outputting a fixed bias current signal I according to the bias unit bias,j Total current signal I j And each current signal I output by the annealing module tca,j Output a voltage signal u j
An activation function module including a multi-stage operational amplifier circuit for receiving and outputting the voltage signal u output by each neuron unit j After nonlinear activation function f operation, m voltage signals v are output j
An annealing module comprising a plurality of field effect transistors, wherein the source electrode of each field effect transistor and the electronic synapse device in each row of the electronic synapse device array are loaded with a transient random initial excitation V at the beginning of the iteration of the hardware circuit in,j The gates of the field effect transistors are then used to anneal the signal v in the transient state tca And corresponding voltage signal v i Under the action of the voltage source, m current signals I are generated tca,j (ii) a Wherein the transient annealing signal v tca The voltage signals are in a signal form decreasing with the iteration time t, so that the hardware circuit can perform spontaneous iteration to finish convergence, and m voltage signals v output by the function module are activated during the convergence j Which is the solution solved by the combinatorial optimization problem.
2. The Hopfield network hardware circuit of claim 1 wherein, in the annealing module, the transient annealing signal v £ v tca The signal form of (a) is linear, exponential or log-type.
3. The Hopfield network hard of claim 1 or 2, solving an optimization problemA circuit, characterized in that the voltage signal u output by each neuron unit j Satisfies the following relation:
Figure FDA0003696458920000021
total current signal I j Comprises the following steps:
Figure FDA0003696458920000022
in the formula, C n Representing a capacitance value of a capacitance in the nth neuron element; r n Represents the resistance of the resistor in the nth neuron element, n is equal to [1, m ∈]And then y is i and x is j.
4. The Hopfield network hardware circuit of claim 1, wherein the bias unit comprises m electronic synapse devices arranged in a column array, the conductance of the m electronic synapse devices being encoded to a fixed value G bias,j At a bias voltage v bias Under the action of the voltage-controlled oscillator, m fixed bias current signals I are generated bias,j ,I bias,j =v bias G bias,j And correspondingly output to the inverting input terminal of the operational amplifier in each neuron unit.
5. The Hopfield network hardware circuit for solving the optimization problem of claim 1 or 4, wherein the electronic synapse device array employs memristors, resistive random access memories, phase change memories, magnetic random access memories or Flash memories.
6. The Hopfield network hardware circuit for solving an optimization problem of claim 1, wherein the bias unit comprises m current sources arranged in a column array for outputting m fixed bias current signals I bias,j To the inverting input of the operational amplifier in each neuron element.
7. The Hopfield network hardware circuit for solving an optimization problem of claim 1, wherein each fet in the annealing module is an NMOS transistor, and the transient annealing signal v is an NMOS transistor tca Loading the initial excitation signals on the source electrodes of the NMOS transistors at the initial stage of iteration, and then activating the voltage signals v output by the operational amplifier circuits at each stage in the function module i Correspondingly sending the signals to the source electrodes of the NMOS transistors;
wherein, the current signal I output by each NMOS transistor tca,j =g(v i ,v tca ) In this case, i is j, and g represents an electrical characteristic equation of the NMOS transistor.
8. The Hopfield network hardware circuit for solving an optimization problem of claim 1 or 7, wherein each initial excitation signal is generated by a switch and a source of random initial excitation pulses.
9. The Hopfield network hardware circuit of claim 1, wherein in the activation function module, the nonlinear activation function is a sigmoid, hard-sigmoid, tanh, hard-tanh, or sign function.
10. A method of operating a Hopfield network hardware circuit to solve an optimization problem according to any one of claims 1 to 9, comprising the steps of:
(1) constructing an energy function according to the constraint conditions and the objective function of the combinatorial optimization problem, and calculating a fixed bias current signal I according to the energy function bias,j And weight information;
(2) applying the bias current signal I bias,j Writing the weight information into a bias unit and a weight unit of the synapse module correspondingly;
(3) initializing a random initial excitation V in,j And a transient annealing signal v tca The hardware circuit starts iteration and then gives backRandom initial excitation V of fire module and weight unit in,j
(4) Loading the transient annealing signal v to an annealing module tca And the hardware circuit performs spontaneous iteration to finish convergence, and a voltage signal output by the activation function module is obtained during convergence, wherein the voltage signal is the solution obtained by the combination optimization problem.
CN202210675728.9A 2022-06-15 2022-06-15 Hopfield network hardware circuit for solving optimization problem and operation method Active CN115062583B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210675728.9A CN115062583B (en) 2022-06-15 2022-06-15 Hopfield network hardware circuit for solving optimization problem and operation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210675728.9A CN115062583B (en) 2022-06-15 2022-06-15 Hopfield network hardware circuit for solving optimization problem and operation method

Publications (2)

Publication Number Publication Date
CN115062583A true CN115062583A (en) 2022-09-16
CN115062583B CN115062583B (en) 2024-05-31

Family

ID=83200794

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210675728.9A Active CN115062583B (en) 2022-06-15 2022-06-15 Hopfield network hardware circuit for solving optimization problem and operation method

Country Status (1)

Country Link
CN (1) CN115062583B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109977470A (en) * 2019-02-20 2019-07-05 华中科技大学 A kind of circuit and its operating method based on memristor Hopfield neural fusion sparse coding
CN110097182A (en) * 2019-04-10 2019-08-06 常州大学 Circuit is realized with the three-dimensional Hopfield neural network model of neuron activation gradient λ control
US20200192598A1 (en) * 2018-12-18 2020-06-18 Hewlett Packard Enterprise Development Lp Adiabatic Annealing Scheme and System for Edge Computing
DE102019134370A1 (en) * 2018-12-18 2020-06-18 Hewlett Packard Enterprise Development Lp ADIABATIC GLOWING DIAGRAM AND SYSTEM FOR EDGE COMPUTING
CN112396176A (en) * 2020-11-11 2021-02-23 华中科技大学 Hardware neural network batch normalization system
CN113469334A (en) * 2021-06-29 2021-10-01 中国地质大学(武汉) Memristor recurrent neural network circuit
CN114386593A (en) * 2021-12-17 2022-04-22 上海工程技术大学 Method for processing TSP problem based on improved particle swarm optimization and dynamic step size neural network

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200192598A1 (en) * 2018-12-18 2020-06-18 Hewlett Packard Enterprise Development Lp Adiabatic Annealing Scheme and System for Edge Computing
DE102019134370A1 (en) * 2018-12-18 2020-06-18 Hewlett Packard Enterprise Development Lp ADIABATIC GLOWING DIAGRAM AND SYSTEM FOR EDGE COMPUTING
CN109977470A (en) * 2019-02-20 2019-07-05 华中科技大学 A kind of circuit and its operating method based on memristor Hopfield neural fusion sparse coding
CN110097182A (en) * 2019-04-10 2019-08-06 常州大学 Circuit is realized with the three-dimensional Hopfield neural network model of neuron activation gradient λ control
CN112396176A (en) * 2020-11-11 2021-02-23 华中科技大学 Hardware neural network batch normalization system
CN113469334A (en) * 2021-06-29 2021-10-01 中国地质大学(武汉) Memristor recurrent neural network circuit
CN114386593A (en) * 2021-12-17 2022-04-22 上海工程技术大学 Method for processing TSP problem based on improved particle swarm optimization and dynamic step size neural network

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
胡迎春;李尚平;廖廷华;: "基于Hopfield神经网络的结构优化算法研究", 中国机械工程, no. 12, 25 June 2007 (2007-06-25) *
许楠;徐耀群;: "反三角函数混沌神经网络的模拟退火策略", 哈尔滨商业大学学报(自然科学版), no. 06, 15 December 2011 (2011-12-15) *

Also Published As

Publication number Publication date
CN115062583B (en) 2024-05-31

Similar Documents

Publication Publication Date Title
CN112183739B (en) Hardware architecture of memristor-based low-power-consumption pulse convolution neural network
US9646243B1 (en) Convolutional neural networks using resistive processing unit array
US9779355B1 (en) Back propagation gates and storage capacitor for neural networks
AU2020274862B2 (en) Training of artificial neural networks
US20200293855A1 (en) Training of artificial neural networks
CN117636945B (en) 5-bit signed bit AND OR accumulation operation circuit and CIM circuit
Greenberg-Toledo et al. Supporting the momentum training algorithm using a memristor-based synapse
Jing et al. VSDCA: A voltage sensing differential column architecture based on 1T2R RRAM array for computing-in-memory accelerators
Zheng et al. Hardware-friendly actor-critic reinforcement learning through modulation of spike-timing-dependent plasticity
Zhang et al. Memristive circuit design of quantized convolutional auto-encoder
US11556770B2 (en) Auto weight scaling for RPUs
Bhattacharya et al. Computing High-Degree Polynomial Gradients in Memory
CN115983358A (en) Hardware implementation method of Bellman equation based on strategy iteration
CN115062583A (en) Hopfield network hardware circuit for solving optimization problem and operation method
CN114093394B (en) Rotatable internal computing circuit and implementation method thereof
Wei et al. Neuromorphic computing systems with emerging devices
Bo et al. A circuit architecture for analog on-chip back propagation learning with local learning rate adaptation
Narayanan et al. Neuromorphic technologies for next-generation cognitive computing
Le et al. CIMulator: a comprehensive simulation platform for computing-in-memory circuit macros with low bit-width and real memory materials
Skolota et al. Overview of technical means of implementation of neuro-fuzzy-algorithms for obtaining the quality factor of electric power
Soures et al. Enabling on-device learning with deep spiking neural networks for speech recognition
Tan et al. Enhancing in-situ updates of quantized memristor neural networks: a Siamese network learning approach
Zhang et al. A 28nm 15.09 nJ/inference Neuromorphic Processor with SRAM-Based Charge Domain in-Memory-Computing
Uenohara et al. A Trainable Synapse Circuit Using a Time-Domain Digital-to-Analog Converter
US20230419092A1 (en) Crossbar arrays implementing truth tables

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant