WO2018180499A1 - Neural network structure, electronic circuit, information processing system, method, and program - Google Patents

Neural network structure, electronic circuit, information processing system, method, and program Download PDF

Info

Publication number
WO2018180499A1
WO2018180499A1 PCT/JP2018/010011 JP2018010011W WO2018180499A1 WO 2018180499 A1 WO2018180499 A1 WO 2018180499A1 JP 2018010011 W JP2018010011 W JP 2018010011W WO 2018180499 A1 WO2018180499 A1 WO 2018180499A1
Authority
WO
WIPO (PCT)
Prior art keywords
neuron
unit
neural network
network structure
synapse
Prior art date
Application number
PCT/JP2018/010011
Other languages
French (fr)
Japanese (ja)
Inventor
正之 廣口
Original Assignee
株式会社日本人工知能研究開発機構
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日本人工知能研究開発機構 filed Critical 株式会社日本人工知能研究開発機構
Priority to JP2019509227A priority Critical patent/JP6712399B2/en
Publication of WO2018180499A1 publication Critical patent/WO2018180499A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computing arrangements using knowledge-based models
    • G06N5/02Knowledge representation; Symbolic representation

Definitions

  • the present invention relates to a neural network structure, an electronic circuit, an information processing system, a method, and a program.
  • the human brain nerve cells (hereinafter referred to as neurons) have functions for human recognition, judgment, memory, and the like.
  • a neuron memorizes an event when it is repeatedly input, recognizes whether it is the same event as that event, and reacts to it when it is the same event. A signal is output.
  • the memory becomes easier, and when the information is input, it becomes possible to easily react and the signal is more easily output.
  • a neuron is connected to many neurons via a synapse, and transmission from neuron to neuron corresponding to input information is performed.
  • input information matches stored information, other neurons To output a signal.
  • connection between a predetermined neuron and a neuron becomes stronger. For this reason, when information that has been repeatedly input is input again, it is likely that transmission from a given neuron to a neuron will easily occur, and that the neuron will react and output signals to other neurons more easily. ing.
  • a neural network is a model of the function of a neuron using mathematical formulas and circuits.
  • a neuron has a function of emitting a pulse (igniting) when a potential applied by an input signal exceeds a threshold value.
  • a function is realized using a nonlinear function, a capacitor, or the like. Synapses that connect neurons have different temporal and spatial propagation efficiencies, and change depending on the connection between neurons and the propagation efficiency between neurons, depending on the results of recognition, judgment, and memory.
  • this can be done by weighting the output value of the nonlinear function or the filling rate of the capacitor with the coupling strength (weighting factor) and substituting it into the nonlinear function of the subsequent neuron or adding it to the filling rate.
  • a hierarchical neural network is a typical model.
  • the hierarchical neural network includes an input layer including a plurality of input layer neurons that receive an input signal from the outside, an output layer including a plurality of output layer neurons that transmit an output signal to the outside, an input layer neuron, and an output layer neuron. And one or more intermediate layers including a plurality of intermediate layer neurons.
  • Patent Document 1 discloses a learning system and method for a hierarchical neural network that realizes accurate and high-speed learning of a neural network.
  • This hierarchical neural network learning system selects a route having the highest degree of matching between the propagation signal and the coupling strength value from among the routes connected to each neuron of each map constituting each layer of the hierarchical neural network. Processes other than are treated as sparse in units of the map, the input signal input to the hierarchical neural network is propagated in the forward direction to obtain the output signal, and the resulting output signal and input signal are paired The target signal is compared with each other, and the degree of matching between the signal propagating on the selected path and the coupling strength value is adjusted according to the degree of coincidence.
  • a back-propagation method is known as a learning method for synaptic weighting factors for obtaining a high correct answer rate.
  • a back-propagation method is known as a learning method for synaptic weighting factors for obtaining a high correct answer rate.
  • it is necessary to learn from a large amount of data, so it is difficult and time-consuming to use a hierarchical neural network in a new application field.
  • the application area of the hierarchical neural network is suitable for specific areas such as games, image recognition, automatic driving, and investment judgment that can be handled by individual algorithms (so-called weak AI).
  • weak AI arithmetic and mathematical operations
  • one neuron can be associated with a concept, but the neuron has a so-called meaning such as shape, property, feature, and name related to the concept. It is not possible (symbol grounding problem).
  • the excitement state also called firing or stimulation
  • the excitement state can be propagated one after another from the input to the output of the entire system, but when the final output is not reached Can't ask for more results.
  • sequential execution is performed by selecting subsequent steps according to whether or not the procedure is executed and whether or not the execution result is in accordance with a series of procedures according to the timing. It is not possible. That is, procedural processing can be executed by a program, but cannot be executed by a conventional neural network.
  • a neural network structure simulating a mechanism of a neural circuit in a human brain, an electronic circuit including the neural network structure, and information processing A system, method, and program are provided.
  • a plurality of neuron units and a synapse unit that connects the plurality of neuron units are provided, and a product of an output value of firing or unfired of one neuron unit and a weight coefficient of the synapse unit is 1
  • a neural network structure in which other neuron units connected to one neuron unit via a synapse unit ignite according to the sum added more than once, and the output value due to the firing of one neuron unit is another
  • a neural network structure having a loop structure that propagates to one neuron unit itself via a neuron unit and a synapse unit is provided.
  • a neuron part included in the loop structure when a neuron part included in the loop structure receives an output value, that is, a stimulus by firing, the stimulus propagates one after another in the loop structure. Therefore, the function is made so that the stimulus continuously propagates in the loop structure, and the neuron part in the loop structure is repeatedly excited at a certain time interval every time the stimulus in the loop structure is propagated.
  • a neural network structure that can function as described above can be provided.
  • the above-described neural network structure and another neural network structure that receives as an input value an output value output when one neuron unit ignites by propagating the loop structure may be provided. According to this, in other neural network structures that receive a stimulus from a certain neuron part included in the loop structure and are not included in the loop structure, the excitement of the neuron part can be increased each time the stimulus is repeatedly received.
  • the series of neuron units receives the input from the output value by firing, and the sum becomes equal to or more than a threshold value. It may be characterized by expressing a function. According to this, the excitement of the neuron part is gradually increased by repeatedly receiving the stimulus in other neural network structures not included in the loop structure that receive the stimulus from a certain neuron part included in the loop structure, and the excitement is increased. The number of neuron portions that have fired beyond the threshold value increases, and a series of neuron portions in which the stimulus propagates over a wider range can be formed, and a predetermined function can be expressed.
  • a series of neuron parts connected in the neural network structure are formed, and a predetermined function corresponding to the series of neuron parts is expressed, so that a neuron part related to a concept is connected to a neuron part related to the concept related to the concept.
  • a so-called “meaning” can be given in the neural network structure by the related concept group.
  • a logical product calculation unit that calculates a logical product of one neuron unit and one or more other neuron units
  • a logical sum calculation unit that calculates a logical sum of one neuron unit and one or more other neuron units.
  • a logic negation calculation unit for calculating the logic negation of one neuron unit.
  • a series of neuron units that express higher-order functions may be formed by any one or combination of a logical product calculation unit, a logical sum calculation unit, and a logical negation calculation unit. According to this, by calculating the neuron part, a series of neuron parts that express more complicated functions can be formed.
  • a series of neuron parts that express one function is pre-connected to one or more series of neuron parts that express another function, and fires when the sum exceeds a threshold value. Also good. According to this, it is possible to form a situation in which it is easy to ignite higher-order “meaning” as a wider concept group by igniting “meaning” that is a related concept group.
  • first neuron unit, the second neuron unit, and other neuron units, and the output value generated by firing the second neuron unit is transmitted to the second neuron unit itself via the other neuron units.
  • An oscillating neuron unit having a propagating structure, and a synapse unit connected in a direction from the first neuron unit to the second neuron unit has an ignition weight coefficient for firing the second neuron unit in one summation.
  • the synapse part that connects the neuron parts in the oscillation neuron part has a firing weight coefficient that fires the neuron part in one summation, and connects in the direction from the neuron part in the oscillation neuron part to the first neuron part
  • the synapse unit may include an oscillating unit having an unfired weighting coefficient that does not fire the first neuron unit because it is less than the threshold value. According to this, it is possible to maintain the potential excitement state for a certain period of time in the first neuron unit that is once excited by being stimulated.
  • a long-term memory unit composed of a series of neuron units and a short-term memory unit composed of a loop structure and one or more oscillation units may be provided. According to this, long-term and short-term memory can be held by the neuron unit.
  • a neuron included in the first neural network structure further including: a plurality of neuron units, the first neural network structure; and a plurality of neuron units, the second neural network structure.
  • a synapse unit that connects the unit and a neuron unit included in the second neural network structure may be provided. According to this, it is possible to link between neural network structures of different systems by providing synapse parts that connect neuron parts included in different neural network structures that retain long-term memory or short-term memory. Become.
  • the first neural network structure is a master having a function of controlling one or more second neural network structures
  • the second neural network structure is a slave functioning under the control of the first neural network structure. It may be a feature.
  • one main neural network structure can have a larger neural network structure that is controlled by the hierarchical structure by controlling other neural network structures.
  • first neural network structure may have a general purpose or planned function
  • second neural network structure may have an individual or execution function.
  • individual or executable functions can be expressed under a neural network structure that expresses general-purpose or planned functions.
  • the weighting factor of the synapse part that connects the neuron part included in the first neural network structure and the neuron part included in the second neural network structure increases when the neuron parts at both ends connected by the synapse part fire simultaneously. It may be characterized by. According to this, strong connection between different series of neuron parts or between different neural network structures is promoted, and it becomes easy to form a series of neuron parts that express a predetermined function.
  • a series of neuron units belong to other neural network structures other than the other neural network structure to which they belong.
  • the sum may be equal to or greater than a threshold value, and a predetermined function may be exhibited.
  • output information can be obtained by giving related information or the like as an additional stimulus to a series of neuron units in a quasi-excited state in which output can be obtained a little more.
  • a series of neuron units whose sum is equal to or greater than the threshold value by receiving input from the output value of the firing of the neuron unit belonging to another neural network structure other than the other neural network structure to which the self belongs
  • input is received only from the output value from the firing of the neuron part belonging to the neural network structure, from the output value from the firing of the neuron part fired in a series of neuron parts and the neuron part belonging to another neural network structure to which it does not belong
  • evaluation is performed by comparing with a neuron unit fired in a series of neuron units when an input is received, and determining whether or not the coincidence ratio of the fired neuron units is a predetermined value or more. According to this, the validity of the output information output in the associative function can be ensured.
  • At least one neuron unit in the first series of neuron units expressing a predetermined function and a predetermined function different from the first series of neuron units are expressed.
  • the weight coefficient of the synapse unit connecting the first series of neuron units and the second series of neuron units increases. May be a feature. According to this, strong connection between different series of neuron parts or between different neural network structures is promoted, and it becomes easy to connect the relations between the neural network structures.
  • an electronic circuit including the above-described neural network structure includes an arithmetic element that constitutes a neuron part and a synapse part, a storage element that stores a weight coefficient, and an adder circuit that calculates a sum.
  • an electronic circuit capable of forming a neural network structure including a loop structure can be provided.
  • the memory element may be implemented as a memristor element.
  • a memory element that can take not only a digital value of 0 or 1 but also a plurality of values can be formed by only one memristor element that is a passive element, and the stored value is used as an electric resistance.
  • the stored value can be increased or decreased depending on the accumulated value of the flowed current, and the memory of the memristor element is non-volatile, so that the memory can be retained even when the power is turned off.
  • the electronic circuit an input unit that receives an optical input, an acoustic input, an electrical input, or a physical operation input from the outside and converts the input into an electrical signal;
  • An output unit that converts the signal into an optical output, an acoustic output, an electrical output, and an output of physical operation, and the input unit inputs the electrical signal to a neuron unit included in the neural network structure, and outputs an output unit.
  • a so-called conceptual neuron unit having visual, audio, conceptual, or physical meaning corresponding to various external inputs is formed, and a neuron unit related to this neuron unit is connected.
  • Provides a broader and richer meaning to the conceptual neuron realizes a neural network structure based on the meaning of the real world, and provides an information processing system that allows the output of the neural network structure to act externally can do.
  • a method for propagating firing through the neuron unit and the synapse unit the first neuron And the (N-1) th neuron unit is ignited when the firing from the (N-2) th neuron unit is propagated. And the step of propagating to the Nth neuron unit through the synapse unit, and the Nth neuron unit ignites when the firing from the (N ⁇ 1) th neuron unit is propagated (N is 3 or more). Natural number) and propagating to the first neuron part via the synapse part.
  • the function of the stimulus continues to propagate and the neural network structure It is possible to provide a method capable of functioning to repeatedly excite a constant time interval each time a stimulus that is continuously propagated by a neuron in the (loop structure) is propagated.
  • a program for executing simulation of the neural network structure on a computer a storage step for storing a weighting factor, an adding step for calculating a sum, a stored weighting factor and a calculation Based on the sum, it is possible to provide a program for executing a processing step for performing a simulation of whether one neuron unit propagates or does not propagate the firing to another neuron unit via the synapse unit. According to this, it is possible to provide a program for simulating a neural network structure including a loop structure using a normal Neumann computer.
  • a plurality of neuron units and a synapse unit that connects the plurality of neuron units are provided, and a product of an output value of firing or unfired of one neuron unit and a weight coefficient of the synapse unit is 1
  • a neural network structure that mimics the mechanism of a neural circuit in the human brain, an electronic circuit including the neural network structure, an information processing system, a method, and a program.
  • FIG. 4 is a schematic diagram showing a (NOT circuit), (D) a schematic diagram showing a latch structure (Latch circuit), and (E) a schematic diagram showing a flip-flop structure (Flip-Flop circuit).
  • FIG. The schematic diagram which shows the example of the latch structure in the neural network structure of 2nd Example which concerns on this invention.
  • the schematic diagram which shows the example of the reinforcement
  • the schematic diagram which shows the learning function in the neural network structure of 4th Example based on this invention The schematic diagram which shows the associative function in the neural network structure of 4th Example which concerns on this invention.
  • the schematic diagram which shows a learning function in the neural network structure of 4th Example which concerns on this invention by making the cooperation between the territory of a brain into an example.
  • neural network structure refers to an architecture configured to simulate biological neurons. Neural network structure means a structure that is approximately functionally equivalent to neurons and synapses of biological brain, ie, the connection between elements. Therefore, the electronic circuit and information processing system including the neural network structure according to the embodiment of the present invention can include various elements and electronic circuits that model biological neurons. Further, a computer including a neural network structure according to an embodiment of the present invention can include various processing elements and algorithms (including computer simulations) that model biological neurons.
  • the neural network structure is an engineering model of the information processing function of the brain, and has a structure in which a number of elements corresponding to nerve cells called neuron portions are connected.
  • the neural network structure includes a plurality of neuron parts and a plurality of synapse parts. Note that this figure shows a state in which two neuron parts are connected to each other via a synapse part for the sake of easy viewing, but in reality, a large number of neuron parts are connected via a large number of synapse parts. Are connected to each other.
  • one neuron unit describes inputs from three synapse units and outputs to the three synapse units, but it goes without saying that the present invention is not limited to this.
  • one neuron unit When one neuron unit receives inputs In1 to In3 (also referred to as stimuli) from other neuron units via the synapse unit, it accumulates a potential inside itself.
  • the inputs In1 to In3 are given asynchronously from a plurality of other neuron units.
  • the neuron unit ignites and generates outputs Out1 to Out3.
  • the neuron unit that has received the output Out2 as an input (input In2 in the figure) accumulates a potential inside and can further propagate the stimulus to the connected neuron unit.
  • This neuron unit includes, for example, a capacitor and an operational amplifier (not shown) constituting an integration circuit, and stores a potential therein by charging the capacitor using a current output from the synapse unit.
  • a capacitor and an operational amplifier constituting an integration circuit
  • the neuron unit When the internal potential exceeds a predetermined threshold value, the neuron unit outputs (fires) outputs Out1 to Out3, and resets or attenuates the internal potential over time.
  • the neuron unit When the neuron unit frequently receives inputs In1 to In3, the state in which the internal potential is close to the threshold value is maintained, and such a state is easy to ignite even with a small amount of input, and thus is easily excited (potential excitation state). It can be said.
  • a similar integration circuit corresponding to each synapse part may be provided in the part of the receiving port from the synapse part in the synapse part or the neuron part. These function as the coupling strength (also referred to as a weighting coefficient or weighting) of the synapse portion depending on the capacity of the integration circuit and the level of the filling rate. Then, the product of the output value of a certain neuron unit and the weighting coefficient of the synapse unit and the like is repeatedly added and summed, thereby affecting the degree and speed of accumulation of the internal potential of the neuron unit.
  • the neural network structure according to the present invention has a plurality of neuron units and a synapse unit that connects the plurality of neuron units, and the output value of firing or unfired one neuron unit and the weight of the synapse unit
  • This is a neural network structure in which another neuron unit connected to one neuron unit is ignited via a synapse unit according to a sum obtained by adding a product with a coefficient one or more times.
  • processing elements as an algorithm of software executed on a computer.
  • the function F indicating the internal potential of the neuron part can be expressed by Expression (1).
  • F In1 * w1 + In2 * w2 + In3 * w3 Formula (1) (However, In1 to In3: 0 or 1)
  • Software including such an algorithm can be executed on a Neumann computer.
  • This neural network structure has a loop structure.
  • the output value due to the firing (1) of the neuron unit 1 propagates to the original neuron unit 1 itself through the other neuron units 2 to 10 and their synapse units.
  • the loop structure is composed of ten neuron parts, but the number is not limited in the present invention. These ten neuron portions are connected in a loop via a synapse portion.
  • the neuron unit 1 becomes excited by some kind of trigger and is ignited (1).
  • the neuron unit 2 to which the firing (1) is input is in an excited state, and firing (2) is performed. Thereafter, similarly, the neuron part N is in an excited state and firing N is performed.
  • firing (10) is performed in the neuron unit 10
  • the firing (10) is input to the neuron unit 1.
  • the neuron unit 1 to which the firing (10) is input becomes excited again and firing (1) is performed. Therefore, once any one of the neurons in the loop structure is in an excited state, the excited state is propagated one after another and is repeated, so that the excited state is maintained.
  • the human brain is known to have four loops: a motor loop, an eye movement loop, a limbic loop, and a prefrontal loop.
  • the loop structure in the neural network structure in the present invention is a prefrontal cortex. It is considered to correspond to a loop. If the prefrontal loop corresponds to a beta wave in the electroencephalogram, the loop is about 20 laps per second, which requires a transmission time of 2 milliseconds per neuron, and is considered to correspond to 25 neurons.
  • the weighting coefficient of the synapse part in the neuron part included in the loop structure is a weighting factor that is always in an excited state and fires only for the subsequent neuron part as shown in FIG. 9 ") is set.
  • the neuron portion is excited when the sum of products of the output value and the weighting factor of the synapse portion is added once or more is 8 or more.
  • a certain neuron part in the loop structure (the neuron part 1 in this figure (A)) is also connected to a neuron part that is not included in this loop structure, and when it becomes excited and ignites, the loop structure Firing is also transmitted to the outer neuron.
  • the neuron unit 1 propagates the firing (1) and the firing of the second round (11) to the neuron unit 2, and also fires (1) to the neuron part outside the loop structure (1) and the firing of the second round ( 11) Propagation in the third round of ignition (21), and so on. Therefore, by having the loop structure, it is possible to propagate the firing so as to repeatedly excite at certain time intervals to other neuron parts not included in the loop structure.
  • the firing can be propagated repeatedly to the neuron portion outside the loop structure at a constant time interval. Then, in the first lap, the neuron part outside the loop structure may have stopped by firing (1) and firing (2), but in the second lap, firing (3) and firing (4) are performed. In the eyes, firing (5) and firing (6) are performed, and the number of neuron portions that are in an excited state increases, and the firing spreads one after another.
  • the number of neuron portions in an excited state or a state in which they are easily excited increases, and a series of neuron portions that express a predetermined function can be easily formed.
  • the connection between the neuron part and the neuron part becomes strong, and it becomes easy to form a series of neuron parts that express a predetermined function.
  • a neural network structure when a neural network structure includes a loop structure, when a neuron part included in the loop structure receives an output value, i.e., a stimulus by firing, the stimulus propagates in the loop structure one after another. Therefore, it is made to function so that the stimulus continuously propagates in the loop structure, and the neuron part in the loop structure is repeatedly excited at a certain time interval every time the stimulus in the loop structure is propagated. It is possible to provide a neural network structure capable of functioning.
  • the neural network structure in this embodiment includes the above-described neural network structure of the loop structure and another neural network that receives as an input value an output value that is output when a certain neuron unit ignites by propagating the loop structure. And a structure.
  • the neuron portion in an excited state or a state in which it is easily excited increases by receiving a stimulus repeatedly at a constant time interval. It becomes easy to express the function.
  • the excitement of the neuron part can be increased each time the stimulus is repeatedly received.
  • the series of neuron units receives the input from the output value by firing, and the sum exceeds the threshold value.
  • a given function that has become a series of neuron parts when the sum is greater than or equal to a threshold is, for example, that when the sum of the neuron parts is greater than or equal to a threshold and fires without fail, it becomes a so-called connected state and forms a group of concepts. is there.
  • a series of neuron units having a state in which the neuron units are fired (connected state) has “meaning”.
  • a “Zebra” neuron portion is formed.
  • the “Zebra” neuron is a series of Japanese “Sound” neuron, “Shi” neuron, “Ma” neuron, “U” neuron, and “Ma” neuron.
  • the Japanese word “word” neuron part is connected to the “sumama” neuron part.
  • the “stripes” neuron part is a “word” neuron part with a Japanese name in which a “shi” neuron part that is a Japanese “sound” neuron part and a “ma” neuron part are connected in series. Shimashima "is connected to the neuron.
  • the “Zebra” neuron is the “Animal” neuron that has the concept of “Animal”. It becomes the state connected with. And if you have been taught more than a certain number of times that the concept of the “animal” neuron part is a concept such as “lion” or “giraffe”, the “animal” neuron part with the concept of “animal” It becomes connected to the “lion” neuron and “giraffe” neuron.
  • the “Animal” neuron unit is a series of “Do” neuron unit, “U” neuron unit, “Bu” neuron unit, and “Tsu” neuron unit, which are Japanese “sound” neuron units. It becomes connected to the “animal” neuron part, which is the “word” neuron part of the Japanese name.
  • zebra neuron is determined by another neuron that is connected to the “zebra” neuron.
  • the concept and meaning are given by connecting to such a neuron part, and the efficiency is dramatically improved as compared with the prior art. For example, if you try to achieve the same thing with a conventional program on a Neumann computer, check whether it is meaningful if you break it in the middle of a word, or check the concept of "pattern” or "shape" It is necessary to store the features in advance and perform a search. However, since there are various combinations for separating words, the amount of calculation is enormous. Also, a large storage area is required to comprehensively store the features. In either case, it is not very efficient.
  • the excitation of the neuron part is gradually increased by receiving the stimulation repeatedly, and the excitement becomes a threshold value.
  • the number of neuron portions that have fired beyond the range increases, and a series of neuron portions in which the stimulus propagates over a wider range can be formed, and a predetermined function can be expressed.
  • a series of neuron units connected in the neural network structure are formed, and a predetermined function corresponding to the series of neuron units is expressed, so that a neuron unit related to a concept changes to a neuron unit related to a concept related to the concept.
  • the so-called “meaning” can be given to the neural network structure by the related concept group.
  • a series of neuron portions in which a stimulus propagates in a wider range in a neural network structure can be formed, and so-called “meaning” can be provided in the neural network structure.
  • the disputed symbol grounding problem can be solved.
  • FIG. 5A in a hierarchical neural network, one neuron part can be associated with a certain concept, but the shape, property, feature, and name associated with that concept are associated with that neuron part. It is said that it has a symbol grounding problem that it cannot give a so-called meaning.
  • the neural network structure of the present embodiment includes the neural network structure of the above embodiment.
  • the neural network structure of this embodiment can hold a logical product structure for calculating a logical product of two neuron parts, a logical sum structure for calculating a logical sum of two neuron parts, a logical negation structure of a neuron part, and a state. It has a latch structure and a flip-flop neuron part. These structures are also the expression of a predetermined function that becomes a series of neuron portions when the sum is equal to or greater than a threshold value.
  • FIG. 6A shows a logical product structure (AND circuit).
  • the neuron unit 1 and the neuron unit 3 and the neuron unit 2 and the neuron unit 3 are connected by a synapse unit having a weighting coefficient 4, and both the neuron unit 1 and the neuron unit 2 are in an excited state.
  • the neuron unit 3 fires. When either one does not fire, the neuron unit 3 does not fire.
  • FIG. 6B shows a logical OR structure (OR circuit).
  • OR circuit a logical OR structure
  • the neuron unit 1 and the neuron unit 3 and the neuron unit 2 and the neuron unit 3 are connected by a synapse unit having a weighting coefficient 9, and either the neuron unit 1 or the neuron unit 2 is excited. When it becomes a state and fires, the neuron part 3 fires. If neither fires, the neuron unit 3 does not fire.
  • FIG. 6C shows a logical negation structure (NOT circuit).
  • the neuron unit 1 and the neuron unit 2 are connected by a synapse unit having a weight coefficient of -9, and when the neuron unit 1 is excited and ignites, Even if there is a stimulus, the neuron part 2 is difficult to ignite. If the neuron unit 1 does not fire, the neuron unit 2 may fire if there is a stimulus from another neuron unit.
  • 6 (D) and 6 (E) show a latch structure (Latch circuit) and a flip-flop structure (Flip-Flop circuit) for holding values and states.
  • the latch structure even if the neuron unit 1 fires, the neuron unit 2 does not fire unless fire is transmitted to the neuron unit 2 from others.
  • the flip-flop structure the excited state of the neuron unit 3 is maintained when the neuron unit 1 fires, and the excited state of the neuron unit 4 is maintained when the neuron unit 2 fires.
  • the neural network structure has a logical product structure, a logical sum structure, and a logical negation structure, a structure equivalent to the latch structure and the flip-flop structure can be formed.
  • logical product operation unit and the logical sum operation unit between the neuron units and the logical negation operation unit of the neuron unit
  • all the arithmetic processing of the neuron unit can be performed.
  • a combination of logical product calculation unit, logical sum calculation unit, and logical negation calculation unit forms a series of neuron units that express complex logical structures and higher-order functions such as latch structures and flip-flop structures. It comes out.
  • FIG. 7A illustrates an example of a logical product structure.
  • the neuron part A having the concept of “sima” and the neuron part B having the concept of “horse” are input to the logical product structure, the neuron part C having the concept of “zebra” is excited.
  • a “zebra” neuron is not excited only by a “sima” neuron or only by a “horse” neuron.
  • (B) is an example of a logical sum structure.
  • the neuron part C having the concept of “zebra” is excited.
  • the neuron part C having the concept of “zebra” Excited.
  • (C) is an example of a logical negation structure.
  • the neuron part C having the concept of “zebra” is input, the neuron part F having the concept of “not a real horse” is not excited.
  • (D) is an example of a flip-flop structure.
  • the neuron part D having the sound of “zebra” is input, the excitement of the neuron part C having the concept of “zebra” is sustained, and when the neuron part G having the sound of “horse” is input, The excitement of the neuron part B having the concept of “is sustained”.
  • FIG. 8 shows an example of a latch structure. This example shows the addition of two numbers.
  • the neuron unit 11 corresponding to the word “2”, the neuron unit 12 corresponding to the word “3”, and the neuron unit 13 corresponding to the word “add (sum)” are each called “2”.
  • the neuron unit 21 of the number concept, the neuron unit 22 of the number concept “3”, and the neuron unit 23 of the calculation concept “add” are excited, and a timing signal is sent at the timing when all the neurons are excited, The neuron part 3 having the concept of the number “5” in the latter stage is excited.
  • a series of neuron parts that express a certain function may be preliminarily connected to one or more series of neuron parts that express other functions, and may be ignited when the sum exceeds a threshold value.
  • a “sound” neuron unit consisting of a “sound” neuron unit consisting of a “sound” neuron unit called “shi” and a “sound” neuron unit called “ma” When a “sound” neuron unit consisting of a “sound” neuron unit consisting of a “sound” neuron unit and a “sound” neuron unit called “ma” is input to a logical structure, The "sound” neuron part with a new "zebra” sound can be formed. In this way, by igniting the “meaning” that is a related concept group, it is possible to form a situation that facilitates the firing of a higher-order “meaning” as a wider concept group.
  • the neural network structure of the present embodiment may further include an oscillation circuit (oscillation unit) shown in FIG.
  • the oscillation circuit is composed of a first neuron unit 1, a second neuron unit 2, and another neuron unit 3.
  • the other neuron part 3 is shown in this figure, it is not limited to this, There may be one or more.
  • the oscillation circuit includes a neuron unit 1, an oscillation neuron unit that includes a neuron unit 2 and a neuron unit 3 and has a structure in which an output value generated by firing of the second neuron unit 2 is propagated to the neuron unit 2 itself via the neuron unit 3.
  • the synapse part connected in the direction from the neuron part 1 to the neuron part 2 has a firing weight coefficient 9 which is a connection strength that the neuron part 2 fires in one sum.
  • the neuron units in the oscillation neuron unit that is, the synapse unit that connects the neuron unit 2 and the neuron unit 3 have a firing weight coefficient 9 that is a coupling strength at which each neuron unit fires in one sum.
  • the synapse unit connected in the direction from the neuron unit 2 and the neuron unit 3 to the neuron unit 1 in the oscillation neuron unit is an unfired weight coefficient 4 that is a connection strength at which the first neuron unit does not fire because the sum of one time is less than the threshold value.
  • the neuron unit 1 When the neuron unit 1 receives a stimulus (1) from the outside, the neuron unit 2 always fires (2) and does not fire the neuron unit 3.
  • the neuron unit 2 that has received the firing (2) i.e., the stimulus (2) has a weighting coefficient 9 that fires in a single sum because the synapse unit connected in the direction from the neuron unit 1 to the neuron unit 2 has a neuron unit.
  • firing (3) is performed, and the synapse part linked to the neuron part 1 has an unfired weight coefficient that is not fired by one summation of the weighting coefficient 4.
  • firing (3) is performed, the neuron part 1 is not in an excited state by itself.
  • the neuron unit 3 that has received the firing (3) i.e., the stimulus (3)
  • ignition (4) is performed.
  • the neuron unit 2 that has received the stimulus (4) fires (5) the neuron unit 1 and the neuron unit 3, and the neuron unit 1 that has received the stimulus (4) is not in an excited state by itself.
  • this neural network structure is composed of the first neuron unit 1, the second neuron unit 2 and the other neuron unit 3, and the output value due to the firing of the second neuron unit passes through the other neuron units.
  • An oscillating neuron unit having a structure that propagates to the second neuron unit itself, and the synapse unit connected in the direction from the first neuron unit to the second neuron unit is the second neuron unit in one summation
  • a synapse part that connects the neuron parts in the oscillating neuron part has a firing weight coefficient that fires the neuron part in a single sum
  • the first neuron part in the oscillating neuron part The synapse part connected in the direction to the neuron part has an unfired weight coefficient that the first neuron part does not fire because the sum is less than the threshold value. It includes an oscillating unit. According to this, it is possible to maintain the potential excitement state, that is, the state of being easily excited, for a certain period
  • this neural network structure can be provided with a short-term memory part that can store events and states for a certain period in the neuron part. Become. Further, by combining the oscillation circuit with a loop structure, it is possible to perform short-term memory for several tens of seconds by repeating a certain period in which the oscillation circuit oscillates, and it is possible to connect to the thinking described later.
  • the neural network structure of the present embodiment may further include a mechanism for strengthening the coupling strength of the synapse part.
  • this strengthening mechanism includes a connection strength conversion table, and when the neuron unit 1 on the input side and the neuron unit 2 on the output side are simultaneously excited within a predetermined time, the synapse between the neuron units is related. It strengthens the strength of the bond. Neurons in the human brain memorize an event when it is repeatedly input, then recognize whether it is the same event, and react to it if it is the same event Then, a signal is output. And it becomes easy to memorize
  • This strengthening mechanism is a model of the function in the brain.
  • a certain event is repeatedly input.
  • the fact that it becomes easier to memorize as the same information is repeated represents a state in which the coupling strength is strengthened and the state becomes more excited and more easily ignited.
  • Long-term memory can be formed by a series of neuron parts that have synaptic parts thus enhanced in connection strength and are prone to excitement.
  • long-term memory is formed by approximately seven stimulations.
  • long-term memory is set by three stimulations.
  • the setting is such that long-term memory is formed three times, but is not limited to three times.
  • this neural network structure includes a long-term memory unit composed of a series of neuron units.
  • the neural network structure includes a short-term memory unit including a loop structure and one or more oscillation circuits. According to this, long-term and short-term memory can be held by the neuron unit. It is said that the human brain has sensory memory in addition to short-term memory and long-term memory.
  • this sensory memory can be realized as an input buffer in an input unit for inputting information to the neural network structure.
  • the neural network structure shown in FIG. 11 includes a loop structure and another neural network structure that receives, as an input value, an output value that a neuron unit fires and outputs by propagating the loop structure.
  • the synapse part in the loop structure has a weight coefficient of 8 or more that becomes an excited state by one summation, and all the neuron parts have strong connection strength (strong connection).
  • the neuron units 1 to 6 in other neural network structures are each provided with the oscillation circuit described above.
  • the neuron part in the loop structure and the neuron parts 1 to 6 in the other neural network structures are weakly coupled at a synapse part having a weighting coefficient that becomes an excited state by summing a plurality of times. Further, it is assumed that the neuron unit 1 and the neuron unit 2, the neuron unit 2 and the neuron unit 3, the neuron unit 4 and the neuron unit 5, and the neuron unit 6 and the neuron unit 7 are strongly coupled, and the others are weakly coupled.
  • a strong stimulus above the threshold is given to the neuron unit 1 of another neural network structure through a strong-coupled synapse. Then, the neuron unit 1 becomes excited and ignites with respect to the neuron unit 2, and the neuron unit 2 also ignites and the first stimulus is propagated to the neuron unit 3. However, since the firing timings of the neuron unit 1 and the neuron unit 3 are shifted, the neuron unit 4 does not fire. Note that the neuron units 1 to 3 have an oscillation circuit and thus hold a potential excited state for a certain period.
  • a weak stimulus below the threshold is received from the neuron in the loop structure via the weakly connected synapse.
  • the stimulus is transmitted to the neuron parts 1 to 6, and the neuron parts 1 to 3 are kept in a potential excitement state. Therefore, even if the stimulus is weak, the neuron part 1 and the neuron part 3 fire simultaneously.
  • the neuron part 4 is connected by a synapse part having only weak connection, the neuron part 4, the neuron part 1, and the neuron part 3 in the loop structure are simultaneously stimulated, and thus become excited and ignite.
  • the neuron unit 4 fires, it propagates to the neuron unit 5.
  • the neuron unit 6 does not fire. Since the neuron units 1 to 6 have an oscillation circuit, they hold a potential excited state for a certain period.
  • a weak stimulus below the threshold is received from the neuron in the loop structure via the weakly connected synapse.
  • the stimulation is transmitted to the neuron parts 1 to 6, and the neuron parts 1 to 5 are kept in the potential excitement state. Therefore, even if the stimulus is weak, the neuron part 3 and the neuron part 5 are fired simultaneously.
  • the neuron part 6 is connected by a synapse part having only weak connection, but since it is simultaneously stimulated by the neuron part, the neuron part 3 and the neuron part 5 in the loop structure, it becomes excited and ignites.
  • the neuron unit 6 fires, it propagates to the last neuron unit 7 that propagates in another neural network structure.
  • a neural network structure when a neural network structure includes a loop structure, when a neuron part included in the loop structure receives an output value, i.e., a stimulus by firing, the stimulus propagates in the loop structure one after another. Therefore, it is made to function so that the stimulus continuously propagates in the loop structure, and the neuron part in the loop structure is repeatedly excited at a certain time interval every time the stimulus in the loop structure is propagated. Can function. Moreover, when it inputs into a certain neuron part, it will ignite and the output value, ie, irritation
  • a series of neurons in an excited state or a potential excited state increases, and it is easy to express a predetermined function.
  • the brain information processing function it is easier to draw a conclusion when thinking deepens. It is also possible to think that they are thinking in the same way as humans.
  • the neural network structure of this embodiment includes the neural network structure of the above embodiment.
  • the neural network structure of the present embodiment includes a first neural network structure and a second neural network structure, and connects the neuron part included in the first neural network structure and the neuron part included in the second neural network structure.
  • a synapse is provided. That is, the present neural network structure includes two or more different neural network structures and has a synapse portion that connects these neural network structures.
  • the neural network structure shown in FIG. 12 includes a loop structure, a first neural network structure that receives as an input value an output value that is fired and output by a neuron unit that propagates through the loop structure, and a first neural network.
  • a second neural network structure including a neuron unit connected by a synapse unit and a neuron unit in the structure is provided.
  • the synapse part in the loop structure has a weight coefficient of 8 or more that is excited in the sum of one time along the loop shape, and all the neuron parts forming the loop are adjacent with strong connection strength (strong connection) It is connected to the neuron part that does.
  • Each of the neuron units 1 to 10 in the first neural network structure includes the oscillation circuit described above. Then, the neuron part in the loop structure and the neuron parts 2 to 9 in the first neural network structure are weakly coupled at a synapse part having a weighting coefficient that becomes an excited state by a total of a plurality of times.
  • the second neural network structure includes procedures 1 to 4.
  • Procedures 1 to 4 have a structure in which several neuron parts are all strongly coupled. Therefore, procedures 1 to 4 are a state in which a series of neuron parts that express a predetermined function are formed, and are a state having memory and meaning according to procedures 1 to 4.
  • a strongly coupled stimulus is applied to the neuron unit 1 of the first neural network structure. Then, the neuron unit 1 becomes excited and ignites with respect to the neuron unit 2, and the first stimulus is propagated to the neuron unit in the procedure 1 of the second neural network structure. Since all the neuron parts in the procedure 1 are strongly coupled, they propagate to the neuron part 3 in the first neural network structure. However, since the firing timings of the neuron unit 2 and the neuron unit 3 are shifted, the neuron unit 4 does not fire. Note that the neuron unit 2 and the neuron unit 3 have an oscillation circuit, and thus hold a potential excited state for a certain period.
  • a weak connection is stimulated from the neurons in the loop structure.
  • the stimulation is transmitted to the neuron units 2 to 9, and the neuron unit 1 and the neuron unit 3 maintain the potential excitement state. Fire at the same time.
  • the neuron part 4 is connected by a synapse part having only weak connection, the neuron part 4, the neuron part 2, and the neuron part 3 in the loop structure are simultaneously stimulated, and thus become excited and ignite.
  • the neuron unit 4 fires, it propagates to the neuron unit in the procedure 2 in the second neural network structure. Since all the neuron parts in the procedure 2 are strongly coupled, they propagate to the neuron part 5 in the first neural network structure. However, since the firing timings of the neuron unit 4 and the neuron unit 5 are shifted, the neuron unit 6 does not fire. Since the neuron units 2 to 9 have the oscillation circuit, they hold a potential excited state for a certain period.
  • a weak connection is stimulated from the neurons in the loop structure.
  • the stimulation is transmitted to the neuron units 2 to 9, and the neuron unit 4 and the neuron unit 5 maintain the potential excitement state.
  • the neuron unit 6 is connected by a synapse unit having only weak connection, but since it receives stimuli from the neuron unit, the neuron unit 4 and the neuron unit 5 in the loop structure at the same time, it becomes excited and ignites.
  • the neuron unit 6 fires, it propagates to the neuron unit in the procedure 3 in the second neural network structure. Since all the neuron parts in the procedure 3 are strongly coupled, they propagate to the neuron part 7 in the first neural network structure. However, since the firing timings of the neuron unit 6 and the neuron unit 7 are shifted, the neuron unit 8 does not fire. Since the neuron units 2 to 9 have the oscillation circuit, they hold a potential excited state for a certain period.
  • a weak connection is stimulated from the neurons in the loop structure.
  • the stimulation is transmitted to the neuron units 2 to 9, and the neuron unit 6 and the neuron unit 7 maintain the potential excitement state. Fire at the same time.
  • the neuron part 8 is connected by a synapse part having only weak connection, the neuron part, the neuron part 6 and the neuron part 7 in the loop structure are simultaneously stimulated and thus become excited and ignite.
  • the neuron unit 8 fires, it propagates to the neuron unit in the procedure 4 in the second neural network structure. Since all the neuron parts in the procedure 4 are strongly coupled, they propagate to the neuron part 9 in the first neural network structure. Then, the neuron unit 9 also fires and propagates to the neuron unit 10 that propagates last in the first neural network structure.
  • a neural network structure when a neural network structure includes a loop structure, when a neuron part included in the loop structure receives an output value, i.e., a stimulus by firing, the stimulus propagates in the loop structure one after another. Therefore, it can function so that the stimulus continues to propagate continuously in the loop structure.
  • a synapse unit that connects neuron units included in different neural network structures in which long-term memory, short-term memory, and the like are held, it becomes possible to perform cooperation between neural network structures of different systems.
  • this neural network structure is suitable for executing procedural processing.
  • the first neural network structure is called a plan level neural network structure
  • the second neural network structure is called an execution level neural network structure.
  • the neural network structure shown in FIG. It is possible to execute detailed actions (procedures) based on a large plan.
  • firing (1) is performed in the first neuron section at the planning level.
  • firing (2), firing (3), and firing (4) are performed for searching for an algorithm that can be conceived to solve a puzzle in the neuron portion at the execution level.
  • firing (2) is directed to a neuron section having a procedure of “raising straight up”
  • firing (3) is directed to a neuron section having a procedure of “twisting”
  • firing (4) is “rotated”.
  • ⁇ Stimulation of weak connection is given from the neuron part at the planning level to the neuron part at the execution level.
  • the connection strength to the neuron portion is a state (strong connection) where the connection strength from the neuron portion having the procedure of “rotate” is the highest.
  • firing (6) is performed from the execution level to the planning level, and an algorithm of “rotate” is returned.
  • the leading algorithm for solving the puzzle is to “rotate”.
  • the stimulus propagates to the “execution” neuron unit, and the “execution” neuron unit fires against the “execute rotation” neuron unit of the execution level.
  • a predetermined procedure is executed at the execution level, and the result (“disconnected” in the figure) is propagated to the neuron unit at the planning level.
  • the first neural network structure has a general purpose or planned function
  • the second neural network structure has a general or planned function, thereby expressing the general purpose or planned function.
  • individual or executable functions can be expressed.
  • the neural network structure at the planning level can select an algorithm from the neural network structure at the execution level.
  • the conventional information processing system has a frame problem in which an algorithm suitable for the current situation cannot be selected from a plurality of algorithms and executed.
  • a conventional information processing system takes an infinite amount of time in consideration of everything that can happen, so a frame (frame) is attached to a specific theme or range and processing is performed only within that frame.
  • the first neural network structure is a master having a function of controlling one or more second neural network structures
  • the second neural network structure is the same as the first neural network structure. It may be a slave that functions under control.
  • one main neural network structure can have a larger neural network structure that is controlled by the hierarchical structure by controlling other neural network structures.
  • a dual master-slave configuration in which two neural network structures recognize each other as a master and a partner as a slave may be employed. According to this, it is possible to achieve a predetermined purpose of one's own neural network structure while mutually controlling the other neuron portion.
  • one of the master slaves may be a conventional program instead of the neural network structure.
  • the program is provided by incorporating instinctive “unconscious” behavior, and a series of neuron units that express a predetermined function provide innate “conscious” behavior as a function.
  • the learning function is a function for increasing the connection strength of the synapse parts in the neural network structure, that is, the weight coefficient of the synapse parts for connecting the neuron parts is increased, and the inter-neuron part or the neural network
  • This is a function for facilitating formation of a series of neuron portions that promote a strong connection between structures and express a predetermined function.
  • the associative function is not able to form and output a series of neuron parts only by the input, but it does not output a different neural network structure or a stimulus from a series of neuron parts ( A function to make it easier to obtain output by receiving additional stimuli.
  • the evaluation function is a function for evaluating the validity of the output information obtained by the associative function.
  • the learning function in this embodiment is that even if there are two unrelated neural networks, if there are neurons that are excited (fired) at the same time, Considers that there is some relation, forms a synapse part that connects both neuron parts, and increases the weight coefficient of the synapse part. In this way, it is possible to output what could not be output by only one neural network structure by connecting the neuron parts that happened to be excited simultaneously.
  • At least one neuron unit in the first series of neuron units expressing a predetermined function and a predetermined function different from the first series of neuron units are expressed.
  • the weight coefficient of the synapse unit that connects the first series of neuron units and the second series of neuron units increases. Further, such a learning function is the same between a neuron unit included in a certain neural network structure described in the above embodiment and a neuron unit included in a different neural network structure.
  • the weight coefficient of the synapse part that connects the neuron part included in the first neural network structure and the neuron part included in the second neural network structure different from the first neural network structure is the neuron at both ends connected by the synapse part. It may be increased if parts fire simultaneously.
  • Such a learning function promotes strong connection between different series of neuron parts or between different neural network structures, making it easy to form a series of neuron parts that express a given function, and the relationship between neural network structures. It becomes easy to tie. For example, in the human cerebrum, it is said that when the input side neuron and the output side neuron are excited simultaneously, the synaptic connection that connects them is strengthened. By doing this, it becomes possible to make a new “discovery” even if it was not understood by one system. For example, knowledge about the physical relationship between the earth and the moon is related to the phenomenon of apple falling.
  • the screen used in this verification and shown in this figure was created on an integrated development environment called Unity.
  • Unity an integrated development environment
  • the neuron part in the concept network that joins the visual cortex and the auditory cortex integrates the attributes attached to the neuron part in the concept network by regarding the same concept when fired at the same time.
  • the upper part of the screen shows the visual cortex. From the left, the visual cortex corresponds to apple, mandarin, melon, persimmon, banana, persimmon, persimmon, carrot, radish, and eggplant.
  • the shape neuron part to be formed is already formed and displayed.
  • the auditory cortex is shown.
  • the auditory cortex includes apple, mandarin orange, melon, persimmon, banana, persimmon, persimmon, carrot and radish.
  • Word neuron portions corresponding to the lions are already formed and displayed.
  • Many neurons in the conceptual network that connect the visual cortex and auditory cortex are displayed between the visual cortex and auditory cortex.
  • the concept network prepares an intermediate neuron part that is not connected to connect an equivalent concept or a different concept, and triggered by the simultaneous firing of a plurality of neuron parts belonging to different neural network structures. It is a mechanism for linking multiple neuron parts to form new concepts and promote learning.
  • step 1 the neuron portion corresponding to the shape of the banana in the visual cortex and the neuron portion corresponding to the word banana in the auditory cortex are simultaneously excited (fired) so that the infant can hear the word banana while showing the banana.
  • step 2 one of the intermediate neuron parts in the concept network changes to a neuron part corresponding to the concept of banana.
  • the input neurons are surrounded by an ellipse, and the output neurons are surrounded by a rectangular dotted line.
  • step 2 the neuron part corresponding to the shape of the banana in the visual cortex and the neuron part corresponding to the concept of sweet under the concept of taste are taught to teach the concept that the banana taste is sweet while showing the infant a banana.
  • step 2 one of the intermediate neurons in the concept network changes to a neuron corresponding to the concept of sweet taste, and a synapse between the neuron corresponding to the concept of sweet taste and the neuron corresponding to the concept of banana.
  • the concept of banana is linked to the concept of sweet taste in both directions.
  • step 3 what is the taste of bananas for young children?
  • the neuron part corresponding to the word “banana” in the auditory cortex and the neuron part corresponding to the taste are stimulated (excited).
  • a series of neurons connected by the synapse that the taste of banana is sweet due to the stimulation of the visual cortex has already been formed, so from the neuron corresponding to the word banana in the auditory cortex, neurons of the concept of banana Through the department, it was linked to the concept of sweet taste, so the answer was that the taste of banana was sweet.
  • FIG. 19 shows the flow of the associative function.
  • the associative function receives an input (additional stimulus) from a different neural network structure or a series of neuron parts in addition to an input to one neural network structure, and easily obtains an output. For example, given a banana image in a food image, you can extract features such as a long, slightly curved yellow-green food, but there is a series of neurons that are semi-excited but cannot output the concept of bananas By inputting the concept of sweet taste as an additional stimulus, a series of neurons that output the concept of banana is formed.
  • This figure (A) shows that it is going to excite a series of neuron parts in addition to input information to a certain neural network structure.
  • the associative function is such that a series of neuron units receives an input from an output value by firing of a neuron unit belonging to another neural network structure other than the other neural network structure to which the self-association unit belongs.
  • the sum of products of the output value and the weighting factor of the synapse part is equal to or greater than a threshold value, and functions so as to develop a predetermined function.
  • output information can be obtained by giving related information or the like as an additional stimulus to a series of neuron units in a quasi-excited state in which output can be obtained in a little more amount.
  • the evaluation function is a function for collating the output information obtained by the associative function with the input information, obtaining the accuracy of the output information, and extracting the reason for determining the output information.
  • the excitation pattern of the neuron that was excited by receiving input information like the neural network structure on the left side of this figure, and the output information when output information was output after receiving additional stimuli like the neural network structure on the right side of this figure The excitation pattern of the neuron part traced in the reverse direction from the neuron part corresponding to is compared. In the evaluation function, these are compared, and the higher the degree of coincidence of the excited neuron portion, the higher the accuracy of the output information is evaluated.
  • the evaluation function is a series of neurons whose sum is equal to or greater than the threshold by receiving input from the output value of the firing of the neuron part belonging to another neural network structure other than the other neural network structure to which the evaluation function belongs. It functions to the part.
  • the evaluation function is the neuron part that fires in a series of neuron parts when receiving input only from the output value by firing of the neuron parts belonging to other neural network structure to which it belongs, and other neural network to which it does not belong To compare whether the firing ratio of the firing neuron is equal to or greater than a predetermined value by comparing the firing neuron with a series of neurons when input is received from the output value from firing of the neuron that belongs to the structure Evaluate by having such an evaluation function, it is possible to ensure the validity of the output information output in the associative function.
  • All the neural network structures described above can be broadly realized in two ways.
  • One is a method of building a simulator with software on a Neumann computer.
  • the other is a method of constructing hardware in which a neuron is modeled with an electronic circuit and a large number of such neurons are mounted.
  • the former is suitable for small-scale and trial neural network structures, and the latter is suitable for large-scale and full-scale neural network structures.
  • An electronic circuit for realizing the above-described neural network structure by hardware includes an arithmetic element that forms a neuron part and a synapse part, a storage element that stores a weight coefficient, and an adder circuit that calculates a sum.
  • Arithmetic elements constituting the neuron part and the synapse part are multi-input single-output type semiconductor elements, and a large number of these elements are integrated to constitute a neural network structure.
  • the adder circuit for calculating the sum is a circuit for adding the weighting coefficients of the synapse part when the neuron part fires.
  • the configuration of the circuit can be configured by a known circuit that adds digital values expressed by binary values of 0 and 1.
  • the storage element that stores the weighting coefficient is a semiconductor element having a function of holding a value, and there exists at least one corresponding to each synapse portion.
  • this memory element may be implemented as a memristor element.
  • a memory element that can take not only a digital value of 0 or 1 but also a plurality of values can be formed by only one memristor element that is a passive element, and the stored value is used as an electric resistance.
  • the stored value can be increased or decreased depending on the accumulated value of the flowed current, and the memory of the memristor element is non-volatile, so that the memory can be retained even when the power is turned off.
  • An electronic circuit including the neural network structure can be provided by configuring a neural network structure including the loop structure described above in a part of the plurality of arithmetic elements constituting the neuron part and the synapse part.
  • the electronic circuit may be a so-called neurochip, which is a circuit in which a neural network structure is mounted by a large number of electronic elements and wirings on a substrate made of a semiconductor such as silicon.
  • the information processing system 100 includes a neurochip 10 that is the above-described electronic circuit, a neuroboard 20 provided with a plurality of neurochips 10, and a region having a logical configuration in which a plurality of neuroboards 20 are gathered.
  • the functional unit 30, the back panel 60 for the plurality of area functional units 30 to function in an integrated manner, and externally accepting optical input, acoustic input, electrical input, or physical operation input
  • an input unit 40 that converts the electric signal to the outside
  • an output unit 50 that converts the electric signal into an optical output, an acoustic output, an electrical output, and an output of physical operation.
  • the area function unit 30 is a logical unit for realizing a function having a plurality of neuro boards 20.
  • the area function unit 30 is an area having a voice analysis function or an image analysis function, a field area of a word or shape, a field of a concept, a field of a grammar function, or a decision making function.
  • There are various areas such as Since each area requires a large number of neuron chips 10, it is realized by using a plurality of neuroboards 20.
  • the input unit 40 is a device that receives as input any stimulus such as light, sound, electrical or physical stimulus given from the outside of the information processing system 100 and converts it into an electrical signal.
  • any stimulus such as light, sound, electrical or physical stimulus given from the outside of the information processing system 100 and converts it into an electrical signal.
  • the output unit 50 outputs light, sound, electrical, and physical signals that are electrically output from the neuron chip 10.
  • the related functional unit 30 functions in response to external stimuli input from the input unit 40, and reacts by the output unit 50.
  • the information processing system 100 is integrated by a loop structure formed in the neural network structure, so that the neuron part in the information processing system 100 is easily excited and the stimulus is easily propagated between the neuron parts.
  • a neural network structure according to the present invention as a software simulator on a Neumann computer.
  • this simulator software in a neural network structure having a plurality of neuron units and a synapse unit that connects the plurality of neuron units, a method for propagating firing through the neuron unit and the synapse unit is implemented.
  • the simulator software Causing the first neuron to fire and propagate to the second neuron through the synapse;
  • the (N-1) neuron unit ignites when the firing from the (N-2) neuron unit is propagated, and propagates to the Nth neuron unit via the synapse unit;
  • the Nth neuron unit ignites when the firing from the (N ⁇ 1) th neuron unit is propagated (N is a natural number of 3 or more), and propagates to the first neuron unit via the synapse unit; , including. According to this, when a stimulus is received, the stimulus is repeatedly transmitted to itself.
  • the function of the stimulus continues to propagate and the neural network structure It is possible to provide simulator software and a method thereof that can function to repeatedly excite a certain time interval each time a stimulus that is continuously propagated in a (loop structure) is propagated.
  • this program may be a program for executing a simulation of the neural network structure on a computer.
  • This program Memory enhancement / degradation step to increase / decrease the weight factor of the synapse part,
  • Neurochip electronic circuit
  • neuro board 30 territory functional unit
  • input unit 50 output unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computational Linguistics (AREA)
  • Mathematical Physics (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Image Analysis (AREA)

Abstract

The purpose of the present invention is to provide a neural network structure, and the like, that model the structure of the neural circuits of the human brain. Provided is a neural network structure which includes a plurality of neuron parts and a synapse part which links the plurality of neuron parts, and in which a second neuron part connected to a first neuron part via the synapse part fires in accordance with a sum found by adding, at least one time, the product of the output value of the firing or non-firing of the one neuron part and a weighting factor of the synapse part, wherein the neural network structure is provided with a loop structure in which an output value according to the firing of the one neuron part is propagated to such one neuron part via the second neuron part and the synapse part.

Description

ニューラルネットワーク構造、電子回路、情報処理システム、方法、およびプログラムNeural network structure, electronic circuit, information processing system, method, and program
 本発明は、ニューラルネットワーク構造、電子回路、情報処理システム、方法、およびプログラムに関する。 The present invention relates to a neural network structure, an electronic circuit, an information processing system, a method, and a program.
 人間の脳の神経細胞(以下、ニューロンという)は、人間が認識、判断、記憶などを行うための機能を有している。例えば、ニューロンは、ある事象を繰り返し入力されるとその事象を記憶し、その後その事象と同じ事象のものであるか否かを認識して、同じ事象のものである場合に、それに反応して信号を出力するようになる。また、ニューロンでは、同じ情報が繰り返し入力されるほど、記憶が容易になり、その情報が入力されたときに容易に反応できるようになり、より信号が出力され易くなる。 The human brain nerve cells (hereinafter referred to as neurons) have functions for human recognition, judgment, memory, and the like. For example, a neuron memorizes an event when it is repeatedly input, recognizes whether it is the same event as that event, and reacts to it when it is the same event. A signal is output. In addition, in the neuron, as the same information is repeatedly input, the memory becomes easier, and when the information is input, it becomes possible to easily react and the signal is more easily output.
 このようなことが可能となるのは、ニューロンの構造によるものと考えられている。すなわち、ニューロンは、シナプスを介して多数のニューロンが接続され、入力された情報に対応するニューロンからニューロンへの伝達が行われ、入力情報が記憶された情報と一致していると、他のニューロンに信号を出力する。そして、同じ情報が繰り返し入力されるほど、所定のニューロンとニューロンとの結び付きが強くなるという特性を有している。このため、繰り返し入力されていた情報が再度入力されたときに、所定のニューロンからニューロンへの伝達が行われ易くなり、そのニューロンが反応して他のニューロンへより信号が出力され易くなると考えられている。 This is possible because of the structure of neurons. That is, a neuron is connected to many neurons via a synapse, and transmission from neuron to neuron corresponding to input information is performed. When input information matches stored information, other neurons To output a signal. And as the same information is repeatedly input, there is a characteristic that the connection between a predetermined neuron and a neuron becomes stronger. For this reason, when information that has been repeatedly input is input again, it is likely that transmission from a given neuron to a neuron will easily occur, and that the neuron will react and output signals to other neurons more easily. ing.
 従来から、人間が行う認識、判断、記憶といった行為をコンピュータ上で実現することが試みられており、コンピュータ上でニューラルネットワークを構築する技術が知られている。ニューラルネットワークは、ニューロンの機能を数式や回路によりモデル化したものである。ニューロンは、入力信号により付加される電位が閾(しきい)値を超えると、パルスを発する(発火する)という機能を有する。ニューラルネットワークでは、このような機能を、非線形関数やキャパシタ等を用いて実現する。ニューロン間を結ぶシナプスは、時間的・空間的に異なる伝播効率を有し、認識、判断、記憶といった結果により、ニューロン間の接続関係や各ニューロン間の伝播効率によって変化する。ニューラルネットワークにおいては、これを、非線形関数の出力値やキャパシタの充満率などを結合強度(重み係数)で重み付けして、後段のニューロンの非線形関数に代入したり、充満率に加えたりすることでモデル化している。 Conventionally, it has been attempted to realize actions such as recognition, judgment, and memory performed by humans on a computer, and a technique for constructing a neural network on the computer is known. A neural network is a model of the function of a neuron using mathematical formulas and circuits. A neuron has a function of emitting a pulse (igniting) when a potential applied by an input signal exceeds a threshold value. In the neural network, such a function is realized using a nonlinear function, a capacitor, or the like. Synapses that connect neurons have different temporal and spatial propagation efficiencies, and change depending on the connection between neurons and the propagation efficiency between neurons, depending on the results of recognition, judgment, and memory. In a neural network, this can be done by weighting the output value of the nonlinear function or the filling rate of the capacitor with the coupling strength (weighting factor) and substituting it into the nonlinear function of the subsequent neuron or adding it to the filling rate. Modeling.
 ニューラルネットワークに関しては、様々なモデルが提唱されている。たとえば、非特許文献1に記述されているように、G.Edelmanらは、Neural Darwinismによる学習を取り入れた人工脳システムを考案した。現在、ニューラルネットワークにおいて、代表的なモデルとしては、階層型ニューラルネットワークが挙げられる。階層型ニューラルネットワークは、外部からの入力信号を受け取る複数の入力層ニューロンを含む入力層と、外部に出力信号を送出する複数の出力層ニューロンを含む出力層と、入力層ニューロンと出力層ニューロンとの間に設けられる複数の中間層ニューロンを含む1層以上の中間層とを有するモデルである。 Various models have been proposed for neural networks. For example, as described in Non-Patent Document 1, G.I. Edelman et al. Devised an artificial brain system incorporating learning by Neural Darwinism. Currently, in a neural network, a hierarchical neural network is a typical model. The hierarchical neural network includes an input layer including a plurality of input layer neurons that receive an input signal from the outside, an output layer including a plurality of output layer neurons that transmit an output signal to the outside, an input layer neuron, and an output layer neuron. And one or more intermediate layers including a plurality of intermediate layer neurons.
 たとえば、特許文献1は、正確かつ高速なニューラルネットワークの学習を実現する階層型ニューラルネットワークの学習システム及び方法を開示する。この階層型ニューラルネットワークの学習システムは、階層型ニューラルネットワークの各層を構成する各マップの各ニューロンにつながる経路のうち伝播する信号と結合強度値との適合度が最も高い経路等を選択し、それ以外の経路はスパースとみなす処理を当該マップ単位で行い、階層型ニューラルネットワークに入力された入力信号を順方向に伝播させて出力信号を取得し、得られた出力信号と入力信号と組になっている目標信号とを比較し、これらの一致度合に応じて上記選択した経路を伝播する信号と結合強度値との適合度を増減させる調整を行う。 For example, Patent Document 1 discloses a learning system and method for a hierarchical neural network that realizes accurate and high-speed learning of a neural network. This hierarchical neural network learning system selects a route having the highest degree of matching between the propagation signal and the coupling strength value from among the routes connected to each neuron of each map constituting each layer of the hierarchical neural network. Processes other than are treated as sparse in units of the map, the input signal input to the hierarchical neural network is propagated in the forward direction to obtain the output signal, and the resulting output signal and input signal are paired The target signal is compared with each other, and the degree of matching between the signal propagating on the selected path and the coupling strength value is adjusted according to the degree of coincidence.
特開2015-210747号公報JP 2015-210747 A
 上述した階層型ニューラルネットワークでは、高い正答率を得るためのシナプスの結合の重み係数の学習方法として、バックプロパゲーション法が知られている。しかし、高い精度で正答にたどり着くためには、大量のデータにより学習させてやる必要があるため、階層型ニューラルネットワークを新たな適用分野へ活用することは、困難を伴い多くの時間を要する。 In the above-described hierarchical neural network, a back-propagation method is known as a learning method for synaptic weighting factors for obtaining a high correct answer rate. However, in order to arrive at a correct answer with high accuracy, it is necessary to learn from a large amount of data, so it is difficult and time-consuming to use a hierarchical neural network in a new application field.
 また、上述した階層型ニューラルネットワークでは、学習をさせてアルゴリズムが形成できたとしても、そのアルゴリズムから外れるケースには全く対応できない。したがって、階層型ニューラルネットワークの適用領域としては、個別のアルゴリズムで対応可能な、ゲーム、画像認識、自動運転、投資判断などの特定の領域に向いている(所謂、弱いAI)。しかしながら、複数のアルゴリズムの中から、その場の状況に適したアルゴリズムを選択して実行させることはできない(フレーム問題)。 In the above-described hierarchical neural network, even if an algorithm can be formed by learning, it cannot cope with a case that deviates from the algorithm. Therefore, the application area of the hierarchical neural network is suitable for specific areas such as games, image recognition, automatic driving, and investment judgment that can be handled by individual algorithms (so-called weak AI). However, it is not possible to select and execute an algorithm suitable for the situation from a plurality of algorithms (frame problem).
 また、上述した階層型ニューラルネットワークでは、1つのニューロンをある概念に対応させることはできるが、そのニューロンに対して、その概念に関連する形状、性質、特徴、名称などの、いわゆる意味を持たせることはできない(記号接地問題)。 In the above-described hierarchical neural network, one neuron can be associated with a concept, but the neuron has a so-called meaning such as shape, property, feature, and name related to the concept. It is not possible (symbol grounding problem).
 さらにまた、上述した階層型ニューラルネットワークでは、システム全体の入力から出力に向かって、興奮状態(発火や刺激などとも言う)を次々と伝播させることはできるが、最終的な出力に至らないときは、それ以上の結果を求めることができない。 Furthermore, in the above-described hierarchical neural network, the excitement state (also called firing or stimulation) can be propagated one after another from the input to the output of the entire system, but when the final output is not reached Can't ask for more results.
 さらにまた、上述した階層型ニューラルネットワークでは、一連の手順に従って、タイミングを合わせながら、手順の実行の有無や、実行結果の如何に応じて、後続する手順を取捨選択しながら、逐次的に実行することはできない。すなわち、手続的な処理は、プログラムでは実行できるが、従来のニューラルネットワークでは実行することができない。 Furthermore, in the above-described hierarchical neural network, sequential execution is performed by selecting subsequent steps according to whether or not the procedure is executed and whether or not the execution result is in accordance with a series of procedures according to the timing. It is not possible. That is, procedural processing can be executed by a program, but cannot be executed by a conventional neural network.
 本発明は、かかる事情を鑑みて、所謂「強いAI」目指して考案されたものであり、人間の脳の神経回路の仕組みを模したニューラルネットワーク構造、このニューラルネットワーク構造を含む電子回路および情報処理システム、方法、およびプログラムを提供するものである。 In view of such circumstances, the present invention has been devised with the aim of so-called “strong AI”. A neural network structure simulating a mechanism of a neural circuit in a human brain, an electronic circuit including the neural network structure, and information processing A system, method, and program are provided.
 上記課題を解決するために、複数のニューロン部と、複数のニューロン部を結びつけるシナプス部とを有し、一のニューロン部の発火または未発火の出力値とシナプス部の重み係数との積を1回以上加算した総和に応じて、シナプス部を介して一のニューロン部と接続されている他のニューロン部が発火するニューラルネットワーク構造であって、一のニューロン部の発火による出力値が、他のニューロン部およびシナプス部を介して、一のニューロン部自身に伝播するループ構造を備えるニューラルネットワーク構造が提供される。
 これによれば、ニューラルネットワーク構造においてループ構造が含まれることにより、そのループ構造に含まれるあるニューロン部が発火による出力値すなわち刺激を受けるとその刺激がループ構造内を次々と伝播してゆくこととなるため、ループ構造内で刺激が連続的に伝播しつづけるように機能させることと、ループ構造内のニューロン部がループ構造内の刺激が伝播される都度一定の時間間隔をおいて繰り返し興奮するように機能させることができるニューラルネットワーク構造を提供することができる。
In order to solve the above-described problem, a plurality of neuron units and a synapse unit that connects the plurality of neuron units are provided, and a product of an output value of firing or unfired of one neuron unit and a weight coefficient of the synapse unit is 1 A neural network structure in which other neuron units connected to one neuron unit via a synapse unit ignite according to the sum added more than once, and the output value due to the firing of one neuron unit is another A neural network structure having a loop structure that propagates to one neuron unit itself via a neuron unit and a synapse unit is provided.
According to this, when a loop structure is included in a neural network structure, when a neuron part included in the loop structure receives an output value, that is, a stimulus by firing, the stimulus propagates one after another in the loop structure. Therefore, the function is made so that the stimulus continuously propagates in the loop structure, and the neuron part in the loop structure is repeatedly excited at a certain time interval every time the stimulus in the loop structure is propagated. A neural network structure that can function as described above can be provided.
 さらに、上記のニューラルネットワーク構造と、ループ構造を伝播することにより一のニューロン部が発火することによって出力する出力値を入力値として受け取る他のニューラルネットワーク構造と、を備えることを特徴としてもよい。
 これによれば、ループ構造に含まれるあるニューロン部から刺激を受ける、ループ構造に含まれない他のニューラルネットワーク構造において、繰り返しその刺激を受けるたびに、ニューロン部の興奮を高めることができる。
Furthermore, the above-described neural network structure and another neural network structure that receives as an input value an output value output when one neuron unit ignites by propagating the loop structure may be provided.
According to this, in other neural network structures that receive a stimulus from a certain neuron part included in the loop structure and are not included in the loop structure, the excitement of the neuron part can be increased each time the stimulus is repeatedly received.
 さらに、ループ構造における伝播が1回以上行われ1回以上の入力を受け取る他のニューラルネットワーク構造において、一連のニューロン部は、入力を発火による出力値から受け取ることで総和が閾値以上となり、所定の機能を発現することを特徴としてもよい。
 これによれば、ループ構造に含まれるあるニューロン部から刺激を受ける、ループ構造に含まれない他のニューラルネットワーク構造において、繰り返しその刺激を受けることによって、ニューロン部の興奮が次第に高められ、興奮が閾値を越えて発火したニューロン部が増加し、刺激がより広範囲に伝播するような一連のニューロン部を形成し、所定の機能を発現させることができる。ニューラルネットワーク構造において接続された一連のニューロン部が形成され、その一連のニューロン部に対応する所定の機能が発現することで、ある概念に関するニューロン部が、その概念に関連する概念に関するニューロン部と接続され、関連する概念群により所謂「意味」をニューラルネットワーク構造内に持たせることができる。
Further, in another neural network structure in which propagation in the loop structure is performed one or more times and receives one or more inputs, the series of neuron units receives the input from the output value by firing, and the sum becomes equal to or more than a threshold value. It may be characterized by expressing a function.
According to this, the excitement of the neuron part is gradually increased by repeatedly receiving the stimulus in other neural network structures not included in the loop structure that receive the stimulus from a certain neuron part included in the loop structure, and the excitement is increased. The number of neuron portions that have fired beyond the threshold value increases, and a series of neuron portions in which the stimulus propagates over a wider range can be formed, and a predetermined function can be expressed. A series of neuron parts connected in the neural network structure are formed, and a predetermined function corresponding to the series of neuron parts is expressed, so that a neuron part related to a concept is connected to a neuron part related to the concept related to the concept. In addition, a so-called “meaning” can be given in the neural network structure by the related concept group.
 さらに、一のニューロン部と他の1以上のニューロン部との論理積を算出する論理積算出部と、一のニューロン部と他の1以上のニューロン部との論理和を算出する論理和算出部と、一のニューロン部の論理否定を算出する論理否定算出部と、を備えることを特徴としてもよい。
 これによれば、ニューロン部同士の論理積算出部および論理和算出部、およびニューロン部の論理否定算出部を備えることで、ニューロン部のすべての演算処理が可能となる。
Further, a logical product calculation unit that calculates a logical product of one neuron unit and one or more other neuron units, and a logical sum calculation unit that calculates a logical sum of one neuron unit and one or more other neuron units. And a logic negation calculation unit for calculating the logic negation of one neuron unit.
According to this, all the arithmetic processing of the neuron unit can be performed by including the logical product calculation unit and the logical sum calculation unit of the neuron units and the logical negation calculation unit of the neuron unit.
 さらに、論理積算出部、論理和算出部、論理否定算出部のいずれかまたは組み合わせにより、より高次の機能を発現する一連のニューロン部を形成することを特徴としてもよい。
 これによれば、ニューロン部を演算することで、より複雑な機能を発現する一連のニューロン部の形成が可能となる。
Furthermore, a series of neuron units that express higher-order functions may be formed by any one or combination of a logical product calculation unit, a logical sum calculation unit, and a logical negation calculation unit.
According to this, by calculating the neuron part, a series of neuron parts that express more complicated functions can be formed.
 さらに、一の機能を発現する一連のニューロン部は、他の機能を発現する1以上の一連のニューロン部とあらかじめ接続されており、かつ、総和が閾値以上となることで発火することを特徴としてもよい。
 これによれば、関連する概念群である「意味」が発火することで、より広い概念群としてのより高次の「意味」の発火が容易になる状況を形成することができる。
Furthermore, a series of neuron parts that express one function is pre-connected to one or more series of neuron parts that express another function, and fires when the sum exceeds a threshold value. Also good.
According to this, it is possible to form a situation in which it is easy to ignite higher-order “meaning” as a wider concept group by igniting “meaning” that is a related concept group.
 さらに、第1のニューロン部と、第2のニューロン部および他の複数のニューロン部からなり第2のニューロン部の発火による出力値が他の複数のニューロン部を介して第2のニューロン部自身に伝播する構造を有する発振ニューロン部とを有し、第1のニューロン部から第2のニューロン部への方向で結びつけるシナプス部は、1回の総和で第2のニューロン部が発火する発火重み係数を有し、発振ニューロン部内のニューロン部同士を結びつけるシナプス部は、1回の総和でニューロン部が発火する発火重み係数を有し、発振ニューロン部内のニューロン部から第1のニューロン部への方向で結びつけるシナプス部は、閾値未満のため第1のニューロン部が発火しない未発火重み係数を有する発振部を備えることを特徴としてもよい。
 これによれば、刺激を受けて一度興奮状態になった第1のニューロン部を、その潜在的な興奮状態を一定期間持続させることができる。
Further, the first neuron unit, the second neuron unit, and other neuron units, and the output value generated by firing the second neuron unit is transmitted to the second neuron unit itself via the other neuron units. An oscillating neuron unit having a propagating structure, and a synapse unit connected in a direction from the first neuron unit to the second neuron unit has an ignition weight coefficient for firing the second neuron unit in one summation. The synapse part that connects the neuron parts in the oscillation neuron part has a firing weight coefficient that fires the neuron part in one summation, and connects in the direction from the neuron part in the oscillation neuron part to the first neuron part The synapse unit may include an oscillating unit having an unfired weighting coefficient that does not fire the first neuron unit because it is less than the threshold value.
According to this, it is possible to maintain the potential excitement state for a certain period of time in the first neuron unit that is once excited by being stimulated.
 さらに、一連のニューロン部から構成される長期記憶部と、ループ構造と1以上の発振部から構成される短期記憶部と、を備えることを特徴としてもよい。
 これによれば、ニューロン部により長期および短期の記憶を保持することが可能となる。
Furthermore, a long-term memory unit composed of a series of neuron units and a short-term memory unit composed of a loop structure and one or more oscillation units may be provided.
According to this, long-term and short-term memory can be held by the neuron unit.
 さらに、複数のニューロン部を有し、上記の第1のニューラルネットワーク構造と、複数のニューロン部を有し、上記の第2のニューラルネットワーク構造と、を含み、第1ニューラルネットワーク構造に含まれるニューロン部と、第2ニューラルネットワーク構造に含まれるニューロン部とを結びつけるシナプス部を備えることを特徴としてもよい。
 これによれば、長期記憶や短期記憶などが保持された異なるニューラルネットワーク構造に含まれるニューロン部を接続するシナプス部を備えることで、異なる体系のニューラルネットワーク構造の間で連携を行うことが可能になる。
A neuron included in the first neural network structure, further including: a plurality of neuron units, the first neural network structure; and a plurality of neuron units, the second neural network structure. A synapse unit that connects the unit and a neuron unit included in the second neural network structure may be provided.
According to this, it is possible to link between neural network structures of different systems by providing synapse parts that connect neuron parts included in different neural network structures that retain long-term memory or short-term memory. Become.
 さらに、第1ニューラルネットワーク構造は、1以上の第2ニューラルネットワーク構造を制御する機能を有するマスタであり、第2ニューラルネットワーク構造は、第1ニューラルネットワーク構造の制御下で機能するスレーブであることを特徴としてもよい。
 これによれば、1つの主となるニューラルネットワーク構造が、他のニューラルネットワーク構造を制御することで、階層構造によって統制を効かせた、より大きなニューラルネットワーク構造を有することができる。
Further, the first neural network structure is a master having a function of controlling one or more second neural network structures, and the second neural network structure is a slave functioning under the control of the first neural network structure. It may be a feature.
According to this, one main neural network structure can have a larger neural network structure that is controlled by the hierarchical structure by controlling other neural network structures.
 さらに、第1ニューラルネットワーク構造は、汎用的または計画的機能を有し、第2ニューラルネットワーク構造は、個別的または実行的機能を有することを特徴としてもよい。
 これによれば、汎用的または計画的な機能を発現するニューラルネットワーク構造の下で、個別的または実行的な機能を発現させることができる。
Further, the first neural network structure may have a general purpose or planned function, and the second neural network structure may have an individual or execution function.
According to this, individual or executable functions can be expressed under a neural network structure that expresses general-purpose or planned functions.
 さらに、第1ニューラルネットワーク構造に含まれるニューロン部と、第2ニューラルネットワーク構造に含まれるニューロン部とを結びつけるシナプス部の重み係数は、当該シナプス部が結びつける両端のニューロン部が同時に発火した場合に増加することを特徴としてもよい。
 これによれば、異なる一連のニューロン部間や異なるニューラルネットワーク構造間での強い結合が促され、所定の機能を発現する一連のニューロン部を形成し易くなる。
Furthermore, the weighting factor of the synapse part that connects the neuron part included in the first neural network structure and the neuron part included in the second neural network structure increases when the neuron parts at both ends connected by the synapse part fire simultaneously. It may be characterized by.
According to this, strong connection between different series of neuron parts or between different neural network structures is promoted, and it becomes easy to form a series of neuron parts that express a predetermined function.
 さらに、ループ構造における伝播が1回以上行われ1回以上の入力を受け取る他のニューラルネットワーク構造において、一連のニューロン部は、自身が属する他のニューラルネットワーク構造以外の他のニューラルネットワーク構造に属するニューロン部の発火による出力値から入力を受け取ることで総和が閾値以上となり、所定の機能を発現することを特徴としてもよい。
 これによれば、もう少しで出力を得られる準興奮状態にある一連のニューロン部に対して、関連する情報などを追加刺激として与えることで出力情報を得られるようになる。
Furthermore, in another neural network structure in which the propagation in the loop structure is performed one or more times and receives one or more inputs, a series of neuron units belong to other neural network structures other than the other neural network structure to which they belong. By receiving an input from an output value due to the firing of the unit, the sum may be equal to or greater than a threshold value, and a predetermined function may be exhibited.
According to this, output information can be obtained by giving related information or the like as an additional stimulus to a series of neuron units in a quasi-excited state in which output can be obtained a little more.
 さらに、自身が属する他のニューラルネットワーク構造以外の他のニューラルネットワーク構造に属するニューロン部の発火による出力値から入力を受け取ることで総和が閾値以上となった一連のニューロン部は、自身が属する他のニューラルネットワーク構造に属するニューロン部の発火による出力値のみから入力を受け取った場合の一連のニューロン部内で発火したニューロン部と、自身が属さない他のニューラルネットワーク構造に属するニューロン部の発火による出力値から入力を受け取った場合の一連のニューロン部内で発火したニューロン部と、を比較し、発火したニューロン部の一致割合が所定値以上か否かを判定することにより評価されることを特徴としてもよい。
 これによれば、連想機能において出力された出力情報の妥当性を確保することができる。
In addition, a series of neuron units whose sum is equal to or greater than the threshold value by receiving input from the output value of the firing of the neuron unit belonging to another neural network structure other than the other neural network structure to which the self belongs When input is received only from the output value from the firing of the neuron part belonging to the neural network structure, from the output value from the firing of the neuron part fired in a series of neuron parts and the neuron part belonging to another neural network structure to which it does not belong It may be characterized in that evaluation is performed by comparing with a neuron unit fired in a series of neuron units when an input is received, and determining whether or not the coincidence ratio of the fired neuron units is a predetermined value or more.
According to this, the validity of the output information output in the associative function can be ensured.
 さらに、他のニューラルネットワーク構造において、所定の機能を発現している第1の一連のニューロン部の中の少なくとも1つのニューロン部と、第1の一連のニューロン部とは異なる所定の機能を発現している第2の一連のニューロン部の中の少なくとも1つのニューロン部とが同時に発火した場合、第1の一連のニューロン部と第2の一連のニューロン部を結びつけるシナプス部の重み係数は増加することを特徴としてもよい。
 これによれば、異なる一連のニューロン部間や異なるニューラルネットワーク構造間での強い結合が促され、ニューラルネットワーク構造間の関連を結びつけやすくなる。
Further, in another neural network structure, at least one neuron unit in the first series of neuron units expressing a predetermined function and a predetermined function different from the first series of neuron units are expressed. When at least one neuron unit in the second series of neuron units fires simultaneously, the weight coefficient of the synapse unit connecting the first series of neuron units and the second series of neuron units increases. May be a feature.
According to this, strong connection between different series of neuron parts or between different neural network structures is promoted, and it becomes easy to connect the relations between the neural network structures.
 上記課題を解決するために、ニューロン部およびシナプス部を構成する演算素子と、重み係数を記憶する記憶素子と、総和を計算する加算回路と、を備え、上記のニューラルネットワーク構造を含む電子回路が提供される。
 これによれば、ループ構造を含むニューラルネットワーク構造を形成することができる電子回路を提供することができる。
In order to solve the above-described problem, an electronic circuit including the above-described neural network structure includes an arithmetic element that constitutes a neuron part and a synapse part, a storage element that stores a weight coefficient, and an adder circuit that calculates a sum. Provided.
According to this, an electronic circuit capable of forming a neural network structure including a loop structure can be provided.
 さらに、記憶素子は、メモリスタ素子で実装されることを特徴としてもよい。
 これによれば、単に0か1かのデジタル値だけでなく、複数の値を取り得る記憶素子を、受動素子であるメモリスタ素子1個だけで形成でき、しかも、記憶した値を電気抵抗として利用することはもちろん、流した電流の累積値によって記憶値を増減させることもでき、また、メモリスタ素子の記憶は不揮発性であるので、電源を遮断しても記憶を保持することができる。
Further, the memory element may be implemented as a memristor element.
According to this, a memory element that can take not only a digital value of 0 or 1 but also a plurality of values can be formed by only one memristor element that is a passive element, and the stored value is used as an electric resistance. Of course, the stored value can be increased or decreased depending on the accumulated value of the flowed current, and the memory of the memristor element is non-volatile, so that the memory can be retained even when the power is turned off.
 上記課題を解決するために、上記の電子回路と、外部から光学的入力、音響的入力、電気的入力、または物理的動作の入力を受け付けて、電気信号に変換する入力部と、外部へ電気信号を光学的出力、音響的出力、電気的出力、物理的動作の出力に変換する出力部と、を備え、入力部は、電気信号をニューラルネットワーク構造に含まれるニューロン部に入力し、出力部は、ニューラルネットワーク構造に含まれるニューロン部からの電気信号を受け付けることを特徴とする情報処理システムが提供される。
 これによれば、外部からの各種入力に呼応した視覚的、音声的、概念的、または物理的な意味を持つ所謂概念ニューロン部を形成し、またこのニューロン部と関連するニューロン部を接続させることによって、概念ニューロン部に、より広範で豊かな意味を持たせ、現実世界の意味に立脚したニューラルネットワーク構造を実現するとともに、ニューラルネットワーク構造の出力を外部に作用させることができる情報処理システムを提供することができる。
In order to solve the above-described problems, the electronic circuit, an input unit that receives an optical input, an acoustic input, an electrical input, or a physical operation input from the outside and converts the input into an electrical signal; An output unit that converts the signal into an optical output, an acoustic output, an electrical output, and an output of physical operation, and the input unit inputs the electrical signal to a neuron unit included in the neural network structure, and outputs an output unit. Provides an information processing system characterized by receiving an electrical signal from a neuron unit included in a neural network structure.
According to this, a so-called conceptual neuron unit having visual, audio, conceptual, or physical meaning corresponding to various external inputs is formed, and a neuron unit related to this neuron unit is connected. Provides a broader and richer meaning to the conceptual neuron, realizes a neural network structure based on the meaning of the real world, and provides an information processing system that allows the output of the neural network structure to act externally can do.
 上記課題を解決するために、複数のニューロン部と複数のニューロン部を結びつけるシナプス部とを有するニューラルネットワーク構造において、ニューロン部およびシナプス部を介して発火を伝播させる方法であって、第1のニューロン部が発火し、シナプス部を介して第2のニューロン部に伝播させるステップと、第(N-1)のニューロン部は、第(N-2)のニューロン部からの発火が伝播されると発火し、シナプス部を介して第Nのニューロン部に伝播させるステップと、第Nのニューロン部は、第(N-1)のニューロン部からの発火が伝播されると発火し(Nは3以上の自然数)、シナプス部を介して第1のニューロン部に伝播させるステップと、を含む方法を提供することができる。
 これによれば、刺激を受けるとその刺激が繰り返し自分自身に伝わることとなるため、ニューラルネットワーク構造(ループ構造)内において、刺激が連続的に伝播しつづけるように機能させることと、ニューラルネットワーク構造(ループ構造)内のニューロン部が連続的に伝播する刺激が伝播される都度一定の時間間隔をおいて繰り返し興奮するように機能させることができる方法を提供することができる。
In order to solve the above problem, in a neural network structure having a plurality of neuron units and a synapse unit that connects the plurality of neuron units, a method for propagating firing through the neuron unit and the synapse unit, the first neuron And the (N-1) th neuron unit is ignited when the firing from the (N-2) th neuron unit is propagated. And the step of propagating to the Nth neuron unit through the synapse unit, and the Nth neuron unit ignites when the firing from the (N−1) th neuron unit is propagated (N is 3 or more). Natural number) and propagating to the first neuron part via the synapse part.
According to this, when a stimulus is received, the stimulus is repeatedly transmitted to itself. Therefore, in the neural network structure (loop structure), the function of the stimulus continues to propagate and the neural network structure It is possible to provide a method capable of functioning to repeatedly excite a constant time interval each time a stimulus that is continuously propagated by a neuron in the (loop structure) is propagated.
 上記課題を解決するために、上記の方法のすべてのステップを実行するための命令を含み、コンピュータ上で実行されるプログラムを提供することができる。
 これによれば、通常のノイマン型コンピュータを用いて、ループ構造を含むニューラルネットワーク構造を機能させることができる方法を実行するプログラムを提供することができる。
In order to solve the above-mentioned problem, it is possible to provide a program to be executed on a computer including instructions for executing all the steps of the above method.
According to this, it is possible to provide a program for executing a method capable of causing a neural network structure including a loop structure to function using a normal Neumann computer.
 上記課題を解決するために、上記のニューラルネットワーク構造のシミュレーションをコンピュータ上で実行するプログラムであって、重み係数を記憶する記憶ステップと、総和を計算する加算ステップと、記憶した重み係数および計算した総和に基づき、一のニューロン部がシナプス部を介して他のニューロン部に発火を伝播させるまたは伝播させないことのシミュレーションを行う処理ステップと、を実行するプログラムを提供することができる。
 これによれば、通常のノイマン型コンピュータを用いて、ループ構造を含むニューラルネットワーク構造をシミュレーションするプログラムを提供することができる。
In order to solve the above problems, a program for executing simulation of the neural network structure on a computer, a storage step for storing a weighting factor, an adding step for calculating a sum, a stored weighting factor and a calculation Based on the sum, it is possible to provide a program for executing a processing step for performing a simulation of whether one neuron unit propagates or does not propagate the firing to another neuron unit via the synapse unit.
According to this, it is possible to provide a program for simulating a neural network structure including a loop structure using a normal Neumann computer.
 上記課題を解決するために、複数のニューロン部と、複数のニューロン部を結びつけるシナプス部とを有し、一のニューロン部の発火または未発火の出力値とシナプス部の重み係数との積を1回以上加算した総和に応じて、シナプス部を介して一のニューロン部と接続されている他のニューロン部が発火するニューラルネットワーク構造であって、一のニューロン部に対して一の入力を与えると、一のニューロン部は発火し、その発火による出力値が、他のニューロン部およびシナプス部を介して、その一のニューロン部自身に伝播し発火するニューラルネットワーク構造を提供することができる。
 これによれば、ニューラルネットワーク構造において、あるニューロン部に入力すると発火し、その発火による出力値すなわち刺激が繰り返し自分自身に伝わることで、その刺激を連続的に伝播させることができるニューラルネットワーク構造を提供することができる。
In order to solve the above-described problem, a plurality of neuron units and a synapse unit that connects the plurality of neuron units are provided, and a product of an output value of firing or unfired of one neuron unit and a weight coefficient of the synapse unit is 1 A neural network structure in which other neuron units connected to one neuron unit via the synapse unit fire according to the sum added more than once. When one input is given to one neuron unit Thus, it is possible to provide a neural network structure in which one neuron portion ignites and an output value due to the fire is propagated to the one neuron portion itself via another neuron portion and a synapse portion and fires.
According to this, in a neural network structure, a neural network structure that ignites when it is input to a certain neuron unit, and an output value, that is, a stimulus transmitted by the firing is repeatedly transmitted to itself, can continuously propagate the stimulus. Can be provided.
 以上説明したように、本発明によれば、人間の脳の神経回路の仕組みを模したニューラルネットワーク構造、このニューラルネットワーク構造を含む電子回路および情報処理システム、方法、およびプログラムを提供できる。 As described above, according to the present invention, it is possible to provide a neural network structure that mimics the mechanism of a neural circuit in the human brain, an electronic circuit including the neural network structure, an information processing system, a method, and a program.
(A)ニューロンとシナプスによる脳情報処理機能を示す説明図、(B)ニューラルネットワーク構造のモデル図。(A) Explanatory drawing which shows the brain information processing function by a neuron and a synapse, (B) Model figure of a neural network structure. 本発明に係る第1実施例のニューラルネットワーク構造における、(A)ループ構造を示す模式図、(B)ループ構造におけるシナプス部の結合強度を示す表。In the neural network structure of the first embodiment according to the present invention, (A) a schematic diagram showing a loop structure, (B) a table showing the coupling strength of the synapse portion in the loop structure. (A)ループ構造がないニューラルネットワーク構造における興奮の伝播を示す模式図、(B)本発明に係る第1実施例のニューラルネットワーク構造におけるループ構造がある場合の興奮の伝播を示す模式図。(A) Schematic diagram showing the propagation of excitement in the neural network structure without the loop structure, (B) Schematic diagram showing the propagation of excitement when there is a loop structure in the neural network structure of the first embodiment according to the present invention. 本発明に係る第1実施例のニューラルネットワーク構造において、ニューロン部の繋がりにより意味を形成することを説明する模式図。The schematic diagram explaining forming a meaning by the connection of a neuron part in the neural network structure of 1st Example which concerns on this invention. 本発明に係る第1実施例のニューラルネットワーク構造において、記号接地問題を解決することを説明する模式図。The schematic diagram explaining solving the symbol earthing | grounding problem in the neural network structure of 1st Example which concerns on this invention. 本発明に係る第2実施例のニューラルネットワーク構造における、(A)論理積構造(AND回路)を示す模式図、(B)論理和構造(OR回路)を示す模式図、(C)論理否定構造(NOT回路)を示す模式図、(D)ラッチ構造(Latch回路)を示す模式図、(E)フリップフロップ構造(Flip-Flop回路)を示す模式図。(A) Schematic diagram showing logical product structure (AND circuit), (B) Schematic diagram showing logical sum structure (OR circuit), (C) Logical negation structure in the neural network structure of the second embodiment according to the present invention. FIG. 4 is a schematic diagram showing a (NOT circuit), (D) a schematic diagram showing a latch structure (Latch circuit), and (E) a schematic diagram showing a flip-flop structure (Flip-Flop circuit). 本発明に係る第2実施例のニューラルネットワーク構造における、(A)論理積構造の例、(B)論理和構造の例、(C)論理否定構造の例、(D)フリップフロップ構造の例を示す模式図。(A) Example of logical product structure, (B) Example of logical sum structure, (C) Example of logical negation structure, (D) Example of flip-flop structure in the neural network structure of the second embodiment according to the present invention FIG. 本発明に係る第2実施例のニューラルネットワーク構造におけるラッチ構造の例を示す模式図。The schematic diagram which shows the example of the latch structure in the neural network structure of 2nd Example which concerns on this invention. 本発明に係る第2実施例のニューラルネットワーク構造における、(A)発振回路の例を示す模式図、(B)発振回路におけるシナプス部の結合強度を示す表。In the neural network structure of the second embodiment according to the present invention, (A) a schematic diagram showing an example of an oscillation circuit, and (B) a table showing the coupling strength of a synapse portion in the oscillation circuit. 本発明に係る第2実施例のニューラルネットワーク構造におけるシナプス部の結合強度の強化機構の例を示す模式図。The schematic diagram which shows the example of the reinforcement | strengthening mechanism of the coupling strength of the synapse part in the neural network structure of 2nd Example which concerns on this invention. 本発明に係る第2実施例のニューラルネットワーク構造におけるループ構造がある場合の興奮の伝播を示す模式図。The schematic diagram which shows the propagation of excitement in case there exists a loop structure in the neural network structure of 2nd Example which concerns on this invention. 本発明に係る第3実施例のニューラルネットワーク構造における興奮の伝播を示す模式図。The schematic diagram which shows the propagation of excitement in the neural network structure of 3rd Example which concerns on this invention. 本発明に係る第3実施例のニューラルネットワーク構造においてパズルを解く例を示す模式図。The schematic diagram which shows the example which solves a puzzle in the neural network structure of 3rd Example which concerns on this invention. 本発明に係る第3実施例のニューラルネットワーク構造においてマスタスレーブの関係を示す模式図。The schematic diagram which shows the relationship of a master slave in the neural network structure of 3rd Example based on this invention. 本発明に係る情報処理システムの構成を示す模式図。The schematic diagram which shows the structure of the information processing system which concerns on this invention. 本発明に係る情報処理システムにおける、領野機能ユニットの連携を示す模式図。The schematic diagram which shows cooperation of a territory functional unit in the information processing system which concerns on this invention. 本発明に係る第4実施例のニューラルネットワーク構造における機能を示す模式図。The schematic diagram which shows the function in the neural network structure of 4th Example based on this invention. 本発明に係る第4実施例のニューラルネットワーク構造における学習機能を示す模式図。The schematic diagram which shows the learning function in the neural network structure of 4th Example based on this invention. 本発明に係る第4実施例のニューラルネットワーク構造における連想機能を示す模式図。The schematic diagram which shows the associative function in the neural network structure of 4th Example which concerns on this invention. 本発明に係る第4実施例のニューラルネットワーク構造における評価機能を示す模式図。The schematic diagram which shows the evaluation function in the neural network structure of 4th Example based on this invention. 本発明に係る第4実施例のニューラルネットワーク構造において、脳の領野間での連携を例にして学習機能を示す模式図。The schematic diagram which shows a learning function in the neural network structure of 4th Example which concerns on this invention by making the cooperation between the territory of a brain into an example.
 本明細書において用いる場合、「ニューラルネットワーク構造」という言葉は、生物学的ニューロンをシミュレーションするように構成されたアーキテクチャを表す。ニューラルネットワーク構造は、生物学的な脳のニューロンおよびシナプスと機能的にほぼ同等である構造、すなわち要素と要素間の繋がりを意味する。したがって、本発明の実施形態のニューラルネットワーク構造を含む電子回路および情報処理システムは、生物学的ニューロンをモデル化した様々な素子や電子回路を含むことができる。さらに、本発明の実施形態に従ったニューラルネットワーク構造を含むコンピュータは、生物学的ニューロンをモデル化した様々な処理要素やアルゴリズム(コンピュータ・シミュレーションを含む)を含むことができる。 As used herein, the term “neural network structure” refers to an architecture configured to simulate biological neurons. Neural network structure means a structure that is approximately functionally equivalent to neurons and synapses of biological brain, ie, the connection between elements. Therefore, the electronic circuit and information processing system including the neural network structure according to the embodiment of the present invention can include various elements and electronic circuits that model biological neurons. Further, a computer including a neural network structure according to an embodiment of the present invention can include various processing elements and algorithms (including computer simulations) that model biological neurons.
 図1に示すように、ニューラルネットワーク構造は、脳の情報処理機能を工学的にモデル化したものであり、ニューロン部と呼ばれる神経細胞に対応する素子を多数結合した構造を有している。ニューラルネットワーク構造は、複数のニューロン部と複数のシナプス部とを備えている。なお、本図は、図面の見易さのため、2つのニューロン部がシナプス部を介して相互に接続されている様子を示すが、実際には、多数のニューロン部が多数のシナプス部を介して相互に接続されている。また、便宜上、1つのニューロン部には3つのシナプス部からの入力と、3つのシナプス部への出力を記載するが、これに限定されないことは言うまでもない。 As shown in FIG. 1, the neural network structure is an engineering model of the information processing function of the brain, and has a structure in which a number of elements corresponding to nerve cells called neuron portions are connected. The neural network structure includes a plurality of neuron parts and a plurality of synapse parts. Note that this figure shows a state in which two neuron parts are connected to each other via a synapse part for the sake of easy viewing, but in reality, a large number of neuron parts are connected via a large number of synapse parts. Are connected to each other. For convenience, one neuron unit describes inputs from three synapse units and outputs to the three synapse units, but it goes without saying that the present invention is not limited to this.
 一のニューロン部は、シナプス部を介して他のニューロン部から入力In1~In3(刺激とも言う)を受け取ると、自身の内部に電位を蓄積する。この入力In1~In3は、複数の他のニューロン部から非同期的に与えられる。ニューロン部の内部電位が所定の閾値を超えると、ニューロン部は発火し、出力Out1~Out3を生成する。出力Out2を入力(図では入力In2)として受け取ったニューロン部は、内部に電位を蓄積して、さらに接続されたニューロン部に刺激が伝播していくことが可能となる。 When one neuron unit receives inputs In1 to In3 (also referred to as stimuli) from other neuron units via the synapse unit, it accumulates a potential inside itself. The inputs In1 to In3 are given asynchronously from a plurality of other neuron units. When the internal potential of the neuron unit exceeds a predetermined threshold value, the neuron unit ignites and generates outputs Out1 to Out3. The neuron unit that has received the output Out2 as an input (input In2 in the figure) accumulates a potential inside and can further propagate the stimulus to the connected neuron unit.
 このニューロン部は、たとえば、積分回路を構成するキャパシタとオペアンプを備え(図示せず)、シナプス部から出力される電流を用いてキャパシタを充電することにより内部に電位を蓄積する。1つのニューロン部には多くのシナプス部(図では3つ)があり、それら多くのシナプス部から受け取る電位の総和がニューロン部の内部電位となる。その内部電位が予め定められた閾値を越えると、そのニューロン部は出力Out1~Out3を出力(発火)し、その内部電位をリセットしたり、経時的に減衰させたりする。ニューロン部は、頻繁に入力In1~In3を受け取ると内部電位が閾値に近い状態が維持され、このような状態は、僅かな入力でも発火しやすいので興奮しやすい状態(潜在的興奮状態)であると言える。 This neuron unit includes, for example, a capacitor and an operational amplifier (not shown) constituting an integration circuit, and stores a potential therein by charging the capacitor using a current output from the synapse unit. There are many synapse parts (three in the figure) in one neuron part, and the total potential received from these many synapse parts becomes the internal potential of the neuron part. When the internal potential exceeds a predetermined threshold value, the neuron unit outputs (fires) outputs Out1 to Out3, and resets or attenuates the internal potential over time. When the neuron unit frequently receives inputs In1 to In3, the state in which the internal potential is close to the threshold value is maintained, and such a state is easy to ignite even with a small amount of input, and thus is easily excited (potential excitation state). It can be said.
 シナプス部またはニューロン部におけるシナプス部からの受け入れ口の部分には、各シナプス部に対応した同様な積分回路を有してもよい。これらは、この積分回路の容量の大小や充満率の高低により、シナプス部の結合強度(重み係数、重み付けとも言う)として機能する。そうすると、あるニューロン部の出力値とシナプス部等の重み係数との積は、繰り返し加算されて総和されることで、そのニューロン部の内部電位の蓄積の程度や速さに影響を与える。 A similar integration circuit corresponding to each synapse part may be provided in the part of the receiving port from the synapse part in the synapse part or the neuron part. These function as the coupling strength (also referred to as a weighting coefficient or weighting) of the synapse portion depending on the capacity of the integration circuit and the level of the filling rate. Then, the product of the output value of a certain neuron unit and the weighting coefficient of the synapse unit and the like is repeatedly added and summed, thereby affecting the degree and speed of accumulation of the internal potential of the neuron unit.
 上述したように、本発明に係るニューラルネットワーク構造は、複数のニューロン部と、複数のニューロン部を結びつけるシナプス部とを有し、一のニューロン部の発火または未発火の出力値とシナプス部の重み係数との積を1回以上加算した総和に応じて、シナプス部を介して一のニューロン部と接続されている他のニューロン部が発火するニューラルネットワーク構造である。また、このような処理要素を含む構造を、コンピュータ上で実行されるソフトウェアのアルゴリズムとして実装することは、当業者であれば可能であろう。たとえば、シナプス部がそれぞれ重み係数w1~w3を有するとすると、ニューロン部の内部電位を示す関数Fは、式(1)で表すことができる。
  F=In1*w1+In2*w2+In3*w3 ・・・式(1)
    (ただし、In1~In3:0または1)
 このようなアルゴリズムを含むソフトウェアをノイマン型のコンピュータ
上で実行することは可能である。
As described above, the neural network structure according to the present invention has a plurality of neuron units and a synapse unit that connects the plurality of neuron units, and the output value of firing or unfired one neuron unit and the weight of the synapse unit This is a neural network structure in which another neuron unit connected to one neuron unit is ignited via a synapse unit according to a sum obtained by adding a product with a coefficient one or more times. Moreover, those skilled in the art will be able to implement such a structure including processing elements as an algorithm of software executed on a computer. For example, assuming that the synapse parts have weight coefficients w1 to w3, the function F indicating the internal potential of the neuron part can be expressed by Expression (1).
F = In1 * w1 + In2 * w2 + In3 * w3 Formula (1)
(However, In1 to In3: 0 or 1)
Software including such an algorithm can be executed on a Neumann computer.
<第1実施例>
 図2を参照して、本実施例のニューラルネットワーク構造を説明する。このニューラルネットワーク構造は、ループ構造を備える。ループ構造とは、本図に示すように、ニューロン部1の発火(1)による出力値が、他のニューロン部2~10およびそれらのシナプス部を介して、元のニューロン部1自身に伝播する構造を言う。本実施例にループ構造は、10個のニューロン部から構成されているが、本発明において個数が限定されることはない。これらの10個のニューロン部は、シナプス部を介してループ状に接続されている。
<First embodiment>
The neural network structure of the present embodiment will be described with reference to FIG. This neural network structure has a loop structure. In the loop structure, as shown in this figure, the output value due to the firing (1) of the neuron unit 1 propagates to the original neuron unit 1 itself through the other neuron units 2 to 10 and their synapse units. Say structure. In this embodiment, the loop structure is composed of ten neuron parts, but the number is not limited in the present invention. These ten neuron portions are connected in a loop via a synapse portion.
 ニューロン部1は、何らかのきっかけにより興奮状態となり発火(1)が行われる。その発火(1)を入力されたニューロン部2は、興奮状態となり発火(2)が行われる。以降同様に、ニューロン部Nは興奮状態となり発火Nが行われる。ニューロン部10で発火(10)が行われると、その発火(10)はニューロン部1に入力される。そして、発火(10)を入力されたニューロン部1は、また興奮状態となり発火(1)が行われる。したがって、ループ構造内のいずれかのニューロン部が一度興奮状態となると、次々と興奮状態が伝播し、それが繰り返されるので、興奮状態が維持された状態となる。 The neuron unit 1 becomes excited by some kind of trigger and is ignited (1). The neuron unit 2 to which the firing (1) is input is in an excited state, and firing (2) is performed. Thereafter, similarly, the neuron part N is in an excited state and firing N is performed. When firing (10) is performed in the neuron unit 10, the firing (10) is input to the neuron unit 1. Then, the neuron unit 1 to which the firing (10) is input becomes excited again and firing (1) is performed. Therefore, once any one of the neurons in the loop structure is in an excited state, the excited state is propagated one after another and is repeated, so that the excited state is maintained.
 人間の脳には、運動系ループ、眼球運動系ループ、辺縁系ループ、前頭前野ループの4つのループがあることが知られているが、本発明におけるニューラルネットワーク構造におけるループ構造は、前頭前野ループに相当するものと考えられる。前頭前野ループが脳波におけるベータ波に相当するとすれば、毎秒約20周回しており、これは、1ニューロン当たり2ミリ秒の伝達時間を要するので、ニューロン25個分に相当すると考えられる。 The human brain is known to have four loops: a motor loop, an eye movement loop, a limbic loop, and a prefrontal loop. The loop structure in the neural network structure in the present invention is a prefrontal cortex. It is considered to correspond to a loop. If the prefrontal loop corresponds to a beta wave in the electroencephalogram, the loop is about 20 laps per second, which requires a transmission time of 2 milliseconds per neuron, and is considered to correspond to 25 neurons.
 なお、ループ構造内に含まれるニューロン部におけるシナプス部の重み係数は、本図(B)に示すように、後段のニューロン部のみに対しては必ず興奮状態になり発火する重み係数(表中「9」)が設定されている。なお、本明細書においては、ニューロン部は、出力値とシナプス部の重み係数との積を1回以上加算した総和が8以上となると興奮するものとする。 Note that the weighting coefficient of the synapse part in the neuron part included in the loop structure is a weighting factor that is always in an excited state and fires only for the subsequent neuron part as shown in FIG. 9 ") is set. In this specification, it is assumed that the neuron portion is excited when the sum of products of the output value and the weighting factor of the synapse portion is added once or more is 8 or more.
 また、ループ構造内のあるニューロン部(本図(A)ではニューロン部1)は、このループ構造に含まれないニューロン部にも接続されており、興奮状態となり発火した場合には、そのループ構造外のニューロン部にも発火が伝播される。ニューロン部1は、ニューロン部2に対して発火(1)と2周目の発火(11)を伝播すると共に、ループ構造外のニューロン部に対しても発火(1)、2周目の発火(11)、3周目の発火(21)、、、というように伝播する。したがって、ループ構造があることにより、ループ構造に含まれない他のニューロン部に対して、一定の時間間隔をおいて繰り返し興奮するように発火を伝播することができる。 Further, a certain neuron part in the loop structure (the neuron part 1 in this figure (A)) is also connected to a neuron part that is not included in this loop structure, and when it becomes excited and ignites, the loop structure Firing is also transmitted to the outer neuron. The neuron unit 1 propagates the firing (1) and the firing of the second round (11) to the neuron unit 2, and also fires (1) to the neuron part outside the loop structure (1) and the firing of the second round ( 11) Propagation in the third round of ignition (21), and so on. Therefore, by having the loop structure, it is possible to propagate the firing so as to repeatedly excite at certain time intervals to other neuron parts not included in the loop structure.
 図3(A)に示すように、ループ構造が存在しないニューラルネットワーク構造においては、仮にあるニューロン部が1回刺激を受けても、そのニューラルネットワーク構造に含まれる後段のニューロン部にそれが伝播するとは限らず、途中で発火が行われなくなることがある。本図では、発火(1)と発火(2)が行われたが、そこで止まったことを示している。そうすると、このニューラルネットワーク構造は、多くのニューロン部が興奮状態となることがないので、興奮状態または興奮状態に近い状態が維持された一連のニューロン部をなかなか形成することができない。 As shown in FIG. 3A, in a neural network structure in which no loop structure exists, even if a neuron part receives a single stimulus, it propagates to a subsequent neuron part included in the neural network structure. There is no limit, and ignition may stop during the process. In this figure, it has shown that although the ignition (1) and the ignition (2) were performed, it stopped there. In this case, this neural network structure cannot easily form a series of neuron portions in which the excited state or the state close to the excited state is maintained because many neuron portions do not become excited.
 一方、図3(B)に示すように、ループ構造が存在するニューラルネットワーク構造においては、ループ構造内のあるニューロン部が一旦興奮状態となると、一定の時間間隔をおいて繰り返し興奮するようになるので、ループ構造外のニューロン部に対して一定の時間間隔をおいて繰り返し発火を伝播させることができる。そうすると、1周目では、ループ構造外のニューロン部では発火(1)と発火(2)で止まったかもしれないが、2周目では発火(3)と発火(4)が行われ、3周目では発火(5)と発火(6)が行われることとなり、興奮状態となるニューロン部が多くなり次々に発火が伝播するようになる。したがって、興奮状態または興奮し易い状態(潜在的興奮状態)のニューロン部が多くなり、所定の機能を発現する一連のニューロン部を形成しやすくなる。また、このような繰り返し刺激を受けることで、ニューロン部とニューロン部との結び付きが強くなり、所定の機能を発現する一連のニューロン部を形成しやすくなる。 On the other hand, as shown in FIG. 3B, in a neural network structure in which a loop structure exists, once a certain neuron part in the loop structure is in an excited state, it is repeatedly excited at a certain time interval. Therefore, the firing can be propagated repeatedly to the neuron portion outside the loop structure at a constant time interval. Then, in the first lap, the neuron part outside the loop structure may have stopped by firing (1) and firing (2), but in the second lap, firing (3) and firing (4) are performed. In the eyes, firing (5) and firing (6) are performed, and the number of neuron portions that are in an excited state increases, and the firing spreads one after another. Accordingly, the number of neuron portions in an excited state or a state in which they are easily excited (potential excited state) increases, and a series of neuron portions that express a predetermined function can be easily formed. Moreover, by receiving such repeated stimulation, the connection between the neuron part and the neuron part becomes strong, and it becomes easy to form a series of neuron parts that express a predetermined function.
 このように、ニューラルネットワーク構造においてループ構造が含まれることにより、そのループ構造に含まれるあるニューロン部が発火による出力値すなわち刺激を受けるとその刺激がループ構造内を次々と伝播してゆくこととなるため、ループ構造内で刺激が連続的に伝播しつづけるように機能させ、また、ループ構造内のニューロン部がループ構造内の刺激が伝播される都度一定の時間間隔をおいて繰り返し興奮するように機能させることができるニューラルネットワーク構造を提供することができる。 As described above, when a neural network structure includes a loop structure, when a neuron part included in the loop structure receives an output value, i.e., a stimulus by firing, the stimulus propagates in the loop structure one after another. Therefore, it is made to function so that the stimulus continuously propagates in the loop structure, and the neuron part in the loop structure is repeatedly excited at a certain time interval every time the stimulus in the loop structure is propagated. It is possible to provide a neural network structure capable of functioning.
 また、本実施例におけるニューラルネットワーク構造は、上述したループ構造のニューラルネットワーク構造と、そのループ構造を伝播することによりあるニューロン部が発火することによって出力する出力値を入力値として受け取る他のニューラルネットワーク構造とを備える。図3(B)に示すように、ループ構造外にあるニューラルネットワーク構造では、一定の時間間隔をおいて繰り返し刺激を受けることで、興奮状態または興奮し易い状態のニューロン部が多くなり、所定の機能を発現しやすくなる。このように、ループ構造に含まれるあるニューロン部から刺激を受ける、ループ構造に含まれない他のニューラルネットワーク構造において、繰り返しその刺激を受けるたびに、ニューロン部の興奮を高めることができる。 Further, the neural network structure in this embodiment includes the above-described neural network structure of the loop structure and another neural network that receives as an input value an output value that is output when a certain neuron unit ignites by propagating the loop structure. And a structure. As shown in FIG. 3 (B), in the neural network structure outside the loop structure, the neuron portion in an excited state or a state in which it is easily excited increases by receiving a stimulus repeatedly at a constant time interval. It becomes easy to express the function. In this way, in other neural network structures that receive a stimulus from a certain neuron part included in the loop structure and are not included in the loop structure, the excitement of the neuron part can be increased each time the stimulus is repeatedly received.
 興奮状態または潜在的興奮状態のニューロン部が多くなり、所定の機能を発現しやすくなるとは、脳情報処理機能において、思考を深めてゆくと結論が導出されやすくなることと同義である。ループ構造における伝播が1回以上行われ1回以上の入力を受け取るループ構造の外のニューラルネットワーク構造において、一連のニューロン部は、発火による出力値から入力を受け取ることで総和が閾値以上となり、所定の機能を発現する。 The fact that the number of neurons in the excited state or the latent excited state is increased and the predetermined function is easily expressed is synonymous with the fact that the conclusion is easily derived when the thought is deepened in the brain information processing function. In a neural network structure outside the loop structure that receives the input at least once and is propagated at least once in the loop structure, the series of neuron units receives the input from the output value by firing, and the sum exceeds the threshold value. The function of
 総和が閾値以上となることにより一連のニューロン部となった所定の機能とは、たとえば、ニューロン部の総和が閾値以上となり必ず発火することで所謂繋がった状態になり一群の概念を形成することである。図4を参照して、ニューロン部が発火する状態(繋がった状態)となっている一連のニューロン部が「意味」を備えることを説明する。たとえば、本発明にかかるニューラルネットワーク構造では、『「シマウマ」』という概念が一定回数以上刺激されると、「シマウマ」ニューロン部が形成される。 A given function that has become a series of neuron parts when the sum is greater than or equal to a threshold is, for example, that when the sum of the neuron parts is greater than or equal to a threshold and fires without fail, it becomes a so-called connected state and forms a group of concepts. is there. With reference to FIG. 4, it will be described that a series of neuron units having a state in which the neuron units are fired (connected state) has “meaning”. For example, in the neural network structure according to the present invention, when the concept of ““ Zebra ”” is stimulated more than a certain number of times, a “Zebra” neuron portion is formed.
 また、『「シマウマ」の模様は?』という問いと、それに対する答えを教えられると、「シマウマ」ニューロン部と「縞」ニューロン部が、また、「模様」ニューロン部と「縞」ニューロン部がほとんど同時に興奮するので、それぞれのニューロンを結ぶシナプスの結合強度が高まる。そして、同時興奮がある程度繰り返されると、「シマウマ」ニューロン部と「模様」ニューロン部に対して、「縞」ニューロン部が繋がった状態となる。これで、「シマウマ」ニューロン部が持つ「シマウマ」の概念と、「模様」ニューロン部が持つ「模様」の概念に対して、「縞」模様の概念が関連付けられる。 Also, “What about“ Zebra ”? , And the answer to that question, the “zebra” and “stripes” neurons are excited at the same time, and the “pattern” and “stripes” neurons are almost simultaneously excited. Increases the strength of synapses that connect. When the simultaneous excitement is repeated to some extent, the “striped” neuron part is connected to the “zebra” neuron part and the “pattern” neuron part. Thus, the concept of “stripes” pattern is associated with the concept of “zebra” possessed by the “zebra” neuron part and the concept of “pattern” possessed by the “pattern” neuron part.
 また、『「シマウマ」の形は?』という問いと、それに対する答えを教えられると、「シマウマ」ニューロン部と「ウマ形状」ニューロンが、また、「形状」ニューロン部と「ウマ形状」ニューロン部がほとんど同時に興奮するので、それぞれのニューロンを結ぶシナプスの結合強度が高まる。そして、その同時興奮がある程度繰り返されると、「シマウマ」ニューロン部と「ウマ形状」ニューロン部に対して、「縞」ニューロン部が繋がった状態となる。これで、「シマウマ」ニューロン部が持つ「シマウマ」の概念と、「形状」ニューロン部が持つ「形状」の概念に対して、「ウマ形状」の概念が関連付けられる。 Also, “What is the shape of“ Zebra ”? ”And the answer to it, the“ Zebra ”and“ Equine shape ”neurons are excited at the same time, and the“ Shape ”and“ Equine shape ”neurons are almost simultaneously excited. Increases the strength of synaptic connections. When the simultaneous excitement is repeated to some extent, the “striped” neuron part is connected to the “zebra” neuron part and the “horse shape” neuron part. Thus, the concept of “equine shape” is associated with the concept of “zebra” possessed by the “zebra” neuron unit and the concept of “shape” possessed by the “shape” neuron unit.
 また、「シマウマ」ニューロン部は、日本語の「音」ニューロン部である「し」ニューロン部と、「ま」ニューロン部と、「う」ニューロン部と、「ま」ニューロン部とが一連に繋がった日本語の名前の「単語」ニューロン部である「しまうま」ニューロン部と繋がった状態となる。また、「縞」ニューロン部は、日本語の「音」ニューロン部である「し」ニューロン部と、「ま」ニューロン部とが一連に繋がった日本語の名前の「単語」ニューロン部である「しま」ニューロン部と繋がった状態となる。 In addition, the “Zebra” neuron is a series of Japanese “Sound” neuron, “Shi” neuron, “Ma” neuron, “U” neuron, and “Ma” neuron. The Japanese word “word” neuron part is connected to the “sumama” neuron part. In addition, the “stripes” neuron part is a “word” neuron part with a Japanese name in which a “shi” neuron part that is a Japanese “sound” neuron part and a “ma” neuron part are connected in series. Shimashima "is connected to the neuron.
 また、「シマウマ」ニューロン部が持つ概念の上位概念は、「動物」という概念であることを一定回数以上教えられると、「シマウマ」ニューロン部は、「動物」という概念を持つ「動物」ニューロン部と繋がった状態となる。そして、「動物」ニューロン部が持つ概念の下位概念は、「ライオン」や「キリン」といった概念であることを一定回数以上教えられると、「動物」という概念を持つ「動物」ニューロン部は、「ライオン」ニューロン部や「キリン」ニューロン部と繋がった状態となる。また、「どうぶつ」ニューロン部は、日本語の「音」ニューロン部である「ど」ニューロン部と、「う」ニューロン部と、「ぶ」ニューロン部と、「つ」ニューロン部とが一連に繋がった日本語の名前の「単語」ニューロン部である「どうぶつ」ニューロン部と繋がった状態となる。 In addition, if the superordinate concept of the concept of the “Zebra” neuron is the concept of “Animal”, the “Zebra” neuron is the “Animal” neuron that has the concept of “Animal”. It becomes the state connected with. And if you have been taught more than a certain number of times that the concept of the “animal” neuron part is a concept such as “lion” or “giraffe”, the “animal” neuron part with the concept of “animal” It becomes connected to the “lion” neuron and “giraffe” neuron. In addition, the “Animal” neuron unit is a series of “Do” neuron unit, “U” neuron unit, “Bu” neuron unit, and “Tsu” neuron unit, which are Japanese “sound” neuron units. It becomes connected to the “animal” neuron part, which is the “word” neuron part of the Japanese name.
 このように、「シマウマ」ニューロン部が意味するところは、「シマウマ」ニューロン部が繋がった状態となっている他のニューロン部によって決まってくる。ニューラルネットワーク構造において、このようなニューロン部に繋がりにより概念や意味を与えることで、従来技術に比べると飛躍的に効率が良くなる。たとえば、ノイマン型のコンピュータ上で従来のプログラムで同様なことを実現しようとする、単語の途中で区切った場合に意味があるかどうかを調べたり、その概念の「模様」や「形状」などの特徴をあらかじめ記憶させておいて検索したりする必要がある。しかし、単語を区切る組み合わせは様々にあるので演算量が膨大になる。また、特徴を網羅的に記憶させるためには、大きな記憶領域が必要になる。いずれの場合も、あまり効率的ではない。 Thus, what is meant by a “zebra” neuron is determined by another neuron that is connected to the “zebra” neuron. In a neural network structure, the concept and meaning are given by connecting to such a neuron part, and the efficiency is dramatically improved as compared with the prior art. For example, if you try to achieve the same thing with a conventional program on a Neumann computer, check whether it is meaningful if you break it in the middle of a word, or check the concept of "pattern" or "shape" It is necessary to store the features in advance and perform a search. However, since there are various combinations for separating words, the amount of calculation is enormous. Also, a large storage area is required to comprehensively store the features. In either case, it is not very efficient.
 このように、ループ構造に含まれるあるニューロン部から刺激を受ける、ループ構造に含まれない他のニューラルネットワーク構造において、繰り返しその刺激を受けることによって、ニューロン部の興奮が次第に高められ、興奮が閾値を越えて発火したニューロン部が増加し、刺激がより広範囲に伝播するような一連のニューロン部を形成し、所定の機能を発現させることができる。また、ニューラルネットワーク構造において接続された一連のニューロン部が形成され、その一連のニューロン部に対応する所定の機能が発現することで、ある概念に関するニューロン部が、その概念に関連する概念に関するニューロン部と接続され、関連する概念群により所謂「意味」をニューラルネットワーク構造内に持たせることができる。 In this way, in other neural network structures not included in the loop structure that receive stimulation from a certain neuron part included in the loop structure, the excitation of the neuron part is gradually increased by receiving the stimulation repeatedly, and the excitement becomes a threshold value. The number of neuron portions that have fired beyond the range increases, and a series of neuron portions in which the stimulus propagates over a wider range can be formed, and a predetermined function can be expressed. In addition, a series of neuron units connected in the neural network structure are formed, and a predetermined function corresponding to the series of neuron units is expressed, so that a neuron unit related to a concept changes to a neuron unit related to a concept related to the concept. And the so-called “meaning” can be given to the neural network structure by the related concept group.
 上述したように、ニューラルネットワーク構造において刺激がより広範囲に伝播するような一連のニューロン部を形成し、所謂「意味」をニューラルネットワーク構造内に持たせることができることにより、従来から「強いAI」に対して議論のある記号接地問題を解決することができる。図5(A)に示すように、階層型ニューラルネットワークでは、1つのニューロン部をある概念に対応させることはできるが、そのニューロン部に対して、その概念に関連する形状、性質、特徴、名称などの、いわゆる意味を持たせることはできないという記号接地問題を有すると言われている。 As described above, a series of neuron portions in which a stimulus propagates in a wider range in a neural network structure can be formed, and so-called “meaning” can be provided in the neural network structure. On the other hand, the disputed symbol grounding problem can be solved. As shown in FIG. 5A, in a hierarchical neural network, one neuron part can be associated with a certain concept, but the shape, property, feature, and name associated with that concept are associated with that neuron part. It is said that it has a symbol grounding problem that it cannot give a so-called meaning.
 あるシンボル(記号)があった場合、それが意味するものと結びつけることが必要だが、従来型の情報処理システムにおいては解決されていない。たとえば、シマウマは、シマのある馬だということが人間には理解できるが、従来の情報処理システムにはできない。本発明にかかるニューラルネットワーク構造では、ニューロン部の意味は、図4や図5(B)に示すように、そのニューロン部が結びつくものによって決まるとの考え方に立ち、ある概念を意味するニューロン部に対して、名前、形状、他の概念のニューロン部との結びつきを形成すれば、それらがその概念ニューロンの意味そのものとなる。その結果、物事の概念や意味を理解することで、本当の意味での自然な会話や、人間が一般的に処理している事物を扱える強い人工知能を作ることができる。 When there is a certain symbol (symbol), it is necessary to link it with what it means, but it has not been solved in conventional information processing systems. For example, humans can understand that a zebra is a horse with a zebra, but a conventional information processing system cannot. In the neural network structure according to the present invention, as shown in FIG. 4 and FIG. 5 (B), the meaning of the neuron part is based on the idea that the neuron part is determined by what the neuron part is connected to. On the other hand, if the name, shape, and connection with the neuron part of another concept are formed, these become the meaning of the concept neuron itself. As a result, understanding the concept and meaning of things can create true natural conversations and strong artificial intelligence that can handle things that humans generally process.
<第2実施例>
 図6を参照して、本実施例のニューラルネットワーク構造を説明する。なお、本実施例のニューラルネットワーク構造は、上記実施例のニューラルネットワーク構造を含むものとする。本実施例のニューラルネットワーク構造は、2つのニューロン部の論理積を算出する論理積構造、2つのニューロン部の論理和を算出する論理和構造、ニューロン部の論理否定構造、状態を保持することができるラッチ構造およびフリップフロップ構造のニューロン部を備える。これらの構造は、総和が閾値以上となることにより一連のニューロン部となった所定の機能の発現でもある。
<Second embodiment>
The neural network structure of the present embodiment will be described with reference to FIG. The neural network structure of this embodiment includes the neural network structure of the above embodiment. The neural network structure of this embodiment can hold a logical product structure for calculating a logical product of two neuron parts, a logical sum structure for calculating a logical sum of two neuron parts, a logical negation structure of a neuron part, and a state. It has a latch structure and a flip-flop neuron part. These structures are also the expression of a predetermined function that becomes a series of neuron portions when the sum is equal to or greater than a threshold value.
 図6(A)は、論理積構造(AND回路)を示す。論理積構造を有するニューラルネットワーク構造は、ニューロン部1とニューロン部3およびニューロン部2とニューロン部3が重み係数4のシナプス部で接続されており、ニューロン部1とニューロン部2の両方が興奮状態となり発火した場合にはニューロン部3が発火する。いずれか一方が発火しない場合には、ニューロン部3は発火しない。 FIG. 6A shows a logical product structure (AND circuit). In the neural network structure having a logical product structure, the neuron unit 1 and the neuron unit 3 and the neuron unit 2 and the neuron unit 3 are connected by a synapse unit having a weighting coefficient 4, and both the neuron unit 1 and the neuron unit 2 are in an excited state. In the event of firing, the neuron unit 3 fires. When either one does not fire, the neuron unit 3 does not fire.
 図6(B)は、論理和構造(OR回路)を示す。論理和構造を有するニューラルネットワーク構造は、ニューロン部1とニューロン部3およびニューロン部2とニューロン部3が重み係数9のシナプス部で接続されており、ニューロン部1とニューロン部2のいずれかが興奮状態となり発火した場合にはニューロン部3が発火する。いずれも発火しない場合には、ニューロン部3は発火しない。 FIG. 6B shows a logical OR structure (OR circuit). In the neural network structure having a logical sum structure, the neuron unit 1 and the neuron unit 3 and the neuron unit 2 and the neuron unit 3 are connected by a synapse unit having a weighting coefficient 9, and either the neuron unit 1 or the neuron unit 2 is excited. When it becomes a state and fires, the neuron part 3 fires. If neither fires, the neuron unit 3 does not fire.
 図6(C)は、論理否定構造(NOT回路)を示す。論理否定構造を有するニューラルネットワーク構造は、ニューロン部1とニューロン部2が重み係数-9のシナプス部で接続されており、ニューロン部1が興奮状態となり発火した場合には、他のニューロン部からの刺激があったとしても、ニューロン部2は発火しにくくなる。ニューロン部1が発火しない場合には、他のニューロン部からの刺激があれば、ニューロン部2は発火する可能性がある。 FIG. 6C shows a logical negation structure (NOT circuit). In the neural network structure having a logical negation structure, the neuron unit 1 and the neuron unit 2 are connected by a synapse unit having a weight coefficient of -9, and when the neuron unit 1 is excited and ignites, Even if there is a stimulus, the neuron part 2 is difficult to ignite. If the neuron unit 1 does not fire, the neuron unit 2 may fire if there is a stimulus from another neuron unit.
 図6(D)および(E)は、値や状態を保持するためのラッチ構造(Latch回路)およびフリップフロップ構造(Flip-Flop回路)を示す。ラッチ構造では、ニューロン部1が発火してもニューロン部2に他から発火が伝達されないとニューロン部2は発火しない。フリップフロップ構造では、ニューロン部1が発火するとニューロン部3の興奮状態が維持され、ニューロン部2が発火するとニューロン部4の興奮状態が維持される。なお、ニューラルネットワーク構造は、論理積構造、論理和構造および論理否定構造を有していれば、ラッチ構造およびフリップフロップ構造と同値の構造を形成することができる。 6 (D) and 6 (E) show a latch structure (Latch circuit) and a flip-flop structure (Flip-Flop circuit) for holding values and states. In the latch structure, even if the neuron unit 1 fires, the neuron unit 2 does not fire unless fire is transmitted to the neuron unit 2 from others. In the flip-flop structure, the excited state of the neuron unit 3 is maintained when the neuron unit 1 fires, and the excited state of the neuron unit 4 is maintained when the neuron unit 2 fires. If the neural network structure has a logical product structure, a logical sum structure, and a logical negation structure, a structure equivalent to the latch structure and the flip-flop structure can be formed.
 このように、ニューロン部同士の論理積演算部および論理和演算部、およびニューロン部の論理否定演算部を備えることで、ニューロン部のすべての演算処理が可能となる。さらに、論理積算出部、論理和算出部、論理否定算出部の組み合わせによりラッチ構造やフリップフロップ構造などのような、複雑な論理構造やより高次の機能を発現する一連のニューロン部を形成することがでる。 Thus, by including the logical product operation unit and the logical sum operation unit between the neuron units and the logical negation operation unit of the neuron unit, all the arithmetic processing of the neuron unit can be performed. Furthermore, a combination of logical product calculation unit, logical sum calculation unit, and logical negation calculation unit forms a series of neuron units that express complex logical structures and higher-order functions such as latch structures and flip-flop structures. It comes out.
 図7および図8を参照して、それぞれの構造の使用例を示す。図7(A)は、論理積構造の例である。「シマ」の概念を持つニューロン部Aと「ウマ」の概念を持つニューロン部Bとが論理積構造に入力されると、「シマウマ」という概念を持つニューロン部Cが興奮する。「シマ」ニューロン部だけまたは「ウマ」ニューロン部だけでは「シマウマ」ニューロン部は興奮しない。 Referring to FIG. 7 and FIG. 8, examples of using each structure are shown. FIG. 7A illustrates an example of a logical product structure. When the neuron part A having the concept of “sima” and the neuron part B having the concept of “horse” are input to the logical product structure, the neuron part C having the concept of “zebra” is excited. A “zebra” neuron is not excited only by a “sima” neuron or only by a “horse” neuron.
 本図(B)は、論理和構造の例である。「シマウマ」の音を持つニューロン部Dと「シマウマ」の画像(イメージ)を持つニューロン部Eのいずれかが論理和構造に入力されると、「シマウマ」の概念を持つニューロン部Cが興奮する。無論、「シマウマ」の音を持つニューロン部Dと「シマウマ」の画像(イメージ)を持つニューロン部Eとの両方が論理和構造に入力されても、「シマウマ」の概念を持つニューロン部Cが興奮する。 (B) is an example of a logical sum structure. When either the neuron part D having the sound of “zebra” or the neuron part E having the image of “zebra” is input to the logical sum structure, the neuron part C having the concept of “zebra” is excited. . Of course, even if both the neuron part D having the sound of “zebra” and the neuron part E having the image (image) of “zebra” are inputted into the logical sum structure, the neuron part C having the concept of “zebra” Excited.
 本図(C)は、論理否定構造の例である。「シマウマ」の概念を持つニューロン部Cが入力されると、「本物のウマではない」という概念を持つニューロン部Fは興奮しない。 (C) is an example of a logical negation structure. When the neuron part C having the concept of “zebra” is input, the neuron part F having the concept of “not a real horse” is not excited.
 本図(D)は、フリップフロップ構造の例である。「シマウマ」の音を持つニューロン部Dが入力されると、「シマウマ」の概念を持つニューロン部Cの興奮が持続し、「ウマ」の音を持つニューロン部Gが入力されると、「ウマ」の概念を持つニューロン部Bの興奮が持続する。 (D) is an example of a flip-flop structure. When the neuron part D having the sound of “zebra” is input, the excitement of the neuron part C having the concept of “zebra” is sustained, and when the neuron part G having the sound of “horse” is input, The excitement of the neuron part B having the concept of “is sustained”.
 図8は、ラッチ構造の例である。この例では2つの数字の足し算を示す。「2と」という言葉に対応するニューロン部11と、「3を」という言葉に対応するニューロン部12と、「足す(和)」という言葉に対応するニューロン部13が、それぞれ、「2」という数字の概念のニューロン部21、「3」という数字の概念のニューロン部22、「足す」という演算概念のニューロン部23を興奮させ、全部のニューロンの興奮がそろったタイミングでタイミング信号を送ると、後段の「5」という数字の概念のニューロン部3が興奮する。 FIG. 8 shows an example of a latch structure. This example shows the addition of two numbers. The neuron unit 11 corresponding to the word “2”, the neuron unit 12 corresponding to the word “3”, and the neuron unit 13 corresponding to the word “add (sum)” are each called “2”. When the neuron unit 21 of the number concept, the neuron unit 22 of the number concept “3”, and the neuron unit 23 of the calculation concept “add” are excited, and a timing signal is sent at the timing when all the neurons are excited, The neuron part 3 having the concept of the number “5” in the latter stage is excited.
 また、ある機能を発現する一連のニューロン部は、他の機能を発現する1以上の一連のニューロン部とあらかじめ接続されており、かつ、総和が閾値以上となることで発火してもよい。たとえば、「シ」という「音」ニューロン部と「マ」という「音」ニューロン部からなる一連の「音」ニューロン部である「シマ」の音を持つ「音」ニューロン部と、「ウ」という「音」ニューロン部と「マ」という「音」ニューロン部からなる一連の「音」ニューロン部である「ウマ」の音を持つ「音」ニューロン部と、を論理積構造に入力すると発火して、新たな「シマウマ」の音を持つ「音」ニューロン部を形成することができる。このように、関連する概念群である「意味」が発火することで、より広い概念群としてのより高次の「意味」の発火が容易になる状況を形成することができる。 In addition, a series of neuron parts that express a certain function may be preliminarily connected to one or more series of neuron parts that express other functions, and may be ignited when the sum exceeds a threshold value. For example, a “sound” neuron unit consisting of a “sound” neuron unit consisting of a “sound” neuron unit called “shi” and a “sound” neuron unit called “ma”; When a “sound” neuron unit consisting of a “sound” neuron unit consisting of a “sound” neuron unit and a “sound” neuron unit called “ma” is input to a logical structure, The "sound" neuron part with a new "zebra" sound can be formed. In this way, by igniting the “meaning” that is a related concept group, it is possible to form a situation that facilitates the firing of a higher-order “meaning” as a wider concept group.
 本実施例のニューラルネットワーク構造は、さらに、図9に示す発振回路(発振部)を備えてもよい。発振回路は、第1のニューロン部1と、第2のニューロン部2および他のニューロン部3とから構成される。なお、他のニューロン部3は、本図では1つ示すが、これに限定されず1つ以上あってもよい。発振回路は、ニューロン部1と、ニューロン部2とニューロン部3からなり第2のニューロン部2の発火による出力値がニューロン部3を介してニューロン部2自身に伝播する構造を有する発振ニューロン部とを有する。 The neural network structure of the present embodiment may further include an oscillation circuit (oscillation unit) shown in FIG. The oscillation circuit is composed of a first neuron unit 1, a second neuron unit 2, and another neuron unit 3. In addition, although the other neuron part 3 is shown in this figure, it is not limited to this, There may be one or more. The oscillation circuit includes a neuron unit 1, an oscillation neuron unit that includes a neuron unit 2 and a neuron unit 3 and has a structure in which an output value generated by firing of the second neuron unit 2 is propagated to the neuron unit 2 itself via the neuron unit 3. Have
 ニューロン部1からニューロン部2への方向で結びつけるシナプス部は、1回の総和でニューロン部2が発火する結合強度である発火重み係数9を有する。また、発振ニューロン部内のニューロン部同士すなわちニューロン部2とニューロン部3を結びつけるシナプス部は、1回の総和でニューロン部がそれぞれ発火する結合強度である発火重み係数9を有する。発振ニューロン部内のニューロン部2およびニューロン部3からニューロン部1への方向で結びつけるシナプス部は、1回の総和が閾値未満のため第1のニューロン部が発火しない結合強度である未発火重み係数4を有する。 The synapse part connected in the direction from the neuron part 1 to the neuron part 2 has a firing weight coefficient 9 which is a connection strength that the neuron part 2 fires in one sum. In addition, the neuron units in the oscillation neuron unit, that is, the synapse unit that connects the neuron unit 2 and the neuron unit 3 have a firing weight coefficient 9 that is a coupling strength at which each neuron unit fires in one sum. The synapse unit connected in the direction from the neuron unit 2 and the neuron unit 3 to the neuron unit 1 in the oscillation neuron unit is an unfired weight coefficient 4 that is a connection strength at which the first neuron unit does not fire because the sum of one time is less than the threshold value. Have
 ニューロン部1は、外部から刺激(1)を受けると、ニューロン部2に対して必ず発火(2)が行われ、ニューロン部3に対しては発火しない。発火(2)すなわち刺激(2)を受けたニューロン部2は、ニューロン部1からニューロン部2への方向に結びつけるシナプス部は1回の総和で発火する重み係数9を有しているのでニューロン部3に対しては、発火(3)が行われると共に、ニューロン部1の方へ結びつけるシナプス部は重み係数4の1回の総和では発火しない未発火重み係数を有するため、ニューロン部1に対しても発火(3)は行われるが、それだけでニューロン部1は興奮状態とはならない。 When the neuron unit 1 receives a stimulus (1) from the outside, the neuron unit 2 always fires (2) and does not fire the neuron unit 3. The neuron unit 2 that has received the firing (2), i.e., the stimulus (2) has a weighting coefficient 9 that fires in a single sum because the synapse unit connected in the direction from the neuron unit 1 to the neuron unit 2 has a neuron unit. 3, firing (3) is performed, and the synapse part linked to the neuron part 1 has an unfired weight coefficient that is not fired by one summation of the weighting coefficient 4. Although firing (3) is performed, the neuron part 1 is not in an excited state by itself.
 発火(3)すなわち刺激(3)を受けたニューロン部3は、重み係数9で以ってニューロン部2に対して発火(4)が行われると共に、重み係数4で以ってニューロン部1に対して発火(4)が行われる。そうすると、刺激(4)を受けたニューロン部2はニューロン部1とニューロン部3に対して発火(5)を行い、刺激(4)を受けたニューロン部1はこれだけでは興奮状態とはならない。 The neuron unit 3 that has received the firing (3), i.e., the stimulus (3), is fired (4) to the neuron unit 2 with the weighting factor 9, and is also applied to the neuron unit 1 with the weighting factor 4. On the other hand, ignition (4) is performed. Then, the neuron unit 2 that has received the stimulus (4) fires (5) the neuron unit 1 and the neuron unit 3, and the neuron unit 1 that has received the stimulus (4) is not in an excited state by itself.
 以降、同様に、各ニューロン部の間で発火(6)、(7)、(8)、、、と行われることになる。これにより、一旦、ニューロン部1に対して刺激が伝わると、ニューロン部2とニューロン部3の間で互いに発火/刺激を繰り返し行い、ニューロン部2またはニューロン部3のどちらかが必ず興奮状態におかれることになる。そして、ニューロン部2とニューロン部3は、ニューロン部1に対して未発火重み係数で以って刺激を与えているので、ニューロン部1は興奮状態とはならないもの、興奮しやすい状態(潜在的興奮状態)が維持されることとなる。 Thereafter, similarly, firing (6), (7), (8), etc. is performed between each neuron unit. As a result, once a stimulus is transmitted to the neuron unit 1, the neuron unit 2 and the neuron unit 3 repeatedly fire / stimulate each other, and either the neuron unit 2 or the neuron unit 3 is always in an excited state. Will be. Since the neuron unit 2 and the neuron unit 3 give stimulation to the neuron unit 1 with an unfired weighting coefficient, the neuron unit 1 is not in an excited state, but is in an easily excited state (potentially Excited state) will be maintained.
 すなわち、本ニューラルネットワーク構造は、第1のニューロン部1と、第2のニューロン部2および他のニューロン部3からなり第2のニューロン部の発火による出力値が他の複数のニューロン部を介して第2のニューロン部自身に伝播する構造を有する発振ニューロン部とを有し、第1のニューロン部から第2のニューロン部への方向で結びつけるシナプス部は、1回の総和で第2のニューロン部が発火する発火重み係数を有し、発振ニューロン部内のニューロン部同士を結びつけるシナプス部は、1回の総和でニューロン部が発火する発火重み係数を有し、発振ニューロン部内のニューロン部から第1のニューロン部への方向で結びつけるシナプス部は、総和が閾値未満のため第1のニューロン部が発火しない未発火重み係数を有する発振部を備える。これによれば、刺激を受けて一度興奮状態になった第1のニューロン部を、その潜在的な興奮状態、すなわち興奮しやすい状態を一定期間持続させることができる。 That is, this neural network structure is composed of the first neuron unit 1, the second neuron unit 2 and the other neuron unit 3, and the output value due to the firing of the second neuron unit passes through the other neuron units. An oscillating neuron unit having a structure that propagates to the second neuron unit itself, and the synapse unit connected in the direction from the first neuron unit to the second neuron unit is the second neuron unit in one summation Has a firing weight coefficient that fires, and a synapse part that connects the neuron parts in the oscillating neuron part has a firing weight coefficient that fires the neuron part in a single sum, and the first neuron part in the oscillating neuron part The synapse part connected in the direction to the neuron part has an unfired weight coefficient that the first neuron part does not fire because the sum is less than the threshold value. It includes an oscillating unit. According to this, it is possible to maintain the potential excitement state, that is, the state of being easily excited, for a certain period of time in the first neuron unit that has been excited once upon receiving the stimulus.
 このように、あるニューロン部を興奮しやすい状態に一定期間持続させることで、本ニューラルネットワーク構造は、ニューロン部においてある一定期間事象や状態を記憶することができる短期記憶部を備えることが可能となる。さらに、発振回路をループ構造と組み合わせることで、発振回路が発振する一定期間を繰り返して数十秒間の短期記憶を可能とすることができ、後述する思考につなげることが可能となる。 In this way, by maintaining a certain neuron part in an easily excited state for a certain period, this neural network structure can be provided with a short-term memory part that can store events and states for a certain period in the neuron part. Become. Further, by combining the oscillation circuit with a loop structure, it is possible to perform short-term memory for several tens of seconds by repeating a certain period in which the oscillation circuit oscillates, and it is possible to connect to the thinking described later.
 本実施例のニューラルネットワーク構造は、さらに、シナプス部の結合強度の強化機構を備えてもよい。この強化機構は、図10に示すように、結合強度の変換テーブルを備え、入力側のニューロン部1と出力側のニューロン部2が一定時間以内に同時に興奮した場合に、当該ニューロン部間のシナプス結合の強度を強化するものである。人間の脳におけるニューロンは、ある事象を繰り返し入力されるとその事象を記憶し、その後その事象と同じ事象のものであるか否かを認識して、同じ事象のものである場合に、それに反応して信号を出力するようになる。そして、同じ情報が繰り返されるほど記憶することが容易になる。 The neural network structure of the present embodiment may further include a mechanism for strengthening the coupling strength of the synapse part. As shown in FIG. 10, this strengthening mechanism includes a connection strength conversion table, and when the neuron unit 1 on the input side and the neuron unit 2 on the output side are simultaneously excited within a predetermined time, the synapse between the neuron units is related. It strengthens the strength of the bond. Neurons in the human brain memorize an event when it is repeatedly input, then recognize whether it is the same event, and react to it if it is the same event Then, a signal is output. And it becomes easy to memorize | store so that the same information is repeated.
 この強化機構は、この脳における機能をモデル化したものであり、入力側のニューロン部1と出力側のニューロン部2が一定時間以内に同時に興奮することは、ある事象を繰り返し入力される状態であり、同じ情報が繰り返されるほど記憶することが容易になることは、結合強度を強化してより興奮状態になり発火しやすくなる状態を表している。長期記憶は、シナプス部がこのように強化された結合強度を有し、興奮状態になりやすい一連のニューロン部により形成することができる。人間の脳情報処理機能では、概ね7回の刺激で長期記憶が形成されるところ、この変換テーブルの例では3回の刺激で長期記憶が形成されるように設定している。この回数が多いと、長期記憶が形成されにくくなるが、一度形成された記憶は忘れにくく、また、誤って記憶が形成される確率が下がる。回数が少ないと、一度形成された記憶でも忘れやすく、また、誤って記憶が形成される確率も高くなるが、長期記憶の形成は容易になる。ここでは3回で長期記憶が形成される設定にしているが、3回に限定されるものではない。 This strengthening mechanism is a model of the function in the brain. When the neuron unit 1 on the input side and the neuron unit 2 on the output side are simultaneously excited within a certain time, a certain event is repeatedly input. The fact that it becomes easier to memorize as the same information is repeated represents a state in which the coupling strength is strengthened and the state becomes more excited and more easily ignited. Long-term memory can be formed by a series of neuron parts that have synaptic parts thus enhanced in connection strength and are prone to excitement. In the human brain information processing function, long-term memory is formed by approximately seven stimulations. In this example of the conversion table, long-term memory is set by three stimulations. If this number is large, long-term memory is difficult to be formed, but memory once formed is difficult to forget, and the probability that a memory is mistakenly formed decreases. If the number of times is small, the memory once formed can be easily forgotten, and the probability that the memory is erroneously formed becomes high, but the formation of long-term memory becomes easy. Here, the setting is such that long-term memory is formed three times, but is not limited to three times.
 このように、本ニューラルネットワーク構造は、一連のニューロン部から構成される長期記憶部を備える。また、本ニューラルネットワーク構造は、ループ構造と1以上の発振回路から構成される短期記憶部を備える。これによれば、ニューロン部により長期および短期の記憶を保持することが可能となる。なお、人間の脳では、短期記憶と長期記憶に加え、感覚記憶があると言われている。この感覚記憶は、本ニューラルネットワーク構造では、ニューラルネットワーク構造に情報を入力するための入力部における入力バッファとして実現されることができる。 Thus, this neural network structure includes a long-term memory unit composed of a series of neuron units. The neural network structure includes a short-term memory unit including a loop structure and one or more oscillation circuits. According to this, long-term and short-term memory can be held by the neuron unit. It is said that the human brain has sensory memory in addition to short-term memory and long-term memory. In the present neural network structure, this sensory memory can be realized as an input buffer in an input unit for inputting information to the neural network structure.
 図11を参照して、図3(B)で説明した内容をさらに詳しく説明する。図11に示すニューラルネットワーク構造は、ループ構造と、そのループ構造を伝播することによりあるニューロン部が発火して出力した出力値を入力値として受け取る他のニューラルネットワーク構造とを備える。ループ構造内のシナプス部は1回の総和で興奮状態となる8以上の重み係数を有しており、ニューロン部はすべて強い結合強度(強結合)を有している。他のニューラルネットワーク構造内のニューロン部1~6は、それぞれ上述した発振回路を備えている。そして、ループ構造内のニューロン部と他のニューラルネットワーク構造内のニューロン部1~6は、複数回の総和で興奮状態となる重み係数を有するシナプス部で弱結合されている。また、ニューロン部1とニューロン部2、ニューロン部2とニューロン部3、ニューロン部4とニューロン部5、およびニューロン部6とニューロン部7の間は強結合、その他は弱結合しているとする。 Referring to FIG. 11, the content described in FIG. 3B will be described in more detail. The neural network structure shown in FIG. 11 includes a loop structure and another neural network structure that receives, as an input value, an output value that a neuron unit fires and outputs by propagating the loop structure. The synapse part in the loop structure has a weight coefficient of 8 or more that becomes an excited state by one summation, and all the neuron parts have strong connection strength (strong connection). The neuron units 1 to 6 in other neural network structures are each provided with the oscillation circuit described above. The neuron part in the loop structure and the neuron parts 1 to 6 in the other neural network structures are weakly coupled at a synapse part having a weighting coefficient that becomes an excited state by summing a plurality of times. Further, it is assumed that the neuron unit 1 and the neuron unit 2, the neuron unit 2 and the neuron unit 3, the neuron unit 4 and the neuron unit 5, and the neuron unit 6 and the neuron unit 7 are strongly coupled, and the others are weakly coupled.
 まず、他のニューラルネットワーク構造のニューロン部1に強結合のシナプスを介して閾値以上の強い刺激が与えられる。そうすると、ニューロン部1が興奮状態となりニューロン部2に対して発火し、ニューロン部2も発火してニューロン部3まで最初の刺激が伝播する。しかし、ニューロン部1とニューロン部3の発火タイミングはずれるので、ニューロン部4は発火しない。なお、ニューロン部1~3は、発振回路を有しているので潜在的な興奮状態を一定期間保持する。 First, a strong stimulus above the threshold is given to the neuron unit 1 of another neural network structure through a strong-coupled synapse. Then, the neuron unit 1 becomes excited and ignites with respect to the neuron unit 2, and the neuron unit 2 also ignites and the first stimulus is propagated to the neuron unit 3. However, since the firing timings of the neuron unit 1 and the neuron unit 3 are shifted, the neuron unit 4 does not fire. Note that the neuron units 1 to 3 have an oscillation circuit and thus hold a potential excited state for a certain period.
 その間に、ループ構造内のニューロン部から弱結合のシナプスを介して、閾値未満の弱い刺激をうける。その刺激は、ニューロン部1~6に対して伝達され、ニューロン部1~3は潜在的興奮状態を持続しているので、弱い刺激であってもニューロン部1とニューロン部3は同時に発火する。そうすると、ニューロン部4は、弱結合のみのシナプス部で接続されているが、ループ構造内のニューロン部、ニューロン部1、およびニューロン部3から同時に刺激を受けるので興奮状態となり発火する。ニューロン部4が発火するとニューロン部5まで伝播する。しかし、ニューロン部3とニューロン部5の発火タイミングはずれるので、ニューロン部6は発火しない。なお、ニューロン部1~6は、発振回路を有しているので潜在的な興奮状態を一定期間保持する。 In the meantime, a weak stimulus below the threshold is received from the neuron in the loop structure via the weakly connected synapse. The stimulus is transmitted to the neuron parts 1 to 6, and the neuron parts 1 to 3 are kept in a potential excitement state. Therefore, even if the stimulus is weak, the neuron part 1 and the neuron part 3 fire simultaneously. Then, although the neuron part 4 is connected by a synapse part having only weak connection, the neuron part 4, the neuron part 1, and the neuron part 3 in the loop structure are simultaneously stimulated, and thus become excited and ignite. When the neuron unit 4 fires, it propagates to the neuron unit 5. However, since the firing timings of the neuron unit 3 and the neuron unit 5 are shifted, the neuron unit 6 does not fire. Since the neuron units 1 to 6 have an oscillation circuit, they hold a potential excited state for a certain period.
 またその間に、ループ構造内のニューロン部から弱結合のシナプスを介して、閾値未満の弱い刺激をうける。その刺激は、ニューロン部1~6に対して伝達され、ニューロン部1~5は潜在的興奮状態を持続しているので、弱い刺激であってもニューロン部3とニューロン部5は同時に発火する。そうすると、ニューロン部6は、弱結合のみのシナプス部で接続されているが、ループ構造内のニューロン部、ニューロン部3、およびニューロン部5から同時に刺激を受けるので興奮状態となり発火する。ニューロン部6が発火すると、他のニューラルネットワーク構造における最後に伝播するニューロン部7まで伝播する。 In the meantime, a weak stimulus below the threshold is received from the neuron in the loop structure via the weakly connected synapse. The stimulation is transmitted to the neuron parts 1 to 6, and the neuron parts 1 to 5 are kept in the potential excitement state. Therefore, even if the stimulus is weak, the neuron part 3 and the neuron part 5 are fired simultaneously. Then, the neuron part 6 is connected by a synapse part having only weak connection, but since it is simultaneously stimulated by the neuron part, the neuron part 3 and the neuron part 5 in the loop structure, it becomes excited and ignites. When the neuron unit 6 fires, it propagates to the last neuron unit 7 that propagates in another neural network structure.
 このように、ループ構造がなければ、ニューロン部1とニューロン部2で刺激が止まってしまい、所定の機能を発現する一連のニューロン部を形成できないまたは形成することに多くの時間を要することになるが、ループ構造があることでまたはループ構造と発振回路と組み合わせることで、所定の機能を発現する一連のニューロン部を形成しやすくなると言える。また、ループ構造があると、あるニューロン部に対して一つの入力を与えると、そのニューロン部は発火し、その発火による出力値が、他のニューロン部およびシナプス部を介して、そのニューロン部自身に伝播し発火する。 Thus, if there is no loop structure, stimulation stops at the neuron part 1 and the neuron part 2, and a series of neuron parts that express a predetermined function cannot be formed or it takes a long time to form. However, it can be said that the presence of the loop structure or the combination of the loop structure and the oscillation circuit facilitates formation of a series of neuron portions that express a predetermined function. Also, if there is a loop structure, when one input is given to a certain neuron part, that neuron part will fire, and the output value by that fire will be passed through other neuron parts and synapse parts themselves. Propagated to fire.
 このように、ニューラルネットワーク構造においてループ構造が含まれることにより、そのループ構造に含まれるあるニューロン部が発火による出力値すなわち刺激を受けるとその刺激がループ構造内を次々と伝播してゆくこととなるため、ループ構造内で刺激が連続的に伝播しつづけるように機能させ、また、ループ構造内のニューロン部がループ構造内の刺激が伝播される都度一定の時間間隔をおいて繰り返し興奮するように機能させることができる。また、あるニューロン部に入力すると発火し、その発火による出力値すなわち刺激が繰り返し自分自身に伝わることで、その刺激を連続的に伝播させることができる。 As described above, when a neural network structure includes a loop structure, when a neuron part included in the loop structure receives an output value, i.e., a stimulus by firing, the stimulus propagates in the loop structure one after another. Therefore, it is made to function so that the stimulus continuously propagates in the loop structure, and the neuron part in the loop structure is repeatedly excited at a certain time interval every time the stimulus in the loop structure is propagated. Can function. Moreover, when it inputs into a certain neuron part, it will ignite and the output value, ie, irritation | stimulation by that irritation | stimulation will be transmitted to self repeatedly, and the irritation | stimulation can be continuously propagated.
 ニューラルネットワーク構造において、興奮状態または潜在的興奮状態の一連のニューロン部が多くなり、所定の機能を発現しやすくなるとは、脳情報処理機能において、思考を深めてゆくと結論が導出されやすくなることと同等であり、人間同様の方法で思考していると考えることも可能である。 In the neural network structure, a series of neurons in an excited state or a potential excited state increases, and it is easy to express a predetermined function. In the brain information processing function, it is easier to draw a conclusion when thinking deepens. It is also possible to think that they are thinking in the same way as humans.
<第3実施例>
 図12を参照して、本実施例のニューラルネットワーク構造を説明する。なお、本実施例のニューラルネットワーク構造は、上記実施例のニューラルネットワーク構造を含むものとする。本実施例のニューラルネットワーク構造は、第1ニューラルネットワーク構造と、第2ニューラルネットワーク構造とを含み、第1ニューラルネットワーク構造に含まれるニューロン部と、第2ニューラルネットワーク構造に含まれるニューロン部とを結びつけるシナプス部を備える。すなわち、本ニューラルネットワーク構造は、2つまたはそれ以上の異なるニューラルネットワーク構造を含み、これらのニューラルネットワーク構造を繋ぐシナプス部を有する。
<Third embodiment>
With reference to FIG. 12, the neural network structure of the present embodiment will be described. The neural network structure of this embodiment includes the neural network structure of the above embodiment. The neural network structure of the present embodiment includes a first neural network structure and a second neural network structure, and connects the neuron part included in the first neural network structure and the neuron part included in the second neural network structure. A synapse is provided. That is, the present neural network structure includes two or more different neural network structures and has a synapse portion that connects these neural network structures.
 図12に示すニューラルネットワーク構造は、ループ構造と、そのループ構造を伝播することによりあるニューロン部が発火して出力した出力値を入力値として受け取る第1のニューラルネットワーク構造と、第1のニューラルネットワーク構造内のニューロン部とシナプス部で接続されたニューロン部を含む第2のニューラルネットワーク構造を備える。 The neural network structure shown in FIG. 12 includes a loop structure, a first neural network structure that receives as an input value an output value that is fired and output by a neuron unit that propagates through the loop structure, and a first neural network. A second neural network structure including a neuron unit connected by a synapse unit and a neuron unit in the structure is provided.
 ループ構造内のシナプス部は、ループ形状に沿って1回の総和で興奮状態となる8以上の重み係数を有しており、ループを形成するニューロン部はすべて強い結合強度(強結合)で隣接するニューロン部と繋がっている。第1のニューラルネットワーク構造内のニューロン部1~10は、それぞれ上述した発振回路を備えている。そして、ループ構造内のニューロン部と第1のニューラルネットワーク構造内のニューロン部2~9は、複数回の総和で興奮状態となる重み係数を有するシナプス部で弱結合されている。 The synapse part in the loop structure has a weight coefficient of 8 or more that is excited in the sum of one time along the loop shape, and all the neuron parts forming the loop are adjacent with strong connection strength (strong connection) It is connected to the neuron part that does. Each of the neuron units 1 to 10 in the first neural network structure includes the oscillation circuit described above. Then, the neuron part in the loop structure and the neuron parts 2 to 9 in the first neural network structure are weakly coupled at a synapse part having a weighting coefficient that becomes an excited state by a total of a plurality of times.
 また、ニューロン部1とニューロン部2、ニューロン部9とニューロン部10の間は強結合、その他は弱結合しているとする。また、第1のニューラルネットワーク構造内のニューロン部と第2のニューラルネットワーク構造内のニューロン部は、強結合している。また、第2のニューラルネットワーク構造は、手順1~4を含む。手順1~4は、数個のニューロン部がすべて強結合された構造を備える。したがって、手順1~4は、所定の機能を発現する一連のニューロン部が形成された状態であり、手順1~4により記憶や意味を備えている状態である。 Further, it is assumed that the neuron unit 1 and the neuron unit 2, the neuron unit 9 and the neuron unit 10 are strongly coupled, and the others are weakly coupled. Further, the neuron part in the first neural network structure and the neuron part in the second neural network structure are strongly coupled. The second neural network structure includes procedures 1 to 4. Procedures 1 to 4 have a structure in which several neuron parts are all strongly coupled. Therefore, procedures 1 to 4 are a state in which a series of neuron parts that express a predetermined function are formed, and are a state having memory and meaning according to procedures 1 to 4.
 まず、第1のニューラルネットワーク構造のニューロン部1に強結合の刺激が与えられる。そうすると、ニューロン部1が興奮状態となりニューロン部2に対して発火し、第2のニューラルネットワーク構造の手順1内のニューロン部まで最初の刺激が伝播する。そして、手順1内のニューロン部はすべて強結合しているので、第1のニューラルネットワーク構造内のニューロン部3まで伝播する。しかし、ニューロン部2とニューロン部3の発火タイミングはずれるので、ニューロン部4は発火しない。なお、ニューロン部2およびニューロン部3は、発振回路を有しているので潜在的な興奮状態を一定期間保持する。 First, a strongly coupled stimulus is applied to the neuron unit 1 of the first neural network structure. Then, the neuron unit 1 becomes excited and ignites with respect to the neuron unit 2, and the first stimulus is propagated to the neuron unit in the procedure 1 of the second neural network structure. Since all the neuron parts in the procedure 1 are strongly coupled, they propagate to the neuron part 3 in the first neural network structure. However, since the firing timings of the neuron unit 2 and the neuron unit 3 are shifted, the neuron unit 4 does not fire. Note that the neuron unit 2 and the neuron unit 3 have an oscillation circuit, and thus hold a potential excited state for a certain period.
 その間に、ループ構造内のニューロン部から弱結合の刺激をうける。その刺激は、ニューロン部2~9に対して伝達され、ニューロン部1とニューロン部3は潜在的興奮状態を持続しているので、弱結合の刺激であってもニューロン部2とニューロン部3は同時に発火する。そうすると、ニューロン部4は、弱結合のみのシナプス部で接続されているが、ループ構造内のニューロン部、ニューロン部2、およびニューロン部3から同時に刺激を受けるので興奮状態となり発火する。ニューロン部4が発火すると第2のニューラルネットワーク構造内の手順2内のニューロン部まで伝播する。そして、手順2内のニューロン部はすべて強結合しているので、第1のニューラルネットワーク構造内のニューロン部5まで伝播する。しかし、ニューロン部4とニューロン部5の発火タイミングはずれるので、ニューロン部6は発火しない。なお、ニューロン部2~9は、発振回路を有しているので潜在的な興奮状態を一定期間保持する。 In the meantime, a weak connection is stimulated from the neurons in the loop structure. The stimulation is transmitted to the neuron units 2 to 9, and the neuron unit 1 and the neuron unit 3 maintain the potential excitement state. Fire at the same time. Then, although the neuron part 4 is connected by a synapse part having only weak connection, the neuron part 4, the neuron part 2, and the neuron part 3 in the loop structure are simultaneously stimulated, and thus become excited and ignite. When the neuron unit 4 fires, it propagates to the neuron unit in the procedure 2 in the second neural network structure. Since all the neuron parts in the procedure 2 are strongly coupled, they propagate to the neuron part 5 in the first neural network structure. However, since the firing timings of the neuron unit 4 and the neuron unit 5 are shifted, the neuron unit 6 does not fire. Since the neuron units 2 to 9 have the oscillation circuit, they hold a potential excited state for a certain period.
 またその間に、ループ構造内のニューロン部から弱結合の刺激をうける。その刺激は、ニューロン部2~9に対して伝達され、ニューロン部4とニューロン部5は潜在的興奮状態を持続しているので、弱結合の刺激であってもニューロン部4とニューロン部5は同時に発火する。そうすると、ニューロン部6は、弱結合のみのシナプス部で接続されているが、ループ構造内のニューロン部、ニューロン部4、およびニューロン部5から同時に刺激を受けるので興奮状態となり発火する。ニューロン部6が発火すると第2のニューラルネットワーク構造内の手順3内のニューロン部まで伝播する。そして、手順3内のニューロン部はすべて強結合しているので、第1のニューラルネットワーク構造内のニューロン部7まで伝播する。しかし、ニューロン部6とニューロン部7の発火タイミングはずれるので、ニューロン部8は発火しない。なお、ニューロン部2~9は、発振回路を有しているので潜在的な興奮状態を一定期間保持する。 In the meantime, a weak connection is stimulated from the neurons in the loop structure. The stimulation is transmitted to the neuron units 2 to 9, and the neuron unit 4 and the neuron unit 5 maintain the potential excitement state. Fire at the same time. Then, the neuron unit 6 is connected by a synapse unit having only weak connection, but since it receives stimuli from the neuron unit, the neuron unit 4 and the neuron unit 5 in the loop structure at the same time, it becomes excited and ignites. When the neuron unit 6 fires, it propagates to the neuron unit in the procedure 3 in the second neural network structure. Since all the neuron parts in the procedure 3 are strongly coupled, they propagate to the neuron part 7 in the first neural network structure. However, since the firing timings of the neuron unit 6 and the neuron unit 7 are shifted, the neuron unit 8 does not fire. Since the neuron units 2 to 9 have the oscillation circuit, they hold a potential excited state for a certain period.
 またその間に、ループ構造内のニューロン部から弱結合の刺激をうける。その刺激は、ニューロン部2~9に対して伝達され、ニューロン部6とニューロン部7は潜在的興奮状態を持続しているので、弱結合の刺激であってもニューロン部6とニューロン部7は同時に発火する。そうすると、ニューロン部8は、弱結合のみのシナプス部で接続されているが、ループ構造内のニューロン部、ニューロン部6、およびニューロン部7から同時に刺激を受けるので興奮状態となり発火する。ニューロン部8が発火すると第2のニューラルネットワーク構造内の手順4内のニューロン部まで伝播する。そして、手順4内のニューロン部はすべて強結合しているので、第1のニューラルネットワーク構造内のニューロン部9まで伝播する。そして、ニューロン部9も発火し、第1のニューラルネットワーク構造における最後に伝播するニューロン部10まで伝播する。 In the meantime, a weak connection is stimulated from the neurons in the loop structure. The stimulation is transmitted to the neuron units 2 to 9, and the neuron unit 6 and the neuron unit 7 maintain the potential excitement state. Fire at the same time. Then, although the neuron part 8 is connected by a synapse part having only weak connection, the neuron part, the neuron part 6 and the neuron part 7 in the loop structure are simultaneously stimulated and thus become excited and ignite. When the neuron unit 8 fires, it propagates to the neuron unit in the procedure 4 in the second neural network structure. Since all the neuron parts in the procedure 4 are strongly coupled, they propagate to the neuron part 9 in the first neural network structure. Then, the neuron unit 9 also fires and propagates to the neuron unit 10 that propagates last in the first neural network structure.
 このように、ニューラルネットワーク構造においてループ構造が含まれることにより、そのループ構造に含まれるあるニューロン部が発火による出力値すなわち刺激を受けるとその刺激がループ構造内を次々と伝播してゆくこととなるため、ループ構造内で刺激が連続的に伝播しつづけるように機能させることができる。そして、長期記憶や短期記憶などが保持された異なるニューラルネットワーク構造に含まれるニューロン部を接続するシナプス部を備えることで、異なる体系のニューラルネットワーク構造の間で連携を行うことが可能になる。 As described above, when a neural network structure includes a loop structure, when a neuron part included in the loop structure receives an output value, i.e., a stimulus by firing, the stimulus propagates in the loop structure one after another. Therefore, it can function so that the stimulus continues to propagate continuously in the loop structure. In addition, by providing a synapse unit that connects neuron units included in different neural network structures in which long-term memory, short-term memory, and the like are held, it becomes possible to perform cooperation between neural network structures of different systems.
 たとえば、このニューラルネットワーク構造は、手続き的な処理を実行するのに適している。図12では、第1のニューラルネットワーク構造を計画レベルのニューラルネットワーク構造と呼び、第2のニューラルネットワーク構造を実行レベルのニューラルネットワーク構造と呼んでいるが、たとえば、本図に示すニューラルネットワーク構造により、大きな計画に基づきながら、細かい行動(手順)を実行していくことが可能になる。 For example, this neural network structure is suitable for executing procedural processing. In FIG. 12, the first neural network structure is called a plan level neural network structure, and the second neural network structure is called an execution level neural network structure. For example, the neural network structure shown in FIG. It is possible to execute detailed actions (procedures) based on a large plan.
 図13を参照して、パズルを解く例に基づき手続き的な処理について説明する。なお、本図ではループ構造は省略している。まず、計画レベルの最初のニューロン部で、パズルを解くという意思があり発火(1)が行われる。発火(1)が行われると、パズルを解くために思いつくアルゴリズムについて、実行レベルのニューロン部の中で探索するための発火(2)、発火(3)、発火(4)が行われる。たとえば、発火(2)は「まっすぐ上にあげる」という手順を有するニューロン部へ、発火(3)は「ななめにひねる」という手順を有するニューロン部へ、発火(4)は「回転させる」という手順を有するニューロン部へ行われる。 Referring to FIG. 13, procedural processing will be described based on an example of solving a puzzle. In this figure, the loop structure is omitted. First, in the first neuron section at the planning level, there is an intention to solve the puzzle, and firing (1) is performed. When firing (1) is performed, firing (2), firing (3), and firing (4) are performed for searching for an algorithm that can be conceived to solve a puzzle in the neuron portion at the execution level. For example, firing (2) is directed to a neuron section having a procedure of “raising straight up”, firing (3) is directed to a neuron section having a procedure of “twisting”, and firing (4) is “rotated”. To the neuron section having
 計画レベルのニューロン部から実行レベルのニューロン部へは弱結合の刺激が与えられる。しかし、「まっすぐ上にあげる」という手順を有するニューロン部から、「ななめにひねる」という手順を有するニューロン部から、「回転させる」という手順を有するニューロン部から、それぞれ計画レベルのニューロン部へ返答するニューロン部への結合強度は、「回転させる」という手順を有するニューロン部からの結合強度が最も高い状態(強結合)である。そうすると、計画レベルのニューロン部へ返答するニューロン部へは、「回転させる」ニューロン部から入力される可能性が高い。したがって、実行レベルから計画レベルへは、発火(6)が行われ、「回転させる」というアルゴリズムが返答される。計画レベルのニューロン部としては、パズルを解くにあたって有力なアルゴリズムは「回転させる」ことであると選択したことになる。 ◎ Stimulation of weak connection is given from the neuron part at the planning level to the neuron part at the execution level. However, from the neuron unit having the procedure of “raising straight up”, from the neuron unit having the procedure of “twisting smoothly”, and from the neuron unit having the procedure of “rotating” to the neuron unit at the plan level, respectively. The connection strength to the neuron portion is a state (strong connection) where the connection strength from the neuron portion having the procedure of “rotate” is the highest. Then, there is a high possibility that the neuron unit that responds to the neuron unit at the planning level is input from the “rotating” neuron unit. Therefore, firing (6) is performed from the execution level to the planning level, and an algorithm of “rotate” is returned. For the neuron part at the planning level, it has been selected that the leading algorithm for solving the puzzle is to “rotate”.
 その後、計画レベルでは、「実行する」ニューロン部へ刺激が伝播し、「実行する」ニューロン部は、実行レベルの「回転させるを実行する」ニューロン部に対して発火する。以降、実行レベルで所定の手順が実行され、結果(本図で「はずれた」)が計画レベルのニューロン部へ伝播する。 Then, at the plan level, the stimulus propagates to the “execution” neuron unit, and the “execution” neuron unit fires against the “execute rotation” neuron unit of the execution level. Thereafter, a predetermined procedure is executed at the execution level, and the result (“disconnected” in the figure) is propagated to the neuron unit at the planning level.
 このように、第1のニューラルネットワーク構造は、汎用的または計画的機能を有し、第2ニューラルネットワーク構造は、個別的または実行的機能を有することで、汎用的または計画的な機能を発現するニューラルネットワーク構造の下で、個別的または実行的な機能を発現させることができるようになる。 As described above, the first neural network structure has a general purpose or planned function, and the second neural network structure has a general or planned function, thereby expressing the general purpose or planned function. Under the neural network structure, individual or executable functions can be expressed.
 この例では、計画レベルのニューラルネットワーク構造は、実行レベルのニューラルネットワーク構造からアルゴリズムを選択することができることを示した。従来型の情報処理システムでは、複数のアルゴリズムの中から、その場の状況に適したアルゴリズムを選択して実行させることはできないというフレーム問題を有すると言われている。従来型の情報処理システムは、起こりうる全てを考慮すると無限の時間がかかってしまうので、特定のテーマや範囲に枠(フレーム)をはめて、その枠の中だけで処理しようとする。 In this example, it was shown that the neural network structure at the planning level can select an algorithm from the neural network structure at the execution level. It is said that the conventional information processing system has a frame problem in which an algorithm suitable for the current situation cannot be selected from a plurality of algorithms and executed. A conventional information processing system takes an infinite amount of time in consideration of everything that can happen, so a frame (frame) is attached to a specific theme or range and processing is performed only within that frame.
 しかし、現実世界のあらゆる事象に対処しようとすると、振るい分けをしなければならない可能性が無数にあるため、抽出する段階で無限の時間がかかってしまう。本発明では、そもそも人間でも、全てを考慮してから行動しているわけではないという考え方に立ち、複数のフレームを用意しておき、現実世界の状況に対して、連想系のニューラルネットワーク構造を利用して、最も合致するフレームを探索することによって、瞬時に抽出可能とする。このニューラルネットワーク構造によれば、現実的な時間内に答えを出せ、実用化に耐える強い人工知能の機能を有する情報処理システムを作製することができる。 However, if you try to deal with all the events in the real world, there are countless possibilities for sorting, so it will take an infinite amount of time to extract. In the present invention, based on the idea that even human beings do not act after considering everything, a plurality of frames are prepared, and an associative neural network structure is prepared for the real world situation. By using it, it is possible to extract it instantaneously by searching for the best matching frame. According to this neural network structure, it is possible to produce an information processing system that can give an answer within a realistic time and has a strong artificial intelligence function that can withstand practical use.
 また、図14(A)に示すように、第1ニューラルネットワーク構造は、1以上の第2ニューラルネットワーク構造を制御する機能を有するマスタであり、第2ニューラルネットワーク構造は、第1ニューラルネットワーク構造の制御下で機能するスレーブであってもよい。これによれば、1つの主となるニューラルネットワーク構造が、他のニューラルネットワーク構造を制御することで、階層構造によって統制を効かせた、より大きなニューラルネットワーク構造を有することができる。 As shown in FIG. 14A, the first neural network structure is a master having a function of controlling one or more second neural network structures, and the second neural network structure is the same as the first neural network structure. It may be a slave that functions under control. According to this, one main neural network structure can have a larger neural network structure that is controlled by the hierarchical structure by controlling other neural network structures.
 また、図14(B)に示すように、2つのニューラルネットワーク構造が互いに自分がマスタであり、相手がスレーブであると認識しているようなデュアルマスタスレーブ構成であってもよい。これによれば、相互に相手のニューロン部を制御しながら、自分のニューラルネットワーク構造の所定の目的を達することができる。 Further, as shown in FIG. 14B, a dual master-slave configuration in which two neural network structures recognize each other as a master and a partner as a slave may be employed. According to this, it is possible to achieve a predetermined purpose of one's own neural network structure while mutually controlling the other neuron portion.
 また、図14(C)に示すように、マスタスレーブの一方をニューラルネットワーク構造ではなく従来のプログラムとしてもよい。プログラムは、本能的な「無意識」の行動を内蔵することで提供し、また所定の機能を発現する一連のニューロン部は、生得的な「意識」のある行動を機能として提供する。 Also, as shown in FIG. 14C, one of the master slaves may be a conventional program instead of the neural network structure. The program is provided by incorporating instinctive “unconscious” behavior, and a series of neuron units that express a predetermined function provide innate “conscious” behavior as a function.
<第4実施例>
 図17~図21を参照して、本願発明のニューラルネットワーク構造を使用した機能(学習機能、連想機能、および評価機能)について説明する。ここで、図17に示すように、学習機能とは、ニューラルネットワーク構造においてシナプス部の結合強度を高めるための機能、すなわちニューロン部を結びつけるシナプス部の重み係数を増加させ、ニューロン部間あるいはニューラルネットワーク構造間の強い結合を促し、所定の機能を発現する一連のニューロン部を形成し易くするための機能である。また、連想機能とは、ニューラルネットワーク構造に対する入力に加えて、その入力だけでは一連のニューロン部を形成できず出力することはないが、別の異なるニューラルネットワーク構造または一連のニューロン部からの刺激(追加刺激)を受けることで、出力を得やすくするための機能を言う。また、評価機能とは、連想機能で得られた出力情報の妥当性を評価するための機能を言う。
<Fourth embodiment>
The functions (learning function, associative function, and evaluation function) using the neural network structure of the present invention will be described with reference to FIGS. Here, as shown in FIG. 17, the learning function is a function for increasing the connection strength of the synapse parts in the neural network structure, that is, the weight coefficient of the synapse parts for connecting the neuron parts is increased, and the inter-neuron part or the neural network This is a function for facilitating formation of a series of neuron portions that promote a strong connection between structures and express a predetermined function. In addition to the input to the neural network structure, the associative function is not able to form and output a series of neuron parts only by the input, but it does not output a different neural network structure or a stimulus from a series of neuron parts ( A function to make it easier to obtain output by receiving additional stimuli. The evaluation function is a function for evaluating the validity of the output information obtained by the associative function.
 たとえば、図18に示すように、学習に関する従来方式では、左右に関連の無い2つのニューラルネットワークがあったとしても両者は繋がるようなことはない。しかし、同図に示すように、本実施例での学習機能は、関連の無い2つのニューラルネットワークであったとしても、同時に興奮(発火)したニューロン部があった場合には、両ニューロン部には何らかの関連があるとみなして両ニューロン部を結びつけるシナプス部を形成し、そのシナプス部の重み係数を増加させる。このように、たまたま同時に興奮したニューロン部を結びつけることで、1つのニューラルネットワーク構造だけでは出力しえなかったことも出力できるようになる。なお、発火の厳密な同時性を求めるものではなく、所定の機能を発現している一連のニューロン部の中で連続的に発火する少なくとも1つのニューロンと、それとは異なる一連のニューロン部の中で連続的に発火する少なくとも1つのニューロンとが同時期に興奮するということでもよい。 For example, as shown in FIG. 18, in the conventional method relating to learning, even if there are two neural networks that are not related to the left and right, the two are not connected. However, as shown in the figure, the learning function in this embodiment is that even if there are two unrelated neural networks, if there are neurons that are excited (fired) at the same time, Considers that there is some relation, forms a synapse part that connects both neuron parts, and increases the weight coefficient of the synapse part. In this way, it is possible to output what could not be output by only one neural network structure by connecting the neuron parts that happened to be excited simultaneously. It does not require exact synchronization of firing, but in at least one neuron that fires continuously in a series of neurons that express a predetermined function, and in a series of neurons that are different from that It may also mean that at least one neuron that fires continuously is excited at the same time.
 すなわち、他のニューラルネットワーク構造において、所定の機能を発現している第1の一連のニューロン部の中の少なくとも1つのニューロン部と、第1の一連のニューロン部とは異なる所定の機能を発現している第2の一連のニューロン部の中の少なくとも1つのニューロン部とが同時に発火した場合、第1の一連のニューロン部と第2の一連のニューロン部を結びつけるシナプス部の重み係数は増加する。さらに、このような学習機能は、上記実施例でも説明したあるニューラルネットワーク構造に含まれるニューロン部と、異なるニューラルネットワーク構造に含まれるニューロン部と間でも同様である。すなわち、第1ニューラルネットワーク構造に含まれるニューロン部と、第1ニューラルネットワーク構造とは異なる第2ニューラルネットワーク構造に含まれるニューロン部とを結びつけるシナプス部の重み係数は、そのシナプス部が結びつける両端のニューロン部が同時に発火した場合に増加してもよい。 That is, in another neural network structure, at least one neuron unit in the first series of neuron units expressing a predetermined function and a predetermined function different from the first series of neuron units are expressed. When at least one neuron unit in the second series of neuron units fires at the same time, the weight coefficient of the synapse unit that connects the first series of neuron units and the second series of neuron units increases. Further, such a learning function is the same between a neuron unit included in a certain neural network structure described in the above embodiment and a neuron unit included in a different neural network structure. That is, the weight coefficient of the synapse part that connects the neuron part included in the first neural network structure and the neuron part included in the second neural network structure different from the first neural network structure is the neuron at both ends connected by the synapse part. It may be increased if parts fire simultaneously.
 このような学習機能により、異なる一連のニューロン部間や異なるニューラルネットワーク構造間での強い結合が促され、所定の機能を発現する一連のニューロン部を形成し易くなり、またニューラルネットワーク構造間の関連を結びつけやすくなる。たとえば、人間の大脳では入力側のニューロン部と出力側のニューロンが同時に興奮すると、両者をつなぐシナプス結合が強化されると言われているが、別々の関連の無い知識と思っていたことが結びつくことで、1つの系ではわからなかったことも新たな「発見」をすることができるようになる。たとえば、地球と月の物理的関係に関する知識とリンゴが落ちる事象とが関連付けられる如くである。 Such a learning function promotes strong connection between different series of neuron parts or between different neural network structures, making it easy to form a series of neuron parts that express a given function, and the relationship between neural network structures. It becomes easy to tie. For example, in the human cerebrum, it is said that when the input side neuron and the output side neuron are excited simultaneously, the synaptic connection that connects them is strengthened. By doing this, it becomes possible to make a new “discovery” even if it was not understood by one system. For example, knowledge about the physical relationship between the earth and the moon is related to the phenomenon of apple falling.
 これに関連して、人間の脳の領野間での連携を題材に検証を行った(図21参照)。なお、この検証に用い、本図に示されている画面は、Unityという統合開発環境の上で作成された。本題材例では、同時に発火した時は同じ概念と捉えることで、視覚野と聴覚野に接合する概念ネットワーク内のニューロン部が、概念ネットワーク内のニューロン部に付属する属性を統合することを示す。画面の上段には、視覚野が示されており、その視覚野の中には、左から、林檎、蜜柑、甜瓜(メロン)、苺、バナナ、葡萄、檸檬、人参、大根、茄子にそれぞれ対応する形状ニューロン部が既に形成され、表示されている。また、同様に、画面の下段には、聴覚野が示されており、その聴覚野の中には、左から、林檎、蜜柑、甜瓜(メロン)、苺、バナナ、葡萄、檸檬、人参、大根、茄子にそれぞれ対応する言葉ニューロン部が既に形成され、表示されている。視覚野と聴覚野に接合する概念ネットワーク内のニューロン部が、視覚野と聴覚野の中間に多数表示されている。 In connection with this, verification was conducted on the subject of cooperation between the areas of the human brain (see FIG. 21). The screen used in this verification and shown in this figure was created on an integrated development environment called Unity. In this example material, it is shown that the neuron part in the concept network that joins the visual cortex and the auditory cortex integrates the attributes attached to the neuron part in the concept network by regarding the same concept when fired at the same time. The upper part of the screen shows the visual cortex. From the left, the visual cortex corresponds to apple, mandarin, melon, persimmon, banana, persimmon, persimmon, carrot, radish, and eggplant. The shape neuron part to be formed is already formed and displayed. Similarly, in the lower part of the screen, the auditory cortex is shown. From the left, the auditory cortex includes apple, mandarin orange, melon, persimmon, banana, persimmon, persimmon, carrot and radish. Word neuron portions corresponding to the lions are already formed and displayed. Many neurons in the conceptual network that connect the visual cortex and auditory cortex are displayed between the visual cortex and auditory cortex.
 なお、概念ネットワークとは、同等の概念や異なる概念を結びつけるための繋がりのない中間ニューロン部を用意しておき、異なるニューラルネットワーク構造に属する複数のニューロン部が同時発火したのを契機に、これらの複数のニューロン部を結びつけ、新たな概念を形成し学習を促進するための仕掛けである。 Note that the concept network prepares an intermediate neuron part that is not connected to connect an equivalent concept or a different concept, and triggered by the simultaneous firing of a plurality of neuron parts belonging to different neural network structures. It is a mechanism for linking multiple neuron parts to form new concepts and promote learning.
 まず、本図の太短点線で示す手順1を説明する。手順1では、幼児にバナナを見せながらバナナという言葉を聞かせるように、視覚野のバナナの形状に相当するニューロン部と聴覚野のバナナという言葉に相当するニューロン部を同時に興奮(発火)させる。そうすると、概念ネットワーク内の中間ニューロン部の一つが、バナナという概念に相当するニューロン部に変化する。なお、入力のニューロンは、楕円で囲んであり、出力のニューロンは、矩形の点線で囲んである。 First, the procedure 1 indicated by the thick and short dotted lines in this figure will be described. In step 1, the neuron portion corresponding to the shape of the banana in the visual cortex and the neuron portion corresponding to the word banana in the auditory cortex are simultaneously excited (fired) so that the infant can hear the word banana while showing the banana. Then, one of the intermediate neuron parts in the concept network changes to a neuron part corresponding to the concept of banana. The input neurons are surrounded by an ellipse, and the output neurons are surrounded by a rectangular dotted line.
 次に、本図の太長点線で示す手順2を説明する。手順2では、幼児にバナナを見せながらバナナの味は甘いという概念を教えるように、視覚野のバナナの形状に相当するニューロン部と、味という概念の下の甘いという概念に相当するニューロン部を同時に興奮(発火)させる。そうすると、概念ネットワーク内の中間ニューロン部の一つが、甘い味という概念に相当するニューロン部に変化すると共に、甘い味という概念に相当するニューロン部とバナナという概念に相当するニューロン部と間にシナプス部が形成され、バナナと言う概念は甘い味という概念と双方向に結び付けられる。 Next, the procedure 2 indicated by the thick dotted line in this figure will be described. In step 2, the neuron part corresponding to the shape of the banana in the visual cortex and the neuron part corresponding to the concept of sweet under the concept of taste are taught to teach the concept that the banana taste is sweet while showing the infant a banana. At the same time excite (fire). Then, one of the intermediate neurons in the concept network changes to a neuron corresponding to the concept of sweet taste, and a synapse between the neuron corresponding to the concept of sweet taste and the neuron corresponding to the concept of banana. The concept of banana is linked to the concept of sweet taste in both directions.
 次に、本図の太一点鎖線で示す手順3を説明する。手順3では、幼児にバナナの味は?という質問をするように、聴覚野のバナナという言葉に相当するニューロン部と味に相当するニューロン部を刺激し興奮(発火)させる。そうすると、視覚野の刺激によりバナナというものの味は甘いというシナプス部で結びつけられた一連のニューロン部が既に形成されているので、聴覚野のバナナという言葉に相当するニューロン部から、バナナという概念のニューロン部を経由して、甘い味という概念に結びつけられるので、バナナの味は甘いという答えが得られた。 Next, the procedure 3 indicated by the thick dashed line in this figure will be described. In step 3, what is the taste of bananas for young children? The neuron part corresponding to the word “banana” in the auditory cortex and the neuron part corresponding to the taste are stimulated (excited). As a result, a series of neurons connected by the synapse that the taste of banana is sweet due to the stimulation of the visual cortex has already been formed, so from the neuron corresponding to the word banana in the auditory cortex, neurons of the concept of banana Through the department, it was linked to the concept of sweet taste, so the answer was that the taste of banana was sweet.
 このように、本検証では、概念領域において、形状ニューロン部と言葉ニューロン部間のシナプス部が形成されてつながり、概念ニューロンが形成できることを確認できた。また、同時発火した時は同じ概念と捉え、視覚野や聴覚野につながる概念ネットワーク内のニューロン部が、概念ネットワーク内のニューロン部に付属する属性が統合されること(属性統合)を確認できた。 Thus, in this verification, it was confirmed that the synapse part between the shape neuron part and the word neuron part was formed and connected in the concept area, and the concept neuron could be formed. In addition, when firing at the same time, it was considered the same concept, and it was confirmed that the neuron part in the concept network connected to the visual cortex and auditory cortex was integrated with the attributes attached to the neuron part in the concept network (attribute integration) .
 図19は、連想機能の流れを示す。連想機能は、1つのニューラルネットワーク構造に対する入力に加えて、それとは異なるニューラルネットワーク構造または一連のニューロン部からの入力(追加刺激)を受けて、出力を得やすくする。たとえば、食べ物の画像の中のバナナの画像を与えられ、細長く少し湾曲した黄緑色の食べ物などの特徴は抽出できるが、準興奮状態であるがバナナの概念を出力できない一連のニューロン部がある場合、甘い味という概念を追加的に刺激として入力してやることで、バナナの概念を出力する一連のニューロン部を形成する。本図(A)では、あるニューラルネットワーク構造に入力情報に加えて、一連のニューロン部を興奮させようとすることを示す。本図(B)では、入力された刺激がある程度ニューロン部を伝播するが、出力情報に対応するニューロン部までは刺激は伝播せず興奮状態とならないため、出力情報は得られないことを示す。しかし、本図(C)では、そのような状態の一連のニューロン部の中のニューロン部に対して、系外のニューロン部から追加的な刺激を受けると出力情報に対応するニューロン部まで刺激が伝播し、出力情報が得られるようになることを示す。 FIG. 19 shows the flow of the associative function. The associative function receives an input (additional stimulus) from a different neural network structure or a series of neuron parts in addition to an input to one neural network structure, and easily obtains an output. For example, given a banana image in a food image, you can extract features such as a long, slightly curved yellow-green food, but there is a series of neurons that are semi-excited but cannot output the concept of bananas By inputting the concept of sweet taste as an additional stimulus, a series of neurons that output the concept of banana is formed. This figure (A) shows that it is going to excite a series of neuron parts in addition to input information to a certain neural network structure. In this figure (B), although the input stimulus propagates to the neuron part to some extent, the stimulus does not propagate to the neuron part corresponding to the output information and does not enter the excited state, so that output information cannot be obtained. However, in this figure (C), when an additional stimulus is received from a neuron part outside the system to a neuron part in a series of neuron parts in such a state, the neuron part corresponding to the output information is stimulated. Propagate and output information can be obtained.
 このように、連想機能は、一連のニューロン部は、自身が属する他のニューラルネットワーク構造以外の他のニューラルネットワーク構造に属するニューロン部の発火による出力値から入力を受け取ることで、一連のニューロン部の出力値とシナプス部の重み係数との積の総和が閾値以上となり、所定の機能を発現するように機能する。このように、もう少しで出力を得られる準興奮状態にある一連のニューロン部に対して、関連する情報などを追加刺激として与えることで出力情報を得られるようになる。 As described above, the associative function is such that a series of neuron units receives an input from an output value by firing of a neuron unit belonging to another neural network structure other than the other neural network structure to which the self-association unit belongs. The sum of products of the output value and the weighting factor of the synapse part is equal to or greater than a threshold value, and functions so as to develop a predetermined function. As described above, output information can be obtained by giving related information or the like as an additional stimulus to a series of neuron units in a quasi-excited state in which output can be obtained in a little more amount.
 ただし、得られた出力情報の確度が高いものか否かの評価は必要である。その評価を行うために、評価機能を使用する。評価機能とは、図20に示すように、連想機能によって得られた出力情報について、入力情報と照合し、出力情報の確度を求め、また、出力情報と判断した理由を抽出する機能である。たとえば、本図左側のニューラルネットワーク構造のように入力情報を受けて興奮したニューロン部の興奮パターンと、本図右側のニューラルネットワーク構造のように追加刺激を受けて出力情報を出力した時の出力情報に対応するニューロン部から逆方向にたどったニューロン部の興奮パターンとを比較する。評価機能では、これらを比較して、興奮したニューロン部の一致度が高いほど、出力情報の確度が高いと評価する。 However, it is necessary to evaluate whether the accuracy of the obtained output information is high. Use the evaluation function to perform the evaluation. As shown in FIG. 20, the evaluation function is a function for collating the output information obtained by the associative function with the input information, obtaining the accuracy of the output information, and extracting the reason for determining the output information. For example, the excitation pattern of the neuron that was excited by receiving input information like the neural network structure on the left side of this figure, and the output information when output information was output after receiving additional stimuli like the neural network structure on the right side of this figure The excitation pattern of the neuron part traced in the reverse direction from the neuron part corresponding to is compared. In the evaluation function, these are compared, and the higher the degree of coincidence of the excited neuron portion, the higher the accuracy of the output information is evaluated.
 すなわち、評価機能は、連想機能により、自身が属する他のニューラルネットワーク構造以外の他のニューラルネットワーク構造に属するニューロン部の発火による出力値から入力を受け取ることで総和が閾値以上となった一連のニューロン部に対して機能するものである。そして、評価機能は、自身が属する他のニューラルネットワーク構造に属するニューロン部の発火による出力値のみから入力を受け取った場合の一連のニューロン部内で発火したニューロン部と、自身が属さない他のニューラルネットワーク構造に属するニューロン部の発火による出力値から入力を受け取った場合の一連のニューロン部内で発火したニューロン部と、を比較し、発火したニューロン部の一致割合が所定値以上か否かを判定することにより評価する。このような評価機能を有することで、連想機能において出力された出力情報の妥当性を確保することができる。 In other words, the evaluation function is a series of neurons whose sum is equal to or greater than the threshold by receiving input from the output value of the firing of the neuron part belonging to another neural network structure other than the other neural network structure to which the evaluation function belongs. It functions to the part. And the evaluation function is the neuron part that fires in a series of neuron parts when receiving input only from the output value by firing of the neuron parts belonging to other neural network structure to which it belongs, and other neural network to which it does not belong To compare whether the firing ratio of the firing neuron is equal to or greater than a predetermined value by comparing the firing neuron with a series of neurons when input is received from the output value from firing of the neuron that belongs to the structure Evaluate by By having such an evaluation function, it is possible to ensure the validity of the output information output in the associative function.
 上述したすべてのニューラルネットワーク構造は、大別して、2通りの方法で実現できる。1つは、ノイマン型コンピュータ上に、ソフトウェアでシミュレータを構築する方法である。もう1つは、ニューロンを電子回路でモデル化し、それを多数実装したハードウェアを構築する方法である。前者は、小規模なそして試行的なニューラルネットワーク構造に向いており、後者は、大規模なそして本格的なニューラルネットワーク構造に向いている。 All the neural network structures described above can be broadly realized in two ways. One is a method of building a simulator with software on a Neumann computer. The other is a method of constructing hardware in which a neuron is modeled with an electronic circuit and a large number of such neurons are mounted. The former is suitable for small-scale and trial neural network structures, and the latter is suitable for large-scale and full-scale neural network structures.
 上述したニューラルネットワーク構造をハードウェアにより実現するための電子回路は、ニューロン部およびシナプス部を構成する演算素子と、重み係数を記憶する記憶素子と、総和を計算する加算回路とから構成される。ニューロン部およびシナプス部を構成する演算素子は、多入力1出力型の半導体素子であり、この素子が多数集積することでニューラルネットワーク構造を構成する。総和を計算する加算回路は、ニューロン部が発火した場合シナプス部の重み係数を加算するための回路である。回路の構成は、0と1の2値で表現されるデジタル値を加算する既知の回路によって構成することができる。 An electronic circuit for realizing the above-described neural network structure by hardware includes an arithmetic element that forms a neuron part and a synapse part, a storage element that stores a weight coefficient, and an adder circuit that calculates a sum. Arithmetic elements constituting the neuron part and the synapse part are multi-input single-output type semiconductor elements, and a large number of these elements are integrated to constitute a neural network structure. The adder circuit for calculating the sum is a circuit for adding the weighting coefficients of the synapse part when the neuron part fires. The configuration of the circuit can be configured by a known circuit that adds digital values expressed by binary values of 0 and 1.
 重み係数を記憶する記憶素子は、値を保持する機能を備える半導体素子であり、各シナプス部に対応して少なくとも1つ存在する。なお、この記憶素子は、メモリスタ素子で実装されてもよい。これによれば、単に0か1かのデジタル値だけでなく、複数の値を取り得る記憶素子を、受動素子であるメモリスタ素子1個だけで形成でき、しかも、記憶した値を電気抵抗として利用することはもちろん、流した電流の累積値によって記憶値を増減させることもでき、また、メモリスタ素子の記憶は不揮発性であるので、電源を遮断しても記憶を保持することができる。 The storage element that stores the weighting coefficient is a semiconductor element having a function of holding a value, and there exists at least one corresponding to each synapse portion. Note that this memory element may be implemented as a memristor element. According to this, a memory element that can take not only a digital value of 0 or 1 but also a plurality of values can be formed by only one memristor element that is a passive element, and the stored value is used as an electric resistance. Of course, the stored value can be increased or decreased depending on the accumulated value of the flowed current, and the memory of the memristor element is non-volatile, so that the memory can be retained even when the power is turned off.
 ニューロン部およびシナプス部を構成する複数の演算素子の一部において、上述したループ構造を始めとしたニューラルネットワーク構造を構成することで、係るニューラルネットワーク構造を含む電子回路を提供することができる。なお、電子回路は、シリコンなどの半導体でできた基板の上に多数の電子素子や配線によりニューラルネットワーク構造を実装した回路、所謂ニューロチップであってもよい。 An electronic circuit including the neural network structure can be provided by configuring a neural network structure including the loop structure described above in a part of the plurality of arithmetic elements constituting the neuron part and the synapse part. The electronic circuit may be a so-called neurochip, which is a circuit in which a neural network structure is mounted by a large number of electronic elements and wirings on a substrate made of a semiconductor such as silicon.
 図15に示すように、情報処理システム100は、上述した電子回路であるニューロチップ10と、ニューロチップ10が複数設けられたニューロボード20と、ニューロボード20が複数集まり論理的な構成を有する領野機能ユニット30と、複数の領野機能ユニット30が連携して統合的に機能させるためのバックパネル60と、外部から光学的入力、音響的入力、電気的入力、または物理的動作の入力を受け付けて、電気信号に変換する入力部40と、外部へ電気信号を光学的出力、音響的出力、電気的出力、物理的動作の出力に変換する出力部50と、を備える。 As shown in FIG. 15, the information processing system 100 includes a neurochip 10 that is the above-described electronic circuit, a neuroboard 20 provided with a plurality of neurochips 10, and a region having a logical configuration in which a plurality of neuroboards 20 are gathered. The functional unit 30, the back panel 60 for the plurality of area functional units 30 to function in an integrated manner, and externally accepting optical input, acoustic input, electrical input, or physical operation input And an input unit 40 that converts the electric signal to the outside, and an output unit 50 that converts the electric signal into an optical output, an acoustic output, an electrical output, and an output of physical operation.
 1ヒューマンレベルは概ね100億ニューロン/システムと言われているが、ニューロボード20は、ニューロン部を構成するニューロチップ10の集積度をさらに高めるために用いられる。領野機能ユニット30は、複数のニューロボード20がある機能を実現するための論理的なまとまりである。領野機能ユニット30は、たとえば音声や画像の領野においては図16に示すように、音声解析機能や画像解析機能を有する領野、単語や形状の領野、概念の領野、文法機能の領野、意思決定機能の領野といった様々な領野がある。それぞれの領野には多数のニューロンチップ10が必要となるため、複数のニューロボード20を用いて実現する。 Although one human level is said to be approximately 10 billion neurons / system, the neuroboard 20 is used to further increase the degree of integration of the neurochip 10 constituting the neuron portion. The area function unit 30 is a logical unit for realizing a function having a plurality of neuro boards 20. For example, as shown in FIG. 16, the area function unit 30 is an area having a voice analysis function or an image analysis function, a field area of a word or shape, a field of a concept, a field of a grammar function, or a decision making function. There are various areas such as Since each area requires a large number of neuron chips 10, it is realized by using a plurality of neuroboards 20.
 入力部40は、情報処理システム100の外部から与えられる、光の、音の、電気的、物理的な刺激などあらゆる刺激を入力として受けて電気的信号に変換する装置である。たとえば、カメラ、マイク、センサなどである。出力部50は、逆にニューロンチップ10が電気的に出力した信号を、光の、音の、電気的、物理的な出力を行うものである。たとえば、ディスプレイ、スピーカ、通信機器、アクチュエータである。 The input unit 40 is a device that receives as input any stimulus such as light, sound, electrical or physical stimulus given from the outside of the information processing system 100 and converts it into an electrical signal. For example, a camera, a microphone, and a sensor. Conversely, the output unit 50 outputs light, sound, electrical, and physical signals that are electrically output from the neuron chip 10. For example, a display, a speaker, a communication device, and an actuator.
 情報処理システム100は、入力部40から入力される外界の刺激に対して、それぞれ関連する領野機能ユニット30が機能し、出力部50により反応する。情報処理システム100は、これらの過程において、ニューラルネットワーク構造に形成されたループ構造により統合され、情報処理システム100内のニューロン部が興奮しやすく、ニューロン部間を刺激が伝播しやすくなっている。 In the information processing system 100, the related functional unit 30 functions in response to external stimuli input from the input unit 40, and reacts by the output unit 50. In these processes, the information processing system 100 is integrated by a loop structure formed in the neural network structure, so that the neuron part in the information processing system 100 is easily excited and the stimulus is easily propagated between the neuron parts.
 このように、外部からの各種入力に呼応した視覚的、音声的、具体的、抽象的、または物理的な意味を持つ所謂概念ニューロン部を形成し、またこのニューロン部と関連するニューロン部を接続させることによって、概念ニューロン部に、より広範で豊かな意味を持たせ、現実世界の意味に立脚したニューラルネットワーク構造を実現するとともに、ニューラルネットワーク構造の出力を外部に作用させることができる情報処理システム100を提供することができる。 In this way, a so-called conceptual neuron with visual, audio, concrete, abstract, or physical meaning corresponding to various external inputs is formed, and the neuron related to this neuron is connected. Information system that allows the concept neuron to have a broader and richer meaning, realizes a neural network structure based on the real world meaning, and allows the output of the neural network structure to act externally 100 can be provided.
 ノイマン型コンピュータ上にソフトウェアのシミュレータとして、本発明にかかるニューラルネットワーク構造を構築することも可能である。このシミュレータソフトウェアでは、複数のニューロン部と複数のニューロン部を結びつけるシナプス部とを有するニューラルネットワーク構造において、ニューロン部およびシナプス部を介して発火を伝播させる方法を実装する。具体的には、シミュレータソフトウェアは、
 第1のニューロン部が発火し、シナプス部を介して第2のニューロン部に伝播させるステップと、
 第(N-1)のニューロン部は、第(N-2)のニューロン部からの発火が伝播されると発火し、シナプス部を介して第Nのニューロン部に伝播させるステップと、
 第Nのニューロン部は、第(N-1)のニューロン部からの発火が伝播されると発火し(Nは3以上の自然数)、シナプス部を介して第1のニューロン部に伝播させるステップと、
 を含む。これによれば、刺激を受けるとその刺激が繰り返し自分自身に伝わることとなるため、ニューラルネットワーク構造(ループ構造)内において、刺激が連続的に伝播しつづけるように機能させることと、ニューラルネットワーク構造(ループ構造)内のニューロン部が連続的に伝播する刺激が伝播される都度一定の時間間隔をおいて繰り返し興奮するように機能させることができるシミュレータソフトウェアおよびその方法を提供することができる。
It is also possible to construct a neural network structure according to the present invention as a software simulator on a Neumann computer. In this simulator software, in a neural network structure having a plurality of neuron units and a synapse unit that connects the plurality of neuron units, a method for propagating firing through the neuron unit and the synapse unit is implemented. Specifically, the simulator software
Causing the first neuron to fire and propagate to the second neuron through the synapse;
The (N-1) neuron unit ignites when the firing from the (N-2) neuron unit is propagated, and propagates to the Nth neuron unit via the synapse unit;
The Nth neuron unit ignites when the firing from the (N−1) th neuron unit is propagated (N is a natural number of 3 or more), and propagates to the first neuron unit via the synapse unit; ,
including. According to this, when a stimulus is received, the stimulus is repeatedly transmitted to itself. Therefore, in the neural network structure (loop structure), the function of the stimulus continues to propagate and the neural network structure It is possible to provide simulator software and a method thereof that can function to repeatedly excite a certain time interval each time a stimulus that is continuously propagated in a (loop structure) is propagated.
 また、上述したステップを実行するための命令を含み、コンピュータ上で実行されるプログラムを提供することができる。これによれば、通常のノイマン型コンピュータを用いて、ループ構造を含むニューラルネットワーク構造を機能させることができる方法を実行するプログラムを提供することができる。 Also, it is possible to provide a program that includes instructions for executing the above steps and is executed on a computer. According to this, it is possible to provide a program for executing a method capable of causing a neural network structure including a loop structure to function using a normal Neumann computer.
 また、このプログラムは、ニューラルネットワーク構造のシミュレーションをコンピュータ上で実行するプログラムであってもよい。このプログラムは、
 シナプス部の重み係数を増減する記憶強化/劣化ステップと、
 一のニューロン部の発火または未発火の出力値とシナプス部の重み係数との積を1回以上加算した総和を計算する加算ステップと、
 記憶した重み係数および計算した総和に基づき、一のニューロン部がシナプス部を介して他のニューロン部に発火を伝播させるまたは伝播させないことのシミュレーションを行う処理ステップと、
 を実行するプログラムである。これによれば、通常のノイマン型コンピュータを用いて、ループ構造を含むニューラルネットワーク構造をシミュレーションするプログラムを提供することができる。
Further, this program may be a program for executing a simulation of the neural network structure on a computer. This program
Memory enhancement / degradation step to increase / decrease the weight factor of the synapse part,
An adding step for calculating a sum obtained by adding one or more times a product of an output value of firing or non-firing of one neuron part and a weight coefficient of a synapse part;
Based on the stored weighting factor and the calculated sum, a processing step for simulating that one neuron part propagates or does not propagate the firing through the synapse part to the other neuron part;
It is a program that executes According to this, it is possible to provide a program for simulating a neural network structure including a loop structure using a normal Neumann computer.
 なお、本発明は、例示した実施例に限定するものではなく、特許請求の範囲の各項に記載された内容から逸脱しない範囲の構成による実施が可能である。すなわち、本発明は、主に特定の実施形態に関して特に図示され、かつ説明されているが、本発明の技術的思想および目的の範囲から逸脱することなく、以上述べた実施形態に対し、数量、その他の詳細な構成において、当業者が様々な変形を加えることができるものである。 It should be noted that the present invention is not limited to the illustrated embodiments, and can be implemented with configurations within a range that does not deviate from the contents described in the claims. That is, although the present invention has been particularly illustrated and described with respect to particular embodiments, it should be understood that the present invention has been described in terms of quantity, quantity, and amount without departing from the scope and spirit of the present invention. In other detailed configurations, various modifications can be made by those skilled in the art.
 100  情報処理システム
 10   ニューロチップ(電子回路)
 20   ニューロボード
 30   領野機能ユニット
 40   入力部
 50   出力部

 
100 Information processing system 10 Neurochip (electronic circuit)
20 neuro board 30 territory functional unit 40 input unit 50 output unit

Claims (22)

  1.  複数のニューロン部と、前記複数のニューロン部を結びつけるシナプス部とを有し、一の前記ニューロン部の発火または未発火の出力値と前記シナプス部の重み係数との積を1回以上加算した総和に応じて、前記シナプス部を介して前記一のニューロン部と接続されている他の前記ニューロン部が発火するニューラルネットワーク構造であって、
     前記一のニューロン部の発火による出力値が、前記他のニューロン部および前記シナプス部を介して、前記一のニューロン部自身に伝播するループ構造を備えるニューラルネットワーク構造。
    A sum total including a plurality of neuron parts and a synapse part that connects the plurality of neuron parts, and adding a product of an output value of firing or unfired one neuron part and a weighting factor of the synapse part at least once And a neural network structure that fires another neuron unit connected to the one neuron unit via the synapse unit,
    A neural network structure including a loop structure in which an output value generated by firing of one of the neuron units propagates to the one neuron unit itself via the other neuron unit and the synapse unit.
  2.  請求項1に記載のニューラルネットワーク構造と、
     前記ループ構造を伝播することにより前記一のニューロン部が発火することによって出力する出力値を入力値として受け取る他のニューラルネットワーク構造と、
     を備えることを特徴とするニューラルネットワーク構造。
    A neural network structure according to claim 1;
    Another neural network structure that receives, as an input value, an output value output by firing the one neuron unit by propagating the loop structure;
    A neural network structure characterized by comprising:
  3.  前記ループ構造における伝播が1回以上行われ1回以上の入力を受け取る前記他のニューラルネットワーク構造において、一連の前記ニューロン部は、前記入力を発火による出力値から受け取ることで前記総和が閾値以上となり、所定の機能を発現することを特徴とする請求項2に記載のニューラルネットワーク構造。 In the other neural network structure in which propagation in the loop structure is performed one or more times and receives one or more inputs, a series of the neuron units receives the input from an output value by firing, so that the sum exceeds a threshold value. The neural network structure according to claim 2, wherein the neural network structure exhibits a predetermined function.
  4.  一の前記ニューロン部と他の1以上の前記ニューロン部との論理積を算出する論理積算出部と、
     一の前記ニューロン部と他の1以上の前記ニューロン部との論理和を算出する論理和算出部と、
     一の前記ニューロン部の論理否定を算出する論理否定算出部と、
     を備えることを特徴とする請求項3に記載のニューラルネットワーク構造。
    A logical product calculation unit that calculates a logical product of one neuron unit and one or more other neuron units;
    A logical sum calculator for calculating a logical sum of one neuron part and one or more other neuron parts;
    A logical negation calculation unit for calculating a logical negation of one of the neuron units;
    The neural network structure according to claim 3, comprising:
  5.  前記論理積算出部、前記論理和算出部、前記論理否定算出部のいずれかまたは組み合わせにより、より高次の機能を発現する一連の前記ニューロン部を形成することを特徴とする請求項4に記載のニューラルネットワーク構造。 5. The series of neuron units that express higher-order functions are formed by any one or combination of the logical product calculation unit, the logical sum calculation unit, and the logical negation calculation unit. Neural network structure.
  6.  一の機能を発現する前記一連のニューロン部は、他の機能を発現する1以上の前記一連のニューロン部と接続されていて、かつ、前記総和が閾値以上となることで発火することを特徴とする請求項3または5に記載のニューラルネットワーク構造。 The series of neuron parts that express one function are connected to one or more series of neuron parts that express another function, and ignite when the sum exceeds a threshold value. The neural network structure according to claim 3 or 5.
  7.  第1の前記ニューロン部と、第2の前記ニューロン部および他の複数の前記ニューロン部からなり前記第2のニューロン部の発火による出力値が前記他の複数のニューロン部を介して前記第2のニューロン部自身に伝播する構造を有する発振ニューロン部とを有し、前記第1のニューロン部から前記第2のニューロン部への方向で結びつけるシナプス部は、1回の前記総和で前記第2のニューロン部が発火する発火重み係数を有し、
     前記発振ニューロン部内の前記ニューロン部同士を結びつけるシナプス部は、1回の前記総和で前記ニューロン部が発火する発火重み係数を有し、
     前記発振ニューロン部内の前記ニューロン部から前記第1のニューロン部への方向で結びつけるシナプス部は、閾値未満のため前記第1のニューロン部が発火しない未発火重み係数を有する、
     発振部を備えることを特徴とする請求項6に記載のニューラルネットワーク構造。
    The first neuron unit, the second neuron unit, and a plurality of other neuron units, and the output value generated by firing of the second neuron unit is transmitted to the second neuron unit via the other neuron units. An oscillating neuron unit having a structure that propagates to the neuron unit itself, and the synapse unit connected in the direction from the first neuron unit to the second neuron unit is the second neuron in one summation. Has a firing weight coefficient that fires the part,
    The synapse part that connects the neuron parts in the oscillation neuron part has a firing weight coefficient that fires the neuron part in one summation,
    The synapse unit connected in the direction from the neuron unit to the first neuron unit in the oscillation neuron unit has an unfired weight coefficient that the first neuron unit does not fire because it is less than a threshold value.
    The neural network structure according to claim 6, further comprising an oscillating unit.
  8.  前記一連のニューロン部から構成される長期記憶部と、
     前記ループ構造と1以上の前記発振部から構成される短期記憶部と、
     を備えることを特徴とする請求項7に記載のニューラルネットワーク構造。
    A long-term memory unit composed of the series of neuron units;
    A short-term memory unit composed of the loop structure and one or more oscillation units;
    The neural network structure according to claim 7, comprising:
  9.  複数の前記ニューロン部を有する、請求項8に記載の第1ニューラルネットワーク構造と、複数の前記ニューロン部を有する、請求項8に記載の第2ニューラルネットワーク構造と、を含み、
     前記第1ニューラルネットワーク構造に含まれる前記ニューロン部と、前記第2ニューラルネットワーク構造に含まれる前記ニューロン部とを結びつける前記シナプス部を備えることを特徴とするニューラルネットワーク構造。
    The first neural network structure according to claim 8, comprising a plurality of the neuron parts, and the second neural network structure according to claim 8, comprising a plurality of the neuron parts,
    A neural network structure comprising: the synapse unit that connects the neuron unit included in the first neural network structure and the neuron unit included in the second neural network structure.
  10.  前記第1ニューラルネットワーク構造は、1以上の前記第2ニューラルネットワーク構造を制御する機能を有するマスタであり、
     前記第2ニューラルネットワーク構造は、前記第1ニューラルネットワーク構造の制御下で機能するスレーブであることを特徴とする請求項9に記載のニューラルネットワーク構造。
    The first neural network structure is a master having a function of controlling one or more second neural network structures;
    The neural network structure according to claim 9, wherein the second neural network structure is a slave that functions under the control of the first neural network structure.
  11.  前記第1ニューラルネットワーク構造は、汎用的または計画的機能を有し、
     前記第2ニューラルネットワーク構造は、個別的または実行的機能を有することを特徴とする請求項10に記載のニューラルネットワーク構造。
    The first neural network structure has a general purpose or planned function,
    The neural network structure according to claim 10, wherein the second neural network structure has an individual or execution function.
  12.  前記第1ニューラルネットワーク構造に含まれる前記ニューロン部と、前記第2ニューラルネットワーク構造に含まれる前記ニューロン部とを結びつける前記シナプス部の重み係数は、当該シナプス部が結びつける両端の前記ニューロン部が同時に発火した場合に増加することを特徴とする請求項9乃至請求項11のいずれかに記載のニューラルネットワーク構造。 The weighting coefficient of the synapse unit that connects the neuron unit included in the first neural network structure and the neuron unit included in the second neural network structure is such that the neuron units at both ends connected by the synapse unit fire simultaneously. The neural network structure according to any one of claims 9 to 11, wherein the neural network structure increases in the case of being performed.
  13.  前記ループ構造における伝播が1回以上行われ1回以上の入力を受け取る前記他のニューラルネットワーク構造において、一連の前記ニューロン部は、自身が属する前記他のニューラルネットワーク構造以外の前記他のニューラルネットワーク構造に属する前記ニューロン部の発火による出力値から入力を受け取ることで前記総和が閾値以上となり、所定の機能を発現することを特徴とする請求項2乃至請求項12に記載のニューラルネットワーク構造。 In the other neural network structure in which the propagation in the loop structure is performed one or more times and receives one or more inputs, a series of the neuron units is the other neural network structure other than the other neural network structure to which it belongs. 13. The neural network structure according to claim 2, wherein the sum is equal to or greater than a threshold value by receiving an input from an output value obtained by firing of the neuron unit belonging to the network, and a predetermined function is exhibited.
  14.  自身が属する前記他のニューラルネットワーク構造以外の前記他のニューラルネットワーク構造に属する前記ニューロン部の発火による出力値から入力を受け取ることで前記総和が閾値以上となった前記一連のニューロン部は、
     自身が属する前記他のニューラルネットワーク構造に属する前記ニューロン部の発火による出力値のみから入力を受け取った場合の前記一連のニューロン部内で発火した前記ニューロン部と、
     自身が属さない前記他のニューラルネットワーク構造に属する前記ニューロン部の発火による出力値から入力を受け取った場合の前記一連のニューロン部内で発火した前記ニューロン部と、
     を比較し、発火した前記ニューロン部の一致割合が所定値以上か否かを判定することにより評価されることを特徴とする請求項13に記載のニューラルネットワーク構造。
    The series of neuron units whose sum is equal to or more than a threshold value by receiving an input from an output value by firing of the neuron unit belonging to the other neural network structure other than the other neural network structure to which the self belongs,
    The neuron unit that fires in the series of neuron units when receiving an input only from the output value by firing of the neuron unit belonging to the other neural network structure to which it belongs;
    The neuron part that fires in the series of neuron parts when receiving an input from an output value by firing of the neuron part belonging to the other neural network structure to which it does not belong;
    The neural network structure according to claim 13, wherein the neural network structure is evaluated by comparing the ratios and determining whether or not the coincidence ratio of the fired neuron portions is equal to or greater than a predetermined value.
  15.  前記他のニューラルネットワーク構造において、所定の機能を発現している第1の前記一連のニューロン部の中の少なくとも1つのニューロン部と、前記第1の一連のニューロン部とは異なる所定の機能を発現している第2の前記一連のニューロン部の中の少なくとも1つのニューロン部とが同時に発火した場合、前記第1の一連のニューロン部と前記第2の一連のニューロン部を結びつける前記シナプス部の重み係数は増加することを特徴とする請求項3乃至請求項14のいずれかに記載のニューラルネットワーク構造。 In the other neural network structure, at least one neuron unit in the first series of neuron units expressing a predetermined function and a predetermined function different from the first series of neuron units are expressed. The weight of the synapse unit that connects the first series of neuron units and the second series of neuron units when at least one neuron unit in the second series of neuron units firing simultaneously fires 15. The neural network structure according to claim 3, wherein the coefficient increases.
  16.  前記ニューロン部および前記シナプス部を構成する演算素子と、
     前記重み係数を記憶する記憶素子と、
     前記総和を計算する加算回路と、
     を備え、請求項1乃至請求項15のいずれかに記載のニューラルネットワーク構造を含む電子回路。
    Arithmetic elements constituting the neuron part and the synapse part;
    A storage element for storing the weighting factor;
    An adder circuit for calculating the sum;
    An electronic circuit comprising the neural network structure according to any one of claims 1 to 15.
  17.  前記記憶素子は、メモリスタで実装されることを特徴とする請求項16に記載の電子回路。 The electronic circuit according to claim 16, wherein the memory element is implemented by a memristor.
  18.  請求項16または17に記載の電子回路と、
     外部から光学的入力、音響的入力、電気的入力、または物理的動作の入力を受け付けて、電気信号に変換する入力部と、
     外部へ電気信号を光学的出力、音響的出力、電気的出力、物理的動作の出力に変換する出力部と、
     を備え、
     前記入力部は、電気信号を前記ニューラルネットワーク構造に含まれる前記ニューロン部に入力し、
     前記出力部は、前記ニューラルネットワーク構造に含まれる前記ニューロン部からの電気信号を受け付ける、
     ことを特徴とする情報処理システム。
    An electronic circuit according to claim 16 or 17,
    An input unit that receives an optical input, an acoustic input, an electrical input, or an input of a physical operation from the outside and converts the input into an electrical signal;
    An output unit that converts an electrical signal to an optical output, an acoustic output, an electrical output, and an output of physical operation to the outside;
    With
    The input unit inputs an electrical signal to the neuron unit included in the neural network structure,
    The output unit receives an electrical signal from the neuron unit included in the neural network structure;
    An information processing system characterized by this.
  19.  複数のニューロン部と前記複数のニューロン部を結びつけるシナプス部とを有するニューラルネットワーク構造において、前記ニューロン部および前記シナプス部を介して発火を伝播させる方法であって、
     第1の前記ニューロン部が発火し、前記シナプス部を介して第2の前記ニューロン部に伝播させるステップと、
     第(N-1)の前記ニューロン部は、第(N-2)の前記ニューロン部からの発火が伝播されると発火し、前記シナプス部を介して第Nの前記ニューロン部に伝播させるステップと、
     前記第Nのニューロン部は、前記第(N-1)のニューロン部からの発火が伝播されると発火し(Nは3以上の自然数)、前記シナプス部を介して前記第1のニューロン部に伝播させるステップと、
     を含む方法。
    In a neural network structure having a plurality of neuron parts and a synapse part connecting the plurality of neuron parts, a method of propagating firing through the neuron part and the synapse part,
    Causing the first neuron to fire and propagating to the second neuron through the synapse;
    The (N-1) th neuron unit ignites when the firing from the (N-2) th neuron unit is propagated, and propagates to the Nth neuron unit via the synapse unit; ,
    The N-th neuron unit ignites when the firing from the (N−1) -th neuron unit is propagated (N is a natural number of 3 or more) and passes through the synapse unit to the first neuron unit. A propagation step;
    Including methods.
  20.  請求項19に記載の方法のすべてのステップを実行するための命令を含み、コンピュータ上で実行されるプログラム。 A program that includes instructions for executing all the steps of the method according to claim 19 and that is executed on a computer.
  21.  請求項1乃至15のいずれかに記載のニューラルネットワーク構造のシミュレーションをコンピュータ上で実行するプログラムであって、
     前記重み係数を増減する記憶強化/劣化ステップと、
     前記総和を計算する加算ステップと、
     記憶した前記重み係数および計算した前記総和に基づき、一の前記ニューロン部が前記シナプス部を介して他の前記ニューロン部に発火を伝播させるまたは伝播させないことのシミュレーションを行う処理ステップと、
     を実行するプログラム。
    A program for executing a simulation of the neural network structure according to any one of claims 1 to 15 on a computer,
    A memory enhancement / degradation step of increasing or decreasing the weighting factor;
    An adding step for calculating the sum;
    Based on the stored weighting factor and the calculated sum, a processing step of performing a simulation of one neuron unit propagating or not propagating firing to the other neuron unit via the synapse unit;
    A program that executes.
  22.  複数のニューロン部と、前記複数のニューロン部を結びつけるシナプス部とを有し、一の前記ニューロン部の発火または未発火の出力値と前記シナプス部の重み係数との積を1回以上加算した総和に応じて、前記シナプス部を介して前記一のニューロン部と接続されている他の前記ニューロン部が発火するニューラルネットワーク構造であって、
     前記一のニューロン部に対して一の入力を与えると、前記一のニューロン部は発火し、該発火による出力値が、前記他のニューロン部および前記シナプス部を介して、前記一のニューロン部自身に伝播し発火するニューラルネットワーク構造。
     
    A sum total including a plurality of neuron parts and a synapse part that connects the plurality of neuron parts, and adding a product of an output value of firing or unfired one neuron part and a weighting factor of the synapse part at least once And a neural network structure that fires another neuron unit connected to the one neuron unit via the synapse unit,
    When one input is given to the one neuron unit, the one neuron unit fires, and an output value by the firing is transmitted to the one neuron unit itself via the other neuron unit and the synapse unit. Neural network structure that propagates and ignites.
PCT/JP2018/010011 2017-03-28 2018-03-14 Neural network structure, electronic circuit, information processing system, method, and program WO2018180499A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019509227A JP6712399B2 (en) 2017-03-28 2018-03-14 Neural network structure, electronic circuit, information processing system, method, and program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-062748 2017-03-28
JP2017062748 2017-03-28

Publications (1)

Publication Number Publication Date
WO2018180499A1 true WO2018180499A1 (en) 2018-10-04

Family

ID=63675435

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/010011 WO2018180499A1 (en) 2017-03-28 2018-03-14 Neural network structure, electronic circuit, information processing system, method, and program

Country Status (2)

Country Link
JP (1) JP6712399B2 (en)
WO (1) WO2018180499A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109657785A (en) * 2018-11-28 2019-04-19 北京工业大学 The nerve network circuit structure of variable neuronal excitation degree
JP2022538592A (en) * 2019-07-03 2022-09-05 マイクロン テクノロジー,インク. neural network memory
CN116542291A (en) * 2023-06-27 2023-08-04 北京航空航天大学 Pulse memory image generation method and system for memory loop inspiring

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05128081A (en) * 1991-01-29 1993-05-25 Wacom Co Ltd Method for connecting neurons and neural network
US6424961B1 (en) * 1999-12-06 2002-07-23 AYALA FRANCISCO JOSé Adaptive neural learning system
WO2015001697A1 (en) * 2013-07-04 2015-01-08 パナソニックIpマネジメント株式会社 Neural network circuit and learning method thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05128081A (en) * 1991-01-29 1993-05-25 Wacom Co Ltd Method for connecting neurons and neural network
US6424961B1 (en) * 1999-12-06 2002-07-23 AYALA FRANCISCO JOSé Adaptive neural learning system
WO2015001697A1 (en) * 2013-07-04 2015-01-08 パナソニックIpマネジメント株式会社 Neural network circuit and learning method thereof

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"Deep Learning Programing", vol. 89, 25 November 2015, ISBN: 978-4-7741-7638-3, article MURATA, KENTA, pages: 47 - 52 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109657785A (en) * 2018-11-28 2019-04-19 北京工业大学 The nerve network circuit structure of variable neuronal excitation degree
JP2022538592A (en) * 2019-07-03 2022-09-05 マイクロン テクノロジー,インク. neural network memory
CN116542291A (en) * 2023-06-27 2023-08-04 北京航空航天大学 Pulse memory image generation method and system for memory loop inspiring
CN116542291B (en) * 2023-06-27 2023-11-21 北京航空航天大学 Pulse memory image generation method and system for memory loop inspiring

Also Published As

Publication number Publication date
JP6712399B2 (en) 2020-06-24
JPWO2018180499A1 (en) 2020-02-13

Similar Documents

Publication Publication Date Title
US11055609B2 (en) Single router shared by a plurality of chip structures
US8793205B1 (en) Robotic learning and evolution apparatus
Bak et al. Adaptive learning by extremal dynamics and negative feedback
WO2018180499A1 (en) Neural network structure, electronic circuit, information processing system, method, and program
Sher Handbook of neuroevolution through Erlang
Taiji et al. Dynamics of internal models in game players
Graupe Principles of artificial neural networks: basic designs to deep learning
TW201535277A (en) Monitoring neural networks with shadow networks
US11514327B2 (en) Apparatus and method for utilizing a parameter genome characterizing neural network connections as a building block to construct a neural network with feedforward and feedback paths
JPH07114524A (en) Signal processor
Chivers How to train an all-purpose robot: DeepMind is tackling one of the hardest problems for AI
Goldstone Becoming cognitive science
Johannessen et al. Evidence-Based Innovation Leadership
Lukyanova et al. Neuronal topology as set of braids: information processing, transformation and dynamics
CN112352248A (en) Apparatus and method for constructing a neural network with feedforward and feedback paths using a parametric genome characterizing neural network connections as building blocks
Ikegami et al. Joint attention and dynamics repertoire in coupled dynamical recognizers
Bergman Self-organization by simulated evolution
Lisovskaya et al. Processing of Neural System Information with the Use of Artificial Spiking Neural Networks
Durkin History and applications
Teuscher Turing’s connectionism
Cabessa et al. Neural computation with spiking neural networks composed of synfire rings
De Garis Artificial brains: an evolved neural net module approach
Yu et al. Intelligence in Machines
de Garis Artificial brains
Gehlmann Fake it'till you make it: How advancing AI Imitation Learning could contribute to the development of Artificial General Intelligence

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18777758

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019509227

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18777758

Country of ref document: EP

Kind code of ref document: A1