CN105701540A - Self-generated neural network construction method - Google Patents

Self-generated neural network construction method Download PDF

Info

Publication number
CN105701540A
CN105701540A CN201610015589.1A CN201610015589A CN105701540A CN 105701540 A CN105701540 A CN 105701540A CN 201610015589 A CN201610015589 A CN 201610015589A CN 105701540 A CN105701540 A CN 105701540A
Authority
CN
China
Prior art keywords
neuron
output
network
neuronic
self
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610015589.1A
Other languages
Chinese (zh)
Other versions
CN105701540B (en
Inventor
何虎
许志恒
马海林
王玉哲
王旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiamen Semiconductor Industry Technology Research and Development Co., Ltd.
Original Assignee
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University filed Critical Tsinghua University
Priority to CN201610015589.1A priority Critical patent/CN105701540B/en
Publication of CN105701540A publication Critical patent/CN105701540A/en
Application granted granted Critical
Publication of CN105701540B publication Critical patent/CN105701540B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/06Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
    • G06N3/063Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means
    • G06N3/065Analogue means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Computational Linguistics (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Neurology (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Image Analysis (AREA)

Abstract

A self-generated neural network construction method comprises the following steps of 1 adding a stimulation signal; 2 evaluating the output intensity of a nerve cell, determining the connection direction of the nerve cell, forming the network connection continuously, and finally generating an initial network; 3 calculating the position and the probability connected with a target nerve cell; 4 determining whether the current network generation process stops, if yes, turning to a step 5, otherwise turning to the step 2; 5 optimizing the network connection via an optimization algorithm; 6 determining whether simulation still needs to be added, if not ending, otherwise, turning to the step 1. The network provided by the present invention is a self-generated network, and the influence of the artificial subjective factors can be reduced effectively. In addition, the generation process of the network takes the biological neural theory as basis, so that the neural brain can be searched possibly and further to realize the real intelligence.

Description

A kind of self-generating neutral net construction method
Technical field
The invention belongs to neural computing technical field, relate to software-oriented modeling method and hard-wired neutral net self-generating method, particularly to a kind of self-generating neutral net construction method。
Background technology
No matter development of computer, to today, is computing capability or power consumption is obtained for lifting substantially in performance。But along with the variation of people's demand, the problem that computer configuation instantly exposes is also more tired more many。The information-processing problem that nature provides has structural and unstructuredness two kinds。Structural issue refers to that available mathematical linguistics is known and strictly described, and by the algorithmic formula that realizes of problem, and can be mapped to computer program, is then processed by computer bar instruction。During such issues that solve, the ability of traditional von Neumann machine is considerably beyond the mankind。But not structural issue refers to the problem that cannot use arthmetic statement, people are difficult to the understanding of oneself is translated into machine instruction or is only capable of carrying out extremely roughly。Therefore, traditional computer when carrying out such as image processing with scene analysis, language identification and understanding, Study of Intelligent Robot Control, then differs greatly with human capabilitiy, even helpless。
The development of neutral net is broadly divided into three phases。Within 1947~1969 years, being the initial stage, scientists proposes many neuron models and learning rules in the meantime, such as MP model, HEBB learning rules and perceptron etc.。Within 1970~1986 years, being the transition period, during this period, network is introduced the concept of energy function by professor Hopfield, gives the stability criteria of network, it is proposed that for associative memory and the approach optimizing calculating。Within 1987, it is period of expansion so far, forms a climax of neutral net development。Representational have CNN (convolutional neural networks) and RNN (Recognition with Recurrent Neural Network)。Instantly artificial neural network achieves very big success in the direction such as image recognition, speech recognition。But its structures shape it cannot realize the intelligence that organism is the same。
In modern ANN or the design of pulse nerve net, a greatest drawback is exactly that nerve net can not self-organizing, certainly growth。But need external system intervention can change oneself state。The such as annexation between neuron, weighed value adjusting etc.。This has just been doomed this mode and according to outside stimulation oneself growth as biological neuron, can not repair, also just cannot realize self-learning function。Therefore thoroughly to change the limitation of ANN, be necessary for allowing ANN possess self-organizing, certainly grow function。
Existing known biological neural network model has: Kohonen neural network model, and each neuronal structure is identical, but after irriate, produces to connect weights different, under environmental stimuli, constantly receive signal, constantly perform cluster process, form experience;Hebb learning rules, the learning process of neutral net is finally to occur at the synapses between neuron, and the bond strength of synapse changes along with activity neuronic after the presynaptic, and the amount of change activity neuronic to two sum is directly proportional。Neutral net create-rule can be divided into two stages, growth stage: the weak aixs cylinder of immaturity neurocyte, under the guiding of stimulus signal, heuristically navigates within and is among the brain of stage of development。Once the end of aixs cylinder have found the habitat that it is correct, it begins to and peripheral nerve unit cell foundation synaptic contact as far as possible widely, in order to conducts information。Maturation period: after vertebrates birth, one in early development is characterized by, the neural elimination connected。Due to the extensive connection that growth stage is set up, not all connection is all efficient。Under the long-time stimulus of external signal, it is more firm that correct information transmission can make that link can become, otherwise then atrophy separates。
Summary of the invention
For the shortcoming overcoming above-mentioned prior art, it is an object of the invention to provide a kind of self-generating neutral net construction method, the feature of biological neural model and suitable heuritic approach are mapped on silicon-based electronic circuits, make the neutral net can pass through this model self-generating and self-organizing, it is to avoid the shortcoming of artificial neural network。
To achieve these goals, the technical solution used in the present invention is:
A kind of self-generating neutral net construction method, comprises the steps:
Step 1, adds stimulus signal;
Step 2, assesses neuron output intensity, it is determined that this neuronic closure, constantly forms network and connects, ultimately generates initial network;
Step 3, calculating is even to the position of target nerve unit and probability;
Step 4, it is judged that current network generates whether process stops, if it is, proceed to step 5, otherwise proceeds to step 2 and continues;
Step 5, by optimizing algorithm, optimizes network and connects;
Step 6, it may be judged whether also needing to add stimulates, if it is not, then terminate, otherwise proceeds to step 1。
Neuron in described neutral net possess to input carry out the function that then computing exports, signal transmission can be carried out by connection between neuron and neuron, neuron receive input signal time, to input signal carry out computing and store relevant information;One coordinate figure determined of each neuron in the two-dimensional arrangements of rule, is given in described neuron position in space layout simultaneously。
Being modified code so that neuron possesses threshold decision and exomonental function, namely possess the function that then input computing exports, this partial code is a flight data recorder that can change computation schema;
Described input signal refer to that Current neural unit receives from other neuronic output signals, described computing is threshold decision, and namely output valve exceeds certain scope, then be output as 1;Storage relevant information refer to each neuronic input/output signal and threshold decision value, by each neuron regard as one can calculate, storable unit。
In the present invention, the initial neuronal quantity of network may be set to 49, and final size is set as 10000。
Stimulus signal is 0,1 static information, by inputting modular converter, by static 0,1 information input is changed into dynamic pulse signal, when namely input is 1, constantly launches pulse, when being 0, do not produce pulse, so that the generation process of whole network is converted into dynamic process by static process。
In described step 2, according to neuron output intensity, it is determined that the rule of this neuronic closure is as follows:
The change of every 20 clock cycle statistics once this neuron output, then this neuronic output has changed M time within these 20 clock cycle, and 0≤M≤20 if M is 0, then do not produce line;If M is 1 to 5, then connect to this neuron peripheral distance be the neuron in 1 scope;If M is 6 to 10, then connect to this neuron peripheral distance be the neuron in 2 scopes;If M is 11 to 15, then connect to this neuron peripheral distance be the neuron in 3 scopes;If M is 16 to 20, then connect to this neuron peripheral distance be the neuron in 4 scopes。
In join domain, this nerve Rhizoma Coptidis is as follows to connected neuronic probability:
P = k i j + i Σk i j + Σ i
Wherein i and j represents connected neuronic abscissa and vertical coordinate, kijFor even to connected neuronic wiring quantity, summation refers to and connected neuronic wiring quantity all in join domain or abscissa value is added。
In described step 4, judge whether signal arrives output by following either type:
Mode one, it is judged that whether the output of output neuron changes within 20 clock cycle, change then illustrates to arrive output;
Mode two, observes output neuron, sees if there is line even to it, if any, then illustrate that signal has arrived output;
When, in all of output neuron, exceeding the output neuron setting number has pulse to export, then illustrate that network has generated。
In described step 5, optimizing algorithm is genetic algorithm, ergodic algorithm or annealing algorithm, using the neuronic output of wherein arbitrarily setting as reference quantity, by algorithm, removes redundancy section in network。Specifically, set A neuronic output as reference quantity, when having detected that B neuronic output is in change, then stop the generation of network, enter and subtract the line stage;Each connecting line of the traversal that starts anew, often delete one, observe whether this A neuronic output changes, if it occur that change, then do not delete this line, if it is constant, then it is assumed that system is not produced impact by this line segment, for redundancy section, it is deleted, carrying out this process always, until having traveled through all lines, then having calculated。
In described step 6, it may be judged whether also need to add the concrete according to being of stimulation: the neuronal quantity of the change in output neuron reaches setting value。
Compared to traditional neutral net changing neuronic weights only, the present invention proposes the overall variable mode of a kind of network, namely connected mode, neuronic computing unit, connection weight weight average are variable, the network that network is self-generating that the present invention proposes, it is all that the implementation being manually set is entirely different with traditional neutral net, artificial subjective factor impact can be effectively reduced。In addition based on the generation process of present networks and biological neural theory, for exploring neural brain further, thus realizing real intelligence to bring possibility。
Accompanying drawing explanation
Fig. 1 is when the neuron output value being in center is 8~11, its be likely to even to the position of peripheral nerve unit, have 25 neurons as shown in the figure。
Fig. 2 is the flow chart of network struction。
Detailed description of the invention
Embodiments of the present invention are described in detail below in conjunction with drawings and Examples。
Biological neural net have passed through very long evolutionary process, just defines complex, nerve network system in large scale。The research work of biological neural net is had been carried out upper a century by biologist, also the operation principle of biological neural net is known little about it so far。Say nothing of the understanding to this nerve net principle possessing cognition, study, innovation mode of cranial nerve。In the face of so complicated neutral net, it is necessary to the method for research on adjustment。Can not continue to carry out according to the Research Thinking of ANN, and should from bionics angle research self-organizing, from the research work growing silica-based nerve net。What need exist for clearly stating is do not attempt to replicate one and the same or like silica-based nerve net of biological neural net。But should study and how allow silica-based nerve net can realize self-organizing when receiving outside stimulus as biological neural net, from growth and self study。
The feature of silica-based nerve net is to have had abundant neuronal quantity, but problem is cannot effectively to organize and can finally realize self-learning function from growth。Therefore the application designs a kind of mechanism and can solve this problem。First the self-generating rule of neuroid is designed:
1. single neuron has I/O capability, and input signal can be carried out certain processing, then exports。Namely inside neurons computing unit can change calmly according to demand。
Adopting Integrate-and-fire model as neuronic internal calculation mode in this checking, this model is a threshold decision model, and when input is more than a certain threshold value, output will become high level。
2. output signal can be carried out intensity statistics (intensity herein refers to the size of output signal intensity in the unit interval) according to the time by network。If neuron output intensity more than one threshold value is just and one the new connection of generation of peripheral nerve unit, and output is passed on。
3. determine to select many neurons at a distance according to the intensity of output。If output intensity is high, even being attached near, if output intensity is low, just connect at a distance。
4. after neuron launches pulse, the problem that pulse transmission channel intensity will be determined。This is determined by the weights on line。
5. the present invention sets the neuronic two-dimensional arrangements being arranged as rule, and each neuron has the coordinate figure of its correspondence, as shown in Figure 1。Input neuron coordinate fixes on less position, and output neuron coordinate is placed on bigger position。Neuron can arrange priority when connecting output neuron, and the neuron that prioritizing selection coordinate is big is attached。Namely neuron preferentially transmits signal to output neuron direction。
6., due to the low and high level input that input is static, convert the input into a time dependent pulse signal by a modular converter。The input neuron starting only minority in network is activated, and all the other neurons not being activated are it is believed that be temporarily absent from。For the input neuron of nerve net, the external world gives a continuous print electrical stimulation signal。This neuron can grow connection according to its basic function to adjacent neurons, and the signal of telecommunication is passed。When the signal of telecommunication is delivered to appointment neuron, stop or reducing giving electrical stimulation signal at input neuron。Thus from growth, self study goes out a neutral net。
It is above the create-rule of neutral net, also needs to thereafter be designed for self study mode, make network can learn to obtain specific performance by adding didactic algorithm。
1. the neutral net that will obtain in above-mentioned generation step, deletes the redundancy section in network by heuritic approach, until network obtains intended stable output, then the first round has optimized。
2. on the basis obtaining stabilizing network, repeating above-mentioned 6 steps, the input item of this network is the stimulus signal different from last time simultaneously。Reducing redundancy after obtaining network, loop iteration is until network obtains stable output always。
The checking of the present invention first adopts software analog platform to carry out checking functionally, and simulator utilizes C++ to write, and carries out emulation and accelerate on GPU。In order to various neuronic effect can be simulated, multiple neuronic computing mode is simulated by the present invention, additionally can according to the corresponding neuron arithmetic element of user's request self-developing。After obtaining the network structure determined, the structure of this network is realized by recycling hardware。
It is above fundamental rule and the principle of the present invention, the concrete generation step of the present invention is described below in conjunction with accompanying drawing 2。
The development at biological neural networking is a progradation from simple to complex, and in evolution, the growth pattern of network is subject to the impact of environmental stimuli。The present invention is by simulating the process of self-organization of biological neural network, propose a set of to be stimulated by outer signals based under silicon-based electronic circuits, thus the method for self-generating neutral net, it is embodied as flow process as shown in Figure 2, the present invention represents the black and white picture being sized to 28*28, one 0, each pixel position by numeral 0,1,1 value, 0 be black 1 is white。It is translated into one digit number group, namely the array of one 784 inputs, then by this array, as the input of network。
Every pictures is all made up of pixel, and the picture of numeral 1 of the present invention is as follows:
0 represents stain, and 1 represents bright spot。
Its neutral net generates method and comprises the following steps:
1. utilize C++ to build simulator platform。Wherein the neuron computing unit of simulator clearance can be crossed configuration file and reconfigure, and interneuronal connection can also be modified by configuration information in addition。Amendment mode is as previously mentioned。
In order to fully simulate the connection of biological neural network, the present invention sets a maximum m to each neuronic connection number, and (this value can be modified by configuration information, the each neuronic maximum connection value of the present invention is set as 4, namely at most can even to 4 other neurons, but each input has 4 input lines to represent, namely represent an input with 4 lines, input value range for 0 to 15)。Wherein present invention neuron position in simulator is arranged as two-dimentional regularly arranged, and each neuron is by the coordinate representation of its correspondence。
2. initial it needs to be determined that the size of network need to be built, i.e. horizontal and vertical neuron number。Initial value is set as laterally 1000 neurons by the present invention, longitudinally 100 neurons。
3. select several neurons as initial network from middle somewhere, be simultaneously introduced input stimulus signal (this stimulus signal is the pulse signal after modular converter converts)。Allowing network carry out self-organizing, the initial neuronal quantity of inventive network may be set to 49, and each neuron has 16 input signals, and concrete stretching, extension rule is as follows:
(1) change of every 20 clock cycle statistics once each neuron output, if this neuronic output has changed M time (M scope is 0 to 20) within these 20 clock cycle, if the value of M is 0, does not then produce line;If M is 1 to 5, then connect to this neuron peripheral distance be the neuron in 1 scope;In like manner M is 6 to 10, then connect to distance be the scope of 2;M is 11 to 15, then connect to distance be 3;M is 16 to 20 companies to distance is 4. concrete and for distance 1,2,3, and 4. with reference to Fig. 1, it represents that this neuronic output valve M is the value between 11 to 15, vertically and horizontally apart from the neuron being all 3。Which as specifically connecting to neuron, judged by a probability event, so this network generated is likely to difference every time。
(2) owing to each neuron only connects single line, so need to determining according to probability and specifically connecting to which neuron of periphery。In join domain, this nerve Rhizoma Coptidis is as follows to connected neuronic probability:
P = k i j + i Σk i j + Σ i
Wherein i and j represents connected neuronic abscissa and vertical coordinate, kijFor even to connected neuronic wiring quantity, summation refers to and connected neuronic wiring quantity all in join domain or abscissa value is added。
Fig. 1 represents when neuron is output as 8~11, central nervous unit be likely to even to neuron position, all the other situations are then with it in like manner。
4. judge whether signal arrives output, specifically, it is intended that abscissa value is the neuron of the position of 1000 is output neuron, when wherein there being ten neural neurons to have output signal to produce, then stimulus signal is withdrawn。Finally obtain a neural network structure that no longer recurring structure changes。Can judge whether signal arrives output by following either type:
Mode one, it is judged that whether the output of output neuron changes within 20 clock cycle, change then illustrates to arrive output;
Mode two, observes output neuron, sees if there is line even to it, if any, then illustrate that signal has arrived output;
When, in all of output neuron, exceeding the output neuron setting number has pulse to export, then illustrate that network has generated。
5. obtained the neutral net of a self-generating by step 4, afterwards by lookup algorithm, delete the redundancy section in network, obtain stable output。Optimizing algorithm is genetic algorithm, ergodic algorithm or annealing algorithm, using the neuronic output of wherein arbitrarily setting as reference quantity, by algorithm, removes redundancy section in network。Specifically, set A neuronic output as reference quantity, when having detected that B neuronic output is in change, then stop the generation of network, enter and subtract the line stage;Each connecting line of the traversal that starts anew, often delete one, observe whether this A neuronic output changes, if it occur that change, then do not delete this line, if it is constant, then it is assumed that system is not produced impact by this line segment, for redundancy section, it is deleted, carrying out this process always, until having traveled through all lines, then having calculated。
6. generate in step 5 on the basis of network, add new stimulation, circulate above-mentioned 5 steps。When the neuronal quantity of the change in output neuron reaches setting value, then stop adding new stimulation。By the training repeatedly circulated, several intended output mode may finally be obtained。
Above; being only presently preferred embodiments of the present invention, but protection scope of the present invention is not limited thereto, any those familiar with the art is in the technical scope that the invention discloses; the change that can readily occur in or replacement, all should be encompassed within protection scope of the present invention。Therefore, protection scope of the present invention should be as the criterion with the protection domain that claim defines。

Claims (10)

1. a self-generating neutral net construction method, it is characterised in that comprise the steps:
Step 1, adds stimulus signal;
Step 2, assesses neuron output intensity, it is determined that this neuronic closure, constantly forms network and connects, ultimately generates initial network;
Step 3, calculating is even to the position of target nerve unit and probability;
Step 4, it is judged that current network generates whether process stops, if it is, proceed to step 5, otherwise proceeds to step 2 and continues;
Step 5, by optimizing algorithm, optimizes network and connects;
Step 6, it may be judged whether also needing to add stimulates, if it is not, then terminate, otherwise proceeds to step 1。
2. the construction method of self-generating neutral net according to claim 1, it is characterized in that, neuron in described neutral net possess to input carry out the function that then computing exports, signal transmission can be carried out by connection between neuron and neuron, input signal, when receiving input signal, is carried out computing and stores relevant information by neuron;One coordinate figure determined of each neuron in the two-dimensional arrangements of rule, is given in described neuron position in space layout simultaneously。
3. the construction method of self-generating neutral net according to claim 2, it is characterised in that
Being modified code so that neuron possesses threshold decision and exomonental function, namely possess the function that then input computing exports, this partial code is a flight data recorder that can change computation schema;Described input signal refer to that Current neural unit receives from other neuronic output signals, described computing is threshold decision, and namely output valve exceeds certain scope, then be output as 1;Storage relevant information refer to each neuronic input/output signal and threshold decision value, by each neuron regard as one can calculate, storable unit。
4. the construction method of self-generating neutral net according to claim 1, it is characterised in that in described step 1, stimulus signal is static 0,1 information, by inputting modular converter, by static 0,1 information input is changed into dynamic pulse signal, namely, when input is 1, pulse is constantly launched, when being 0, do not produce pulse, so that the generation process of whole network is converted into dynamic process by static process。
5. the construction method of self-generating neutral net according to claim 1, it is characterised in that in described step 2, it is determined that the rule of neuronic closure is as follows:
The change of every 20 clock cycle statistics once this neuron output, then this neuronic output has changed M time within these 20 clock cycle, and 0≤M≤20 if M is 0, then do not produce line;If M is 1 to 5, then connect to this neuron peripheral distance be the neuron in 1 scope;If M is 6 to 10, then connect to this neuron peripheral distance be the neuron in 2 scopes;If M is 11 to 15, then connect to this neuron peripheral distance be the neuron in 3 scopes;If M is 16 to 20, then connect to this neuron peripheral distance be the neuron in 4 scopes。
6. the construction method of self-generating neutral net according to claim 5, it is characterised in that in join domain, this nerve Rhizoma Coptidis is as follows to connected neuronic probability:
Wherein i and j represents connected neuronic abscissa and vertical coordinate, kijFor even to connected neuronic wiring quantity, summation refers to and connected neuronic wiring quantity all in join domain or abscissa value is added。
7. by following either type, the construction method of self-generating neutral net according to claim 1, it is characterised in that in described step 4, judges whether signal arrives output:
Mode one, it is judged that whether the output of output neuron changes within 20 clock cycle, change then illustrates to arrive output;
Mode two, observes output neuron, sees if there is line even to it, if any, then illustrate that signal has arrived output;
When, in all of output neuron, exceeding the output neuron setting number has pulse to export, then illustrate that network has generated。
8. the construction method of self-generating neutral net according to claim 1, it is characterised in that in described step 5, optimizing algorithm is genetic algorithm, ergodic algorithm or annealing algorithm, using the neuronic output of wherein arbitrarily setting as reference quantity, by algorithm, remove redundancy section in network。
9. the construction method of self-generating neutral net according to claim 1, it is characterised in that set A neuronic output as reference quantity, when having detected that B neuronic output is in change, then stop the generation of network, enter and subtract the line stage;Each connecting line of the traversal that starts anew, often delete one, observe whether this A neuronic output changes, if it occur that change, then do not delete this line, if it is constant, then it is assumed that system is not produced impact by this line segment, for redundancy section, it is deleted, carrying out this process always, until having traveled through all lines, then having calculated。
10. the construction method of self-generating neutral net according to claim 1, it is characterised in that in described step 6, it may be judged whether also need to add the concrete according to being of stimulation: the neuronal quantity of the change in output neuron reaches setting value。
CN201610015589.1A 2016-01-11 2016-01-11 A kind of self-generating neutral net construction method Active CN105701540B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610015589.1A CN105701540B (en) 2016-01-11 2016-01-11 A kind of self-generating neutral net construction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610015589.1A CN105701540B (en) 2016-01-11 2016-01-11 A kind of self-generating neutral net construction method

Publications (2)

Publication Number Publication Date
CN105701540A true CN105701540A (en) 2016-06-22
CN105701540B CN105701540B (en) 2017-12-19

Family

ID=56227129

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610015589.1A Active CN105701540B (en) 2016-01-11 2016-01-11 A kind of self-generating neutral net construction method

Country Status (1)

Country Link
CN (1) CN105701540B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018112892A1 (en) * 2016-12-23 2018-06-28 北京中科寒武纪科技有限公司 Device and method for supporting fast artificial neural network operation
WO2018113790A1 (en) * 2016-12-23 2018-06-28 北京中科寒武纪科技有限公司 Operation apparatus and method for artificial neural network
CN109034372A (en) * 2018-06-28 2018-12-18 浙江大学 A kind of neural networks pruning method based on probability
CN109376853A (en) * 2018-10-26 2019-02-22 电子科技大学 Echo State Networks export aixs cylinder circuit
CN110059809A (en) * 2018-10-10 2019-07-26 北京中科寒武纪科技有限公司 A kind of computing device and Related product
CN111611893A (en) * 2020-05-14 2020-09-01 青岛翰林汇力科技有限公司 Intelligent measuring and judging method applying neural network deep learning
US11270190B2 (en) 2017-08-18 2022-03-08 Beijing Sensetime Technology Development Co., Ltd. Method and apparatus for generating target neural network structure, electronic device, and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104584037A (en) * 2012-08-23 2015-04-29 高通股份有限公司 Neural system of adaptive behavior
CN104915195A (en) * 2015-05-20 2015-09-16 清华大学 Method for achieving neural network calculation based on field-programmable gate array

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104584037A (en) * 2012-08-23 2015-04-29 高通股份有限公司 Neural system of adaptive behavior
CN104915195A (en) * 2015-05-20 2015-09-16 清华大学 Method for achieving neural network calculation based on field-programmable gate array

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
KHURODAE R: "Development of the learning process for the neuron and neural network for pattern recognition", 《AMERICAN JOURNAL OF INTELLIGENT SYSTEMS》 *
SHAYANI H ETAL.: "Hardware implementation of a bio-plausible neuron model for evolution and growth of spiking neural networks on FPGA", 《CONFERENCE ON ADAPTIVE HARDWARE AND SYSTEMS》 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018112892A1 (en) * 2016-12-23 2018-06-28 北京中科寒武纪科技有限公司 Device and method for supporting fast artificial neural network operation
WO2018113790A1 (en) * 2016-12-23 2018-06-28 北京中科寒武纪科技有限公司 Operation apparatus and method for artificial neural network
US11270190B2 (en) 2017-08-18 2022-03-08 Beijing Sensetime Technology Development Co., Ltd. Method and apparatus for generating target neural network structure, electronic device, and storage medium
CN109034372A (en) * 2018-06-28 2018-12-18 浙江大学 A kind of neural networks pruning method based on probability
CN109034372B (en) * 2018-06-28 2020-10-16 浙江大学 Neural network pruning method based on probability
CN110059809A (en) * 2018-10-10 2019-07-26 北京中科寒武纪科技有限公司 A kind of computing device and Related product
CN110059809B (en) * 2018-10-10 2020-01-17 中科寒武纪科技股份有限公司 Computing device and related product
CN109376853A (en) * 2018-10-26 2019-02-22 电子科技大学 Echo State Networks export aixs cylinder circuit
CN109376853B (en) * 2018-10-26 2021-09-24 电子科技大学 Echo state neural network output axon circuit
CN111611893A (en) * 2020-05-14 2020-09-01 青岛翰林汇力科技有限公司 Intelligent measuring and judging method applying neural network deep learning
CN111611893B (en) * 2020-05-14 2024-03-19 龙立强人工智能科技(苏州)有限公司 Intelligent measuring and judging method applying neural network deep learning

Also Published As

Publication number Publication date
CN105701540B (en) 2017-12-19

Similar Documents

Publication Publication Date Title
CN105701540B (en) A kind of self-generating neutral net construction method
Hunsberger et al. Spiking deep networks with LIF neurons
CN109829541A (en) Deep neural network incremental training method and system based on learning automaton
Li et al. Using a million cell simulation of the cerebellum: network scaling and task generality
Guerra-Hernandez et al. A FPGA-based neuromorphic locomotion system for multi-legged robots
CN110223785A (en) A kind of infectious disease transmission network reconstruction method based on deep learning
WO1989009457A1 (en) Processing of high-order information with neuron network and minimum and maximum value searching method therefor
US20150379397A1 (en) Secure voice signature communications system
CN111382840B (en) HTM design method based on cyclic learning unit and oriented to natural language processing
CN105122278B (en) Neural network and method of programming
Ding et al. College English online teaching model based on deep learning
Kulkarni et al. Learning and real-time classification of hand-written digits with spiking neural networks
Christophe et al. Pattern recognition with spiking neural networks: a simple training method.
CN109635942B (en) Brain excitation state and inhibition state imitation working state neural network circuit structure and method
Maliavko et al. Towards development of self-learning and self-modification spiking neural network as model of brain
CN114548239A (en) Image identification and classification method based on artificial neural network of mammal-like retina structure
CN112819143A (en) Work memory computing system and method based on graph neural network
Vaila et al. Spiking CNNs with PYNN and NEURON
Hourdakis et al. Computational modeling of cortical pathways involved in action execution and action observation
Gavrilov et al. A model of spike neuron oriented to hardware implementation
Sun et al. A Deep Learning Method for Intelligent Analysis of Sports Training Postures
Wu Research on the Development of Integration of Neuroscience and Artificial Intelligence
Yashchenko Neural-like growing networks in the development of general intelligence. Neural-like growing networks (P. II)
Davies et al. Spike-based learning of transfer functions with the SpiNNaker neuromimetic simulator
Rückauer Event-based vision processing in deep neural networks

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20190710

Address after: 361022 Unit 0206, Unit 109, 62 Chengyi North Street, Xiamen Software Park Phase III, Fujian Province

Patentee after: Xiamen Semiconductor Industry Technology Research and Development Co., Ltd.

Address before: 100084 Beijing Haidian District 100084 box 82 box, Tsinghua University Patent Office

Patentee before: Tsinghua University