CN105701540B - A kind of self-generating neutral net construction method - Google Patents

A kind of self-generating neutral net construction method Download PDF

Info

Publication number
CN105701540B
CN105701540B CN201610015589.1A CN201610015589A CN105701540B CN 105701540 B CN105701540 B CN 105701540B CN 201610015589 A CN201610015589 A CN 201610015589A CN 105701540 B CN105701540 B CN 105701540B
Authority
CN
China
Prior art keywords
neuron
output
network
self
signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610015589.1A
Other languages
Chinese (zh)
Other versions
CN105701540A (en
Inventor
何虎
许志恒
马海林
王玉哲
王旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiamen Semiconductor Industry Technology Research and Development Co., Ltd.
Original Assignee
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University filed Critical Tsinghua University
Priority to CN201610015589.1A priority Critical patent/CN105701540B/en
Publication of CN105701540A publication Critical patent/CN105701540A/en
Application granted granted Critical
Publication of CN105701540B publication Critical patent/CN105701540B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/06Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
    • G06N3/063Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means
    • G06N3/065Analogue means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Abstract

A kind of self-generating neutral net construction method, comprises the following steps:Step 1, stimulus signal is added;Step 2, neuron output intensity is assessed, the closure of the neuron is determined, constantly forms network connection, ultimately generate initial network;Step 3, the position for being connected to target nerve member and probability are calculated;Step 4, judge whether current network generating process stops, if it is, being transferred to step 5, be otherwise transferred to step 2 and continue;Step 5, by optimizing algorithm, network connection is optimized;Step 6, judge whether to also need to add to stimulate, if it is not, then terminating, be otherwise transferred to step 1;Network proposed by the present invention is the network of self-generating, can effectively reduce artificial subjective factor influence, and the generating process of present networks is with based on biological neural theory, being further to explore neural brain in addition, and so as to realize, really intelligence brings possibility.

Description

A kind of self-generating neutral net construction method
Technical field
The invention belongs to neural computing technical field, is related to software-oriented modeling method and hard-wired nerve net Network self-generating method, more particularly to a kind of self-generating neutral net construction method.
Background technology
Development of computer is obtained for lifting substantially to today, either computing capability or power consumption in performance. It is but also more tired more the problem of computer configuation exposure instantly with the variation of people's demand.The information that nature provides Process problem has structural and two kinds of unstructuredness.Structural issue refers to that available mathematical linguistics understands and strictly described, and Can be by the algorithmic formula of realizing of problem, and computer program is mapped to, then handled by computer bar instruction.It is this kind of solving During problem, the ability of traditional von Neumann machine is considerably beyond the mankind.Rather than structural issue refers to that arthmetic statement can not be used Problem, people are difficult to the understanding of oneself to translate into machine instruction or are only capable of extremely roughly carrying out.Therefore, traditional computer exists When carrying out such as image processing with scene analysis, language identification and understanding, Study of Intelligent Robot Control, then differed very with human capabilitiy Far, it is or even helpless.
The development of neutral net is broadly divided into three phases.1947~1969 years are initial stage, in the meantime scientists Propose many neuron models and learning rules, such as MP models, HEBB learning rules and perceptron.It is within 1970~1986 years Transitional period, during this period, professor Hopfield introduce the concept of energy function to network, give the stability criteria of network, Propose the approach calculated for associative memory and optimization.1987 are period of expansion so far, form one of neutral net development Climax.It is representational to have CNN (convolutional neural networks) and RNN (Recognition with Recurrent Neural Network).Instantly artificial neural network is known in image Not, the direction such as speech recognition achieves very big success.But its structures shape it can not realize the same intelligence of organism.
A greatest drawback is exactly that nerve net is unable to self-organizing in modern ANN or the design of pulse nerve net, from Growth.But external system intervention is needed to change oneself state.Such as the annexation between neuron, weighed value adjusting Etc..This has just been doomed this mode can not be as biological neuron according to self growth of the stimulation of outside, reparation, also with regard to nothing Method realizes self-learning function.Therefore thoroughly to change the limitation of ANN, just ANN must be allowed to possess from group Knit, from growth function.
Existing known biological neural network model has:Kohonen neural network models, each neuronal structure is identical, but After stimulated, connection weight difference is produced, under environmental stimuli, continuous reception signal, cluster process is constantly performed, forms warp Test;Hebb learning rules, the learning process of neutral net are finally that the synapses between neuron, the connection of cynapse occurs Intensity changes with the activity of neuron after the presynaptic, and the amount of change is directly proportional to the active sum of two neurons.Nerve Network create-rule can be divided into two stages, growth phase:Guiding of the weak aixs cylinder of prematurity nerve cell in stimulus signal Under, heuristically navigate within and be among the brain of stage of development.Once the end of aixs cylinder have found its correct habitat, it Begin to establish synaptic contact as extensive as possible with peripheral neurons cell, so as to conducts information.The stage of ripeness:Vertebrate A feature after birth in early development is the elimination of nerve connection.Due to the extensive connection that growth phase is established, it is not All connections are all efficient.Under the long-time stimulus of external signal, correct information transmission can make link to become more steady Gu on the contrary then atrophy separates.
The content of the invention
The shortcomings that in order to overcome above-mentioned prior art, it is an object of the invention to provide a kind of self-generating neutral net structure Method, it will be mapped to the characteristics of biological neural model with appropriate heuritic approach on silicon-based electronic circuits, lead to neutral net The model self-generating and self-organizing are crossed, the shortcomings that avoiding artificial neural network.
To achieve these goals, the technical solution adopted by the present invention is:
A kind of self-generating neutral net construction method, comprises the following steps:
Step 1, stimulus signal is added;
Step 2, neuron output intensity is assessed, the closure of the neuron is determined, constantly forms network connection, finally Generate initial network;
Step 3, the position for being connected to target nerve member and probability are calculated;
Step 4, judge whether current network generating process stops, if it is, be transferred to step 5, be otherwise transferred to step 2 after It is continuous;
Step 5, by optimizing algorithm, network connection is optimized;
Step 6, judge whether to also need to add to stimulate, if it is not, then terminating, be otherwise transferred to step 1.
Neuron in the neutral net, which possesses, carries out computing and then the function of exporting, neuron and neuron to input Between can carry out signal transmission by connecting, neuron when receiving input signal, to input signal computing and deposit Store up relevant information;The neuron position is in the two-dimensional arrangements of rule in space layout, while to each neuron one really Fixed coordinate value.
By changing code so that neuron possesses threshold decision and exomonental function, that is, it is right to possess input computing The function of exporting afterwards, the partial code are a flight data recorders that can change computation schema;
The input signal refers to the output signal from other neurons that Current neural member receives, and the computing is threshold Value judges that is, output valve exceeds some scope, then output is 1;The relevant information of storage refers to the input and output letter of each neuron Number and threshold decision value, will each neuron regard as one can calculate, storable unit.
In the present invention, the initial neuronal quantity of network may be set to 49, and final size is set as 10000.
Stimulus signal is 0,1 static information, by inputting modular converter, 0, the 1 information input of static state is changed into dynamic State pulse signal, i.e., when input is 1, constantly launch pulse, when being 0, pulse is not produced, so that the generating process of whole network Dynamic process is converted into by static process.
In the step 2, according to neuron output intensity, determine that the rule of the closure of the neuron is as follows:
Every 20 clock cycle statistics change that once neuron exports, then the output of the neuron is in 20 clocks Change M times in cycle, 0≤M≤20, if M is 0, do not produced line;If M is 1 to 5, neuron week back gauge is connected to From for the neuron in the range of 1;If M is 6 to 10, it is the neuron in the range of 2 to be connected to the neuron peripheral distance;If M is 11 to 15, then it is the neuron in the range of 3 to be connected to the neuron peripheral distance;If M is 16 to 20, the nerve is connected to First peripheral distance is the neuron in the range of 4.
In join domain, the probability that the neuron is connected to be connected neuron is as follows:
Wherein i and j represents to be connected the abscissa and ordinate of neuron, kijTo be connected to be connected the session number of neuron Amount, summation refer to the wiring quantity of all connected neurons in join domain or abscissa value addition.
In the step 4, judge whether signal reaches output by following either type:
Mode one, judges whether the output of output neuron changes within 20 clock cycle, and then explanation arrives for change Up to output;
Mode two, output neuron is observed, line is seen if there is and is connected to it, if any then illustrating that signal has reached output;
When in all output neurons, having pulse output more than the output neuron for setting number, then illustrate that network is given birth to Into completion.
In the step 5, optimizing algorithm is genetic algorithm, ergodic algorithm or annealing algorithm, with the god wherein arbitrarily set Output through member, by algorithm, removes redundancy section in network as reference quantity.Specifically, the output for setting A neuron is made For reference quantity, when the output for detecting B neuron is changing, then stop the generation of network, into subtracting the line stage;From the beginning open Each connecting line of the traversal that begins, often deletes one, and whether the output for observing the A neuron changes, and if it happens changes, This line is not deleted then, if constant, then it is assumed that the line segment does not have an impact to system, is redundancy section, is deleted, one The straight row process, until having traveled through all lines, then calculate and complete.
In the step 6, the concrete foundation for judging whether to also need to add stimulation is:The god of change in output neuron Reach setting value through first quantity.
Compared to the neutral net of traditional weights for only changing neuron, the present invention proposes a kind of network and integrally may be used The mode of change, i.e. connected mode, the computing unit of neuron, connection weight are variable, and network proposed by the present invention is certainly The network of generation, it is entirely different with all implementations that are manually set of traditional neutral net, it can effectively reduce based on people Sight factor influences.In addition the generating process of present networks is with based on biological neural theory, being further to explore neural brain, so as to real Now really intelligence brings possibility.
Brief description of the drawings
Fig. 1 is the position of its peripheral neurons that may be connected to when the neuron output value at center is 8~11 Put, there are 25 neurons as shown in the figure.
Fig. 2 is the flow chart of network struction.
Embodiment
Describe embodiments of the present invention in detail with reference to the accompanying drawings and examples.
Biological neural net have passed through very long evolutionary process, just form complex, neutral net system in large scale System.Upper a century has been carried out in research work of the biologist to biological neural net, and the work also to biological neural net is former so far Reason is known little about it.Not to mention to cranial nerve it is this possess cognition, study, innovation mode nerve net principle understanding.Face So complicated neutral net is, it is necessary to the method for research on adjustment.The Research Thinking progress according to ANN can not be continued, and Should be from bionics angle research self-organizing, the research work of oneself growth silicon substrate nerve net.Need exist for clearly stating is not Attempt to replicate one and the same or similar silicon substrate nerve net of biological neural net.But it should study and how allow silicon substrate neural Net can realize self-organizing as biological neural net in the case where receiving outside stimulus, from growth and self study.
The characteristics of silicon substrate nerve net is that have enough neuronal quantities, but problem is effectively to organize Get up and finally can realize self-learning function from growth.Therefore the application, which designs a kind of mechanism, can solve this problem.It is first First design the self-generating rule of neuroid:
1. single neuron has I/O capability, and certain processing can be carried out to input signal, then carry out Output.I.e. inside neurons computing unit can change fixed according to demand.
Internal calculation mode of the Integrate-and-fire models as neuron, the model are used in this checking For a threshold decision model, when input is more than a certain threshold value, output will be changed into high level.
2. network can according to time progress intensity statistics, (intensity herein refers to export in the unit interval to output signal The size of signal intensity).If more than one threshold value of neuron output intensity and peripheral neurons generate a new connection, And output is passed on.
3. the how remote neuron of selection is determined according to the intensity of output.If output intensity is high, even attachment is near , if output intensity is low, just connect distant place.
4. after neuron transmitting pulse, the problem of determining pulse transmission channel intensity.This passes through the power on line Value determines.
5. the two-dimensional arrangements for being arranged as rule of present invention setting neuron, each neuron have its corresponding coordinate value, As shown in Figure 1.Input neuron coordinate is scheduled on less position, and output neuron coordinate is placed on larger position.Neuron connects Priority can be set when output neuron, and the big neuron of prioritizing selection coordinate is attached.I.e. neuron preferentially to Output neuron direction transmission signal.
6. being inputted due to inputting for static low and high level, one is converted the input into the time by a modular converter The pulse signal of change.Start input neuron only a small number of in network to be activated, remaining neuron not being activated can be with Think temporarily to be not present.A continuous electrical stimulation signal is given in input neuron for nerve net, the external world.The neuron meeting Connection is grown to adjacent neurons according to its basic function, and electric signal is passed.When electric signal be delivered to it is specified During neuron, stop or reduce to give electrical stimulation signal in input neuron.Thus from growing, self study goes out a nerve Network.
Above is the create-rule of neutral net, also needs to be designed for self study mode thereafter, opened by adding The algorithm of hairdo causes network to learn to obtain specific performance.
1. the neutral net that will be obtained in above-mentioned generation step, the redundancy section in network is deleted by heuritic approach, Until network obtains expected stable output, then first round optimization is completed.
2. on the basis of stabilizing network is obtained, while repeat above-mentioned 6 steps, the input item of this network is and last time Different stimulus signals.Redundancy is reduced after obtaining network, loop iteration exports until network obtains stabilization always.
First the checking using the progress of software analog platform functionally, simulator are write using C++ for the checking of the present invention, Emulation acceleration is carried out on GPU.In order to simulate the effect of various neurons, the computing in the present invention to a variety of neurons Mode is simulated, additionally can be according to the corresponding neuron arithmetic element of user's request self-developing.When what is determined After network structure, hardware is recycled to realize the structure of the network.
Above is the specific generation step of the primitive rule and principle of the present invention, the below in conjunction with the accompanying drawings 2 explanation present invention.
The development at biological neural networking is a progradation from simple to complex, and in evolution network life Long mode is influenceed by environmental stimuli.Process of self-organization of the invention by simulating biological neural network, propose a set of is based on Stimulated under silicon-based electronic circuits by outer signals, so as to the method for self-generating neutral net, specific implementation flow is as shown in Fig. 2 originally The black and white picture that size is 28*28 is represented by numeral 0,1 in invention, the value of each pixel position one 0,1,0 be black 1 be white. It is translated into one digit number group, that is, the array of 784 inputs, then by the array, the input as network.
All it is made up of per pictures pixel, the picture of present invention numeral 1 is as follows:
0 represents stain, and 1 represents bright spot.
Its neutral net generation method comprises the following steps:
1. build simulator platform using C++.Wherein the neuron computing unit of simulator can cross configuration file with clearance Reconfigured, interneuronal connection in addition can also be modified by configuration information.Modification mode is as previously described.
In order to fully simulate the connection of biological neural network, the present invention sets one to the connection number of each neuron (value can be modified individual maximum m by configuration information, and the maximum connection value of each neuron of the present invention is set as 4, i.e., most 4 other neurons can be connected to more, but each input there are 4 input lines to represent, i.e., represents an input, input with 4 lines The scope of value is 0 to 15).Wherein present invention neuron position in simulator is arranged as two-dimentional regularly arranged, each neuron By its corresponding coordinate representation.
2. initially it needs to be determined that the size of network need to be built, i.e., horizontal and vertical neuron number.The present invention is by initial value It is set as laterally 1000 neurons, 100 neurons in longitudinal direction.
3. selecting several neurons as initial network from middle somewhere, while add input stimulus signal (thorn Energizing signal is the pulse signal after modular converter converts).Network is allowed to carry out self-organizing, inventive network is initially neural First quantity may be set to 49, and each neuron has 16 input signals, and specific stretching, extension rule is as follows:
(1) every 20 clock cycle statistics change that once each neuron exports, if the output of the neuron is at this M times (M scopes are 0 to 20) is changed in 20 clock cycle, if M value is 0, has not produced line;If M is 1 to 5, It is the neuron in the range of 1 to be connected to the neuron peripheral distance;Similarly M is 6 to 10, then is connected to the scope that distance is 2;M is 11 To 15, then it is 3 to be connected to distance;M is 16 to 20, and to be connected to distance be 4. specific and be that distance 1,2,3,4. refers to figure 1, its expression The output valve M of the neuron is the value between 11 to 15, is connected to the neuron that distance up and down is all 3.As for being specifically connected to Which neuron, judged by a probability event, so this network generated every time may be different.
(2) because each neuron only connects single line, so which god of periphery need to be determined specifically to be connected to according to probability Through member.In join domain, the probability that the neuron is connected to be connected neuron is as follows:
Wherein i and j represents to be connected the abscissa and ordinate of neuron, kijTo be connected to be connected the session number of neuron Amount, summation refer to the wiring quantity of all connected neurons in join domain or abscissa value addition.
Fig. 1 represented when neuron output is 8~11, neuron position that central nervous member may be connected to, remaining situation Then with it similarly.
4. judging whether signal reaches output, specifically, the neuron for the position that specified abscissa value is 1000 is output Neuron, when wherein there are ten neural neurons to have output signal generation, then stimulus signal is withdrawn.Finally obtain one not The neural network structure of recurring structure change again.It can judge whether signal reaches output by following either type:
Mode one, judges whether the output of output neuron changes within 20 clock cycle, and then explanation arrives for change Up to output;
Mode two, output neuron is observed, line is seen if there is and is connected to it, if any then illustrating that signal has reached output;
When in all output neurons, having pulse output more than the output neuron for setting number, then illustrate that network is given birth to Into completion.
5. obtaining the neutral net of a self-generating by step 4, afterwards by lookup algorithm, the redundancy in network is deleted Part, the output stablized.Optimizing algorithm is genetic algorithm, ergodic algorithm or annealing algorithm, with the god wherein arbitrarily set Output through member, by algorithm, removes redundancy section in network as reference quantity.Specifically, the output for setting A neuron is made For reference quantity, when the output for detecting B neuron is changing, then stop the generation of network, into subtracting the line stage;From the beginning open Each connecting line of the traversal that begins, often deletes one, and whether the output for observing the A neuron changes, and if it happens changes, This line is not deleted then, if constant, then it is assumed that the line segment does not have an impact to system, is redundancy section, is deleted, one The straight row process, until having traveled through all lines, then calculate and complete.
6. on the basis of step 5 generates network, new stimulation is added, circulates above-mentioned 5 steps.In output neuron When the neuronal quantity of change reaches setting value, then stop adding new stimulation.It is final available by the training repeatedly circulated Several expected output mode.
More than, only presently preferred embodiments of the present invention, but protection scope of the present invention is not limited thereto is any to be familiar with sheet Those skilled in the art the invention discloses technical scope in, the change or replacement that can readily occur in should all be covered Within protection scope of the present invention.Therefore, protection scope of the present invention should be defined by the protection domain that claim is defined.

Claims (7)

1. a kind of self-generating neutral net construction method, it is characterised in that comprise the following steps:
Step 1, stimulus signal is added:Stimulus signal is 0,1 static information, by inputting modular converter, by 0,1 letter of static state Breath input is changed into dynamic pulse signal, i.e., when input is 1, constantly launches pulse, when being 0, pulse is not produced, so that entirely The generating process of network is converted into dynamic process by static process;
Step 2, neuron output intensity is assessed, the closure of the neuron is determined, constantly forms network connection, ultimately generate Initial network;
The neuron output intensity refers to the size that output signal changes in the unit interval, and output signal is carried out according to the time Neuron output intensity counts, and if more than one threshold value of neuron output intensity and peripheral neurons generate a new company Connect, and output is passed on;
The rule of the closure for determining neuron is as follows:Every 20 clock cycle statistics change that once neuron exports To change, then the output of the neuron changed M time within 20 clock cycle, 0≤M≤20, if M is 0, the not company of generation Line;If M is 1 to 5, it is the neuron in the range of 1 to be connected to the neuron peripheral distance;If M is 6 to 10, the god is connected to It is the neuron in the range of 2 through first peripheral distance;If M is 11 to 15, it is in the range of 3 to be connected to the neuron peripheral distance Neuron;If M is 16 to 20, it is the neuron in the range of 4 to be connected to the neuron peripheral distance;
Step 3, the position for being connected to target nerve member and probability are calculated;
Step 4, judge whether current network generating process stops, if it is, being transferred to step 5, be otherwise transferred to step 2 and continue;
Step 5, by optimizing algorithm, network connection is optimized;
Step 6, judge whether to also need to add to stimulate, if it is not, then terminating, be otherwise transferred to step 1.
2. self-generating neutral net construction method according to claim 1, it is characterised in that the nerve in the neutral net Member, which possesses, carries out computing and then the function of exporting to input, and signal biography can be carried out by connecting between neuron and neuron Pass, neuron carries out computing to input signal and store relevant information when receiving input signal;The neuron position exists In the two-dimensional arrangements of rule in space layout, while to the coordinate value of each one determination of neuron;
Wherein, the neuron is to the method for input signal progress computing:
By changing code so that neuron possesses threshold decision and exomonental function, that is, it is then defeated to possess input computing The function of going out, the code are a flight data recorders that can change computation schema;The input signal refers to Current neural member and received The output signal from other neurons, the computing is threshold decision, i.e., output valve exceeds some scope, then output be 1; The relevant information of storage refers to the input/output signal and threshold decision value of each neuron, will each neuron regard one as can Calculate, storable unit.
3. self-generating neutral net construction method according to claim 1, it is characterised in that in join domain, the nerve The probability that member is connected to be connected neuron is as follows:
<mrow> <mi>P</mi> <mo>=</mo> <mfrac> <mrow> <msub> <mi>k</mi> <mrow> <mi>i</mi> <mi>j</mi> </mrow> </msub> <mo>+</mo> <mi>i</mi> </mrow> <mrow> <mo>&amp;Sigma;</mo> <msub> <mi>k</mi> <mrow> <mi>i</mi> <mi>j</mi> </mrow> </msub> <mo>+</mo> <mo>&amp;Sigma;</mo> <mi>i</mi> </mrow> </mfrac> </mrow>
Wherein i and j represents to be connected the abscissa and ordinate of neuron, kijTo be connected to be connected the wiring quantity of neuron, Summation refers to the wiring quantity of all connected neurons in join domain or abscissa value addition.
4. self-generating neutral net construction method according to claim 1, it is characterised in that in the step 4, by as follows Either type judges whether signal reaches output:
Mode one, judges whether the output of output neuron changes within 20 clock cycle, and then explanation arrival is defeated for change Go out;
Mode two, output neuron is observed, line is seen if there is and is connected to it, if any then illustrating that signal has reached output;
When in all output neurons, there is pulse output more than the output neuron of setting number, then illustrate that network has generated Into.
5. self-generating neutral net construction method according to claim 1, it is characterised in that in the step 5, optimizing algorithm For genetic algorithm, ergodic algorithm or annealing algorithm, the output of neuron wherein arbitrarily to set passes through calculation as reference quantity Method, remove redundancy section in network.
6. self-generating neutral net construction method according to claim 1, it is characterised in that the output of A neuron of setting As reference quantity, when the output for detecting B neuron is changing, then stop the generation of network, into subtracting the line stage;From the beginning Each connecting line is begun stepping through, often deletes one, whether the output for observing the A neuron changes, in the event of change Change, then this line is not deleted, if constant, then it is assumed that the line segment does not have an impact to system, is redundancy section, is deleted Remove, carry out the process always, until having traveled through all lines, then calculate and complete.
7. self-generating neutral net construction method according to claim 1, it is characterised in that in the step 6, judge whether Also needing to the concrete foundation of addition stimulation is:The neuronal quantity of change in output neuron reaches setting value.
CN201610015589.1A 2016-01-11 2016-01-11 A kind of self-generating neutral net construction method Active CN105701540B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610015589.1A CN105701540B (en) 2016-01-11 2016-01-11 A kind of self-generating neutral net construction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610015589.1A CN105701540B (en) 2016-01-11 2016-01-11 A kind of self-generating neutral net construction method

Publications (2)

Publication Number Publication Date
CN105701540A CN105701540A (en) 2016-06-22
CN105701540B true CN105701540B (en) 2017-12-19

Family

ID=56227129

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610015589.1A Active CN105701540B (en) 2016-01-11 2016-01-11 A kind of self-generating neutral net construction method

Country Status (1)

Country Link
CN (1) CN105701540B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018112892A1 (en) * 2016-12-23 2018-06-28 北京中科寒武纪科技有限公司 Device and method for supporting fast artificial neural network operation
EP3561732A4 (en) * 2016-12-23 2020-04-01 Cambricon Technologies Corporation Limited Operation apparatus and method for artificial neural network
CN108229647A (en) 2017-08-18 2018-06-29 北京市商汤科技开发有限公司 The generation method and device of neural network structure, electronic equipment, storage medium
CN109034372B (en) * 2018-06-28 2020-10-16 浙江大学 Neural network pruning method based on probability
CN110059809B (en) * 2018-10-10 2020-01-17 中科寒武纪科技股份有限公司 Computing device and related product
CN109376853B (en) * 2018-10-26 2021-09-24 电子科技大学 Echo state neural network output axon circuit
CN111611893B (en) * 2020-05-14 2024-03-19 龙立强人工智能科技(苏州)有限公司 Intelligent measuring and judging method applying neural network deep learning

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104584037A (en) * 2012-08-23 2015-04-29 高通股份有限公司 Neural system of adaptive behavior
CN104915195A (en) * 2015-05-20 2015-09-16 清华大学 Method for achieving neural network calculation based on field-programmable gate array

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104584037A (en) * 2012-08-23 2015-04-29 高通股份有限公司 Neural system of adaptive behavior
CN104915195A (en) * 2015-05-20 2015-09-16 清华大学 Method for achieving neural network calculation based on field-programmable gate array

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Development of the learning process for the neuron and neural network for pattern recognition;Khurodae R;《American Journal of Intelligent Systems》;20151231;第5卷(第1期);第34-41页 *
Hardware implementation of a bio-plausible neuron model for evolution and growth of spiking neural networks on FPGA;Shayani H etal.;《Conference on Adaptive Hardware and Systems》;20081231;第236-243页 *

Also Published As

Publication number Publication date
CN105701540A (en) 2016-06-22

Similar Documents

Publication Publication Date Title
CN105701540B (en) A kind of self-generating neutral net construction method
CN109829541A (en) Deep neural network incremental training method and system based on learning automaton
Javanshir et al. Advancements in algorithms and neuromorphic hardware for spiking neural networks
CN110223785A (en) A kind of infectious disease transmission network reconstruction method based on deep learning
Wang et al. Echo state networks regulated by local intrinsic plasticity rules for regression
Wang et al. Motivated optimal developmental learning for sequential tasks without using rigid time-discounts
CN109165730A (en) State quantifies network implementation approach in crossed array neuromorphic hardware
CN106951960A (en) A kind of learning method of neutral net and the neutral net
CN112405542B (en) Musculoskeletal robot control method and system based on brain inspiring multitask learning
CN112364988A (en) Hierarchical heterogeneous brain computing system based on FPGA
CN105427241A (en) Distortion correction method for large-field-of-view display device
CN111382840B (en) HTM design method based on cyclic learning unit and oriented to natural language processing
Wang et al. Complex dynamic neurons improved spiking transformer network for efficient automatic speech recognition
CN105122278B (en) Neural network and method of programming
Ji et al. Accuracy versus simplification in an approximate logic neural model
Li et al. Adaptive dropout method based on biological principles
Ding et al. College English online teaching model based on deep learning
Zhou et al. Surrogate-assisted cooperative co-evolutionary reservoir architecture search for liquid state machines
Hassan On simulation of adaptive learner control considering students' cognitive styles using artificial neural networks (ANNs)
WO2009067582A1 (en) Prediction by single neurons and networks
CN109635942B (en) Brain excitation state and inhibition state imitation working state neural network circuit structure and method
CN114548239A (en) Image identification and classification method based on artificial neural network of mammal-like retina structure
CN109615069B (en) Circuit structure of neural network with asynchronous transmission characteristic
CN107341543A (en) A kind of cerebellar model modeling method based on intensified learning
Knight Plasticity in large-scale neuromorphic models of the neocortex

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20190710

Address after: 361022 Unit 0206, Unit 109, 62 Chengyi North Street, Xiamen Software Park Phase III, Fujian Province

Patentee after: Xiamen Semiconductor Industry Technology Research and Development Co., Ltd.

Address before: 100084 Beijing Haidian District 100084 box 82 box, Tsinghua University Patent Office

Patentee before: Tsinghua University

TR01 Transfer of patent right