CN105740646A - BP neural network based protein secondary structure prediction method - Google Patents

BP neural network based protein secondary structure prediction method Download PDF

Info

Publication number
CN105740646A
CN105740646A CN201610020567.4A CN201610020567A CN105740646A CN 105740646 A CN105740646 A CN 105740646A CN 201610020567 A CN201610020567 A CN 201610020567A CN 105740646 A CN105740646 A CN 105740646A
Authority
CN
China
Prior art keywords
output
error
neutral net
prediction method
delta
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610020567.4A
Other languages
Chinese (zh)
Inventor
傅娟
汤达祺
汤德佑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Hunan University of Technology
Original Assignee
South China University of Technology SCUT
Hunan University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT, Hunan University of Technology filed Critical South China University of Technology SCUT
Priority to CN201610020567.4A priority Critical patent/CN105740646A/en
Publication of CN105740646A publication Critical patent/CN105740646A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16BBIOINFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR GENETIC OR PROTEIN-RELATED DATA PROCESSING IN COMPUTATIONAL MOLECULAR BIOLOGY
    • G16B15/00ICT specially adapted for analysing two-dimensional or three-dimensional molecular structures, e.g. structural or functional relations or structure alignment

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Health & Medical Sciences (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Artificial Intelligence (AREA)
  • Computational Linguistics (AREA)
  • Biotechnology (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Medical Informatics (AREA)
  • Chemical & Material Sciences (AREA)
  • Crystallography & Structural Chemistry (AREA)
  • Biomedical Technology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Investigating Or Analysing Biological Materials (AREA)

Abstract

The invention belongs to the field of protein secondary structure prediction methods, relates to a BP neural network training and prediction method used for protein secondary structure prediction and solves the problem of bad prediction effect of the protein secondary structure. The BP neural network training and prediction method comprises the steps of firstly selecting a group of training sample sets with [alpha]-helix, [beta]-sheet and coiling structures accounting for normal proportions from PDB, coding an amino acid sequence of a protein and regarding the coded amino acid sequence of the protein as a network input, and regarding a secondary structure of the corresponding amino acid as a network output; optimizing based on a gradient method, introducing a learning rule which is attached with a momentum item and a self-adaptive learning rate to avoid an oscillation phenomenon and prevent from being trapped in a local minimum value; adopting a six-bit input coding way and a sliding window technology in an input layer, setting a hidden layer structure based on an experience formula and the size of a sliding window; and outputting and predicting classification of the protein secondary structure by an output layer based on a DSSP algorithm.

Description

A kind of secondary protein structure prediction method based on BP neutral net
Technical field
The present invention relates to bio information field, particularly to secondary protein structure prediction method.
Background technology
The higher structure of protein determines its biological function, and wherein secondary protein structure is determined by its aminoacid sequence feature, is the basis of prediction higher structure.Secondary protein structure mainly utilizes amino acid sequence information, analyzes by experiment or Secondary structure is predicted by statistical method, when the accuracy of prediction reaches more than 80%, it is possible to describe the space structure of protein more accurately.Development along with DNA analysis and sequencing technologies, by the derivation of DNA and analysis are obtained substantial amounts of protein sequence information, but the protein structure obtained by traditional experiment and statistical method analysis is little, protein structural database updates slowly, hindering protein structure and the development of function correlational study, the emphasis studied at present starts to turn to gradually finds new Structure Prediction Methods.
Artificial neural network (ArtificialNeuralNetwork, i.e. ANN), abstract by human brain neuroid, from the angle of information analysis and process, the characteristic of natural neutral net is carried out abstract and simulation, by different rules and connected mode network consisting.BP neural network algorithm is one of current most widely used neural network model, is successfully applied in the research in the fields such as information, biology and medical science.The introducing of BP neural network algorithm, makes protein structure prediction obtain new approach.
Summary of the invention
The present invention is directed to the defect that protein secondary structure prediction accuracy rate is low and BP neutral net exists, the problem that the learning process of network is improved, it is provided that a kind of BP neural metwork training for protein secondary structure prediction and Forecasting Methodology.
The present invention is achieved by following proposal, a kind of secondary protein structure prediction method based on BP neutral net, adopts following step to realize:
Step one, choose from PDB one group of α spiral, β-pleated sheet and curling three class formations account for normal rates protein structural database composition training sample set;
Step 2, adopt the amino acid sequence information of six input coding protein, adopt the selected input range of sliding window;
Step 3, BP Neural Network Training Parameter initialize, sliding window position initialization;
The output of step 4 one, hidden layer calculates, and chooses training sample input network, calculates hidden layer output;
The output of step 4 two, output layer calculates, and utilizes hidden layer to calculate the output obtained, and calculates output layer output;
Step 4 three, output layer weights Error Calculation, the actual output of comparing cell and desired output, calculate output layer weights error;
Step 4 four, hidden layer weights Error Calculation, utilize output layer correction error and hidden layer output, calculate and obtain hidden layer weights error;
Neuronic weights and threshold value, according to Error Calculation weighed value adjusting amount, are adjusted by each neuronic weights in step 4 five, reversely adjustment network;
Step 4 six, sliding window move to right one, skip to 41, until whole protein sequence has calculated;
Step 4 seven, select next training sample to learn, skip to 41, until all sequences has calculated;
Step 4 eight judges that whether network error is less than anticipation error, if less than anticipation error, terminates training, otherwise judges whether iterations reaches maximum study number of times, jumps step 4 one, otherwise terminate training when less than maximum study number of times;
Step 5, input test data, it is determined that the secondary structure corresponding to protein sequence.
The present invention adopts BP neural network algorithm that secondary protein structure is predicted, and for the defect that BP neutral net exists, the learning process of network is improved.In network learning procedure, adopt batch processing learning model, promote the parallel spatial of calculating process, improve network convergence speed;Secondly, it is optimized based on gradient method, introduces the learning rules of subsidiary momentum term and autoadapted learning rate, it is to avoid " oscillatory occurences " and be absorbed in local minimum.In the selection of network structure and correlation technique, six input coding modes and sliding window technique is adopted at input layer, hidden layer structure rule of thumb formula and sliding window size are configured, and the classification of secondary protein structure is carried out output prediction according to DSSP algorithm by output layer.
Accompanying drawing explanation
Fig. 1 is sliding window schematic diagram, and window size is 2n+1, and window both sides length is n, and amino acid residue to be predicted is the residue I of window center position, and the contiguous amino acid residues of input network is YFQSMSVKGR and YSILKQIG, then the desired output of network is then T.Move to right a position with rear hatch, the structure of next residue is predicted, then amino acid residue to be predicted becomes Y, it is desirable to be output as E.
Fig. 2 is six coded system exemplary plot, the alanine (A) of window center be encoded to 000011, on the right side of window, isoleucine (I) is encoded to 010000.125.
Fig. 3 is the BP neural network model for secondary protein structure, and input layer has N number of neuron, and hidden layer has T neuron, and output layer has M neuron.
Fig. 4 is BP neural metwork training flow chart, is that training sample set is carried out repetition learning, makes the process that network is restrained gradually.
Detailed description of the invention
A kind of secondary protein structure prediction method based on BP neutral net, adopts following method to be embodied as:
Detailed description of the invention one: illustrate present embodiment below in conjunction with Fig. 1, Fig. 2, Fig. 3.
Step one, choose from PDB one group of α spiral, β-pleated sheet and curling three class formations account for normal rates protein structural database composition training sample set, three class formation ratios are all between 20%~40%, and coiled structure is on the high side;
Step 2 one, adopt six input coding protein amino acid sequence information.Wherein using five dimension two-stage system vector representation amino acid classes, remaining one is used for representing the residue side-play amount relative to window center position.Distance center position is more near, and the impact of residues Structures to be predicted is more big, and this value is also more big;Distance is more remote, then this value is more little.Assume that relative to the side-play amount at center be n, then this value is 2-n, as shown in Figure 1.
When before sequence is started, several residues carry out structure prediction, the portion in window does not have residue information, here it is the null value problem that can run in an encoding process.In this case, the mode adopted herein is by this position zero, for instance when adopt six coded systems, null value position be encoded to 000000.By by null value position zero so that it is as a kind of incoming network of particular value, reduce the null value impact on e-learning.
Step 2 two, the selected input range of employing sliding window, sliding window technique is by utilizing a window to move on protein sequence to be predicted, the input data of input layer being carried out scope selection.The center of window is amino acid residue to be predicted, the positional symmetry of window both sides, and length is identical, comprises the residue information adjacent with position to be predicted.When carrying out residues Structures prediction, all amino acid residue information in window being inputted network, the output of network output layer is then the secondary structure of window center position amino acid residue, as shown in Figure 2.
Window is excessive or too small all predictablity rate and network training efficiency can be produced impact, after deliberation and test, selects 17 as sliding window size.
Step 3, BP Neural Network Training Parameter initialize, sliding window position initialization.In Fig. 3, network parameter is arranged as follows:
Sample input vector: Ia=(i1, i, i3..., in), a=1,2,3 ..., Q
Input layer input vector: Xa=(x1, x2,x3..., xn), a=1,2,3 ..., Q
Hidden layer output vector: Ha=(h1,h2,h3..., ht), a=1,2,3 ..., Q
The actual output vector of output layer: Oa=(o1, o2,o3..., om), a=1,2,3 ..., Q
Output layer desired output vector: Pa=(p1, p2, p3..., pm), a=1,2,3 ..., Q
Input layer is to the connection weights of hidden layer: ωij, represent input layer xiTo hidden layer hjConnection weights, i=1,2,3 ..., N, j=1,2,3 ..., T
Hidden layer is to the connection weights of output layer: εjk, represent hidden layer hjO to output layerkConnect weights, j=1,2,3 ..., T, k=1,2,3
The threshold value of hidden layer neuron: θj, represent hidden layer neuron hjCorresponding threshold value, j=1,2,3 ..., T
The neuronic threshold value of output layer: μk, represent output layer neuron okCorresponding threshold value, k=1,2,3
The learning rate of weighed value adjusting: δ
Excitation function selects Sigmoid function:
Step 3 one, BP network parameter initialize.Network is connected weights ωij、εjkAnd threshold θj、μkCarry out random assignment.Owing to adopting Sigmoid function as transmission function, function will input the input of the little scope of boil down on a large scale, but when input becomes larger, the change of Sigmoid just levels off to 0, it is only small that the now produced impact of the change of parameter becomes, the assignment interval selection of random assignment is between (-1,1);
Step 3 two, sliding window position initialization, align first residue by sliding window;
BP neutral net instructs flow process as shown in Figure 4:
The output of step 4 one, hidden layer calculates, and chooses training sample input network, it is thus achieved that network input and desired output, calculates hidden layer output.Concentrate from training sample and choose training sample Ia, and by IaIt is converted into network input XaWith desired output Pa;Utilize input layer input Xa, hidden layer connect weights ωijAnd threshold θjCalculate hidden layer intermediate value, obtain the output h of hidden layer neuron through excitation function conversionj:
h i = f ( Σ i = 1 N x i ω i j - θ j )
The output of step 4 two, output layer calculates, and utilizes each neuron output h of hidden layerj, output layer connect weights εjkAnd threshold value μkDraw output layer intermediate value, draw the actual output o of network through excitation function conversionk:
o k = f ( Σ j = 1 T h j ϵ j k - μ k )
Step 4 three, output layer weights Error Calculation, calculate each neuronic correction error of output layer.Comparing cell actual output okWith desired output pkBetween difference, based on minimum mean square error criterion, calculate each neuronic correction error of output layerAnd export h according to hidden layerjWith correction errorIt is adjusted, it is thus achieved that new output layer neuron weights ε 'jkWith threshold value μ 'k:
d k a = o k ( p k - o k ) ( 1 - o k )
ϵ j k ′ = ϵ j k + δ * d k a * h j
μ k ′ = μ k + δ * d k a
Wherein,Represent and a sample is inputted, output layer neuron okCorrection error.
Step 4 four, hidden layer weights Error Calculation, comparing cell actual output okWith desired output pkBetween difference, utilize output layer correction errorCalculate each neuronic correction error of output layerAnd export h based on hidden layerj, output layer connect weights εjkX is inputted with input layeriWeights are adjusted, it is thus achieved that new hidden layer neuron weights ω 'ijAnd threshold θj:
e j a = h j ( 1 - h j ) Σ k = 1 M d k a ϵ j k
ω i j ′ = ω i j + δ * e j a * x i
θ i ′ = θ j + δ * e j a
Wherein,Represent and a sample is inputted, output layer neuron hjCorrection error.
Neuronic weights and threshold value, according to Error Calculation weighed value adjusting amount, are adjusted by each neuronic weights in step 4 five, reversely adjustment network;
Step 4 six, sliding window move to right one, skip to 41, until whole protein sequence has calculated;
Step 4 seven, select next training sample to learn, skip to 41, until all sequences has calculated;
Step 4 eight judges that whether network error is less than anticipation error, if less than anticipation error, terminates training, otherwise judges whether iterations reaches maximum study number of times, jumps step 4 one, otherwise terminate training when less than maximum study number of times;
Step 5, input protein data to be tested, it is determined that the secondary structure corresponding to protein sequence.
According to hydrogen bonding pattern, know together other eight kinds of secondary structure conformation, identification feature according to each of which and architectural feature, these structure conformation are divided into three major types, including: (1) helical structure, including 310 spirals (G), alpha-helix core (H), three kinds of spirals of π-spiral (I);(2) foldable structure, including the hydrogen bond in two kinds of beta-pleated sheets to type, in parallel and antiparallel bridge, independent bridge construction is designated B, i.e. β-bridge, and the pleated sheet containing β-projection is designated E;(3) circulus, including all the other all ordinary constructions (T, S, L etc.).
Utilizing neutral net to carry out protein secondary structure prediction, the actual tendency parameter being output as the corresponding three kinds of structures of this residue of wherein network, tendency parameter corresponding to structure is more big, then illustrate that the probability that residue is this kind of structure is more high.Assume that the actual of network is output as a, b, c, then when a is maximum, residue is helical structure, is foldable structure when b is maximum, is circulus when c is maximum.
The overall accuracy rate evaluation methodology adopted:
Wherein, N is the sum of amino acid residue to be predicted, CallFor overall accuracy rate, CSpiral、CFoldingAnd CRing-typeRepresent the number of residues of correctly predicted spiral, folding and circulus respectively.

Claims (15)

1. one kind based on BP neutral net secondary protein structure prediction method, it is characterised in that:
Adopt following step:
Step one, choose from PDB one group of α spiral, β-pleated sheet and curling three class formations account for normal rates protein structural database composition training sample set, ratio is between 20%~40%, and coiled structure is slightly many;
Step 2, adopt the amino acid sequence information of six coded proteins, as the input of BP neutral net, adopt the selected input range of sliding window;
Step 3, BP Neural Network Training Parameter initialize, sliding window position initialization;
Step 4, setting maximum iterations and minimum expectation error, the one by one sample in input sample set, training produces the BP neutral net for predicting;
In step 4, training produces the following steps of BP neutral net and realizes:
Step 4 one, from training set of stereotypes choose training sample, be converted into network input, calculate hidden layer output;
Step 4 two, utilize hidden layer calculate obtain output, calculate output layer output;
The actual output of step 4 three, comparing cell and desired output, calculate output layer weights error;
Step 4 four, utilize output layer correction error and hidden layer output, calculate hidden layer weights error;
Step 4 five, according to Error Calculation weighed value adjusting amount, reversely adjust hidden layer and the neuronic weights of output layer and threshold value in network;
Step 4 six, sliding window move to right one, skip to 41, until whole protein sequence has calculated;
Step 4 seven, select next training sample to learn, skip to 41, until all sequences has calculated;
Step 4 eight judges that whether network error is less than anticipation error, if less than anticipation error, terminates training, otherwise judges whether iterations reaches maximum study number of times, jumps step 4 one, otherwise terminate training when less than maximum study number of times;
Step 5, input protein data to be tested, it is determined that the secondary structure corresponding to protein sequence.
2. according to claim 1 based on BP neutral net secondary protein structure prediction method, it is characterized in that:
Adopting the amino acid sequence information of six input coding protein in step 2, wherein five represent amino acid classes, and one is used for representing the residue side-play amount relative to window center position, it is assumed that be n relative to the side-play amount at center, then this value is 2-n
3. according to claim 1 based on BP neutral net secondary protein structure prediction method, it is characterized in that:
Adopting the selected input range of sliding window in step 2, the center of window is amino acid residue to be predicted, the positional symmetry of window both sides, and length is identical, comprises the residue information adjacent with position to be predicted.
4. according to claim 1 based on BP neutral net secondary protein structure prediction method, it is characterized in that:
In step 3, initialized network connects weights ωij、εjkAnd threshold θj、μkFor random assignment, the assignment interval selection of random assignment is between (-1,1).
5. according to claim 1 based on BP neutral net secondary protein structure prediction method, it is characterized in that:
Step 4 adopts the transmission function that Sigmoid function calculates as neuron.
6. according to claim 1 and 5 based on BP neutral net secondary protein structure prediction method, it is characterized in that:
Step 4 one utilizes formula one to calculate hidden layer output, wherein ωijAnd θjHidden layer connects weights and threshold value, xiInput for input layer i-th:
Formula one, h j = f ( Σ i = 1 N x i ω i j - θ j ) .
7. according to claim 1 and 5 based on BP neutral net secondary protein structure prediction method, it is characterized in that:
Step 4 two utilizes formula two to calculate output layer output, wherein, and hjExport for hidden layer, εjkAnd μkWeights and threshold value is connected for output layer:
Formula two, o k = f ( Σ j = 1 T h j ϵ j k - μ k ) .
8. according to claim 1 based on BP neutral net secondary protein structure prediction method, it is characterized in that:
Step 4 three, based on minimum mean square error criterion, utilizes formula four to calculate each neuronic correction error of output layerAnd export h according to hidden layerjWith correction errorFormula four and five is utilized to calculate new output layer neuron weights ε 'jkWith threshold value μ 'k,
Formula three d k a = o k ( p k - o k ) ( 1 - o k )
Formula four ϵ j k ′ = ϵ j k + δ * d k a * h j
Formula five μ k ′ = μ k + δ * d k a .
9. according to claim 1 based on BP neutral net secondary protein structure prediction method, it is characterized in that:
Step 4 four comparing cell actual output okWith desired output pkBetween difference, utilize output layer correction errorFormula six is utilized to calculate each neuronic correction error of output layerH is exported based on hidden layerj, output layer connect weights εjkX is inputted with input layeri, utilize formula seven and eight to calculate new hidden layer neuron weights ω 'ijWith threshold θ 'j:
Formula six e j a = h j ( 1 - h j ) Σ k = 1 M d k a ϵ j k
Formula seven ω i j ′ = ω i j + δ * e j a * x i
Formula eight θ j ′ = θ j + δ * e j a .
10. according to claim 1 based on BP neutral net secondary protein structure prediction method, it is characterized in that:
Step 4 five introduces the learning rules with momentum term, and weighed value adjusting quantitative change turns to:
Δω i j ( t ) = δ ( 1 - α ) e j a x i + α * Δω i j ( t - 1 )
Wherein, α is momentum term coefficient, and value is between (0,1).
11. according to claim 1 based on BP neutral net secondary protein structure prediction method, it is characterized in that:
Whether step 4 eight looks first at network error less than anticipation error, if less than anticipation error, terminating training, otherwise continuing iteration until reaching maximum study number of times.
12. according to claim 1 based on BP neutral net secondary protein structure prediction method, it is characterized in that:
Step 5 is in input test data, it is determined that adopt overall accuracy rate evaluation methodology during secondary structure corresponding to protein sequence:
Wherein, N is the sum of amino acid residue to be predicted, CallFor overall accuracy rate, CSpiral、CFoldingAnd CRing-typeRepresent the number of residues of correctly predicted spiral, folding and circulus respectively.
13. protein coding method according to claim 2, null value position be encoded to 000000, as a kind of incoming network of particular value, reduce the null value impact on e-learning.
14. sliding window establishing method according to claim 3, time initial, in window, position is first residue, n null value position of left side polishing;Mend a null value position on the right side of during window alignment n residue of inverse, by that analogy, on the right side of when window is directed at last residue, mend n null value position.
15. according to claim 1,8 and 9 based on BP neutral net secondary protein structure prediction method, it is characterized in that:
Situation of change according to network error, dynamically arranges learning rate δ, makes network ensureing while pace of learning, takes into account the stability of network, if △ E represent the total error of twice study in front and back difference, introducing parameter beta1And β2Carrying out regularized learning algorithm rate, learning rate change formula is:
&delta; = &beta; 1 k &delta; ( &Delta; E > 0 ) &delta; ( &Delta; E = 0 ) k&beta; 2 &delta; ( &Delta; E < 0 )
Wherein, β1And β2As Dynamic gene, β1< 1, β2> 1, k is read-around ratio.
CN201610020567.4A 2016-01-13 2016-01-13 BP neural network based protein secondary structure prediction method Pending CN105740646A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610020567.4A CN105740646A (en) 2016-01-13 2016-01-13 BP neural network based protein secondary structure prediction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610020567.4A CN105740646A (en) 2016-01-13 2016-01-13 BP neural network based protein secondary structure prediction method

Publications (1)

Publication Number Publication Date
CN105740646A true CN105740646A (en) 2016-07-06

Family

ID=56246265

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610020567.4A Pending CN105740646A (en) 2016-01-13 2016-01-13 BP neural network based protein secondary structure prediction method

Country Status (1)

Country Link
CN (1) CN105740646A (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106295242A (en) * 2016-08-04 2017-01-04 上海交通大学 Protein domain detection method based on cost-sensitive LSTM network
CN106338489A (en) * 2016-10-17 2017-01-18 武汉市农业科学技术研究院林业果树科学研究所 Method for rapidly identifying oxidation degree of peony seeds and secondary protein structures
CN106951736A (en) * 2017-03-14 2017-07-14 齐鲁工业大学 A kind of secondary protein structure prediction method based on multiple evolution matrix
CN108364262A (en) * 2018-01-11 2018-08-03 深圳大学 A kind of restored method of blurred picture, device, equipment and storage medium
CN108427867A (en) * 2018-01-22 2018-08-21 中国科学院合肥物质科学研究院 One kind being based on Grey BP Neural Network interactions between protein Relationship Prediction method
CN108549794A (en) * 2018-03-29 2018-09-18 中国林业科学研究院资源昆虫研究所 A kind of secondary protein structure prediction method
CN110021340A (en) * 2018-07-30 2019-07-16 吉林大学 A kind of RNA secondary structure generator and its prediction technique based on convolutional neural networks and planning dynamic algorithm
CN110310698A (en) * 2019-07-05 2019-10-08 齐鲁工业大学 Classification model construction method and system based on protein length and DCNN
CN110390995A (en) * 2019-07-01 2019-10-29 上海交通大学 α spiral transmembrane protein topological structure prediction technique and device
CN111477271A (en) * 2019-12-13 2020-07-31 南京理工大学 MicroRNA prediction method based on supervised self-organizing mapping neural network
CN112585686A (en) * 2018-09-21 2021-03-30 渊慧科技有限公司 Machine learning to determine protein structure
CN115910220A (en) * 2023-01-03 2023-04-04 北京中科弧光量子软件技术有限公司 Quantum computer-based protein amino acid property encoding method and system

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106295242B (en) * 2016-08-04 2019-03-26 上海交通大学 Protein domain detection method and system based on cost-sensitive LSTM network
CN106295242A (en) * 2016-08-04 2017-01-04 上海交通大学 Protein domain detection method based on cost-sensitive LSTM network
CN106338489A (en) * 2016-10-17 2017-01-18 武汉市农业科学技术研究院林业果树科学研究所 Method for rapidly identifying oxidation degree of peony seeds and secondary protein structures
CN106951736A (en) * 2017-03-14 2017-07-14 齐鲁工业大学 A kind of secondary protein structure prediction method based on multiple evolution matrix
CN106951736B (en) * 2017-03-14 2019-02-26 齐鲁工业大学 A kind of secondary protein structure prediction method based on multiple evolution matrix
CN108364262A (en) * 2018-01-11 2018-08-03 深圳大学 A kind of restored method of blurred picture, device, equipment and storage medium
CN108427867A (en) * 2018-01-22 2018-08-21 中国科学院合肥物质科学研究院 One kind being based on Grey BP Neural Network interactions between protein Relationship Prediction method
CN108549794B (en) * 2018-03-29 2021-05-25 中国林业科学研究院资源昆虫研究所 Protein secondary structure prediction method
CN108549794A (en) * 2018-03-29 2018-09-18 中国林业科学研究院资源昆虫研究所 A kind of secondary protein structure prediction method
CN110021340A (en) * 2018-07-30 2019-07-16 吉林大学 A kind of RNA secondary structure generator and its prediction technique based on convolutional neural networks and planning dynamic algorithm
CN110021340B (en) * 2018-07-30 2021-04-02 吉林大学 RNA secondary structure generator based on convolutional neural network and planning dynamic algorithm and prediction method thereof
CN112585686A (en) * 2018-09-21 2021-03-30 渊慧科技有限公司 Machine learning to determine protein structure
CN112585685A (en) * 2018-09-21 2021-03-30 渊慧科技有限公司 Machine learning to determine protein structure
CN110390995A (en) * 2019-07-01 2019-10-29 上海交通大学 α spiral transmembrane protein topological structure prediction technique and device
CN110390995B (en) * 2019-07-01 2022-03-11 上海交通大学 Alpha spiral transmembrane protein topological structure prediction method and device
CN110310698A (en) * 2019-07-05 2019-10-08 齐鲁工业大学 Classification model construction method and system based on protein length and DCNN
CN111477271A (en) * 2019-12-13 2020-07-31 南京理工大学 MicroRNA prediction method based on supervised self-organizing mapping neural network
CN111477271B (en) * 2019-12-13 2022-09-30 南京理工大学 MicroRNA prediction method based on supervised self-organizing mapping neural network
CN115910220A (en) * 2023-01-03 2023-04-04 北京中科弧光量子软件技术有限公司 Quantum computer-based protein amino acid property encoding method and system

Similar Documents

Publication Publication Date Title
CN105740646A (en) BP neural network based protein secondary structure prediction method
CN106600059B (en) Intelligent power grid short-term load prediction method based on improved RBF neural network
CN108846512B (en) Water quality prediction method based on preferential classification
Baldi et al. Matching protein beta-sheet partners by feedforward and recurrent neural networks
Taylan et al. A new approach to multivariate adaptive regression splines by using Tikhonov regularization and continuous optimization
CN106875002A (en) Complex value neural network training method based on gradient descent method Yu generalized inverse
CN103559294A (en) Construction method and device as well as sorting method and device for support vector machine sorter
CN106682606A (en) Face recognizing method and safety verification apparatus
CN103489009A (en) Pattern recognition method based on self-adaptation correction neural network
CN110119760A (en) A kind of sequence classification method based on the multiple dimensioned Recognition with Recurrent Neural Network of stratification
CN104732352A (en) Method for question bank quality evaluation
CN105578472B (en) A kind of wireless sensor network performance online Method for optimized planning based on immunity principle
CN107273818A (en) The selective ensemble face identification method of Genetic Algorithm Fusion differential evolution
CN110895772A (en) Electricity sales amount prediction method based on combination of grey correlation analysis and SA-PSO-Elman algorithm
CN114462664A (en) Short-range branch flight scheduling method integrating deep reinforcement learning and genetic algorithm
CN101893852B (en) Multi-target modeling method for complex industrial process
CN113705724B (en) Batch learning method of deep neural network based on self-adaptive L-BFGS algorithm
CN111130909A (en) Network flow prediction method based on self-adaptive reserve pool ESN
CN107577918A (en) The recognition methods of CpG islands, device based on genetic algorithm and hidden Markov model
CN105976029B (en) A kind of overhead crane neural network modeling approach of cuckoo behavior RNA-GA
CN106156854A (en) A kind of support vector machine parameter prediction method based on DNA encoding
CN110705704A (en) Neural network self-organizing genetic evolution algorithm based on correlation analysis
CN115906831A (en) Distance perception-based Transformer visual language navigation algorithm
CN105046034B (en) Multiresolution Airfoil Design method and system
Seetharaman Consistent bi-level variable selection via composite group bridge penalized regression

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20160706

WD01 Invention patent application deemed withdrawn after publication