CN108345457A - A method of to program source code automatic generation function descriptive notes - Google Patents
A method of to program source code automatic generation function descriptive notes Download PDFInfo
- Publication number
- CN108345457A CN108345457A CN201810070002.6A CN201810070002A CN108345457A CN 108345457 A CN108345457 A CN 108345457A CN 201810070002 A CN201810070002 A CN 201810070002A CN 108345457 A CN108345457 A CN 108345457A
- Authority
- CN
- China
- Prior art keywords
- source code
- feature
- word
- vector
- automatic generation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/40—Transformation of program code
- G06F8/41—Compilation
- G06F8/42—Syntactic analysis
- G06F8/427—Parsing
Abstract
The present invention provides a kind of methods to program source code automatic generation function descriptive notes, including:Step 1:Extract the feature of source code;Step 2:The feature extracted is analyzed, will be stated about the feature of function from natural language is employed.The present invention can be that a variety of different programming languages automatically generate corresponding function descriptive notes, express the action and function of source code well, can effectively promote the maintenance and exploitation of software.Compared with prior art, the present invention has following advantageous effect:1, the present invention handle simultaneously extraction feature by the analytic tree to source code, compared to the characteristic that other methods can more fully characterize source code, rather than just the Partial Feature of part;2, fixed form is eliminated the reliance on when the present invention generates annotation, more flexible compared to other methods various, the annotation of generation can also state more functions, and description is also more comprehensive;3, the present invention is suitable for all kinds of programming languages, has generality.
Description
Technical field
The present invention relates to field of computer technology, and in particular, to a kind of to be described to program source code automatic generation function
Property annotation method.
Background technology
With the high speed development of internet, software development is also increasingly burning hoter and with there are a large amount of Open Source Code libraries.
It is read in software development process and understands that others' source code is the work taken time and effort, if source code can have
Corresponding function descriptive notes will greatly simplify this process, accelerate the speed of software development.Annotation perfect simultaneously
The maintainability of software systems can be improved.But the code under present case only less than 20 percent has corresponding note
It releases, so numerous application developers can not only accelerate to understand there is an urgent need for a kind of method of automatic generation function descriptive notes
The speed of other people codes, additionally it is possible to which auxiliary development maintains easily.
McBurney is (referring to McBurney, P.W., and McMi llan, C.2014.Automatic
documentation generat ion via source code summarization of method context.In
ICPC, 279-290.ACM.) propose a kind of method of the generation annotation based on template.This method is equal to source code and annotation
It is handled by template, the annotation effect that the method generates if encountering the code for not being inconsistent shuttering and annotation is too poor, fits
It is too narrow with range, simultaneously as be the annotation by template generation, so the annotation generated between different code is similar, it can
It is insufficient with property.A kind of " source code annotation automatic generation method based on data mining " this patent proposes a kind of based on data
Excavate the annotation automatic generation method with Rule Extraction.But this method is only limitted to automatically generate the note of linux kernel function
It releases, generality is not strong.Because this method has equally used Rule Extraction, some defects of all methods based on template also to deposit
It is in this method.Iyer is (referring to Iyer, S.;Konstas,I.;Cheung,A.;and Zettlemoyer,
L.2016.Summariz ing source code us ing a neural attention model.In ACL,2073–
2083) a kind of method generating annotation using the method training data of neural network is proposed, but in this method still
It is so that source code has been treated as to common text, does not extract any programming language exclusive feature and feature,
This existing defects of source code feature extraction.
Invention content
For the defects in the prior art, program source code automatic generation function is retouched the object of the present invention is to provide a kind of
The method of the property stated annotation.
According to a kind of method to program source code automatic generation function descriptive notes provided by the invention, including:
Step 1:Extract the feature of source code;
Step 2:The feature extracted is analyzed, will be stated about the feature of function from natural language is employed.
Preferably, the step 1 includes:
Step 1.1:According to the structural information of source code analytic tree corresponding with code statement structure;
Step 1.2:Semantic parsing is carried out to the analytic tree that structure obtains;
Step 1.3:According to the analytic tree after semantic parse, feature extraction is Down-Up used since all leaf nodes
Formula is traversed to root node, the i.e. required final feature extracted of the feature that final root node extracts.
Preferably, the step 1.2 includes:
Step 1.2.1:In the analytic tree of structure, different words is splitted out according to pre-defined rule;
Step 1.2.2:By the etymology of the information searching of source code context to the word splitted out, and pass through etymology
Carry out variable name polishing.
Preferably, the feature extraction formula includes:
In formula:V is indicated by the vector of the subtree of root node of N;VnodeIt is the vector for indicating N own types;C is N
The set of all child nodes;VcIt is to indicate to indicate by the vector of the subtree of root node of a child node c of N;F is RELU activation
Function;N is the child node number of node N;B is offset.
Preferably, the step 2 includes:
Step 2.1:Increase a selection door in GRU neural networks to be used to select about work(in feature vector
Can and it carry out annotation generation with the relevant feature of current state;
Step 2.2:Generate annotation.
Preferably, the selection formula of the selection door in the step 2.1 includes:
ct=σ (Wc·[ht-1,xt])
In formula:ctBe select pupil at filter, value is between 0 to 1;ht-1It is the state vector of last moment;xt
It is the word of this moment input;WcIt is weight matrix;σ is sigmoid functions.
Preferably, the step 2.2 includes:
Step 2.2.1:The word occurred in all annotations is integrated into dictionary and is numbered, then according to each word
Number generate its exclusive one-hot vector, i.e., it is equal that the value that corresponding dimension values are 1 other dimensions is numbered in this vector
It is 0;
Step 2.2.2:The step 2.2.1 one-hot vectors generated are obtained using word generating probability calculation formula
The probability of each word, and the word that the word of maximum probability should be generated as current time are generated under to current state.
Preferably, the probability for generating each word in the step 2.2.2 includes:
zt=σ (Wz·[ht-1,xt])
rt=σ (Wr·[ht-1,xt])
ct=σ (Wc·[ht-1,xt])
yt=softmax (Wohht+bo)
In formula:VmIt is the vector expression of source code;ht-1It is the state vector of last moment;xtIt is this moment input word
One-hot vectors;ztIt is the output for updating door;rtIt is the output for resetting door;ctIt is the output for selecting door;It is current hidden layer
Output, as one of door input of update;WohIt is that weight matrix optimizes in training;boIt is the offset of output layer
Amount;ytIt is that the output of softmax saves the generating probability of each word;H is the output for updating door, is current output shape
State.
Preferably, the step 2.2.2 is a unit of RNN neural networks, the unit is put into RNN neural networks
It is modified in the middle by the method for chain rule and stochastic gradient descent.
It is indicated preferably, the feature that the step 1 extracts is integrated into a multi-C vector;The step 2
The word of generation is stitched together according to genesis sequence as final function descriptive notes.
The present invention can be that a variety of different programming languages automatically generate corresponding function descriptive notes, well
The action and function for expressing source code can effectively promote the maintenance and exploitation of software.Compared with prior art, of the invention
With following advantageous effect:
1, the present invention handle simultaneously extraction feature by the analytic tree to source code, can be more complete compared to other methods
The characteristic of the characterization source code in face, rather than just the Partial Feature of part;
2, fixed form is eliminated the reliance on when the present invention generates annotation, it is more flexible compared to other methods various, generation
Annotation can also state more functions, and description is also more comprehensive;
3, the present invention is suitable for all kinds of programming languages, has generality.
Description of the drawings
Upon reading the detailed description of non-limiting embodiments with reference to the following drawings, other feature of the invention,
Objects and advantages will become more apparent upon:
Fig. 1 is the flow chart of the present invention;
Fig. 2 is analytic tree schematic diagram provided in an embodiment of the present invention;
Fig. 3 is the schematic diagram of a unit of RNN neural networks in the embodiment of the present invention;
Fig. 4 is that present invention annotation generates schematic diagram.
Specific implementation mode
With reference to specific embodiment, the present invention is described in detail.Following embodiment will be helpful to the technology of this field
Personnel further understand the present invention, but the invention is not limited in any way.It should be pointed out that the ordinary skill of this field
For personnel, without departing from the inventive concept of the premise, several changes and improvements can also be made.These belong to the present invention
Protection domain.
As shown in Figure 1, a kind of method to program source code automatic generation function descriptive notes provided by the invention, packet
It includes:
Step 1:Extract the feature of source code;
Step 2:The feature extracted is analyzed, will be stated about the feature of function from natural language is employed.
Step 1 includes:
Step 1.1:It, can by analytic tree according to the structural information of source code analytic tree corresponding with code statement structure
The structural information and semantic information of source code are indicated simultaneously.
Step 1.2:Variable name in the analytic tree obtained to structure etc. carries out semantic parsing.It specifically includes:
Step 1.2.1:Variable name is split:In the analytic tree of structure, different words is split out according to pre-defined rule
Come.Node is write in the analytic tree that step 1.1 is built to be spliced by multiple words, according to the lead-in of each word
Mother, which such as can capitalize, can be attached by underscore between various words at the rule, to split out different words, make semanteme
Information is definitely.And new node type " CombineName " is increased in analytic tree to indicate that the son of this node saves
Sliceable point is variable name.
Step 1.2.2:Variable name polishing:Pass through the word of the information searching of source code context to the word splitted out
Source, and variable name polishing is carried out by etymology, improve semantic information.Then the semantic information after improving is updated to analytic tree and is worked as
In.
Step 1.3:According to the analytic tree after semantic parse, feature extraction is Down-Up used since all leaf nodes
Formula is traversed to root node, the i.e. required final feature extracted of the feature that final root node extracts.Feature extraction formula includes:
In formula:V is indicated by the vector of the subtree of root node of N;VnodeIt is the vector for indicating N own types;C is N
The set of all child nodes;VcIt is to indicate to indicate by the vector of the subtree of root node of a child node c of N;F is RELU activation
Function;N is the child node number of node N;B is offset.
The final feature being drawn into step 1 is integrated into a multi-C vector and is indicated.
Step 2 includes:
Step 2.1:Increase a selection door in GRU neural networks to be used to select about work(in feature vector
Can and annotation generation be carried out with the relevant feature of current state, different moments select the selection result of door can be variant.
Selection door selection formula include:
ct=σ (Wc·[ht-1,xt])
In formula:ctBe select pupil at filter, value is between 0 to 1;ht-1It is the state vector of last moment;xt
It is the word of this moment input;WcIt is weight matrix;σ is sigmoid functions.
It will select the output result c of doortCorrespondence takes feature vector and can be obtained required source code feature.
Step 2.2:Generate annotation.
Step 2.2.1:The word occurred in all annotations is integrated into dictionary and is numbered, then according to each word
Number generate its exclusive one-hot vector, i.e., it is equal that the value that corresponding dimension values are 1 other dimensions is numbered in this vector
It is 0;
Step 2.2.2:The one-hot vectors that step 2.2.1 is generated are worked as using word generating probability calculation formula
The probability of each word, and the word that the word of maximum probability should be generated as current time are generated under preceding state.
The probability for generating each word includes:
zt=σ (Wz·[ht-1,xt])
rt=σ (Wr·[ht-1,xt])
ct=σ (Wc·[ht-1,xt])
yt=softmax (Wohht+bo)
In formula:VmIt is the vector expression of source code;ht-1It is the state vector of last moment;xtIt is this moment input word
One-hot vectors;ztIt is the output for updating door;rtIt is the output for resetting door;ctIt is the output for selecting door;It is current hidden layer
Output, as one of door input of update;WohIt is that weight matrix optimizes in training;boIt is the offset of output layer
Amount;ytIt is that the output of softmax saves the generating probability of each word;H is the output for updating door, is current output shape
State.
Step 2.2.2 is a unit of RNN neural networks, as shown in figure 3, unit is put into RNN neural networks
It is modified by the method for chain rule and stochastic gradient descent.
The word that step 2 generates is stitched together according to genesis sequence as final function descriptive notes.
Specifically, one section of source code S is given as input, and output can state the descriptive notes C of its function.S, C are such as
Under:
S:if(!Found) { allFound=false;}if(allFound){return true;}
C:find out whether find it all
We build the analytic tree of source code S first, and the results are shown in Figure 2.Then start to begin stepping through from leaf node
Whole analytic tree.By taking " all " and " found " in leaf node as an example, obtained in ergodic process by following equation
The vector of " CombineName " node indicates:
In formula:VCombineName, Vall, VfoundThree vectors indicate " CombineName " respectively, " all ", " found " institute
Exclusive vector is because three character strings have change in different location.VC'ombineNameIt represents " CombineName " and is in and work as
The vector of this character string indicates when front position and context, when the vector for the father's node for generating this " CombineName " indicates
When need use VC'ombineNameRather than VCombineName。
Analytic tree is traversed from bottom to top since leaf node using feature extraction formula and once can be obtained root node
The feature vector at place indicates, namely the feature vector of entire source code indicates.
Characteristic extraction procedure many of work as parameter be need it is modified, so this method is from Google Code Jam
The function for having crawled a large amount of specific functions in the middle carries out classification based training.By classification based training by parameters revision a to stable state
Stable parameter can be used to carry out the formal feature extraction for carrying out source code afterwards.
Next annotation generates process as shown in figure 4, input feature value and start symbol " START " and then logical first
The output word at this moment is calculated in the method for crossing the step 2.2.2, then by the output word at this moment as next
The input word at moment obtains subsequent word successively.Finally by these obtained words according to the sequential concatenation of generation to one piece
It can be obtained the annotation " find out whether find it all " ultimately generated.
Specific embodiments of the present invention are described above.It is to be appreciated that the invention is not limited in above-mentioned
Particular implementation, those skilled in the art can make a variety of changes or change within the scope of the claims, this not shadow
Ring the substantive content of the present invention.In the absence of conflict, the feature in embodiments herein and embodiment can arbitrary phase
Mutually combination.
Claims (10)
1. a kind of method to program source code automatic generation function descriptive notes, which is characterized in that including:
Step 1:Extract the feature of source code;
Step 2:The feature extracted is analyzed, will be stated about the feature of function from natural language is employed.
2. the method according to claim 1 to program source code automatic generation function descriptive notes, which is characterized in that
The step 1 includes:
Step 1.1:According to the structural information of source code analytic tree corresponding with code statement structure;
Step 1.2:Semantic parsing is carried out to the analytic tree that structure obtains;
Step 1.3:According to the analytic tree after semantic parse, feature extraction formula is Down-Up used since all leaf nodes
Traversal is to root node, the i.e. required final feature extracted of the feature that final root node extracts.
3. the method according to claim 2 to program source code automatic generation function descriptive notes, which is characterized in that
The step 1.2 includes:
Step 1.2.1:In the analytic tree of structure, different words is splitted out according to pre-defined rule;
Step 1.2.2:By the etymology of the information searching of source code context to the word splitted out, and carried out by etymology
Variable name polishing.
4. the method according to claim 1 to program source code automatic generation function descriptive notes, which is characterized in that
The feature extraction formula includes:
In formula:V is indicated by the vector of the subtree of root node of N;VnodeIt is the vector for indicating N own types;C is all sons of N
The set of node;VcIt is to indicate to indicate by the vector of the subtree of root node of a child node c of N;F is RELU activation primitives;n
It is the child node number of node N;B is offset.
5. the method according to claim 1 to program source code automatic generation function descriptive notes, which is characterized in that
The step 2 includes:
Step 2.1:Increase a selection door in GRU neural networks to be used to select about function simultaneously in feature vector
And carry out annotation generation with the relevant feature of current state;
Step 2.2:Generate annotation.
6. the method according to claim 5 to program source code automatic generation function descriptive notes, which is characterized in that
The selection formula of the selection door in the step 2.1 includes:
ct=σ (Wc·[ht-1,xt])
In formula:ctBe select pupil at filter, value is between 0 to 1;ht-1It is the state vector of last moment;xtIt is this
The word of moment input;WcIt is weight matrix;σ is sigmoid functions.
7. the method according to claim 5 to program source code automatic generation function descriptive notes, which is characterized in that
The step 2.2 includes:
Step 2.2.1:The word occurred in all annotations is integrated into dictionary and is numbered, then according to the volume of each word
Its exclusive one-hot vector number is generated, i.e., it is 0 that the value that corresponding dimension values are 1 other dimensions is numbered in this vector;
Step 2.2.2:The step 2.2.1 one-hot vectors generated are worked as using word generating probability calculation formula
The probability of each word, and the word that the word of maximum probability should be generated as current time are generated under preceding state.
8. the method according to claim 7 to program source code automatic generation function descriptive notes, which is characterized in that
The probability that each word is generated in the step 2.2.2 includes:
zt=σ (Wz·[ht-1,xt])
rt=σ (Wr·[ht-1,xt])
ct=σ (Wc·[ht-1,xt])
yt=softmax (Wohht+bo)
In formula:VmIt is the vector expression of source code;ht-1It is the state vector of last moment;xtIt is the one- of this moment input word
Hot vectors;ztIt is the output for updating door;rtIt is the output for resetting door;ctIt is the output for selecting door;It is the defeated of current hidden layer
Go out, an input as update door;WohIt is that weight matrix optimizes in training;boIt is the offset of output layer;yt
It is that the output of softmax saves the generating probability of each word;H is the output for updating door, is current output state.
9. the method according to claim 7 to program source code automatic generation function descriptive notes, which is characterized in that
The step 2.2.2 is a unit of RNN neural networks, and the unit is put into RNN neural networks and passes through chain method
Then it is modified with the method for stochastic gradient descent.
10. the method according to claim 7 to program source code automatic generation function descriptive notes, feature exists
In the feature that the step 1 extracts is integrated into a multi-C vector and is indicated;The word that the step 2 generates is pressed
It is stitched together as final function descriptive notes according to genesis sequence.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810070002.6A CN108345457B (en) | 2018-01-24 | 2018-01-24 | Method for automatically generating functional descriptive annotation for program source code |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810070002.6A CN108345457B (en) | 2018-01-24 | 2018-01-24 | Method for automatically generating functional descriptive annotation for program source code |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108345457A true CN108345457A (en) | 2018-07-31 |
CN108345457B CN108345457B (en) | 2021-03-09 |
Family
ID=62961677
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810070002.6A Expired - Fee Related CN108345457B (en) | 2018-01-24 | 2018-01-24 | Method for automatically generating functional descriptive annotation for program source code |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108345457B (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109739483A (en) * | 2018-12-28 | 2019-05-10 | 北京百度网讯科技有限公司 | Method and apparatus for generated statement |
CN109783079A (en) * | 2018-12-21 | 2019-05-21 | 南京航空航天大学 | A kind of code annotation generation method based on program analysis and Recognition with Recurrent Neural Network |
CN109960506A (en) * | 2018-12-03 | 2019-07-02 | 复旦大学 | A kind of code annotation generation method based on structure perception |
CN110162297A (en) * | 2019-05-07 | 2019-08-23 | 山东师范大学 | A kind of source code fragment natural language description automatic generation method and system |
CN110851175A (en) * | 2019-09-03 | 2020-02-28 | 东南大学 | Comment classification method based on decision tree |
CN111881028A (en) * | 2020-07-23 | 2020-11-03 | 深圳慕智科技有限公司 | Neural network automatic generation method based on model code syntactic analysis |
CN112433754A (en) * | 2021-01-13 | 2021-03-02 | 南京大学 | Java function annotation automatic generation method based on program analysis |
CN112559035A (en) * | 2021-01-22 | 2021-03-26 | 支付宝(杭州)信息技术有限公司 | Method and apparatus for managing notes of code text |
CN113312084A (en) * | 2021-05-26 | 2021-08-27 | 合肥移瑞通信技术有限公司 | AT framework code automatic generation method and device, electronic equipment and storage medium |
WO2022226716A1 (en) * | 2021-04-25 | 2022-11-03 | 南京大学 | Deep learning-based java program internal annotation generation method and system |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106681708A (en) * | 2016-11-16 | 2017-05-17 | 中国科学院软件研究所 | Automatic source code annotation generation method based on data mining |
-
2018
- 2018-01-24 CN CN201810070002.6A patent/CN108345457B/en not_active Expired - Fee Related
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106681708A (en) * | 2016-11-16 | 2017-05-17 | 中国科学院软件研究所 | Automatic source code annotation generation method based on data mining |
Non-Patent Citations (2)
Title |
---|
K. A. DAWOOD等: "Source code analysis extractive approach to generate textual summary", 《JOURNAL OF THEORETICAL & APPLIED INFORMATION TECHNOLOGY》 * |
SRINIVASAN IYER等: "Summarizing Source Code using a Neural Attention Model", 《ACL》 * |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109960506A (en) * | 2018-12-03 | 2019-07-02 | 复旦大学 | A kind of code annotation generation method based on structure perception |
CN109783079A (en) * | 2018-12-21 | 2019-05-21 | 南京航空航天大学 | A kind of code annotation generation method based on program analysis and Recognition with Recurrent Neural Network |
CN109739483B (en) * | 2018-12-28 | 2022-02-01 | 北京百度网讯科技有限公司 | Method and device for generating statement |
CN109739483A (en) * | 2018-12-28 | 2019-05-10 | 北京百度网讯科技有限公司 | Method and apparatus for generated statement |
CN110162297A (en) * | 2019-05-07 | 2019-08-23 | 山东师范大学 | A kind of source code fragment natural language description automatic generation method and system |
CN110851175A (en) * | 2019-09-03 | 2020-02-28 | 东南大学 | Comment classification method based on decision tree |
CN110851175B (en) * | 2019-09-03 | 2023-10-31 | 东南大学 | Comment classification method based on decision tree |
CN111881028A (en) * | 2020-07-23 | 2020-11-03 | 深圳慕智科技有限公司 | Neural network automatic generation method based on model code syntactic analysis |
CN112433754A (en) * | 2021-01-13 | 2021-03-02 | 南京大学 | Java function annotation automatic generation method based on program analysis |
CN112433754B (en) * | 2021-01-13 | 2022-05-31 | 南京大学 | Java function annotation automatic generation method based on program analysis |
CN112559035A (en) * | 2021-01-22 | 2021-03-26 | 支付宝(杭州)信息技术有限公司 | Method and apparatus for managing notes of code text |
WO2022226716A1 (en) * | 2021-04-25 | 2022-11-03 | 南京大学 | Deep learning-based java program internal annotation generation method and system |
CN113312084A (en) * | 2021-05-26 | 2021-08-27 | 合肥移瑞通信技术有限公司 | AT framework code automatic generation method and device, electronic equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN108345457B (en) | 2021-03-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108345457A (en) | A method of to program source code automatic generation function descriptive notes | |
CN108446540B (en) | Program code plagiarism type detection method and system based on source code multi-label graph neural network | |
Hui et al. | Linguistic structure guided context modeling for referring image segmentation | |
CN107885999B (en) | Vulnerability detection method and system based on deep learning | |
CN108388425B (en) | Method for automatically completing codes based on LSTM | |
CN113360915B (en) | Intelligent contract multi-vulnerability detection method and system based on source code diagram representation learning | |
CN101777042B (en) | Neural network and tag library-based statement similarity algorithm | |
CN108874878A (en) | A kind of building system and method for knowledge mapping | |
CN106537333A (en) | Systems and methods for a database of software artifacts | |
US10210249B2 (en) | Method and system of text synthesis based on extracted information in the form of an RDF graph making use of templates | |
CN108563703A (en) | A kind of determination method of charge, device and computer equipment, storage medium | |
CN113761893B (en) | Relation extraction method based on mode pre-training | |
CN112579469A (en) | Source code defect detection method and device | |
CN104391837A (en) | Intelligent grammatical analysis method based on case semantics | |
JP6263858B2 (en) | Method, apparatus and computer program for processing knowledge and information | |
CN115687563A (en) | Interpretable intelligent judgment method and device, electronic equipment and storage medium | |
Zeng et al. | EtherGIS: a vulnerability detection framework for ethereum smart contracts based on graph learning features | |
CN112580331A (en) | Method and system for establishing knowledge graph of policy text | |
CN107590119A (en) | Character attribute information extraction method and device | |
CN111898337B (en) | Automatic generation method of single sentence abstract defect report title based on deep learning | |
CN110989991B (en) | Method and system for detecting source code clone open source software in application program | |
CA3166556A1 (en) | Method and device for generating target advertorial based on deep learning | |
Meng | An intelligent code search approach using hybrid encoders | |
CN104750484A (en) | Code abstract generation method based on maximum entropy model | |
CN114911933A (en) | False news detection method and system based on intra-graph and inter-graph joint information propagation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20210309 Termination date: 20220124 |