CN112162775A - Java code annotation automatic generation method based on Transformer and mixed code expression - Google Patents
Java code annotation automatic generation method based on Transformer and mixed code expression Download PDFInfo
- Publication number
- CN112162775A CN112162775A CN202011129802.4A CN202011129802A CN112162775A CN 112162775 A CN112162775 A CN 112162775A CN 202011129802 A CN202011129802 A CN 202011129802A CN 112162775 A CN112162775 A CN 112162775A
- Authority
- CN
- China
- Prior art keywords
- code
- encoder
- sbt
- java
- words
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 55
- 239000013598 vector Substances 0.000 claims abstract description 14
- 238000012545 processing Methods 0.000 claims abstract description 6
- 239000000284 extract Substances 0.000 claims abstract description 4
- 230000009286 beneficial effect Effects 0.000 abstract description 2
- 238000012549 training Methods 0.000 description 6
- 238000012423 maintenance Methods 0.000 description 5
- 238000011161 development Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 238000013528 artificial neural network Methods 0.000 description 3
- 238000013527 convolutional neural network Methods 0.000 description 3
- 230000007812 deficiency Effects 0.000 description 3
- 239000011159 matrix material Substances 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- 230000004913 activation Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000004883 computer application Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000306 recurrent effect Effects 0.000 description 1
- 230000002787 reinforcement Effects 0.000 description 1
- 238000004904 shortening Methods 0.000 description 1
- 238000010998 test method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/70—Software maintenance or management
- G06F8/73—Program documentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/40—Transformation of program code
- G06F8/41—Compilation
- G06F8/42—Syntactic analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/40—Transformation of program code
- G06F8/41—Compilation
- G06F8/42—Syntactic analysis
- G06F8/425—Lexical analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/40—Transformation of program code
- G06F8/41—Compilation
- G06F8/42—Syntactic analysis
- G06F8/427—Parsing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/40—Transformation of program code
- G06F8/41—Compilation
- G06F8/43—Checking; Contextual analysis
- G06F8/436—Semantic checking
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computational Linguistics (AREA)
- Library & Information Science (AREA)
- Devices For Executing Special Programs (AREA)
Abstract
The invention provides a Java code annotation automatic generation method based on Transformer and mixed code representation, which comprises the following steps: s1, downloading Java items and constructing a code library; s2, converting the AST traversal into a code token vector and an SBT vector on the basis of the AST traversal at a serialization processing layer; s3, at the encoding layer, using a Code encoder and an SBT encoder, wherein the Code encoder extracts lexical information from a source Code and uses the SBT encoder to obtain the structure information of the Code; at the decoding layer, the semantic information is decoded to generate a comment S4. The invention has the beneficial effects that: the method of the present invention is for Code annotation generation, particularly encoding Code and AST-based SBT traversal sequences at the encoding level, and merging the learned semantic information of both to capture the semantic information of the source Code.
Description
Technical Field
The invention relates to the technical field of computer application, in particular to a Java code annotation automatic generation method based on Transformer and mixed code representation.
Background
In the development and maintenance process of software, the comments corresponding to the codes often have the problems of missing, deficiency or mismatching with the actual content of the codes, but writing the code comments manually wastes time and labor for developers, and the comment quality is difficult to guarantee, so that researchers are urgently needed to provide an effective automatic code comment generation method.
Code annotation generates a natural language description intended to generate source code, which can help developers understand the program, thereby reducing the time cost of software maintenance. Recently, most of the latest technologies utilize a Seq2Seq model based on RNN (recurrent neural network) or CNN (convolutional neural network). However, this approach has certain disadvantages. For example, CNNs cannot be used directly to process variable length sequence samples, while RNNs cannot be computed in parallel and are inefficient.
Disclosure of Invention
The invention aims to provide a Java Code annotation automatic generation method based on Transformer and mixed Code representation, which aims to solve the problems of poor program readability, poor understandability and increased software development and maintenance cost caused by lack of Code annotation in the software development and maintenance process in the prior art, and is used for Code annotation generation, particularly for coding Code and AST-based SBT traversal sequences at a coding layer and combining semantic information learned by the Code and the AST to capture the semantic information of source codes; the invention realizes the automation of code annotation generation, generates concise and accurate annotations for codes, improves the readability and the intelligibility of the codes, reduces the code development and maintenance cost and improves the code development and maintenance efficiency.
The invention is realized by the following measures: a Java code annotation automatic generation method based on Transformer and mixed code representation comprises the following steps:
s1, downloading Java items and constructing a code library;
s2, converting the AST traversal into a code token vector and an SBT vector on the basis of the AST traversal at a serialization processing layer;
to address the vocabulary deficiency issue, identifiers from the code token and the AST node are split into words based on a hump naming method;
s3, at the encoding layer, using a Code encoder and an SBT encoder, wherein the Code encoder extracts lexical information from a source Code and uses the SBT encoder to obtain the structure information of the Code;
at the decoding layer, the semantic information is decoded to generate a comment S4.
As a further optimization scheme of the method for automatically generating Java Code annotations represented by a transform and a mixed Code provided by the present invention, in step S2, two input sequences respectively applicable to a Code encoder and an SBT encoder are generated, and when an input sequence of a Code encoder is generated, the method specifically includes the following steps:
s201, decomposing the identifier name in the source code into a plurality of words by using a hump naming method;
s202, uniformly converting the decomposed words into a lower case format;
s203, the OOV (Out-Of-Vocabulariy) problem is that some rare words or derivative words, words generated by complex numbers Of the words or rules Of other combined words can not be represented by the existing word vector model, and specific numbers and character strings are replaced by "< NUM >" and "< STR >" labels respectively to relieve the OOV problem, so as to obtain an input sequence Of Code Encoding;
s204, resolving the Java method into an abstract syntax tree AST by using a JDT compiler of Eclipse for Java code data, and traversing the abstract syntax tree by using an SBT traversal method to obtain an input sequence of the SBT encoder.
As a further optimized solution of the method for automatically generating Java Code annotations represented by a Transformer and mixed codes provided by the present invention, a Code encoder and an SBT encoder are used in step S3, and information obtained by the two encoders is merged into an output of an encoding layer sequence.
Compared with the prior art, the invention has the beneficial effects that: the invention is a new Code annotation generating method based on mixed Code expression and Transformer, the Transformer can realize better performance than the traditional Seq2Seq (Sequence to Sequence) model, the Code and AST-based SBT traversal Sequence are coded in a coding layer, and the learned semantic information of the Code and the AST-based SBT traversal Sequence is merged to obtain the semantic information of a source Code.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention.
FIG. 1 is a general flow diagram of the present invention.
FIG. 2 is a graph illustrating a comparison result of different encoder configurations according to the present invention.
FIG. 3 is a graph illustrating a comparison result curve of different encoder configurations according to the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. Of course, the specific embodiments described herein are merely illustrative of the invention and are not intended to be limiting.
Example 1
Referring to fig. 1 to 3, the technical solution provided by the present invention is a Java code annotation automatic generation method based on a Transformer and mixed code representation, wherein the method includes the following steps:
s1, downloading Java items and constructing a code library;
s2, converting the AST traversal into a code token vector and an SBT vector on the basis of the AST traversal at a serialization processing layer;
to address the vocabulary deficiency issue, identifiers from the code token and the AST node are split into words based on a hump naming method;
s3, at the encoding layer, using a Code encoder and an SBT encoder, wherein the Code encoder extracts lexical information from a source Code and uses the SBT encoder to obtain the structure information of the Code;
at the decoding layer, the semantic information is decoded to generate a comment S4.
As a further optimization scheme of the method for automatically generating Java Code annotations represented by a transform and a mixed Code provided by the present invention, in step S2, two input sequences respectively applicable to a Code encoder and an SBT encoder are generated, and when an input sequence of a Code encoder is generated, the method specifically includes the following steps:
s201, decomposing the identifier name in the source code into a plurality of words by using a hump naming method;
s202, uniformly converting the decomposed words into a lower case format;
s203, the OOV (Out-Of-Vocabulariy) problem is that some rare words or derivative words, words generated by complex numbers Of the words or rules Of other combined words can not be represented by the existing word vector model, and specific numbers and character strings are replaced by "< NUM >" and "< STR >" labels respectively to relieve the OOV problem, so as to obtain an input sequence Of Code Encoding;
s204, resolving the Java method into an abstract syntax tree AST by using a JDT compiler of Eclipse for Java code data, and traversing the abstract syntax tree by using an SBT traversal method to obtain an input sequence of the SBT encoder.
As a further optimized solution of the method for automatically generating Java Code annotations represented by a Transformer and mixed codes provided by the present invention, a Code encoder and an SBT encoder are used in step S3, and information obtained by the two encoders is merged into an output of an encoding layer sequence.
Interpretation of terms:
abstract Syntax spanning Tree (AST): also known as a syntax tree, is an abstract tree representation of the syntax structure of a code, with each node in the tree representing a structure of the code.
Referring to fig. 1, a new code annotation generation method based on mixed code representation and Transformer:
1. gathering of code annotated corpora
1.1, collecting Java method codes and corresponding Javadoc comments from a Github website by means of a crawler;
1.2 remove Java methods that do not contain JavaDoc comments;
1.3 according to the suggestions of JavaDoc, selecting the first sentence of the JavaDoc annotation as the annotation corresponding to the Java method;
1.4 remove Java methods whose comments contain only a single word;
1.5 removing Java methods of the type including the setter/getter method, the build method, the test method, because these methods generate code annotations more easily;
1.6 methods of removing flags with @ SmallTest, @ LargeTest, and @ MediumTest, and reload class
A method of type (I);
2. performing mixed representation on each Java code in the corpus;
2.1 converting the code into a word sequence Seqcode based on the lexical angle of the code;
2.1.1 many words in the code are identifiers (such as class names, method names, variable names, etc.), and in order to better learn the information in the code, the identifiers are further subdivided into a plurality of words according to a hump naming method;
2.1.2 converting all words into lower case forms
2.1.3, the numerical value and the character string in the Code are expressed by special symbols, for example, the numerical value in the Code is expressed by < NUM >, the character string in the Code is expressed by < STR >, so as to relieve the OOV (Out-Of-Vocalburry) problem and obtain the input sequence Of the Code encoder;
2.2 converting the code into SBT input based on the grammar angle of the code to obtain the sequence SeqSBT
2.2.1 Using Eclipse development tool JDT, Java code is converted into an abstract syntax tree. An Abstract Syntax Tree (AST), also called Syntax Tree, is an Abstract Tree representation of the code Syntax structure, where each node in the Tree represents a structure of the code.
2.2.2 the sequence of codes is then generated by traversing the abstract syntax tree using the SBT method. The traversal method can well keep the structure of the abstract syntax tree, and can ensure that the generated code sequence can be accurately restored to the original abstract syntax tree. The SBT (Structure-Based Traversal) method is a new Structure-Based method for traversing AST, with which a sub-tree under a given node is included in a pair of brackets that indicate the Structure of AST, we can accurately translate from sequences generated using SBT to trees;
3. constructing an annotation automatic generation model by means of a Transformer method based on mixed representation of Java codes, training the code annotation generation model by taking a < code, annotation > pair as input of model training, and respectively adding special marks < sos > and < eos > in a training sequence as a start mark and an end mark;
3.1 in the coding layer, through two coders, wherein the Code coder learns the lexical information of the Java Code based on the sequence Seqcode, and the SBT coder learns the grammatical information of the Java Code based on the sequence SeqSBT, finally, the semantic information of the Code can be effectively learned through the two coders. Combining matrix vectors with equal sizes obtained by learning of two encoders, compressing the obtained vectors into the size of a source matrix, increasing the nonlinearity of a neural network by using a TANH (hyper-tangential) activation function, and inputting the nonlinearity into a decoder;
3.2 using position encoding at decoder layer, and combining them with scaled embedded target token by summation element, then performing Dropout processing, Dropout refers to removing the neural network training unit from the network according to certain probability in deep learning training process, and the combined embedding and encoding source, source mask and target mask together pass through 2N decoder layers to obtain the prediction mark in the code annotation Y ^ corresponding to Java code;
3.3 compare Y ^ to the actual marker in target annotation Y to calculate the loss, which will be used to calculate the gradient of the parameter, then we use the adaptive moment estimation optimizer to update our weights to improve the performance of the training model;
4. application of models
4.1 when predicting a new Java code, firstly processing the code by using the method in the step 2 to obtain corresponding initial Seqcode and SeqSBT sequences, carrying out reinforcement and shortening on the two initial sequences, cutting off an over-range part if the code length exceeds a preset value, and filling by using a < pad > tag to obtain a final Seqcode sequence and a SeqSBT sequence if the code length is insufficient;
4.2 putting the two sequences into corresponding encoders respectively, embedding the two sequences into layers through the standard, and carrying out element summation with positionedbudding to obtain a vector containing information about token and the position of the token in the sequences; multiplying token embedding by a scaling factor before they are addedWhere d _ model is the hidden layer dimension; the use of the scale factor can effectively reduce the variance in embedding, and then discard the summed embedding, thereby avoiding the overfitting problem.
4.3 the combined embedding together with the encoding source, source mask and target mask results in a target annotation through 2N decoder layers.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.
Claims (3)
1. A Java code annotation automatic generation method based on Transformer and mixed code representation is characterized by comprising the following steps:
s1, downloading Java items and constructing a code library;
s2, converting the AST traversal into a code token vector and an SBT vector on the basis of the AST traversal at a serialization processing layer; splitting identifiers from the code token and the AST node into a plurality of words based on a hump nomenclature;
s3, at the encoding layer, using a Code encoder and an SBT encoder, wherein the Code encoder extracts lexical information from a source Code and uses the SBT encoder to obtain the structure information of the Code;
at the decoding layer, the semantic information is decoded to generate a comment S4.
2. The method for automatically generating Java Code annotations expressed by Transformer and mixed Code according to claim 1, wherein two input sequences respectively applicable to a Code encoder and an SBT encoder are generated in step S2, and when generating an input sequence of a Code encoder, the method specifically comprises the following steps:
s201, decomposing the identifier name in the source code into a plurality of words by using a hump naming method;
s202, uniformly converting the decomposed words into a lower case format;
s203, the OOV (Out-Of-Vocabulariy) problem is that some rare words or derivative words, words generated by complex numbers Of the words or rules Of other combined words can not be represented by the existing word vector model, and specific numbers and character strings are replaced by "< NUM >" and "< STR >" labels respectively to relieve the OOV problem, so as to obtain an input sequence Of Code Encoding;
s204, resolving the Java method into an abstract syntax tree AST by using a JDT compiler of Eclipse for Java code data, and traversing the abstract syntax tree by using an SBT traversal method to obtain an input sequence of the SBT encoder.
3. The method for automatically generating Java Code annotations expressed by Transformer and mixed Code according to claim 1 or 2, wherein a Code encoder and an SBT encoder are used in step S3, and the information obtained by the two encoders is combined to the output of the coding layer sequence.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011129802.4A CN112162775A (en) | 2020-10-21 | 2020-10-21 | Java code annotation automatic generation method based on Transformer and mixed code expression |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011129802.4A CN112162775A (en) | 2020-10-21 | 2020-10-21 | Java code annotation automatic generation method based on Transformer and mixed code expression |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112162775A true CN112162775A (en) | 2021-01-01 |
Family
ID=73867716
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011129802.4A Pending CN112162775A (en) | 2020-10-21 | 2020-10-21 | Java code annotation automatic generation method based on Transformer and mixed code expression |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112162775A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112836477A (en) * | 2021-01-15 | 2021-05-25 | 亿企赢网络科技有限公司 | Code annotation document generation method and device, electronic equipment and storage medium |
CN112947930A (en) * | 2021-01-29 | 2021-06-11 | 南通大学 | Method for automatically generating Python pseudo code based on Transformer |
CN113238798A (en) * | 2021-04-19 | 2021-08-10 | 山东师范大学 | Code abstract generation method, system, equipment and storage medium |
CN113961237A (en) * | 2021-10-20 | 2022-01-21 | 南通大学 | Bash code annotation generation method based on dual information retrieval |
CN117407051A (en) * | 2023-12-12 | 2024-01-16 | 武汉大学 | Code automatic abstracting method based on structure position sensing |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109101235A (en) * | 2018-06-05 | 2018-12-28 | 北京航空航天大学 | A kind of intelligently parsing method of software program |
CN109960506A (en) * | 2018-12-03 | 2019-07-02 | 复旦大学 | A kind of code annotation generation method based on structure perception |
CN110018820A (en) * | 2019-04-08 | 2019-07-16 | 浙江大学滨海产业技术研究院 | A method of the Graph2Seq based on deeply study automatically generates Java code annotation |
CN110399162A (en) * | 2019-07-09 | 2019-11-01 | 北京航空航天大学 | A kind of source code annotation automatic generation method |
CN111090461A (en) * | 2019-11-18 | 2020-05-01 | 中山大学 | Code annotation generation method based on machine translation model |
CN111459491A (en) * | 2020-03-17 | 2020-07-28 | 南京航空航天大学 | Code recommendation method based on tree neural network |
CN111522581A (en) * | 2020-04-22 | 2020-08-11 | 山东师范大学 | Enhanced code annotation automatic generation method and system |
CN111625276A (en) * | 2020-05-09 | 2020-09-04 | 山东师范大学 | Code abstract generation method and system based on semantic and syntactic information fusion |
CN111651198A (en) * | 2020-04-20 | 2020-09-11 | 北京大学 | Automatic code abstract generation method and device |
-
2020
- 2020-10-21 CN CN202011129802.4A patent/CN112162775A/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109101235A (en) * | 2018-06-05 | 2018-12-28 | 北京航空航天大学 | A kind of intelligently parsing method of software program |
CN109960506A (en) * | 2018-12-03 | 2019-07-02 | 复旦大学 | A kind of code annotation generation method based on structure perception |
CN110018820A (en) * | 2019-04-08 | 2019-07-16 | 浙江大学滨海产业技术研究院 | A method of the Graph2Seq based on deeply study automatically generates Java code annotation |
CN110399162A (en) * | 2019-07-09 | 2019-11-01 | 北京航空航天大学 | A kind of source code annotation automatic generation method |
CN111090461A (en) * | 2019-11-18 | 2020-05-01 | 中山大学 | Code annotation generation method based on machine translation model |
CN111459491A (en) * | 2020-03-17 | 2020-07-28 | 南京航空航天大学 | Code recommendation method based on tree neural network |
CN111651198A (en) * | 2020-04-20 | 2020-09-11 | 北京大学 | Automatic code abstract generation method and device |
CN111522581A (en) * | 2020-04-22 | 2020-08-11 | 山东师范大学 | Enhanced code annotation automatic generation method and system |
CN111625276A (en) * | 2020-05-09 | 2020-09-04 | 山东师范大学 | Code abstract generation method and system based on semantic and syntactic information fusion |
Non-Patent Citations (2)
Title |
---|
WASI UDDIN AHMAD,SAIKAT CHAKRABORTY,BAISHAKHI RAY,KAI-WEI CHANG: "A Transformer-based Approach for Source Code Summarization", ARXIV, 1 May 2020 (2020-05-01), pages 1 - 10 * |
XING HU,GE LI, XIN XIA,DAVID LO,ZHI JIN: "Deep code comment generation with hybrid lexical and syntactical information", SPRINGER, vol. 25, 31 May 2020 (2020-05-31), pages 2179 - 2217, XP037113771, DOI: 10.1007/s10664-019-09730-9 * |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112836477A (en) * | 2021-01-15 | 2021-05-25 | 亿企赢网络科技有限公司 | Code annotation document generation method and device, electronic equipment and storage medium |
CN112836477B (en) * | 2021-01-15 | 2024-02-09 | 亿企赢网络科技有限公司 | Method and device for generating code annotation document, electronic equipment and storage medium |
CN112947930A (en) * | 2021-01-29 | 2021-06-11 | 南通大学 | Method for automatically generating Python pseudo code based on Transformer |
CN112947930B (en) * | 2021-01-29 | 2024-05-17 | 南通大学 | Automatic generation method of Python pseudo code based on transducer |
CN113238798A (en) * | 2021-04-19 | 2021-08-10 | 山东师范大学 | Code abstract generation method, system, equipment and storage medium |
CN113961237A (en) * | 2021-10-20 | 2022-01-21 | 南通大学 | Bash code annotation generation method based on dual information retrieval |
CN117407051A (en) * | 2023-12-12 | 2024-01-16 | 武汉大学 | Code automatic abstracting method based on structure position sensing |
CN117407051B (en) * | 2023-12-12 | 2024-03-08 | 武汉大学 | Code automatic abstracting method based on structure position sensing |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112162775A (en) | Java code annotation automatic generation method based on Transformer and mixed code expression | |
CN110018820B (en) | Method for automatically generating Java code annotation based on Graph2Seq of deep reinforcement learning | |
WO2021231045A1 (en) | Transfer learning system for automated software engineering tasks | |
CN110489102B (en) | Method for automatically generating Python code from natural language | |
CN110442880B (en) | Translation method, device and storage medium for machine translation | |
CN112417897B (en) | Method, system, device and medium for training word alignment model and processing text | |
CN113190219A (en) | Code annotation generation method based on recurrent neural network model | |
CN116151132B (en) | Intelligent code completion method, system and storage medium for programming learning scene | |
CN115033896B (en) | Method, device, system and medium for detecting Ethernet intelligent contract vulnerability | |
CN115543437B (en) | Code annotation generation method and system | |
CN115048141A (en) | Automatic Transformer model code annotation generation method based on graph guidance | |
CN112507337A (en) | Implementation method of malicious JavaScript code detection model based on semantic analysis | |
CN115906815B (en) | Error correction method and device for modifying one or more types of error sentences | |
CN113076133A (en) | Method and system for generating Java program internal annotation based on deep learning | |
CN116661805B (en) | Code representation generation method and device, storage medium and electronic equipment | |
CN115268868B (en) | Intelligent source code conversion method based on supervised learning | |
CN111291175A (en) | Method for automatically generating submitted demand abstract based on strategy gradient algorithm | |
CN116700780A (en) | Code completion method based on abstract syntax tree code representation | |
CN115168402A (en) | Method and device for generating model by training sequence | |
CN112148879B (en) | Computer readable storage medium for automatically labeling code with data structure | |
CN115826988A (en) | Java method annotation instant automatic updating method based on data flow analysis and attention mechanism | |
CN114185595B (en) | Code structure guidance-based method name generation method | |
CN117289938A (en) | Intelligent auxiliary system for software development | |
CN115203236A (en) | text-to-SQL generation method based on template retrieval | |
CN115270792A (en) | Medical entity identification method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |