CN109739494B - Tree-LSTM-based API (application program interface) use code generation type recommendation method - Google Patents

Tree-LSTM-based API (application program interface) use code generation type recommendation method Download PDF

Info

Publication number
CN109739494B
CN109739494B CN201811501452.2A CN201811501452A CN109739494B CN 109739494 B CN109739494 B CN 109739494B CN 201811501452 A CN201811501452 A CN 201811501452A CN 109739494 B CN109739494 B CN 109739494B
Authority
CN
China
Prior art keywords
code
api
nodes
abstract
node
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811501452.2A
Other languages
Chinese (zh)
Other versions
CN109739494A (en
Inventor
彭鑫
陈驰
赵文耘
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fudan University
Original Assignee
Fudan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fudan University filed Critical Fudan University
Priority to CN201811501452.2A priority Critical patent/CN109739494B/en
Publication of CN109739494A publication Critical patent/CN109739494A/en
Application granted granted Critical
Publication of CN109739494B publication Critical patent/CN109739494B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Stored Programmes (AREA)

Abstract

The invention belongs to the technical field of software engineering and intelligent software development, and particularly relates to an API (application program interface) code generation type recommendation method based on Tree-LSTM (software-based virtual reality). The invention constructs a large number of training samples by analyzing a large number of source codes containing target APIs, and based on the training samples, the deep learning and statistical model is used for training the APIs to use a code prediction model; the code prediction model is divided into: statement prediction based on abstract code tree structure representation and deep learning, and API call parameter prediction based on data dependency analysis and statistical model; wherein an abstract Tree structure representation of the code suitable for Tree-LSTM model processing is designed. The invention provides intelligent recommendation of the API use codes based on the code context for software developers, and recommends the API use codes possibly used by the developers line by line according to written codes, wherein the API use codes comprise API object creation/method call/attribute access, control statement, variable statement and the like, and can also recommend related context variables as parameters for API method call.

Description

Tree-LSTM-based API (application program interface) use code generation type recommendation method
Technical Field
The invention belongs to the technical field of software engineering and intelligent software development, in particular to an intelligent recommendation and auxiliary coding technology in software development, and particularly relates to an API (application program interface) use code generation type recommendation method.
Background
Software developers often rely on various generic APIs (Application Programming Interface, i.e., application programming interfaces), such as those in JDK, android, to complete their own development tasks. The number of such APIs is large, and it is difficult for developers to know the functions of all APIs and their use occasions. How to select an appropriate API in a specific code context to complete its own development tasks is often a difficult problem for developers. In addition, many APIs have usage requirements, such as a combination of usage among APIs, a calling order, a corresponding control structure (e.g., condition judgment, loop, etc.), and violating the usage of the APIs may cause code defects. Thus, how to use in the correct manner based on the selected API is also a challenge for the developer.
An effective solution to this problem is to provide code intelligent recommendation capabilities in development tools such as software integrated development environments. Such intelligent recommendations should assist the developer in efficiently and high-quality completion of his own development tasks by predicting the API usage code (e.g., API method calls) that is likely to appear at the current location through analysis of the code context that the developer is writing.
The software code contains various control structures such as branches, loops, etc., so the corresponding code data representation needs to be able to reflect such control structures, thereby preserving the structural information in the code. Therefore, the invention adopts the deep learning network Tree-LSTM supporting the Tree structure to train the API to use the code prediction model, thereby realizing the intelligent recommendation of the generation formula. The Tree-LSTM is a recurrent neural network having a Tree topology, receives data of a Tree structure as an input, and efficiently encodes structural information in the input data. The structure of the Tree-LSTM is shown in FIG. 1. (Tree-LSTM reference is Kai make Tai, richard Socher, christopher D. Manning: improved Semantic Representations From Tree-Structured Long Short-terminal Memory networks. ACL (1) 2015: 1556-1566).
Disclosure of Invention
The invention aims to provide an intelligent recommendation method for API (application program interface) use codes based on code contexts for software developers.
The API use code generation type recommendation method provided by the invention is based on Tree-LSTM, and specifically, a large number of training samples are constructed by analyzing a large number of source codes (open source codes or enterprise codes) containing target APIs (such as APIs in JDK and Android), and then an API use code prediction model is trained by using a deep learning and statistical model. The code prediction model in the invention comprises two parts: statement prediction based on abstract code tree structure representation and deep learning, API call parameter prediction based on data dependency analysis and statistical model.
In the invention, an abstract Tree structure representation of codes suitable for Tree-LSTM model processing is designed, as shown in figure 2 (corresponding source codes see annex), wherein nodes represent abstract API use statement or variable declaration/assignment statement or control structure, and edges represent control flow relations among the abstract API use statement or variable declaration/assignment statement or control structure. This representation abstracts the variables and constants in the code, leaving only API object creation/method call/property access, control nodes (if, while, etc.), and variable declarations, etc. The abstract API uses sentence junction to abstract API use sentences in the code into a complete method signature, e.g., sig.initsign (pk) in code exposed in the appendix will abstract into java. The variable declaration/assignment statement node abstracts the variable declaration/assignment statement in the code to a representation that ignores variable names and assignment constants, e.g., string sign code=null in the code presented in the appendix will abstract to java. The control structure nodes are each denoted by If, elseIf, else, while, doWhile, for, foreach, try, catch, finally, switch, case, default as corresponding control structures. The API uses statement nodes and variable declaration/assignment statement nodes with at most one child node. This child node is an abstract representation of the next statement of an API use statement or variable declaration/assignment statement that its parent node represents, so that the parent-child relationship of these two nodes represents a sequential flow relationship, e.g., string sign node=null in the code exposed by the appendix would be abstracted to be a child node of byte [ ] message byte=message. The control structure node has a plurality of sub-nodes which respectively represent codes in different control flows of the control structure node, for example, if the control structure node has a plurality of sub-nodes which respectively represent condition part codes (corresponding nodes are generated If judging that the condition contains an API call, and nodes are not generated to be added into abstract tree representation If judging that the condition does not contain any API call); code of a branch when true; elseIf node (if ElseIf branch exists); else node (If an Else branch exists) and code after the end of the entire If. When the analysis of the condition part is finished, a special node Condition end node is added to represent the end of the condition part; when the analysis of the whole control structure is finished, a special node control end node is added to indicate the end of the whole control structure. For example, the control structure If in the code shown in the appendix contains four sub-nodes, the Condition end node (here, since the judging condition of If does not contain any API call, the Condition end node is directly used to represent the end of the condition part); the java security key=constant node represents a code of a branch when the judgment condition is true; an Else node; the java security, getinstance (java, lang. String) node represents code after the entire If control structure. Given a piece of code, parsing is performed starting from the first line of code, iteratively resulting in an abstract tree structure representation of the code.
The method for recommending the API using the code generation type based on the Tree-LSTM comprises the following specific steps as shown in a figure 3:
sentence model training, comprising the following substeps.
(1) Performing code static analysis by taking a method in a training data source code as a unit to obtain a corresponding abstract tree structure representation;
(2) Iteratively traversing the abstract tree structure representation of each method from the root node, removing N nodes (N represents the scale of the code to be completed) after the currently traversed nodes, replacing the removed N nodes with a Hole node representing the code to be completed (Hole), thereby obtaining code abstract tree structure representations with holes, and taking each abstract tree structure representation with holes and the removed first node as a training sample;
(3) Mapping abstract APIs represented by each node in the abstract Tree structure representation with holes in all training samples into vectors, and then inputting all training samples into a Tree-LSTM network for training to obtain a sentence model.
(II) training a parameter model, comprising the following substeps:
(1) And (3) carrying out data dependency analysis on the code abstract tree structure representation in the training sample obtained in the step (one) by combining the corresponding source codes, adding data dependency edges between nodes with data stream dependencies, and adding corresponding variable names to nodes representing variable declaration and assignment, thereby obtaining abstract tree structure representation added with data dependency. As shown in FIG. 4 (where the solid lines represent control flow edges and the dashed lines represent data flow edges), the abstract tree structure with the added data dependencies represents data flow edges between nodes than the original abstract tree structure, and the variable names represented by the nodes, the other structures being consistent with the original abstract tree structure representation. For example, the variable represented by the java.lang.string.getbytes () node is the messageByte, so the java.lang.string.getbytes () node has the name of the variable messageByte added to it; meanwhile, the java.lang.string.getbytes () node has data dependence with the java.security.signature.update (byte [ ]) node, so that a data stream edge represented by a dotted line is arranged between the java.lang.string.getbytes () node and the java.security.signature.update (byte [ ]) node;
(2) Extracting all paths with the length more than 2 from the abstract tree structure representation added with the data dependence, which is obtained in the substep (1), and counting the times of generating the data dependence between all nodes in each path and the terminal point of the path for all paths obtained by extraction;
(3) Based on the statistics times in the substep (2), giving a candidate API recommendation and an abstract tree structure representation currently added with data dependence, and calculating the times that each node generates data dependence with the candidate API recommendation node on all paths which can reach the candidate API recommendation node and have the length of more than 2;
(4) According to the statistics times in the substep (3), calculating the probability that each node and the candidate API recommendation node generate data dependence, wherein the higher the probability is, the more likely the variable represented by the node becomes the parameter in the candidate API recommendation;
(5) And (3) calculating the degree of tightness of the combination of all candidate API recommendations and the current code according to the statistics times in the substep (3), wherein the higher the probability is, the stronger the data dependence of the current candidate API recommendation and the current code is, the more tightly the combination is, the final probability is calculated by combining the probability of the current candidate API recommendation given by the statement model, and finally, all the candidate API recommendations are reordered.
(III) code recommendation, comprising the substeps of:
(1) The user inputs a program containing codes to be completed (holes);
(2) According to the input of a user, an API recommendation result is given out by the operation statement model and the parameter model;
(3) The user selects according to the API recommendation result;
(4) And updating the program input by the current user according to the selection of the user.
In the invention, the Tree-LSTM model is adopted in the step (1) and the step (3) to carry out sentence prediction model training and prediction, and meanwhile, abstract Tree structure representation reflecting API sentences, control units and control flow relations among the API sentences and the control units is adopted.
According to the method, the API use codes possibly used by the developer can be recommended line by the written codes, wherein the API use codes comprise API object creation/method call/attribute access, control statement (if, while and the like), variable declaration and the like, and the related context variables are recommended for API method call as parameters.
The Tree-LSTM-based API use code generation type recommendation method is characterized in that:
1. an abstract Tree structure representation of code suitable for Tree-LSTM model processing is designed, wherein nodes represent abstract API usage phrases, edges represent control flow relationships between them. The representation abstracts variables and constants in the code, and only maintains API object creation/method call/attribute access, control nodes (if, while, etc.), variable declarations, etc.;
2. the method combines the characteristics of Child-Sum Tree-LSTM and N-ary Tree-LSTM in the Tree-LSTM, and improves the deep learning network suitable for API code learning. Wherein each node in the Child-Sum Tree-LSTM may have any number of sub-nodes but the sub-nodes are unordered, and each sub-node of the N-ary Tree-LSTM has at most N sub-nodes but the sub-nodes are ordered. The combination of the Child-Sum Tree-LSTM and the N-ary Tree-LSTM can lead each node to have any number of sub-nodes and the sub-nodes to be ordered;
3. based on data flow analysis, adding data dependence into the abstract tree structure representation, and extracting all paths with the length more than 2, thereby counting the times of generating data dependence between all nodes in each path and the terminal point of the path, and using the data dependence for parameter recommendation.
Drawings
Fig. 1 is a block diagram of a Tree-LSTM network used in the present invention.
FIG. 2 is an example of a code abstract tree structure representation used by the present invention.
Fig. 3 is a general flow chart of the present invention.
FIG. 4 is an example of a code abstract tree structure representation incorporating data dependencies in the present invention.
Detailed Description
One embodiment for a Java program and JDK API is as follows.
(1) Detailed description of sentence model training. Analyzing Java codes by using Java Parser to obtain AST (abstract syntax tree), traversing all nodes in the AST by using a monitor mode from the root node of the AST, obtaining a complete list of APIs by using a Java reflection mechanism to extract API using codes, obtaining a complete method signature of the APIs and adding the complete method signature as the nodes to the current abstract tree structure representation. For each abstract tree structure, traversing from the root node, removing N nodes (N represents the scale of the code to be completed) after the currently traversed nodes, and replacing the removed N nodes with a Hole node representing the code to be completed (Hole), so that a training sample can be constructed iteratively. The deep learning network for training writes deep learning network code based on APIs provided by the TensorFlow and Fold frameworks. The whole network structure of the deep learning network is realized based on Child-Sum Tree-LSTM in the Tree-LSTM, and a transfer equation (transitionEquation) is based on N-ary Tree-LSTM improvement in the Tree-LSTM (the original N-ary Tree-LSTM is changed into the original N-ary Tree-LSTM, and each node can have any number of sub-nodes) so as to be suitable for the realization of Child-Sum Tree-LSTM.
(2) Detailed description of the parametric model training. Analyzing Java codes by using the JavaParser to obtain AST, and analyzing the conditions of all variables and objects in the codes to be called based on the AST so as to obtain the data dependency relationship. And adding the variable and the object name into corresponding nodes, and adding data stream edges between the nodes for the nodes with data dependency relationships, thereby obtaining the abstract tree structure representation added with the data dependency. Based on the abstract tree structure representation added with the data dependence, all paths with the path length more than 2 are extracted from the root node in a depth-first traversal mode. And for each path with the length greater than 2, starting from the initial node of the path, judging whether data dependency exists between all nodes on the path and the final node of the path, if so, adding 1 to the number of times that the nodes and the final node have data dependency on the path, and updating the number of times that the nodes and the final node have data dependency on the path in a database.
(3) Code recommended embodiments. The user places the cursor in the code editor at the location where the API is required to recommend the code, and clicks the recommendation button in the code editor, so that a recommendation list containing N recommendation results is displayed in the right column of the code editor. The user can select a recommendation result from the recommendation list, and the recommendation result is automatically filled in the position of the cursor.
Appendix:
1: public byte[] sign(String message, String digestAlgorithm, PrivateKeypk)
throws GeneralSecurityException {
2: byte[] messageByte = message.getBytes();
3: String signMode = null;
4: if(pk == null){
5: pk = getPrivateKey("RSA");
6: String encryptionAlgorithm = pk.getAlgorithm();
7: signMode = combine(digestAlgorithm, encryptionAlgorithm);
8: }else{
9: String encryptionAlgorithm = pk.getAlgorithm();
10:signMode = combine(digestAlgorithm, encryptionAlgorithm);
11: }
12: Signature sig = Signature.getInstance(signMode);
13: sig.initSign(pk);
14: sig.update(messageByte);
15: return sig.sign();
16:}

Claims (3)

1. the method is characterized in that a large number of training samples are constructed by analyzing a large number of source codes containing target APIs, and the deep learning and statistical model is used for training the APIs to use a code prediction model on the basis; the code prediction model is two parts: statement prediction based on abstract code tree structure representation and deep learning, and API call parameter prediction based on data dependency analysis and statistical model;
the method comprises the steps of designing an abstract Tree structure representation of codes suitable for Tree-LSTM model processing, wherein nodes represent abstract API using sentences or variable declaration/assignment sentences or control structures, and edges represent control flow relations between the abstract API using sentences or variable declaration/assignment sentences or control structures; the representation abstracts variables and constants in the code, and only maintains API object creation/method call/attribute access, control nodes and variable declarations; abstract API use sentence nodes abstract API use sentences in codes into complete method signatures; the variable declaration/assignment statement node abstracts the variable declaration/assignment statement in the code into a representation of ignoring variable names and assignment constants; the control structure nodes represent corresponding control structures by If, elseIf, else, while, doWhile, for, foreach, try, catch, finally, switch, case, default respectively; the API uses sentence nodes and variable declaration/assignment sentence nodes to have at most one sub-node; this child node is an abstract representation of the next statement of the API use statement or variable declaration/assignment statement represented by its parent node, so the parent-child relationship of the two nodes represents a sequential flow relationship; the control structure node is provided with a plurality of sub-nodes which respectively represent codes in different control flows; when the analysis of the condition part is finished, a special node Condition end node is added to represent the end of the condition part; when the analysis of the whole control structure is finished, a special node control end node is added to represent the end of the whole control structure; given a piece of code, parsing is performed starting from the first line of code, iteratively resulting in an abstract tree structure representation of the code.
2. The Tree-LSTM based API use code generation recommendation method of claim 1 comprising the specific steps of:
sentence model training, comprising the sub-steps of:
(1) Performing code static analysis by taking a method in a training data source code as a unit to obtain a corresponding abstract tree structure representation;
(2) For the abstract tree structure representation of each method, iteratively traversing from the root node, removing N nodes after the currently traversed nodes, wherein N represents the scale of the code to be completed; replacing the removed N nodes with a Hole node representing the code to be completed, so as to obtain a code abstract tree structure representation with a Hole, and taking each abstract tree structure representation with a Hole and the removed first node as a training sample;
(3) Mapping abstract APIs represented by each node in the abstract Tree structure representation with holes in all training samples into vectors, and then inputting all training samples into a Tree-LSTM network for training to obtain a sentence model;
(II) training a parameter model, comprising the following substeps:
(1a) Carrying out data dependency analysis on the code abstract tree structure representation in the training sample obtained in the step (one) by combining the corresponding source codes, adding data dependency edges between nodes with data stream dependency, and adding corresponding variable names for nodes representing variable declaration and assignment, thereby obtaining abstract tree structure representation added with data dependency, namely, the abstract tree structure representation added with data dependency is more than the original abstract tree structure representation in terms of data stream edges among a plurality of nodes, and the variable names represented by the nodes;
(2a) Extracting all paths with the length more than 2 from the abstract tree structure representation added with the data dependence, which is obtained in the substep (1 a), and counting the times of generating the data dependence between all nodes in each path and the terminal point of the path for all paths obtained by extraction;
(3a) Based on the statistics times in the substep (2 a), giving a candidate API recommendation and an abstract tree structure representation currently added with data dependence, and calculating the times that each node generates data dependence with the candidate API recommendation nodes on all paths which can reach the candidate API recommendation nodes with the length of more than 2;
(4a) According to the statistics times in the substep (3 a), calculating the probability that each node and the candidate API recommendation node generate data dependence, wherein the higher the probability is, the more likely the variable represented by the node becomes the parameter in the candidate API recommendation;
(5a) Calculating the degree of tightness of all candidate API recommendations combined with the current code according to the statistics times in the substep (3 a), wherein the higher the probability is, the stronger the data dependence of the current candidate API recommendation and the current code is, the more tightly the combination is, then the final probability is calculated by combining the probability of the current candidate API recommendation given by the statement model, and finally, all the candidate API recommendations are reordered;
(III) code recommendation, comprising the substeps of:
(1b) The user inputs a program containing a code to be completed, namely a hole;
(2b) According to the input of a user, an API recommendation result is given out by the operation statement model and the parameter model;
(3b) The user selects according to the API recommendation result;
(4b) And updating the program input by the current user according to the selection of the user.
3. The Tree-LSTM based API use code generation recommendation method of claim 2 wherein step (1) and step (3) employ a Tree-LSTM model for sentence prediction model training and prediction while employing an abstract Tree structure representation reflecting API sentences, control units and control flow relationships therebetween.
CN201811501452.2A 2018-12-10 2018-12-10 Tree-LSTM-based API (application program interface) use code generation type recommendation method Active CN109739494B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811501452.2A CN109739494B (en) 2018-12-10 2018-12-10 Tree-LSTM-based API (application program interface) use code generation type recommendation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811501452.2A CN109739494B (en) 2018-12-10 2018-12-10 Tree-LSTM-based API (application program interface) use code generation type recommendation method

Publications (2)

Publication Number Publication Date
CN109739494A CN109739494A (en) 2019-05-10
CN109739494B true CN109739494B (en) 2023-05-02

Family

ID=66359217

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811501452.2A Active CN109739494B (en) 2018-12-10 2018-12-10 Tree-LSTM-based API (application program interface) use code generation type recommendation method

Country Status (1)

Country Link
CN (1) CN109739494B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112230781B (en) * 2019-07-15 2023-07-25 腾讯科技(深圳)有限公司 Character recommendation method, device and storage medium
US11150877B2 (en) * 2019-11-06 2021-10-19 Google Llc Automatically generating machine learning models for software tools that operate on source code
CN111459491B (en) * 2020-03-17 2021-11-05 南京航空航天大学 Code recommendation method based on tree neural network
CN111723192B (en) * 2020-06-19 2024-02-02 南开大学 Code recommendation method and device
CN111966817B (en) * 2020-07-24 2022-05-20 复旦大学 API recommendation method based on deep learning and code context structure and text information
CN111966818B (en) * 2020-07-26 2024-03-08 复旦大学 Deep learning-based interactive API code segment recommendation method
CN112162746B (en) * 2020-10-29 2022-07-05 中国人民解放军国防科技大学 Automatic program construction method based on network knowledge convergence and iterative search
CN113076089B (en) * 2021-04-15 2023-11-21 南京大学 API (application program interface) completion method based on object type

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106104521A (en) * 2014-01-10 2016-11-09 克鲁伊普公司 System, apparatus and method for the emotion in automatic detection text
CN107924579A (en) * 2015-08-14 2018-04-17 麦特尔有限公司 The method for generating personalization 3D head models or 3D body models

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105335352A (en) * 2015-11-30 2016-02-17 武汉大学 Entity identification method based on Weibo emotion
RU2635275C1 (en) * 2016-07-29 2017-11-09 Акционерное общество "Лаборатория Касперского" System and method of identifying user's suspicious activity in user's interaction with various banking services
US11200265B2 (en) * 2017-05-09 2021-12-14 Accenture Global Solutions Limited Automated generation of narrative responses to data queries

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106104521A (en) * 2014-01-10 2016-11-09 克鲁伊普公司 System, apparatus and method for the emotion in automatic detection text
CN107924579A (en) * 2015-08-14 2018-04-17 麦特尔有限公司 The method for generating personalization 3D head models or 3D body models

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张峰逸 ; 彭鑫 ; 陈驰 ; 赵文耘 ; .基于深度学习的代码分析研究综述.计算机应用与软件.(第06期),全文. *

Also Published As

Publication number Publication date
CN109739494A (en) 2019-05-10

Similar Documents

Publication Publication Date Title
CN109739494B (en) Tree-LSTM-based API (application program interface) use code generation type recommendation method
CN111708539B (en) Application program code conversion method and device, electronic equipment and storage medium
CN108388425B (en) Method for automatically completing codes based on LSTM
US11561772B2 (en) Low-code development platform
Heckel et al. Graph Transformation for Software Engineers
Ghamarian et al. Modelling and analysis using GROOVE
CN103164249B (en) Extension mechanism for script compiler
CA3050159C (en) Artificial intelligence (ai) based automatic rule generation
CN113360915A (en) Intelligent contract multi-vulnerability detection method and system based on source code graph representation learning
CN105378658B (en) Automatic source code generates
US20200210158A1 (en) Automated or machine-enhanced source code debugging
US11275625B2 (en) System and method for automated application programming interface generation
CN116406459A (en) Code processing method, device, equipment and medium
US20220075710A1 (en) System and method for improved unit test creation
Mushtaq et al. Multilingual source code analysis: State of the art and challenges
CN109857458B (en) ANTLR-based AltaRica3.0 flattening transformation method
CN116755669A (en) Low code development method and tool based on DSL language operation model
CN116610558A (en) Code detection method, device, electronic equipment and computer readable storage medium
Grechanik et al. Differencing graphical user interfaces
Bieber et al. A library for representing python programs as graphs for machine learning
CN113885844A (en) Business service arranging method and related device
Ameedeen A model driven approach to analysis and synthesis of sequence diagrams
CN111538504A (en) Syntax information extraction method, equipment and storage medium based on solid intelligent contract
Lu et al. DATAM: A model‐based tool for dependability analysis
KR101974804B1 (en) Flow diagram generation method and apparatus performing the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant