CN109614103A - A kind of code completion method and system based on character - Google Patents
A kind of code completion method and system based on character Download PDFInfo
- Publication number
- CN109614103A CN109614103A CN201811223489.3A CN201811223489A CN109614103A CN 109614103 A CN109614103 A CN 109614103A CN 201811223489 A CN201811223489 A CN 201811223489A CN 109614103 A CN109614103 A CN 109614103A
- Authority
- CN
- China
- Prior art keywords
- code
- completion
- character
- model
- source code
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/40—Transformation of program code
- G06F8/41—Compilation
- G06F8/44—Encoding
- G06F8/447—Target code generation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Abstract
The code completion method and system based on character that the present invention provides a kind of, comprising: source code processing step has analysis source code using abstract syntax tree and identifier chemical industry;Model training step uses LSTM model training language model;Completion step is predicted, according to the language model completion code trained.The present invention constructs the language model on extensive code corpus using the Recognition with Recurrent Neural Network (RNN) in deep learning model to predict subsequent code.The invention proposes a restricted character level Recognition with Recurrent Neural Network and it is used for the complementing method calling in Java language.The present invention realizes in programming process, not only can carry out completion to program sentence, can also carry out completion to single keyword.Technical solution of the present invention have the characteristics that it is simple, quick, can preferably improve code recommendation accuracy rate and recommend efficiency.
Description
Technical field
The present invention relates to computer software engineering technical fields, more particularly, to a kind of code completion method based on character
And system.
Background technique
Computer automatic code generating is one of the research hotspot of soft project in recent years.Code automatic build greatly subtracts
The workload for having lacked programmer, improves development efficiency.With the development of open source community, can by analyze a large amount of code from
And carry out code building.The big difficulty of the one of Code automatic build is that source code itself has many constraint and limitation.In recent years
Come, it is original program synthesis research is carried out based on combined optimization method on the basis of, produce some new based on engineering
The method of habit technology progress Program Generating.
According to the difference of the technology and application scenarios taken, current program creating method can be divided into two classes: a kind of
For the Program Generating based on program input and output result, one kind is the code building based on program code characteristic of speech sounds.Based on defeated
The program synthesis for entering to export result is based primarily upon machine learning model, utilizes the corresponding relationship structure between program input and output result
Training dataset is made, and machine learning model is trained using the data set, is simulated in input and output effect with reaching
The purpose of program behavior.Such method is especially in the method based on deep neural network as representative.Based on programing language model
Program Generating mainly utilize statistical property possessed by programming language itself, by having a large amount of program codes
The machine learning model of corresponding program design language is established in study, and is passed through on the basis of existing program code based on this model
The mode of auto-complete generates new code.
LSTM (Long Short-Term Memory) is shot and long term memory network, is a kind of time recurrent neural network,
It is suitable for being spaced and postpone relatively long critical event in processing and predicted time sequence.LSTM has in sciemtifec and technical sphere
A variety of applications.System based on LSTM can learn interpreter language, control robot, image analysis, documentation summary, speech recognition
Image recognition, handwriting recognition, control chat robots, predictive disease, clicking rate and stock, composite music etc. task.
Chinese invention patent application number 201710687197.4 is related to a kind of generation for being based on shot and long term memory network (LSTM)
Code recommended method, it is low for generally existing the recommendations accuracy rate of existing code recommended technology, recommend low efficiency the problems such as, the invention elder generation
Source code is extracted into API sequence, a code recommended models is constructed using shot and long term memory network, learns between API Calls
Relationship, then carry out code recommendation.And dropout technology is used to prevent model over-fitting.It is proposed simultaneously with ReLu letter
Number replaces traditional saturation function, solves the problems, such as that model convergence rate is accelerated in gradient disappearance, improves model performance, give full play to nerve
The advantage of network.
However, what above-mentioned patent actually carried out is that API recommends, recommends with code level or the target of auto-complete still has
Larger gap.It can not achieve the recommendation anywhere to arbitrary code.
Most of programmer will use frame or library API during carrying out software development to be multiplexed code.But
Programmer is almost impossible to remember all API, because existing API quantity is very huge.Therefore, code auto-complete machine
System have become in modern Integrated Development Environment (Integrated Development Environment, IDE) can not or
Scarce component part.According to statistics, code completion is most-often used one of ten instructions of developer.Code completion mechanism is in program
The remainder of completion program can be attempted when member's input code.The input when code completion of intelligence can be programmed by eliminating is wrong
Accidentally and recommend suitable API to accelerate software development process.
Summary of the invention
In order to solve the above problem, the present invention constructs big rule using the Recognition with Recurrent Neural Network (RNN) in deep learning model
Language model on mould code corpus is to predict subsequent code.The invention proposes a restricted character level circulation minds
Through network and it is used for the complementing method calling in Java language.
Specifically, the present invention provides a kind of code completion method based on character, comprising:
Source code processing step has analysis source code using abstract syntax tree and identifier chemical industry;
Model training step uses LSTM model training language model;
Completion step is predicted, according to the language model completion code trained.
Preferably, in source code processing step, the source code is resolved to different form, with obtain code class,
Method list, code identifier.
Preferably, the LSTM model is concatenated LSTM model, and the LSTM model is located at the hidden layer of RNN model.
Preferably, in prediction completion step, partial code segment is inputted to the language model of trained mistake, thus root
The code element recommended according to context output.
Preferably, the method further includes: carry out method tune is extracted by the abstract syntax tree of ergodic part program
Object and the class belonging to it.
Preferably, the method further includes: all member methods of the class are obtained by static analysis.
According to another aspect of the present invention, a kind of code completion system based on character is additionally provided, including is sequentially connected
The following module connect:
Source code processing module has analysis source code using abstract syntax tree and identifier chemical industry;
Model training module uses LSTM model training language model;
Completion module is predicted, for according to the language model completion code trained.
Preferably, the source code is resolved to different form by the source code processing module, to obtain class, the side of code
Method list, code identifier.
The present invention realizes in programming process, not only can carry out completion to program sentence, can also be to single key
Word carries out completion.Technical solution of the present invention have the characteristics that it is simple, quick, can preferably improve code recommendation accuracy rate
With recommendation efficiency.
Detailed description of the invention
By reading the following detailed description of the preferred embodiment, various other advantages and benefits are common for this field
Technical staff will become clear.The drawings are only for the purpose of illustrating a preferred embodiment, and is not considered as to the present invention
Limitation.And throughout the drawings, the same reference numbers will be used to refer to the same parts.In the accompanying drawings:
Fig. 1 is that the present invention is based on the code completion method flow diagrams of character.
Fig. 2 is that the present invention is based on the code completion system construction drawings of character.
Specific embodiment
The illustrative embodiments that the present invention will be described in more detail below with reference to accompanying drawings.Although showing this hair in attached drawing
Bright illustrative embodiments, it being understood, however, that may be realized in various forms the reality of the invention without that should be illustrated here
The mode of applying is limited.It is to be able to thoroughly understand the present invention on the contrary, providing these embodiments, and this can be sent out
Bright range is fully disclosed to those skilled in the art.
The present invention is constructed on extensive code corpus using the Recognition with Recurrent Neural Network (RNN) in deep learning model
Language model is to predict subsequent code.Receive the vocabulary of a fixed size generally, based on the language model of neural network
In word as input, and these words be mapped as the vector in a fixed dimension space in input indicate, i.e., it is logical
Often described term vector.Term vector would generally be as the parameter of neural network first layer, and the vocabulary the big, means nerve
The parameter that network needs is more.Different from natural language, there are a large amount of user's user-defined identification symbols in program language.These marks
Knowing symbol may be spliced by several significant words, it is also possible to only individually without semantic letter.
Therefore, the vocabulary scale for reducing corpus may bring promotion to the performance of model.Based on this point, this hair
It is bright to propose a restricted character level Recognition with Recurrent Neural Network and be used for the complementing method calling in Java language.
Fig. 1 is to include the following steps: the present invention is based on the code completion method flow diagram of character
S1, source code processing step carry out analysis source code using abstract syntax tree and token chemical industry tool.In the step
In, the source code is resolved to different form, with class, method list, the code identifier etc. for obtaining code.
Abstract syntax tree (abstract syntax tree is perhaps abbreviated as AST) or syntax tree (syntax
Tree), be source code abstract syntax structure the tree-shaped form of expression, in particular to the source code of programming language.With abstract language
Method tree it is opposite be concrete syntax tree (concrete syntax tree), commonly referred to as parsing tree (parse tree).Generally
, in the translation and compilation process of source code, syntax analyzer is created that parsing tree.Once AST is created out, subsequent
Treatment process in, such as semantic analysis stage, some information can be added.
S2, model training step use LSTM model training language model.Step S1 training corpus obtained after parsing will
It is used for the Recognition with Recurrent Neural Network language model based on long short-term memory (Long Short Term Memory, LSTM).It is described
LSTM model is concatenated LSTM model, and the LSTM model is located at the hidden layer of RNN model.
S3, prediction completion step, according to the language model completion code trained.In this step, by partial code
Segment inputs the language model of trained mistake, to based on context export the code element of recommendation.
As shown in Fig. 2, according to another aspect of the present invention, additionally providing a kind of code completion system based on character
100, the following module including sequential connection:
Source code processing module 110 has analysis source code using abstract syntax tree and identifier chemical industry;Preferably, described
The source code is resolved to different form by source code processing module, to obtain class, the method list, code identifier of code.
Model training module 120 uses LSTM model training language model;
Completion module 130 is predicted, for according to the language model completion code trained.
Working principle of the present invention is as follows: when starting to carry out character generating method name one by one, the present invention will be in circulation mind
Apply limitation in generating process through network.Firstly, the present invention by the abstract syntax tree (AST) of ergodic part program extract into
The object of row method call and the class belonging to it.Then the present invention obtains this kind of all member sides by static analysis
Method.By generating space and being limited in these possible methods using this information as limitation.
The present invention realizes in programming process, not only can carry out completion to program sentence, can also be to single key
Word carries out completion.For example, after inputting import, java can be generated according to original method, io, regeneration are regenerated
iostream....;And according to the method for character completion, it may be implemented:, can be defeated with auto-complete for import after inputting im
It can be java with auto-complete after entering j, can be iostream with auto-complete after inputting ios.
Technical solution of the present invention have the characteristics that it is simple, quick, can preferably improve code recommendation accuracy rate and
Recommend efficiency.
Experiment and result
The present invention uses the source code of Eclipse 3.5.2 as data set, and focuses on Standard Widget
The API Calls in this library Toolkit (SWT).As described above, the present invention use band limit this model of character level LSTM into
Row experiment, the variation range of partial code line number as input are 1 to 12 rows.The vocabulary table size of model is 97, comprising big
Write mother, lowercase, some other characters used in number and code.
In order to verify whether model effective, the present invention simultaneously be used without limitation character level LSTM model and
Token grades of LSTM models are as control.It is wherein identical as above-mentioned experiment without the character level LSTM model experiment setting of limitation;
Partial code line number as input includes 5,10,15,20,25 rows in token grades of LSTM models, and the vocabulary table size of model is
20798, the setting of other hyper parameters is identical as above-mentioned experiment.
In terms of model realization, present invention uses Keras, an open source deep learning frames realized based on python language
Frame.Model is trained on one piece of Nvidia GTX1080 GPU, and training process continues 50 wheel iteration, spends 10 hour left sides
It is right.
The present invention tests the accuracy rate that completion is carried out to the API of SWT in Eclipse3.5.2, and tests respectively
Accuracy rate on Top1, Top3 and Top10.Table 1 illustrates three kinds of code completion mechanism, including a token grades of LSTM, and one
Without the character level LSTM of limitation and one with the LSTM limited, the accuracy rate of acquirement, recall rate and F value.LOC in bracket
(line of code) indicates that model can obtain highest accuracy rate using the code of this line number as input.For example, with limit
The character level LSTM of system available highest performance when using 12 line codes as input.
Table 1 carries out the result of completion to the API of SWT in Eclipse3.5.2
For 77.9% method call, band restricted character grade LSTM of the invention can directly generate accurate recommendation.
For 94.5% method call, correct method can be listed in the Top10 of recommendation list.What token grades of LSTM were obtained
Performance is then slightly lower, 8.1% is reduced in the correctly predicted accuracy rate of Top1, it is considered herein that this reaches 20798 with its size
Vocabulary is related.Compared with the character level model without limitation, the accuracy rate that Top1 is correctly predicted after limiting is added and improves
11%.For method completion task, it is bigger than the influence of vocabulary table size to generate space as the result is shown for this.
It should be understood that
Algorithm and display be not inherently related to any certain computer, virtual bench or other equipment provided herein.
Various fexible units can also be used together with teachings based herein.As described above, it constructs required by this kind of device
Structure be obvious.In addition, the present invention is also not directed to any particular programming language.It should be understood that can use various
Programming language realizes summary of the invention described herein, and the description done above to language-specific is to disclose this hair
Bright preferred forms.
In the instructions provided here, numerous specific details are set forth.It is to be appreciated, however, that implementation of the invention
Example can be practiced without these specific details.In some instances, well known method, structure is not been shown in detail
And technology, so as not to obscure the understanding of this specification.
Similarly, it should be understood that in order to simplify the disclosure and help to understand one or more of the various inventive aspects,
Above in the description of exemplary embodiment of the present invention, each feature of the invention is grouped together into single implementation sometimes
In example, figure or descriptions thereof.However, the disclosed method should not be interpreted as reflecting the following intention: i.e. required to protect
Shield the present invention claims features more more than feature expressly recited in each claim.More precisely, as following
Claims reflect as, inventive aspect is all features less than single embodiment disclosed above.Therefore,
Thus the claims for following specific embodiment are expressly incorporated in the specific embodiment, wherein each claim itself
All as a separate embodiment of the present invention.
Those skilled in the art will understand that can be carried out adaptively to the module in the equipment in embodiment
Change and they are arranged in one or more devices different from this embodiment.It can be the module or list in embodiment
Member or component are combined into a module or unit or component, and furthermore they can be divided into multiple submodule or subelement or
Sub-component.Other than such feature and/or at least some of process or unit exclude each other, it can use any
Combination is to all features disclosed in this specification (including adjoint claim, abstract and attached drawing) and so disclosed
All process or units of what method or apparatus are combined.Unless expressly stated otherwise, this specification is (including adjoint power
Benefit require, abstract and attached drawing) disclosed in each feature can carry out generation with an alternative feature that provides the same, equivalent, or similar purpose
It replaces.
In addition, it will be appreciated by those of skill in the art that although some embodiments described herein include other embodiments
In included certain features rather than other feature, but the combination of the feature of different embodiments mean it is of the invention
Within the scope of and form different embodiments.For example, in the following claims, embodiment claimed is appointed
Meaning one of can in any combination mode come using.
Various component embodiments of the invention can be implemented in hardware, or to run on one or more processors
Software module realize, or be implemented in a combination thereof.It will be understood by those of skill in the art that can be used in practice
One in the creating device of microprocessor or digital signal processor (DSP) to realize virtual machine according to an embodiment of the present invention
The some or all functions of a little or whole components.The present invention is also implemented as executing method as described herein
Some or all device or device programs (for example, computer program and computer program product).Such realization
Program of the invention can store on a computer-readable medium, or may be in the form of one or more signals.This
The signal of sample can be downloaded from an internet website to obtain, and is perhaps provided on the carrier signal or mentions in any other forms
For.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and ability
Field technique personnel can be designed alternative embodiment without departing from the scope of the appended claims.In the claims,
Any reference symbol between parentheses should not be configured to limitations on claims.Word "comprising" does not exclude the presence of not
Element or step listed in the claims.Word "a" or "an" located in front of the element does not exclude the presence of multiple such
Element.The present invention can be by means of including the hardware of several different elements and being come by means of properly programmed computer real
It is existing.In the unit claims listing several devices, several in these devices can be through the same hardware branch
To embody.The use of word first, second, and third does not indicate any sequence.These words can be explained and be run after fame
Claim.
The foregoing is only a preferred embodiment of the present invention, but scope of protection of the present invention is not limited thereto,
In the technical scope disclosed by the present invention, any changes or substitutions that can be easily thought of by anyone skilled in the art,
It should be covered by the protection scope of the present invention.Therefore, protection scope of the present invention should be with the protection model of the claim
Subject to enclosing.
Claims (8)
1. a kind of code completion method based on character characterized by comprising
Source code processing step has analysis source code using abstract syntax tree and identifier chemical industry;
Model training step uses LSTM model training language model;
Completion step is predicted, according to the language model completion code trained.
2. the code completion method according to claim 1 based on character, it is characterised in that:
In source code processing step, the source code is resolved to different form, to obtain class, the method list, generation of code
Code identifier.
3. the code completion method according to claim 1 or 2 based on character, it is characterised in that:
The LSTM model is concatenated LSTM model, and the LSTM model is located at the hidden layer of RNN model.
4. the code completion method according to claim 1 based on character, it is characterised in that:
In prediction completion step, partial code segment is inputted to the language model of trained mistake, thus based on context defeated
The code element recommended out.
5. the code completion method according to claim 1 based on character, it is characterised in that:
The method further includes: extracted by the abstract syntax tree of ergodic part program the object for carrying out method call and
Class belonging to it.
6. the code completion method according to claim 5 based on character, it is characterised in that:
The method further includes: all member methods of the class are obtained by static analysis.
7. a kind of code completion system based on character, which is characterized in that the following module including sequential connection:
Source code processing module has analysis source code using abstract syntax tree and identifier chemical industry;
Model training module uses LSTM model training language model;
Completion module is predicted, for according to the language model completion code trained.
8. the code completion system according to claim 7 based on character, it is characterised in that:
The source code is resolved to different form by the source code processing module, to obtain class, the method list, code of code
Identifier.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811223489.3A CN109614103A (en) | 2018-10-19 | 2018-10-19 | A kind of code completion method and system based on character |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811223489.3A CN109614103A (en) | 2018-10-19 | 2018-10-19 | A kind of code completion method and system based on character |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109614103A true CN109614103A (en) | 2019-04-12 |
Family
ID=66002914
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811223489.3A Pending CN109614103A (en) | 2018-10-19 | 2018-10-19 | A kind of code completion method and system based on character |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109614103A (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111459491A (en) * | 2020-03-17 | 2020-07-28 | 南京航空航天大学 | Code recommendation method based on tree neural network |
CN112230781A (en) * | 2019-07-15 | 2021-01-15 | 腾讯科技(深圳)有限公司 | Character recommendation method and device and storage medium |
CN112306497A (en) * | 2020-11-03 | 2021-02-02 | 高炼 | Method and system for converting natural language into program code |
CN113821198A (en) * | 2021-09-14 | 2021-12-21 | 中南大学 | Code completion method, system, storage medium and computer program product |
WO2022126909A1 (en) * | 2020-12-18 | 2022-06-23 | 平安科技(深圳)有限公司 | Code completion method and apparatus, and related device |
CN117573096A (en) * | 2024-01-17 | 2024-02-20 | 合肥综合性国家科学中心人工智能研究院(安徽省人工智能实验室) | Intelligent code completion method integrating abstract syntax tree structure information |
CN112306497B (en) * | 2020-11-03 | 2024-04-26 | 高炼 | Method and system for converting natural language into program code |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070277163A1 (en) * | 2006-05-24 | 2007-11-29 | Syver, Llc | Method and tool for automatic verification of software protocols |
CN108388425A (en) * | 2018-03-20 | 2018-08-10 | 北京大学 | A method of based on LSTM auto-complete codes |
CN108563433A (en) * | 2018-03-20 | 2018-09-21 | 北京大学 | A kind of device based on LSTM auto-complete codes |
CN108595165A (en) * | 2018-04-25 | 2018-09-28 | 清华大学 | A kind of code completion method, apparatus and storage medium based on code intermediate representation |
-
2018
- 2018-10-19 CN CN201811223489.3A patent/CN109614103A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070277163A1 (en) * | 2006-05-24 | 2007-11-29 | Syver, Llc | Method and tool for automatic verification of software protocols |
CN108388425A (en) * | 2018-03-20 | 2018-08-10 | 北京大学 | A method of based on LSTM auto-complete codes |
CN108563433A (en) * | 2018-03-20 | 2018-09-21 | 北京大学 | A kind of device based on LSTM auto-complete codes |
CN108595165A (en) * | 2018-04-25 | 2018-09-28 | 清华大学 | A kind of code completion method, apparatus and storage medium based on code intermediate representation |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112230781A (en) * | 2019-07-15 | 2021-01-15 | 腾讯科技(深圳)有限公司 | Character recommendation method and device and storage medium |
CN112230781B (en) * | 2019-07-15 | 2023-07-25 | 腾讯科技(深圳)有限公司 | Character recommendation method, device and storage medium |
CN111459491A (en) * | 2020-03-17 | 2020-07-28 | 南京航空航天大学 | Code recommendation method based on tree neural network |
CN111459491B (en) * | 2020-03-17 | 2021-11-05 | 南京航空航天大学 | Code recommendation method based on tree neural network |
CN112306497A (en) * | 2020-11-03 | 2021-02-02 | 高炼 | Method and system for converting natural language into program code |
CN112306497B (en) * | 2020-11-03 | 2024-04-26 | 高炼 | Method and system for converting natural language into program code |
WO2022126909A1 (en) * | 2020-12-18 | 2022-06-23 | 平安科技(深圳)有限公司 | Code completion method and apparatus, and related device |
CN113821198A (en) * | 2021-09-14 | 2021-12-21 | 中南大学 | Code completion method, system, storage medium and computer program product |
CN113821198B (en) * | 2021-09-14 | 2023-10-24 | 中南大学 | Code complement method, system, storage medium and computer program product |
CN117573096A (en) * | 2024-01-17 | 2024-02-20 | 合肥综合性国家科学中心人工智能研究院(安徽省人工智能实验室) | Intelligent code completion method integrating abstract syntax tree structure information |
CN117573096B (en) * | 2024-01-17 | 2024-04-09 | 合肥综合性国家科学中心人工智能研究院(安徽省人工智能实验室) | Intelligent code completion method integrating abstract syntax tree structure information |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108388425B (en) | Method for automatically completing codes based on LSTM | |
CN109614103A (en) | A kind of code completion method and system based on character | |
Chakraborty et al. | On multi-modal learning of editing source code | |
CN108563433A (en) | A kind of device based on LSTM auto-complete codes | |
CN109582352A (en) | A kind of code completion method and system based on double AST sequences | |
CN109739494B (en) | Tree-LSTM-based API (application program interface) use code generation type recommendation method | |
US11455150B2 (en) | Accelerating application modernization | |
CN106775913B (en) | A kind of object code controlling stream graph generation method | |
WO2009108647A1 (en) | Evaluating software programming skills | |
US5949993A (en) | Method for the generation of ISA simulators and assemblers from a machine description | |
CN108563561B (en) | Program implicit constraint extraction method and system | |
CN107153535A (en) | A kind of operation ElasticSearch method and device | |
CN114911711A (en) | Code defect analysis method and device, electronic equipment and storage medium | |
CN112416806A (en) | JS engine fuzzy test method based on standard document analysis | |
CN106648818B (en) | A kind of object code controlling stream graph generation system | |
CN115238045A (en) | Method, system and storage medium for extracting generation type event argument | |
US20220075710A1 (en) | System and method for improved unit test creation | |
Huang et al. | Codecot and beyond: Learning to program and test like a developer | |
Del Castillo | The ASM Workbench: A Tool Environment for Computer-Aided Analysis and Validation of Abstract State Machine Models: Tool Demonstration | |
CN117795474A (en) | Source code for domain specific language synthesized from natural language text | |
CN111694738B (en) | Method for generating SQL test script | |
JP2017522639A5 (en) | ||
CN106126225B (en) | A kind of object code reverse engineering approach based on program evolution model | |
US7543274B2 (en) | System and method for deriving a process-based specification | |
Murawski et al. | A contextual equivalence checker for IMJ |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |