CN118051429A - Test processing method and system - Google Patents

Test processing method and system Download PDF

Info

Publication number
CN118051429A
CN118051429A CN202410039294.2A CN202410039294A CN118051429A CN 118051429 A CN118051429 A CN 118051429A CN 202410039294 A CN202410039294 A CN 202410039294A CN 118051429 A CN118051429 A CN 118051429A
Authority
CN
China
Prior art keywords
code
tested
unit test
test
updated
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410039294.2A
Other languages
Chinese (zh)
Inventor
王朔
戈伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuzhou Online E Commerce Beijing Co ltd
Original Assignee
Wuzhou Online E Commerce Beijing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuzhou Online E Commerce Beijing Co ltd filed Critical Wuzhou Online E Commerce Beijing Co ltd
Priority to CN202410039294.2A priority Critical patent/CN118051429A/en
Publication of CN118051429A publication Critical patent/CN118051429A/en
Pending legal-status Critical Current

Links

Landscapes

  • Debugging And Monitoring (AREA)

Abstract

The embodiment of the specification provides a test processing method and a test processing system, wherein the test processing method comprises the following steps: analyzing the project codes to obtain related codes of initial codes to be tested, assembling the related codes and the initial codes to be tested to obtain updated codes to be tested, inputting the updated codes to be tested into a large model, and generating unit tests of the updated codes to be tested by using the large model; detecting whether the unit test has errors or not by using a preset detection strategy; if errors exist, determining error information corresponding to the errors; and inputting the code to be tested, the unit test and the error information into the large model, and generating a repaired unit test by using the large model.

Description

Test processing method and system
Technical Field
The embodiment of the specification relates to the technical field of computers, in particular to a test processing method and a test processing system.
Background
The software development comprises a whole software development link, wherein the unit test is used for testing the function of the software, which is an important ring in the software development link. The unit test is also called as a module test, is a test work for checking correctness aiming at a program module, and can improve the quality of software development.
There are currently some unit test generation tools and deep learning unit test generation tools that are capable of generating unit tests. However, many times, these tools generate unit tests with uneven quality, resulting in reduced quality of the software code, affecting project progress and development efficiency. And the development speed and quality of the current software development are high. Therefore, how to process unit tests to meet the requirements of software development speed and quality is an important issue to be solved.
Disclosure of Invention
In view of this, the present embodiments provide a test processing method. One or more embodiments of the present specification also relate to a test processing system, a computing device, a computer-readable storage medium, and a computer program that address the technical deficiencies of the prior art.
According to a first aspect of embodiments of the present disclosure, there is provided a test processing method, including: analyzing the project codes to obtain related codes of initial codes to be tested, and assembling the related codes and the initial codes to be tested to obtain updated codes to be tested; inputting the updated code to be tested into a large model, and generating a unit test of the updated code to be tested by using the large model; detecting whether the unit test has errors or not by using a preset detection strategy; if errors exist, determining error information corresponding to the errors; and inputting the updated code to be tested, the unit test and the error information into the large model, and generating a repaired unit test by using the large model.
According to a second aspect of embodiments of the present disclosure, there is provided a test processing method, applied to cloud-side equipment, including: receiving a unit test generation request sent by a terminal side device, inputting an updated code to be tested corresponding to the unit test generation request into a large model, generating a unit test corresponding to the code to be tested, analyzing an item code by the updated code to be tested, and assembling the related code and the initial code to be tested after obtaining the related code of the initial code to be tested; returning the unit test to the end side equipment, enabling the end side equipment to detect whether the unit test has errors or not, if so, determining error information corresponding to the errors, and sending the updated code to be tested, the unit test and the error information to cloud side equipment; inputting unit test repair prompt information into the large model, repairing the unit test by using the large model to obtain a repaired unit test, wherein the unit test repair prompt information comprises: the code to be tested, the unit test and the error information are updated; and sending the repaired unit test to the end-side equipment.
According to a third aspect of embodiments of the present specification, there is provided a test processing system comprising: the end side device to which the test processing method according to any embodiment of the present specification is applied, and the cloud side device to which the test processing method according to any embodiment of the present specification is applied.
According to a fourth aspect of embodiments of the present specification, there is provided a computing device comprising: a memory and a processor; the memory is configured to store computer-executable instructions that, when executed by the processor, perform the steps of the test processing method of any of the embodiments of the present specification.
According to a fifth aspect of embodiments of the present specification, there is provided a computer readable storage medium storing computer executable instructions which, when executed by a processor, implement the steps of the test processing method of any embodiment of the present specification.
According to a sixth aspect of the embodiments of the present specification, there is provided a computer program, wherein the computer program, when executed in a computer, causes the computer to perform the steps of the test processing method described above.
According to the method, the project codes are analyzed to obtain related codes of initial codes to be tested, the related codes and the initial codes to be tested are assembled to obtain updated codes to be tested, the updated codes to be tested are input into a large model, unit tests of the updated codes to be tested are generated by using the large model, whether errors exist in the unit tests or not is detected by using a preset detection strategy, if errors exist, error information corresponding to the errors is determined, the updated codes to be tested, the unit tests and the error information are input into the large model, and the repaired unit tests are generated by using the large model.
Drawings
FIG. 1 is a block diagram of a test processing system provided in one embodiment of the present disclosure;
FIG. 2 is a flow chart of a test processing method provided in one embodiment of the present disclosure;
FIG. 3 is a process flow diagram of a test processing method according to one embodiment of the present disclosure;
fig. 4 is a signaling interaction schematic diagram of a test processing method according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of a test handler according to one embodiment of the present disclosure;
FIG. 6 is a flow chart of a test processing method according to another embodiment of the present disclosure;
FIG. 7 is a block diagram of a test processing system provided in one embodiment of the present disclosure;
FIG. 8 is a block diagram of a computing device provided in one embodiment of the present description.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present description. This description may be embodied in many other forms than described herein and similarly generalized by those skilled in the art to whom this disclosure pertains without departing from the spirit of the disclosure and, therefore, this disclosure is not limited by the specific implementations disclosed below.
The terminology used in the one or more embodiments of the specification is for the purpose of describing particular embodiments only and is not intended to be limiting of the one or more embodiments of the specification. As used in this specification, one or more embodiments and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used in one or more embodiments of the present specification refers to and encompasses any or all possible combinations of one or more of the associated listed items.
It should be understood that, although the terms first, second, etc. may be used in one or more embodiments of this specification to describe various information, these information should not be limited by these terms. These terms are only used to distinguish one type of information from another. For example, a first may also be referred to as a second, and similarly, a second may also be referred to as a first, without departing from the scope of one or more embodiments of the present description. The word "if" as used herein may be interpreted as "at … …" or "at … …" or "in response to a determination" depending on the context.
Furthermore, it should be noted that, user information (including, but not limited to, user equipment information, user personal information, etc.) and data (including, but not limited to, data for analysis, stored data, presented data, etc.) according to one or more embodiments of the present disclosure are information and data authorized by a user or sufficiently authorized by each party, and the collection, use, and processing of relevant data is required to comply with relevant laws and regulations and standards of relevant countries and regions, and is provided with corresponding operation entries for the user to select authorization or denial.
The large model involved in the software development method provided in one or more embodiments of the present specification is a deep machine learning model with large scale model parameters, typically including hundreds of millions, billions, trillions and even billions of model parameters, with specialized code understanding, code generation and the like based on training of hundreds of millions of levels of specialized code data. The large Model can be called as a Foundation Model, a training Model is performed by using a large-scale unlabeled corpus, a pre-training Model with more than one hundred million parameters is produced, the Model can adapt to a wide downstream task, and the Model has better generalization capability, such as a large-scale language Model (Large Language Model, LLM), a multi-modal pre-training Model (multi-modal pre-training Model) and the like. For example, the large model involved in the software development method provided in the embodiments of the present disclosure may be a neural network model of billion-level parameters represented by a GPT (GENERATIVE PRE-Trained Transformer, generative pre-training transducer model) architecture.
When the large model is actually applied, the pretrained model can be applied to different tasks by only slightly adjusting a small number of samples, the large model can be widely applied to the fields of natural language processing (Natural Language Processing, NLP for short), computer vision and the like, and particularly can be applied to the tasks of the computer vision fields such as vision question and answer (Visual Question Answering, VQA for short), image description (IC for short), image generation and the like, and the tasks of the natural language processing fields such as emotion classification based on texts, text abstract generation, machine translation and the like, and main application scenes of the large model comprise digital assistants, intelligent robots, searching, online education, office software, electronic commerce, intelligent design and the like.
Currently, there are some unit test generation tools and deep learning unit test generation tools that are capable of generating unit tests. However, many times, these tools generate unit tests with uneven quality, resulting in reduced quality of the software code, affecting project progress and development efficiency. And the development speed and quality of the current software development are high. Therefore, how to process unit tests to meet the requirements of software development speed and quality is an important issue to be solved.
In view of this, the method provided in the embodiments of the present disclosure is based on the processing logic of generating, detecting and repairing the unit test by using the large model, and can automatically improve the quality of the unit test in the software development process, thereby improving the software development speed and quality, and reducing the labor cost.
In particular, in the present specification, a test processing method is provided, and the present specification relates to a test processing apparatus, a test processing system, a computing device, and a computer-readable storage medium, which are described in detail one by one in the following embodiments.
Referring to fig. 1, fig. 1 illustrates a frame diagram of a test processing system provided in one embodiment of the present disclosure, where the test processing system may include cloud-side devices and end-side devices.
Under the condition that a plurality of end side devices exist, communication connection can be established among the plurality of end side devices through cloud side devices, in a software development scene, the cloud side devices are used for providing test processing services among the plurality of end side devices, and the plurality of end side devices can be respectively used as a transmitting end or a receiving end and realize communication through the cloud side devices.
The method comprises the steps that an end side device is used for analyzing project codes to obtain related codes of initial codes to be tested, assembling the related codes and the initial codes to be tested to obtain updated codes to be tested, and sending a unit test generation request to a cloud side device, wherein the unit test generation request carries the updated codes to be tested;
The cloud side equipment is used for receiving a unit test generation request sent by the end side equipment, inputting an updated code to be tested corresponding to the unit test generation request into the large model, generating a unit test corresponding to the updated code to be tested, and returning the unit test to the end side equipment;
the terminal side equipment is used for receiving the unit test, detecting whether the unit test has errors or not by utilizing a preset detection strategy, determining error information corresponding to the errors if the errors exist, and sending the updated code to be tested, the unit test and the error information to cloud side equipment;
The cloud side device is further configured to input unit test repair prompt information including the code to be tested, the unit test and the error information to the large model, generate a repaired unit test by using the large model, and send the repaired unit test to the end side device.
For the repaired unit test, the end side device can continue to detect whether the repaired unit test has an error, if so, the repair is continued until the repaired unit test has no error, or the repair times reach a stop preset stop condition.
The end side device and the cloud side device can be connected through a network. The network provides a medium for a communication link between the end-side device and the cloud-side device. The network may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others. The data transmitted by the end-side device may need to be encoded, transcoded, compressed, etc. before being distributed to the cloud-side device.
The end-side device may include a browser, APP (Application), or web Application such as H5 (Hyper Text Markup Language, hypertext markup language version 5) Application, or a light Application (also referred to as applet, a lightweight Application), or cloud Application, etc., and the Application of the end-side device may be based on a software development kit (SDK, software Development Kit) of the corresponding service provided by the service end, such as a real-time communication (RTC, real Time Communication) based SDK development acquisition, etc. The end-side device may appear as an electronic device or run depending on some APP in the device, etc. The electronic device may for example have a display screen and support information browsing etc. as may be a personal mobile terminal such as a mobile phone, tablet computer, personal computer etc. Various other types of applications are also commonly deployed in electronic devices, such as human-machine conversation type applications, model training type applications, text processing type applications, web browser applications, shopping type applications, search type applications, instant messaging tools, mailbox clients, social platform software, and the like.
The cloud-side device may include servers that provide various services, such as servers that provide communication services for multiple clients, as well as servers for background training that provide support for models used on clients, as well as servers that process data sent by clients, and so on. It should be noted that the cloud-side device may be implemented as a distributed server cluster formed by a plurality of servers, or may be implemented as a single server. The server may also be a server of a distributed system or a server that incorporates a blockchain. The server may also be a cloud server for cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communications, middleware services, domain name services, security services, content delivery networks (CDN, content Delivery Network), basic cloud computing services such as big data and artificial intelligence platforms, or an intelligent cloud computing server or intelligent cloud host with artificial intelligence technology.
It should be noted that, the test processing method provided in the embodiment of the present disclosure may be executed by the end-side device, and in other embodiments of the present disclosure, the cloud-side device may also have a similar function to the end-side device, so as to execute the test processing method provided in the embodiment of the present disclosure; in other embodiments, the test processing method provided in the embodiments of the present disclosure may be performed by the cloud-side device and the end-side device together. In addition, the method provided by the embodiments of the present specification can be applied to processing of unit testing in various code languages, including, but not limited to, code languages such as JAVA.
The test processing system is based on the processing logic of unit test generation, detection and repair of the large model, and can automatically improve the quality of unit test in the software development process, thereby improving the software development speed and quality and reducing the labor cost.
Referring to fig. 2, fig. 2 shows a flowchart of a test processing method according to an embodiment of the first aspect of the present disclosure, which specifically includes the following steps.
Step 202: and analyzing the project codes to obtain related codes of the initial codes to be tested, and assembling the related codes and the initial codes to be tested to obtain updated codes to be tested.
The project code may be a code base of a project. The initial code to be tested may refer to a section of code to be tested in the project code. For example: if the C-segment code in the code file B of the item A needs to be tested, the C-segment code can be the initial code to be tested.
Typically, there will be some dependencies in the code, such as methods, functions, etc. that are called on the call links of the code to be tested. Based on the above, in order to enable the generated unit test to perform a more comprehensive test, the coverage rate is higher, the method provided by the embodiment of the present disclosure analyzes the project code to obtain the relevant code of the initial code to be tested, and assembles the relevant code and the initial code to be tested to obtain the updated code to be tested.
Step 204: and inputting the updated code to be tested into a large model, and generating a unit test for updating the code to be tested by using the large model.
For example: and generating unit test generation prompt information by using the updated code to be tested, inputting the unit test generation prompt information into a large model, and generating the unit test corresponding to the updated code to be tested.
Step 206: and detecting whether the unit test has errors or not by using a preset detection strategy.
The detection policy refers to a policy that whether the code tested by the detection unit can normally operate, and specific content can be set according to scene requirements, for example, detection rules, detection steps, detection tools and the like which need to be executed in detection can be included. The method provided in the embodiments of the present disclosure may preset any one or more detection strategies. For example, the unit test may be detected by code parsing, code compiling, and/or test execution as to whether an error exists.
Step 208: if an error exists, determining error information corresponding to the error.
The error information is information describing the error content of the unit test.
For example: determining whether the unit test has grammar errors or not based on a code analysis mode; based on a code compiling mode, whether the unit test has a lack of dependent resource errors or not can be determined; whether the unit test has operation errors or not can be determined based on the test execution mode.
Step 210: and inputting the updated code to be tested, the unit test and the error information into the large model, and generating a repaired unit test by using the large model.
For example: and generating unit test repair prompt information by using the updated code to be tested, the unit test and the error information, inputting the unit test repair prompt information into a large model, and generating a repaired unit test.
According to the method, the project codes are analyzed to obtain the related codes of the initial codes to be tested, the related codes and the initial codes to be tested are assembled to obtain updated codes to be tested, so that the unit test generated later can be tested more comprehensively, the coverage rate is higher, the processing logic of unit test generation, detection and repair of the updated codes to be tested is performed on the basis of a large model, the quality of the unit test in the software research and development process can be automatically improved, the software development speed and quality are further improved, and the labor cost is reduced.
In order to make the method provided by the embodiments of the present specification easier to understand, the following three stages of unit test generation, unit test detection and unit test repair are respectively exemplified.
In the unit test generation stage, an initial code to be tested needs to be determined from the project codes.
The determination manner of the initial code to be tested is not limited, for example: a method or function may be extracted from the project code as the code to be tested. For another example: an adaptive focus context may be generated from the project code as an initial code to be tested, depending on whether the project code to be tested is an inventory code or an incrementally developed code. The adaptive focus context refers to extracting a code context to be tested in a manner suitable for the code state of the stock code or the delta development of both codes. It can be understood that under the condition of the stock code, the code state is relatively complete and comprehensive, and the code modules such as class, function, method and the like can be directly disassembled to obtain an initial code to be tested; in the case of incremental development, the code state is usually defined by some methods/classes, and the code content may be incomplete, and at this time, the initial code to be tested needs to be extracted by extracting key information according to the class/method definition.
Because the large model has a certain limit on the information length of the prompt information, when the initial code to be tested is extracted, the initial code to be tested also needs to be determined to meet the generation condition of generating the prompt information by unit test. The generating condition of the prompt information generated by the unit test may include an information length threshold range (such as a maximum prompt token limit) of the prompt information, when the information length of the initial code to be tested is within the information length threshold range, the extracted initial code to be tested may be used for generating the unit test, otherwise, the extracted initial code to be tested may not be used for generating the unit test, because the initial code to be tested cannot reduce the information, and lack of context information may result in ineffective unit test.
In one or more embodiments of the present description, in order to make the generated unit test have a higher coverage of code, an attempt may be made to determine the relevant code of the initial code to be tested in order of importance to add more information while verifying that each new addition meets the maximum hint token restriction. In particular, the method comprises the steps of,
Analyzing the project code to obtain a related code of an initial code to be tested, assembling the related code and the initial code to be tested to obtain an updated code to be tested, including:
judging whether supplementary information of the initial code to be tested exists or not by analyzing the project code, wherein the supplementary information is related code of the initial code to be tested and is determined based on the degree of correlation with the code to be tested;
if yes, judging whether the supplemental information meets the generation condition of the unit test generation prompt information;
and if the code to be tested is satisfied, assembling the supplemental information and the initial code to be tested to obtain an updated code to be tested.
In the above embodiment, the code to be tested is updated by assembling the initial code to be tested and the supplemental information determined based on the correlation degree, so that the code amount of the updated code to be tested is larger, and the supplemental information is added based on the correlation degree with the initial code to be tested, so that the test coverage of the large model generating unit is more comprehensive, and the test effect is better.
In addition, to provide a unit test generation model with a better understanding of the code to be tested, the generated unit test can instantiate a dependency class, invoke its method without generating a compilation error, and provide return type information that, in one or more embodiments of the present description,
Analyzing the project code to obtain a related code of an initial code to be tested, assembling the related code and the initial code to be tested to obtain an updated code to be tested, including:
judging whether the initial code to be tested needs to rely on resources or not by analyzing the project code;
and if so, assembling the initial code to be tested and the dependent resource to obtain the updated code to be tested.
For example, in combination with the above embodiment, the initial code to be tested, the supplemental information and the dependent resource may be assembled to obtain the updated code to be tested.
The dependent resources may include, for example, resources expressed by way of signatures, such as dependent class signatures, dependent class constructors, and/or invoked method signatures. The splice method signature may be considered when splicing dependent resources, as it provides parameter type and return type information. If the token restriction allows, all method signatures called by the initial code to be tested in the project code can be tried to be assembled with the initial code to be tested, so that the code to be tested is updated, and the large model can understand the effect and purpose of updating the code to be tested.
In the above embodiment, if the initial code to be tested needs to rely on the resource, the signature of the dependent resource is assembled with the initial code to be tested to obtain the updated code to be tested, so that the effective and comprehensive unit test can be ensured to be generated.
In the case where the method provided in the embodiment of the present disclosure is applied to an end-side device, the end-side device needs to generate a unit test generation request by using a code to be tested, and send the unit test generation request to a cloud-side device, and the cloud-side device invokes a large model to generate a unit test. In particular, the method comprises the steps of,
Inputting the updated code to be tested into a large model, generating a unit test of the updated code to be tested by using the large model, and comprising the following steps:
Generating a unit test generation request carrying the updated code to be tested;
The unit test generation request is sent to cloud side equipment, so that the cloud side equipment judges whether the updated code to be tested meets the generation condition of unit test generation prompt information, if so, the updated code to be tested is used for generating unit test generation prompt information, the unit test generation prompt information is input into a unit test generation model, and the unit test corresponding to the updated code to be tested is generated;
And receiving the unit test returned by the cloud side equipment.
For example, in the embodiment of the present specification, a prompt information generation template may be preset. In the hint information generating template, key hints may be included, which are used to help the large model understand the meaning of the information led out by the key hints in the input hint information. When the prompt message is generated, the prompt message generating template can be utilized to render and update the unit test generating prompt message corresponding to the code to be tested, so as to obtain the unit test generating prompt message conforming to the prompt message template.
In the above embodiment, by judging whether the updating code to be tested meets the generating condition of the unit test generating prompt message, and integrating the updating code to be tested into the prompt message only when the updating code to be tested meets the generating condition of the unit test generating prompt message, the prompt length can be ensured not to exceed the limit of the maximum prompt token, so that enough space is reserved for generating complete replies for the large model, and the generation of invalid unit tests is effectively avoided.
After the unit test is generated by using the large model, a unit test detection stage is entered to detect the unit test.
In the unit test detection stage, valid test codes in the unit test may be extracted first. This is because there may be some information irrelevant to the test in the unit test generated by the large model, in order to enable effective detection and repair of the unit test, the unit test may be first extracted with an effective test code, and the extracted effective test code forms the unit test to be detected. For example: in large model generated unit tests, it is possible to separate test codes with three anti-quotients, or pure test codes without separators. For the former case, a regular expression can be used to identify the character strings surrounded by three anti-quotation marks, and code segments lacking test code markers are filtered out, so that effective test codes are obtained; in the latter case, a row containing a code key such as "public", test "may be located and an attempt made to determine the boundary of the Test by verifying that the first character in a row matches the code syntax.
After extracting the valid test codes to form the unit test to be detected, whether the unit test has errors or not can be detected through a code analysis mode, a code compiling mode and/or a test execution mode.
The code parsing mode refers to a syntax tested by a code parser unit so as to detect whether a syntax error exists.
The code compiling mode refers to a code tested by a code compiler compiling unit so as to test whether an error which cannot be compiled exists or not by a detecting unit.
The test execution mode refers to a code which is tested by using a code executor execution unit so as to detect whether the unit test has a running error.
In the above embodiment, the error of the unit test is detected by the code analysis mode, the code compiling mode and the test execution mode, so that various types of errors such as grammar, dependence on resources and the like can be detected, the errors can be repaired in a targeted manner, and the repair precision and efficiency are improved.
After detecting errors based on the above manner, errors in unit tests can be repaired for different error conditions. In order to improve the repair efficiency of the unit test, in one or more embodiments of the present disclosure, a repair method of repairing based on a preset grammar repair rule and adding a resource-dependent manner is adopted, and then repairing is performed by using a large model. Therefore, under the condition that grammar repairing rules and repairing modes added with dependent resources are effective, the method can avoid calling a large model and improve repairing efficiency. Specifically, before inputting the code to be tested, the unit test and the error information into the large model, generating the repaired unit test by using the large model, the method further comprises:
if the grammar error exists in the unit test based on the code analysis mode, repairing the grammar of the unit test based on a preset grammar repairing rule;
If the unit test is determined to have a lack of dependent resource errors based on a code compiling mode, adding the dependent resource which is lack of in the unit test to the unit test;
and if the running error of the unit test is determined based on a test execution mode, entering the step of inputting the updated code to be tested, the unit test and the error information into the large model, and generating the repaired unit test by using the large model.
The preset grammar repair rules can include repair rules set according to grammar rules of any one or more code languages. For example: the preset grammar repair rules may include any one or more of the following:
Bracket balancing rules: identifying the last semicolon or right bracket, adding necessary right bracket to ensure the quantity balance of left brackets;
Deleting error code rules: deleting any one or more potentially erroneous test code structures from the unit test, such as: locating a marker of the Test code, such as "@ Test", cutting off the code before the marker, then adding a right bracket to delete the Test code structure, and terminating the repair process if the repair process fails or there is no Test code structure remaining after the deletion. Wherein the marker of the test code is a symbol for marking the test code.
The specific manner of determining the dependency resources lacking in the unit test is not limited when errors of the unit test are repaired by adding the dependency resources. For example: unit tests obtained from large models may encounter compilation errors such as "… cannot be found". Aiming at the problem, the method can determine the lacking dependent resources of the unit test through the information that 'can not be found', and further adds the corresponding dependent resources in the unit test to repair the error.
And inputting prompt information of the code error repair model, and generating a template based on the preset prompt information. For example: updating a code to be tested corresponding to the unit test ZZ to be XX, wherein the error type is an unsupported element, and the error information is an unsupported element YY of an A module in the code; the format of the prompt information generation template is as follows: code to be tested: blank to be filled; error type: blank to be filled; error information: blank to be filled; unit test: blank to be filled; the generated hint information may be expressed as: code to be tested: "XX"; error type: "unsupported element"; error information: "A module does not support element YY"; unit test: "ZZ".
In the above embodiment, simple grammar errors, such as lack of symbols, lack of imported sentences, and the like, are solved based on grammar rule repair, compilation errors are repaired based on a resource-dependent addition manner, and if grammar repair and resource-dependent addition repair cannot solve errors of unit test, a large model is adopted for repair, so that repair of unit test can be completed more quickly and accurately.
It may be understood that the method provided in the embodiments of the present disclosure may be implemented by an end-side device or a cloud-side device, or may be implemented by a common embodiment of the end-side device and the cloud-side device, where in one or more embodiments of the present disclosure,
Inputting the updated code to be tested, the unit test and the error information into the large model, and generating a repaired unit test by using the large model, wherein the method comprises the following steps of:
transmitting the code to be tested, the unit test and the error information to cloud side equipment, enabling the cloud side equipment to input unit test repair prompt information into the large model, repairing the unit test by using the large model to obtain a repaired unit test, wherein the unit test repair prompt information comprises the code to be tested, the unit test and the error information;
And receiving the repaired unit test returned by the cloud side equipment.
Wherein the error information clock may include an error type and error details. The error type may be determined based on a classification of the historical error information. For example: the error types include: error types such as data set configuration errors, data attribute errors, element unsupported errors, and the like. The error information may be information describing error contents returned during syntax detection, compilation, test code execution.
In the above embodiment, by constructing a unit test repair prompt message including an error message, an error type, a unit test and a code to be tested, inputting the unit test repair prompt message into a large model, repairing the unit test by using the large model with the capability of professional code understanding, code generation and the like based on training of hundreds of millions of professional code data, such as LLM, thereby rapidly acquiring the repaired unit test from the large model and effectively improving test processing efficiency.
In one or more embodiments of the present disclosure, considering that one repair may not completely eliminate errors in a unit test, the repair effect of the unit test is improved by multiple rounds of detection and repair. Specifically, the method further comprises:
and re-entering the step of detecting whether the unit test has errors by utilizing a preset detection strategy aiming at the repaired unit test until the repaired unit test has no errors or the repair times reach a preset stop condition.
The preset stopping condition may include a repair number threshold range, and when the repair number exceeds the repair number threshold range, the preset stopping condition is considered to be reached.
As can be seen from one or more embodiments of the present description, a large model is capable of generating a large number of unit tests based on the specific needs and scenarios of an item. These unit tests can cover different code paths and boundary conditions, thereby improving the coverage of the unit tests. And errors and defects in the generated unit test can be found in the detection process and repaired through a large model, so that the quality and stability of codes are improved. The processing logic of the large model generation-verification-repair unit test can greatly improve the quality and coverage rate of the automatically generated unit test and reduce the workload of manually writing single test. And moreover, the research and development time and resources can be saved, the research and development efficiency is improved, and the line-up speed of software products is accelerated. In addition, the method provided by the embodiment of the specification can adapt to various software development projects based on large model generation and repair unit test, and has higher universality and adaptability.
In order to make the method provided by the embodiments of the present specification easier to understand, the following describes an exemplary application of the test processing method provided by the present specification, which combines the above embodiments, in the unit test of JAVA language code with reference to fig. 3. As shown in fig. 3, before generating a unit test, a single test generating tool implemented according to the method provided by the embodiment of the present disclosure performs data preprocessing and prompt message processing according to two situations of an inventory code and an increment code, so as to obtain a unit test generating prompt message including a code to be tested; the unit test generates prompt information and inputs the prompt information into the LLM model to obtain a unit test generated by the model; the unit test is firstly subjected to grammar rule and resource-dependent detection and repair; and performing test execution on the repaired unit test, if the operation is error-free, and if the operation is error-free, performing recall, and repairing through the LLM model to obtain the repaired unit test. Specifically, fig. 3 shows a process flow chart of a test processing method according to an embodiment of the present disclosure, which specifically includes the following steps.
Step 302: for the stock code, the original JAVA class is extracted.
Step 304: the original JAVA class is disassembled according to the method, so that the data preprocessing of the JAVA class is realized.
As shown in fig. 3, the original JAVA class is broken down into individual methods, i.e., initial test codes.
Step 306: aiming at target class information in the method, the code to be tested is updated by splicing in a method definition method source code method introduction method.
Step 308: aiming at the dependency class information, the codes to be tested are assembled and formed in a method definition and method input/output mode.
Step 310: and generating prompt information by using the code construction unit to be tested.
Based on the steps, the automatic construction of the prompt information generated by the unit test can be realized, and the prompt information contains code contents such as method dependence/method body/source class and the like.
Step 312: class/method definitions are extracted for the delta code.
The extracted class/method definition, i.e., the initial test code.
Step 314: and the data preprocessing of class/method definition is realized by a key information extraction mode.
As shown in FIG. 3, method dependencies and method descriptions may be extracted from class/method definitions.
Step 316: aiming at the extracted target class information, the codes to be tested are assembled and formed in a method definition and function description mode.
Step 318: aiming at the extracted dependency information, the code to be tested is updated by splicing and forming a method definition and a method input/output mode.
Step 320: and generating prompt information by using the updating to-be-tested code construction unit for testing.
Based on the steps, the automatic construction of the prompt information generated by the unit test can be realized, and the prompt information contains code contents such as method dependence/method description and the like.
Step 322: the unit test generation hint information is input to the LLM model to generate the unit test.
For example, a test method 1 file, a test method 2 file, and other unit test files are generated.
Step 324: and testing whether errors exist or not by the code analysis mode, the code compiling mode and the test execution mode detection unit.
The grammar error detected by the code analysis mode is repaired by presetting grammar repair rules; the error of lack of dependent resources detected by the code compiling mode is repaired by adding the dependent resources.
Step 326: and judging whether the repair times reach the upper limit of the repair times through detection or repair times.
The unit test is firstly carried out with detection of a code analysis mode and a code compiling mode, if errors exist, the unit test is repaired based on a preset grammar repairing rule and depending resources, if the repaired unit test is carried out with test execution, the operation is error-free, the detection is passed, and if the operation is error, the failure is indicated.
Step 328: if the code to be tested is failed and the repair times are smaller than the upper limit of the repair times, the unit test repair prompt information containing updated codes to be tested, unit tests and error information is input into the large model for repair.
In the embodiment of the present specification, the unit test generation model and the code error repair model may be implemented by using the same large model, which is not limited in this specification.
And judging whether the repair times are smaller than the upper limit of the repair times or not, and aiming at a single unit test.
Step 330: if the unit Test has no error or the repair times of the unit Test reach the upper limit of the repair times, merging and de-duplicating the unit tests of the same type to obtain the merged unit Test, such as Test class.
The purpose of simplifying the unit test can be achieved through the step.
And merging the duplicate-removed unit tests, and verifying the reasonability by means of manual operation and the like.
The unit test after the model repair may be re-entered into step 324 to detect an error until the unit test has no error or the repair number reaches a preset stop condition.
In the embodiment, the processing of the unit test is realized based on the generation-verification-repair framework, and the coverage rate and the quality of the unit test are improved. Wherein a unit test can be defined as a method; the code to be tested may be determined based on an adaptive focus context generation mechanism; generating unit test generation prompt information based on the maximum prompt token limit of the prompt information, and reserving enough space for a unit test generation model to generate a complete unit test; verifying the unit test by a code analysis mode, a code compiling mode and a test execution mode; based on a preset grammar repairing rule, adding dependent resources and a large model to repair the unit test. Therefore, simple errors such as grammar errors and lack of imported sentences are repaired and solved based on a preset grammar repair rule, and if grammar-based repair fails, a model repair strategy can be adopted, so that errors tested by the unit are effectively repaired. If the test has not been repaired after multiple validation and repair rounds, it may be marked as obsolete. Finally, the obtained unit test is a unit test subjected to detection and repair, and the quality of the unit test is effectively improved.
The following is a schematic signaling interaction diagram applied to the unit test of JAVA language codes by using the test processing method provided in the present specification in combination with the above embodiments. As shown in fig. 4, the signaling interaction procedure includes:
Step 402: the end side device performs an environmental check.
The environmental check may be used to analyze the environment of the software project, determine the model and data set required for the project, and configure the corresponding environment.
Step 404: and the terminal side equipment extracts the class structure information to obtain updated codes to be tested.
In extracting the class structure information from the code base of the software item code, it may be extracted in batches by directory or in a single file.
Step 406: the end-side device sends a unit test generation request to the cloud-side device, where the unit test generation request may carry updated code to be tested (e.g., code or signature of the class and method to be tested).
Step 408: and the cloud side equipment tests and generates prompt information by using the code generating unit to be tested.
Step 410: and the cloud side equipment sends the unit test generation prompt information to the LLM model gateway so as to request the LLM model to generate the unit test.
The unit test is enabled to generate prompt information and input into the LLM model, and the LLM model correspondingly generates the unit test corresponding to the code to be tested.
Step 412: the cloud side device obtains unit tests generated by the LLM model.
Step 414: and the cloud side equipment extracts the test code from the unit test.
Step 416: the cloud side device detects grammar errors in the test codes in a code analysis mode.
Step 418: and the cloud side equipment repairs the grammar of the test code based on a preset grammar repair rule and repacks the grammar into unit tests.
Step 420: and the cloud side equipment sends the packaged unit test to the end side equipment.
Step 422: and the terminal side equipment detects whether the unit test has a compiling error or not in a code compiling mode.
If it is determined that the unit test has a lack of dependent resource errors based on the compilation error, the unit test is repaired by adding the dependent resource.
Step 424: and the terminal equipment detects whether the unit test has operation errors or not in a test execution mode.
Step 426: and if the running errors exist, determining error information of the unit test, and sending the updated code to be tested, the unit test and the error information to cloud side equipment.
When the updating code to be tested, the unit test and the error information are sent, the updating code to be tested, the unit test and the error information can be sent through the unit test generating request corresponding session ID (session identification).
Step 428: the cloud side device sends unit test repair prompt information to the LLM model gateway so as to request the LLM model for repair unit test.
Step 430: and the cloud side equipment receives the repaired unit test returned by the LLM model.
Step 432: the cloud side device detects grammar errors of the repaired unit test in a code analysis mode, repairs grammar of the repaired unit test based on a preset grammar repair rule, and repacks the repaired unit test.
Of course, this step further includes a step of extracting the test code before the detection, and will not be described in detail herein.
Step 434: and the terminal side equipment receives the repaired unit test returned by the cloud side equipment.
Step 422 is re-entered to detect for the repaired unit test until no errors exist in the repaired unit test or the repair times exceed the upper repair times limit.
After the generation, detection and repair of the unit test are completed for the newly extracted updated code to be tested, step 404 may be re-entered to extract new class structure information from the code library of the software project, and the generation, detection and repair of the unit test are performed for the newly extracted updated code to be tested until the code of the code library of the software project is completely extracted.
Corresponding to the method embodiment, the present disclosure further provides an embodiment of a test processing device, and fig. 5 shows a schematic structural diagram of a test processing device provided in one embodiment of the present disclosure. As shown in fig. 5, the apparatus includes:
the code determining module 502 is configured to obtain a relevant code of an initial code to be tested by analyzing the project code, and assemble the relevant code and the initial code to be tested to obtain an updated code to be tested;
a single test generation module 504 configured to input an updated code to be tested into a large model, and generate a unit test of the code to be tested using the large model;
a single test detection module 506 configured to detect whether the unit test has an error;
an error determination module 508 configured to determine error information corresponding to an error, if the error exists;
and the repair execution module 510 is configured to input the updated code to be tested, the unit test and the error information into the large model, and generate a repaired unit test by using the large model.
In one or more embodiments of the present disclosure, the single test generation module includes:
A single test request generation sub-module configured to generate a unit test generation request carrying the updated code to be tested;
The single test request sending submodule is configured to send the unit test generation request to cloud side equipment, enable the cloud side equipment to judge whether the updated code to be tested meets the generation condition of unit test generation prompt information, and if so, utilize the updated code to be tested to generate unit test generation prompt information, input the unit test generation prompt information into a unit test generation model and generate a unit test corresponding to the code to be tested;
And the single-test receiving sub-module is configured to receive the unit test returned by the cloud side equipment.
In one or more embodiments of the present disclosure, the apparatus further includes:
the supplementary information judging module is configured to judge whether supplementary information of the initial code to be tested exists or not by analyzing the project code, wherein the supplementary information is related code of the initial code to be tested and is determined based on the degree of correlation with the code to be tested;
A prompt information judgment module configured to judge whether the supplemental information satisfies a generation condition of the unit test generation prompt information if the supplemental information judgment module determines yes;
and the supplementary information adding module is configured to assemble the supplementary information and the initial code to be tested to obtain updated code to be tested if the prompt information judging module determines that the supplementary information and the initial code to be tested are met.
In one or more embodiments of the present disclosure, the apparatus further includes:
And the dependent resource adding module is configured to judge whether the initial code to be tested needs dependent resources or not by analyzing the project codes, and if so, the initial code to be tested and the dependent resources are spliced to obtain updated code to be tested.
In one or more embodiments of the present disclosure, the single test detection module is configured to detect whether the unit test has an error by using a code parsing manner, a code compiling manner, and/or a test execution manner.
In one or more embodiments of the present disclosure, the apparatus further includes:
And the pre-model repair module is configured to, before inputting the code to be tested, the unit test and the error information into the large model and generating a repaired unit test by using the large model, repair the grammar of the unit test based on a preset grammar repair rule if the unit test has grammar errors based on a code analysis mode, add the dependence resources which are lack of the unit test to the unit test if the unit test has lack of dependence resources errors based on a code compilation mode, and trigger a repair execution module to enter the step of inputting the updated code to be tested, the unit test and the error information into the large model and generating the repaired unit test by using the large model if the unit test has lack of dependence resources errors based on a test execution mode.
In one or more embodiments of the present disclosure, the repair execution module is configured to send the code to be tested, the unit test and the error information to a cloud side device, so that the cloud side device inputs unit test repair prompt information into the large model, repairs the unit test by using the large model, and obtains a repaired unit test, where the unit test repair prompt information includes the code to be tested, the unit test and the error information, and receives the repaired unit test returned by the cloud side device.
The above is a schematic solution of a test handler of the present embodiment. It should be noted that, the technical solution of the test processing device and the technical solution of the test processing method belong to the same conception, and the details of the technical solution of the test processing device, which are not described in detail, can be referred to the description of the technical solution of the test processing method.
Corresponding to the above method embodiments, in the second aspect of the present disclosure, a test processing method embodiment applied to a cloud side device is further provided, and fig. 6 shows a flowchart of a test processing method provided in one embodiment of the present disclosure. As shown in fig. 6, the method includes:
Step 602: receiving a unit test generation request sent by a terminal side device, inputting an updated code to be tested corresponding to the unit test generation request into a large model, generating a unit test corresponding to the code to be tested, analyzing an item code by the updated code to be tested, and assembling the related code and the initial code to be tested after obtaining the related code of the initial code to be tested;
Step 604: returning the unit test to the end side equipment, enabling the end side equipment to detect whether the unit test has errors or not, if so, determining error information corresponding to the errors, and sending the updated code to be tested, the unit test and the error information to cloud side equipment;
Step 606: inputting unit test repair prompt information into the large model, repairing the unit test by using the large model to obtain a repaired unit test, wherein the unit test repair prompt information comprises: the code to be tested, the unit test and the error information are updated;
step 608: and sending the repaired unit test to the end-side equipment.
Corresponding to the method embodiment, the present disclosure further provides an embodiment of a test processing system, and fig. 7 shows a schematic structural diagram of a test processing system provided in one embodiment of the present disclosure. As shown in fig. 7, the test processing system includes: an end-side device 702 to which the test processing method according to any embodiment of the first aspect of the present specification is applied, and a cloud-side device 704 to which the test processing method according to any embodiment of the second aspect of the present specification is applied.
Fig. 8 illustrates a block diagram of a computing device 800 provided in accordance with one embodiment of the present description. The components of computing device 800 include, but are not limited to, memory 810 and processor 820. Processor 820 is coupled to memory 810 through bus 830 and database 850 is used to hold data.
Computing device 800 also includes access device 840, access device 840 enabling computing device 800 to communicate via one or more networks 860. Examples of such networks include public switched telephone networks (PSTN, public Switched Telephone Network), local area networks (LAN, local Area Network), wide area networks (WAN, wide Area Network), personal area networks (PAN, personal Area Network), or combinations of communication networks such as the internet. The access device 840 may include one or more of any type of network interface, wired or wireless, such as a network interface card (NIC, network interface controller), such as an IEEE802.11 wireless local area network (WLAN, wireless Local Area Network) wireless interface, a worldwide interoperability for microwave access (Wi-MAX, worldwide Interoperability for Microwave Access) interface, an ethernet interface, a universal serial bus (USB, universal Serial Bus) interface, a cellular network interface, a bluetooth interface, near Field Communication (NFC).
In one embodiment of the present description, the above-described components of computing device 800, as well as other components not shown in FIG. 8, may also be connected to each other, such as by a bus. It should be understood that the block diagram of the computing device illustrated in FIG. 8 is for exemplary purposes only and is not intended to limit the scope of the present description. Those skilled in the art may add or replace other components as desired.
Computing device 800 may be any type of stationary or mobile computing device including a mobile computer or mobile computing device (e.g., tablet, personal digital assistant, laptop, notebook, netbook, etc.), mobile phone (e.g., smart phone), wearable computing device (e.g., smart watch, smart glasses, etc.), or other type of mobile device, or a stationary computing device such as a desktop computer or personal computer (PC, personal Computer). Computing device 800 may also be a mobile or stationary server.
Wherein the processor 820 is configured to execute computer-executable instructions that, when executed by the processor, perform the steps of the test processing method described above.
The foregoing is a schematic illustration of a computing device of this embodiment. It should be noted that, the technical solution of the computing device and the technical solution of the test processing method belong to the same concept, and details of the technical solution of the computing device, which are not described in detail, can be referred to the description of the technical solution of the test processing method.
An embodiment of the present disclosure also provides a computer-readable storage medium storing computer-executable instructions that, when executed by a processor, implement the steps of the test processing method described above.
The above is an exemplary version of a computer-readable storage medium of the present embodiment. It should be noted that, the technical solution of the storage medium and the technical solution of the test processing method belong to the same concept, and details of the technical solution of the storage medium which are not described in detail can be referred to the description of the technical solution of the test processing method.
An embodiment of the present disclosure also provides a computer program, where the computer program, when executed in a computer, causes the computer to perform the steps of the test processing method described above.
The above is an exemplary version of a computer program of the present embodiment. It should be noted that, the technical solution of the computer program and the technical solution of the test processing method belong to the same conception, and details of the technical solution of the computer program, which are not described in detail, can be referred to the description of the technical solution of the test processing method.
The foregoing describes specific embodiments of the present disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
The computer instructions include computer program code that may be in source code form, object code form, executable file or some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the content of the computer readable medium can be increased or decreased appropriately according to the requirements of the patent practice, for example, in some areas, according to the patent practice, the computer readable medium does not include an electric carrier signal and a telecommunication signal.
It should be noted that, for simplicity of description, the foregoing method embodiments are all expressed as a series of combinations of actions, but it should be understood by those skilled in the art that the embodiments are not limited by the order of actions described, as some steps may be performed in other order or simultaneously according to the embodiments of the present disclosure. Further, those skilled in the art will appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily all required for the embodiments described in the specification.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to the related descriptions of other embodiments.
The preferred embodiments of the present specification disclosed above are merely used to help clarify the present specification. Alternative embodiments are not intended to be exhaustive or to limit the invention to the precise form disclosed. Obviously, many modifications and variations are possible in light of the teaching of the embodiments. The embodiments were chosen and described in order to best explain the principles of the embodiments and the practical application, to thereby enable others skilled in the art to best understand and utilize the invention. This specification is to be limited only by the claims and the full scope and equivalents thereof.

Claims (12)

1. A test processing method, comprising:
Analyzing the project codes to obtain related codes of initial codes to be tested, and assembling the related codes and the initial codes to be tested to obtain updated codes to be tested;
Inputting the updated code to be tested into a large model, and generating a unit test of the updated code to be tested by using the large model;
detecting whether the unit test has errors or not by using a preset detection strategy;
if errors exist, determining error information corresponding to the errors;
and inputting the updated code to be tested, the unit test and the error information into the large model, and generating a repaired unit test by using the large model.
2. The method of claim 1, the inputting the updated code to be tested into a large model, generating a unit test of the updated code to be tested using the large model, comprising:
Generating a unit test generation request carrying the updated code to be tested;
The unit test generation request is sent to cloud side equipment, so that the cloud side equipment judges whether the updated code to be tested meets the generation condition of unit test generation prompt information, if so, the updated code to be tested is used for generating unit test generation prompt information, the unit test generation prompt information is input into a unit test generation model, and the unit test corresponding to the updated code to be tested is generated;
And receiving the unit test returned by the cloud side equipment.
3. The method of claim 2, wherein the analyzing the project code to obtain the related code of the initial code to be tested, and assembling the related code and the initial code to be tested to obtain the updated code to be tested, includes:
judging whether supplementary information of the initial code to be tested exists or not by analyzing the project code, wherein the supplementary information is related code of the initial code to be tested and is determined based on the degree of correlation with the code to be tested;
if yes, judging whether the supplemental information meets the generation condition of the unit test generation prompt information;
and if the code to be tested is satisfied, assembling the supplemental information and the initial code to be tested to obtain an updated code to be tested.
4. The method of claim 2, wherein the analyzing the project code to obtain the related code of the initial code to be tested, and assembling the related code and the initial code to be tested to obtain the updated code to be tested, includes:
judging whether the initial code to be tested needs to rely on resources or not by analyzing the project code;
and if so, assembling the initial code to be tested and the dependent resource to obtain the updated code to be tested.
5. The method of claim 1, wherein the detecting whether the unit test has an error using a preset detection policy comprises:
And detecting whether the unit test has errors or not through a code analysis mode, a code compiling mode and/or a test execution mode.
6. The method of claim 5, further comprising, prior to entering the updated code to be tested, the unit test, and the error information into the large model, generating a repaired unit test using the large model:
if the grammar error exists in the unit test based on the code analysis mode, repairing the grammar of the unit test based on a preset grammar repairing rule;
If the unit test is determined to have a lack of dependent resource errors based on a code compiling mode, adding the dependent resource which is lack of in the unit test to the unit test;
and if the running error of the unit test is determined based on a test execution mode, entering the step of inputting the updated code to be tested, the unit test and the error information into the large model, and generating the repaired unit test by using the large model.
7. The method of claim 1 or 6, the inputting the updated code to be tested, the unit test, and the error information into the large model, generating a repaired unit test using the large model, comprising:
transmitting the code to be tested, the unit test and the error information to cloud side equipment, enabling the cloud side equipment to input unit test repair prompt information into the large model, repairing the unit test by using the large model to obtain a repaired unit test, wherein the unit test repair prompt information comprises the code to be tested, the unit test and the error information;
And receiving the repaired unit test returned by the cloud side equipment.
8. The method of claim 1, further comprising:
and re-entering the step of detecting whether the unit test has errors by utilizing a preset detection strategy aiming at the repaired unit test until the repaired unit test has no errors or the repair times reach a preset stop condition.
9. A test processing method is applied to cloud side equipment and comprises the following steps:
Receiving a unit test generation request sent by a terminal side device, inputting an updated code to be tested corresponding to the unit test generation request into a large model, generating a unit test corresponding to the code to be tested, analyzing an item code by the updated code to be tested, and assembling the related code and the initial code to be tested after obtaining the related code of the initial code to be tested;
Returning the unit test to the end side equipment, enabling the end side equipment to detect whether the unit test has errors or not, if so, determining error information corresponding to the errors, and sending the updated code to be tested, the unit test and the error information to cloud side equipment;
Inputting unit test repair prompt information into the large model, repairing the unit test by using the large model to obtain a repaired unit test, wherein the unit test repair prompt information comprises: the code to be tested, the unit test and the error information are updated;
And sending the repaired unit test to the end-side equipment.
10. A test processing system, comprising: end-side device to which the test processing method according to any one of claims 1 to 8 is applied, and cloud-side device to which the test processing method according to claim 9 is applied.
11. A computing device, comprising:
A memory and a processor;
The memory is configured to store computer executable instructions, and the processor is configured to execute the computer executable instructions, which when executed by the processor, implement the steps of the test processing method of any one of claims 1 to 9.
12. A computer readable storage medium storing computer executable instructions which when executed by a processor implement the steps of the test processing method of any one of claims 1 to 9.
CN202410039294.2A 2024-01-09 2024-01-09 Test processing method and system Pending CN118051429A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410039294.2A CN118051429A (en) 2024-01-09 2024-01-09 Test processing method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410039294.2A CN118051429A (en) 2024-01-09 2024-01-09 Test processing method and system

Publications (1)

Publication Number Publication Date
CN118051429A true CN118051429A (en) 2024-05-17

Family

ID=91047445

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410039294.2A Pending CN118051429A (en) 2024-01-09 2024-01-09 Test processing method and system

Country Status (1)

Country Link
CN (1) CN118051429A (en)

Similar Documents

Publication Publication Date Title
Fantechi et al. Applications of linguistic techniques for use case analysis
CN112149399B (en) Table information extraction method, device, equipment and medium based on RPA and AI
CN111459799B (en) Software defect detection model establishing and detecting method and system based on Github
CN103577324B (en) Static detection method for privacy information disclosure in mobile applications
KR20160122452A (en) Deep learnig framework and image recognition method for content-based visual image recognition
US20220414463A1 (en) Automated troubleshooter
CN115328756A (en) Test case generation method, device and equipment
CN115803734A (en) Natural language enrichment using action interpretation
CN116610781A (en) Task model training method and device
CN112860873B (en) Intelligent response method, device and storage medium
CN112417852B (en) Method and device for judging importance of code segment
US11392370B2 (en) Distributed vectorized representations of source code commits
CN113239698A (en) Information extraction method, device, equipment and medium based on RPA and AI
CN117113080A (en) Data processing and code processing method, device, all-in-one machine and storage medium
CN116346660A (en) Data processing method, device, equipment and medium based on dependent replacement service
CN116680261A (en) Data reporting method, system and device
CN118051429A (en) Test processing method and system
CN115510192A (en) News event context relationship detection method and device
CN117648079B (en) Task processing, code completion, code question answering and task processing model training method
CN117539438B (en) Software development method
US10997056B1 (en) Generation of explanatory and executable repair examples
CN116975209A (en) Dialogue content generation method, system, storage medium and device
CN118312412A (en) Code online debugging method and device, electronic equipment and storage medium
CN117910433A (en) Document editing method, device, equipment and storage medium
CN117370184A (en) Interface testing method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination