CN117093484A - Test case generation method, device, equipment and storage medium - Google Patents

Test case generation method, device, equipment and storage medium Download PDF

Info

Publication number
CN117093484A
CN117093484A CN202311061883.2A CN202311061883A CN117093484A CN 117093484 A CN117093484 A CN 117093484A CN 202311061883 A CN202311061883 A CN 202311061883A CN 117093484 A CN117093484 A CN 117093484A
Authority
CN
China
Prior art keywords
test case
case
target
target test
test
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311061883.2A
Other languages
Chinese (zh)
Inventor
袁琳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bank of China Ltd
Original Assignee
Bank of China Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bank of China Ltd filed Critical Bank of China Ltd
Priority to CN202311061883.2A priority Critical patent/CN117093484A/en
Publication of CN117093484A publication Critical patent/CN117093484A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The application provides a test case generation method, a device, equipment and a storage medium, which can be used in the field of artificial intelligence. The method comprises the following steps: acquiring input data for generating a target test case; inputting the input data into a test case generation model after training, and determining the target test case by adopting the test case generation model; verifying the target test case to obtain a verification result corresponding to the target test case; if the verification result is passed, determining the class of the case to which the target test case belongs; and storing the case categories and the target test cases in a case warehouse correspondingly. According to the method and the device for verifying the target test cases, the generated target test cases can be verified, so that the target test cases passing verification can be stored, the stability of the test cases in the case warehouse is improved, and the occurrence of the phenomenon that the test results are problematic due to the fact that the test cases are not in accordance with the conditions is reduced.

Description

Test case generation method, device, equipment and storage medium
Technical Field
The present application relates to the field of artificial intelligence, and in particular, to a test case generating method, apparatus, device, and storage medium.
Background
In the development and operation and maintenance of financial software, an automatic test is an important stage, and generally, the financial software is automatically tested by using a test case, so that the test coverage of various possible abnormal business scenes is realized, and the quality of the financial software is improved.
In the related art, test engineers typically write test cases manually according to the business scenario of financial software, however, the cost of writing test cases manually is higher and the efficiency is lower. With the continuous development of machine learning technology, the use of machine learning technology to generate test cases has become a new trend. The machine learning model is trained through a large number of sample test cases, so that the machine learning model learns the grammar and the structure of the sample test cases, and the test cases can be automatically generated.
However, the machine learning model has limitations, so that the generated test cases may have errors and cannot be directly used for automatic testing.
Disclosure of Invention
The application provides a test case generation method, a device, equipment and a storage medium, which improve the stability of test cases in a case warehouse and reduce the occurrence of problems caused by test case unconformity conditions.
In a first aspect, the present application provides a test case generating method, including:
acquiring input data for generating a target test case; the input data comprise demand class data, interface class data, service class data and definition class data corresponding to target software, and the target test case is used for automatically testing the target software;
inputting the input data into a test case generation model after training, determining the target test case by adopting the test case generation model, wherein the test case generation model is obtained by adopting historical test case training;
verifying the target test case to obtain a verification result corresponding to the target test case; the verification result is passed or failed;
if the verification result is passed, determining the class of the case to which the target test case belongs;
storing the case categories and the target test cases in a case warehouse correspondingly; the test cases stored in the case repository are used to perform test tasks to automatically test the target software.
Optionally, the verifying the target test case to obtain a verification result corresponding to the target test case includes:
A plurality of preset verification modes are adopted to respectively verify the target test cases so as to obtain verification results respectively corresponding to the preset verification modes;
if the verification results corresponding to the preset verification modes are all passed, determining that the verification result corresponding to the target test case is passed, otherwise, determining that the verification result does not pass.
Optionally, the verifying the target test case by using a plurality of preset verification modes respectively to obtain verification results corresponding to the preset verification modes respectively includes:
when the preset verification mode is command verification, verifying whether the name of the target test case accords with a preset naming specification, if so, determining that a verification result corresponding to the command verification is passed, otherwise, determining that the verification result does not pass;
when the preset checking mode is field non-empty checking, checking whether necessary fields in the target test case are non-empty, if so, determining that a checking result corresponding to the field non-empty checking is passing, otherwise, determining that the checking result is not passing;
when the preset verification mode is parameter use verification, verifying whether each parameter in the target test case is marked with a corresponding use, if so, determining that a verification result corresponding to the parameter use verification is passed, and otherwise, determining that the verification result is failed.
Optionally, the determining the case category to which the target test case belongs includes:
if the service type included in the service type data is null, determining that the service instance type to which the target test instance belongs is a general type;
and if the service type included in the service type data is not null, determining that the service instance type to which the target test instance belongs is a special type.
Optionally, after the storing the case category and the target test case in the case repository, the method further includes:
when the current time reaches the test time corresponding to the test task, acquiring the target test case corresponding to the test task from the case warehouse;
according to the test frequency corresponding to the test task, executing the target test case by adopting a thread corresponding to the test task so as to obtain a test result corresponding to the test task;
and writing the test result into a test report corresponding to the target software.
Optionally, the method further comprises:
if the verification result is failed, the verification result and the failed reason are sent to the user terminal, so that the user terminal displays prompt information, wherein the prompt information comprises information for prompting the user that the target test case is failed in verification and the failed reason.
Optionally, the method further comprises:
receiving a correction request aiming at the target test case, which is sent by the user terminal; the correction request is sent by the user terminal under the condition that a correction button is triggered;
the target test case is sent to the user terminal, so that the user terminal displays the target test case, and the corrected target test case obtained by correcting the target test case by a user is obtained;
receiving the corrected target test case sent by the user terminal;
and verifying the corrected target test case to obtain a verification result corresponding to the corrected target test case.
In a second aspect, the present application provides a test case generating device, including:
the acquisition module is used for acquiring input data for generating a target test case; the input data comprise demand class data, interface class data, service class data and definition class data corresponding to target software, and the target test case is used for automatically testing the target software;
the first determining module is used for inputting the input data into a test case generating model after training, determining the target test case by adopting the test case generating model, and obtaining the test case generating model by adopting historical test case training;
The verification module is used for verifying the target test case to obtain a verification result corresponding to the target test case; the verification result is passed or failed;
the second determining module is used for determining the case type of the target test case if the verification result is passed;
the storage module is used for storing the case categories and the target test cases into a case warehouse correspondingly; the test cases stored in the case repository are used to perform test tasks to automatically test the target software.
In a third aspect, the present application provides an electronic device comprising: a processor and a memory and a transceiver communicatively coupled to the processor, respectively;
the memory stores computer-executable instructions; the transceiver is used for receiving and transmitting data with the user terminal;
the processor executes computer-executable instructions stored in the memory to implement the test case generation method according to any one of the first aspects.
In a fourth aspect, the present application provides a computer-readable storage medium having stored therein computer-executable instructions for implementing the test case generation method of any one of the first aspects when executed by a processor.
In a fifth aspect, the present application provides a computer program product comprising computer-executable instructions which, when executed by a processor, implement the test case generation method of any of the first aspects.
According to the test case generation method, the device, the equipment and the storage medium, input data corresponding to target software are input into the test case generation model, so that the target test case is generated by means of the test case generation model, then the target test case is checked to check whether the target test case meets the conditions or not, a check result is obtained, if the check result is passing, the fact that the target test case meets the conditions can be used for automatic test, the type of the case to which the target test case belongs is determined, the type of the case corresponds to the target test case and is stored in the case warehouse, and the generated target test case is checked, so that the checked target test case can be stored, the stability of the test case in the case warehouse is improved, and the phenomenon that the test result is problematic due to the fact that the test case does not meet the conditions is reduced.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
FIG. 1 is a schematic diagram of an application scenario shown in accordance with an exemplary embodiment;
FIG. 2 is a flow diagram illustrating a test case generation method according to an example embodiment;
FIG. 3 is a flow chart diagram illustrating a test case generation method according to another exemplary embodiment;
FIG. 4 is a schematic diagram illustrating a test case generation flow according to an example embodiment;
FIG. 5 is a schematic diagram of a test case generating device according to an example embodiment;
fig. 6 is a schematic diagram of an electronic device according to an exemplary embodiment.
Specific embodiments of the present application have been shown by way of the above drawings and will be described in more detail below. The drawings and the written description are not intended to limit the scope of the inventive concepts in any way, but rather to illustrate the inventive concepts to those skilled in the art by reference to the specific embodiments.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the application. Rather, they are merely examples of apparatus and methods consistent with aspects of the application as detailed in the accompanying claims.
In the following description of the embodiments, the meaning of "a plurality" is two or more, unless explicitly defined otherwise.
It should be noted that, the user information (including but not limited to user equipment information, user personal information, etc.) and the data (including but not limited to data for analysis, stored data, presented data, etc.) related to the present application are information and data authorized by the user or fully authorized by each party, and the collection, use and processing of the related data need to comply with related laws and regulations and standards, and provide corresponding operation entries for the user to select authorization or rejection.
It should be noted that the test case generating method, device, equipment and storage medium of the present application may be used in the field of artificial intelligence, and may also be used in any field other than the field of artificial intelligence.
For a clear understanding of the technical solutions of the present application, the prior art solutions will be described in detail first.
In the development and operation and maintenance of financial software, an automatic test is an important stage, and generally, the financial software is automatically tested by using a test case, so that the test coverage of various possible abnormal business scenes is realized, and the quality of the financial software is improved.
In the related art, a test engineer usually writes a test case manually according to a service scenario of financial software (also called a test case), however, manually writing the test case takes more manpower, has higher manpower cost and lower efficiency. Meanwhile, the problems of incomplete coverage of test points, long test flow, slow test progress and the like are easy to occur. With the continuous development of machine learning technology, the use of machine learning technology to generate test cases has become a new trend. The machine learning model is trained through a large number of sample test cases, so that the machine learning model learns the grammar and the structure of the sample test cases, and the test cases can be automatically generated.
However, the machine learning model has limitations, so that the generated test cases may have errors and cannot be directly used for automatic testing.
Aiming at the problem that the test case generated by the machine learning model in the prior art can not be directly used for automatic test due to the fact that errors exist in the test case, the inventor finds out in the research, in order to solve the problem, after the test case is generated by the machine learning model, the test case can be checked to check whether the test case meets the conditions, and the test case is used for automatic test only when the test case meets the conditions, so that the phenomenon that the test result of the automatic test is problematic due to the problem of the test case is greatly reduced. Specifically, input data corresponding to target software is input into a test case generation model, so that a target test case is generated by means of the test case generation model, then the target test case is checked to check whether the target test case meets the conditions or not, a check result is obtained, if the check result is passed, the test result indicates that the target test case meets the conditions, the test result can be used for automatic test, the type of the case to which the target test case belongs is determined, and then the type of the case corresponds to the target test case and is stored in a case warehouse.
The application scenario of the test case generation method provided by the embodiment of the application is introduced below.
Fig. 1 is a schematic diagram of an application scenario shown according to an example embodiment. As shown in fig. 1, the application scenario includes: a user terminal 1 and an electronic device 2. The devices are connected through a network.
The user terminal 1 is a device used by a tester or a developer of software, and the electronic device 2 is used for providing background services for the user terminal 1. In this embodiment, the electronic device 2 is used to generate test cases, and the test cases are used to automatically test software. Alternatively, the electronic device 2 is a server or a cluster of servers.
In this embodiment, a user inputs input data for generating a target test case through a user terminal 1, triggers the user terminal 1 to send a test case generation request to the electronic device 2, where the test case generation request carries the input data for generating the target test case, accordingly, the electronic device 2 obtains the input data for generating the target test case, inputs the input data into a trained test case generation model, determines the target test case by using the test case generation model, and checks the target test case to obtain a check result corresponding to the target test case, if the check result is passed, determines a case class to which the target test case belongs, stores the case class and the target test case in a case warehouse, and then sends the case generation result to the user terminal 1 to enable the user terminal 1 to display the case generation result, so that the user can check conveniently, and the case generation result is used for representing that the target test case has been generated and is stored in the case warehouse. In this way, the electronic device 2 can acquire the target test case from the case warehouse subsequently, automatically test the target software, and the user can trigger the user terminal 1 to acquire the target test case from the electronic device 2 so as to check the target test case.
It should be noted that, the present application is described by taking a development and operation scenario of financial software as an example, but the present application is not limited to an application scenario, and the application scenario may also be development and operation scenario of other types of software, such as social software, game software, office software, etc.
The following describes the technical scheme of the present application and how the technical scheme of the present application solves the above technical problems in detail with specific embodiments. The following embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
FIG. 2 is a flow diagram illustrating a test case generation method according to an example embodiment. The execution main body of the test case generation method provided by the application is a test case generation device which is integrated in the electronic equipment. As shown in fig. 2, the test case generating method provided in this embodiment includes the following steps:
step S101, obtaining input data for generating a target test case; the input data comprise demand class data, interface class data, service class data and definition class data corresponding to the target software, and the target test case is used for automatically testing the target software.
The target test case is a test case to be generated. And the target software is any software to be tested, such as financial software, social software or game software. In this embodiment, the electronic device is connected to the user terminal through a network, where the user terminal is a device used by a tester or a developer of the target software, the electronic device is used to provide a background service for the user terminal, and the electronic device is used to generate a test case, and the test case is used to automatically test the target software. In this embodiment, a user inputs input data for generating a target test case through a user terminal, and triggers the user terminal to send a test case generation request to an electronic device, where the test case generation request carries the input data for generating the target test case, and accordingly, the electronic device obtains the input data for generating the target test case. Optionally, the user triggers the user terminal to display a test case generating interface, at least one test case generating button respectively corresponding to the software is displayed in the test case generating interface, the at least one software comprises target software, the user triggers the test case generating button corresponding to the target software, the user terminal detects that the test case generating button is triggered, acquires input data for generating the target test case, and sends a test case generating request to the electronic device. The style of the test case generation button may be set as required, which is not limited in this embodiment.
The input data comprises demand class data, interface class data, service class data and definition class data corresponding to the target software. The requirement class data comprises data representing test requirements, such as user stories (requirement cards), requirement documents (PRD documents) and requirement specifications. The interface class data includes data for describing an interface, such as an interface document. The service class data comprises data used for representing services with test requirements, such as a service model, a service object, a service type and the like, and the definition class data comprises data definition description or other custom description and the like. The data definition description is a definition description of data used in the target test case by a pointer. The input data may further include other types of data set according to the user requirement, which is not limited in this embodiment.
Step S102, inputting input data into a test case generation model after training, determining a target test case by adopting the test case generation model, and training the test case generation model by adopting a historical test case.
In this embodiment, a test case generation model after training is completed is deployed in the electronic device, and accordingly, after input data is acquired by the electronic device, the test case generation model is called, the input data is input into the test case generation model, and the test case generation model outputs a target test case.
The test case generation model is obtained by training a historical test case, and specific training process refers to a training process of a model for generating the test case in a related technology, which is not described herein. The test case generating model is a machine learning model, for example, a Knowledge-enhanced large language model (knowledges-Enhanced Large Language Model, abbreviated as KELLM) or other machine learning models. Taking a large language model as an example, carrying out description normalization, dialogue generation, case divergence and other steps according to input data by the large language model to generate a literal test case, and then generating a unit test case on the basis of the literal test case.
Step S103, verifying the target test case to obtain a verification result corresponding to the target test case; the verification result is pass or fail.
In this embodiment, after the target test case is generated, the target test case is checked to check whether the target test case meets the condition, so as to obtain a check result corresponding to the target test case. For example, the verification result may be represented by 0 or 1, where 0 indicates failed and 1 indicates passed.
Step S104, if the verification result is passed, determining the case type to which the target test case belongs.
In this embodiment, if the verification result is that the verification result is passed, it indicates that the verification of the target test case is passed, and then the class of the case to which the target test case belongs is determined, so as to implement classification of the target test case. Wherein the use case category is a general category or a special category. The general category representation is irrelevant to the service scene, any service scene can be automatically tested by using the test cases belonging to the general category, the special category representation is relevant to the service scene, and the test cases belonging to the special category can only automatically test the service scene corresponding to the special category.
Step S105, storing the case categories and the target test cases in a case warehouse correspondingly; the test cases stored in the case repository are used to perform test tasks to automatically test the target software.
The case warehouse is a database for storing test cases. Alternatively, the use case repository is deployed on the electronic device, or the use case repository is deployed separately, and the user repository is connected to the electronic device through a network. In this embodiment, after determining the case category to which the target test case belongs, the case category and the target test case are stored in the case repository in correspondence, so as to implement management of the test case, where the management includes verification, classification and warehousing.
Optionally, after step S105, the electronic device may further send a use case generation result to the user terminal, where the use case generation result is used to indicate that the target test case has been generated and is stored in the use case repository. Accordingly, the user terminal displays the use case generation result so as to be convenient for the user to check. The user can also trigger the user terminal to acquire the target test case from the electronic device so as to check the target test case. When the user terminal displays the case generation result, the user terminal also displays a viewing button, the user triggers the viewing button, the user terminal responds to the triggering of the viewing button and sends an acquisition request to the electronic equipment, the electronic equipment responds to the receiving of the acquisition request, acquires the target test case and sends the target test case to the user terminal, and therefore the user terminal receives and displays the target test case for the user to view.
In the embodiment, input data corresponding to target software is input into a test case generation model, so that the target test case is generated by means of the test case generation model, then the target test case is checked to check whether the target test case meets the conditions or not, a check result is obtained, if the check result is passed, the test result indicates that the target test case meets the conditions, the test result can be used for automatic test, the type of the case to which the target test case belongs is determined, and then the type of the case corresponds to the target test case and is stored in a case warehouse.
Fig. 3 is a flowchart of a test case generating method according to another exemplary embodiment, and as shown in fig. 3, the test case generating method provided in this embodiment further refines steps S103 and S104 respectively based on the test case generating method provided in the previous embodiment of the present application. The test case generation method provided by the embodiment comprises the following steps:
step S201, obtaining input data for generating a target test case; the input data comprise demand class data, interface class data, service class data and definition class data corresponding to the target software, and the target test case is used for automatically testing the target software.
Step S202, inputting input data into a test case generation model after training, determining a target test case by adopting the test case generation model, and training the test case generation model by adopting a historical test case.
In this embodiment, the implementation manner of steps S201 to S202 is the same as that of steps S101 to S102 in the previous embodiment, and will not be described here again.
In this embodiment, after the target test case is generated, the electronic device performs verification on the target test case to obtain a verification result corresponding to the target test case. In an alternative embodiment, the implementation of this step includes the following steps S203-S204.
Step S203, a plurality of preset verification modes are adopted to respectively verify the target test cases so as to obtain verification results respectively corresponding to the preset verification modes.
The preset verification mode is a preset verification mode. The priorities of the preset verification modes are the same, and the target test cases can be verified by adopting the preset verification modes in parallel, or can be verified by adopting the preset verification modes in sequence according to the preset sequence. The preset sequence may be set as needed, which is not limited in this embodiment. For example, if the plurality of preset verification methods includes the method 1, the method 2, and the method 3, the preset sequence may be the method 1, the method 2, and the method 3.
Optionally, the plurality of preset checking modes include naming check, field non-null check and parameter use check. Correspondingly, a plurality of preset verification modes are adopted to respectively verify the target test cases, and the implementation mode for obtaining the verification results respectively corresponding to the preset verification modes comprises the following steps:
when the preset verification mode is command verification, verifying whether the name of the target test case accords with the preset naming standard, if so, determining that the verification result corresponding to the command verification is passed, otherwise, determining that the verification result does not pass. The electronic device stores a preset naming specification in advance, and the preset naming specification can be set according to needs, which is not limited in this embodiment, for example, the electronic device cannot contain non-character content, or the name needs to contain keywords. In this embodiment, the electronic device queries the test case according to the case name, so that if the name meets the preset naming standard, the electronic device can be helped to quickly query the test case.
When the preset checking mode is field non-empty checking, checking whether necessary fields in the target test case are non-empty, if so, determining that a checking result corresponding to the field non-empty checking is passing, otherwise, determining that the checking result is not passing. The electronic device stores necessary fields in advance, which may be set according to needs, and the embodiment is not limited to this, and the necessary fields include fields such as a case description field, a test tool field, and the like. In this embodiment, the field non-null verification is performed on the target test case, so as to implement verification on the integrity of the target test case. If the number of the necessary fields is plural, if all of the plural necessary fields are not empty, the check result corresponding to the field non-empty check is determined to be passed, otherwise, the check result is not passed.
When the preset verification mode is parameter use verification, verifying whether each parameter in the target test case is marked with a corresponding use, if so, determining that a verification result corresponding to parameter use verification is passed, and otherwise, determining that the verification result is failed. The target test case comprises a plurality of parameters, and each parameter needs to be used for explaining the corresponding application, so that a user can know the parameters, and the user can be assisted in checking the reasons of the execution failure of the case to quickly position whether the parameters of the case meet expectations.
It should be noted that, in addition to the above three preset verification methods, the user may set other preset verification methods as required, which is not limited in this embodiment.
In this embodiment, three preset verification modes are provided, and each preset verification mode is adopted to verify the target test case, so that verification results corresponding to each preset verification mode are obtained, multi-aspect multi-angle verification of the target test case is realized, and the comprehensiveness of verification is improved.
Step S204, if the verification results corresponding to the preset verification modes are all passed, determining that the verification result corresponding to the target test case is passed, otherwise, determining that the verification result does not pass.
Under the condition that the verification results corresponding to the preset verification modes are all passed, the target test case is qualified in the angle aimed by the preset verification modes, and then the target test case can be considered to pass the verification. And if the check result corresponding to any one of the preset check modes is failed, the target test case is not qualified in the angle aimed at by the preset check mode, and the target test case can be considered to be failed in check.
It should be noted that, even if the verification result obtained by verifying the target test case by adopting one preset verification mode is not passed, the remaining preset verification modes are required to be adopted to verify the target test case, in other words, each preset verification mode is adopted to verify the target test case, and then the verification result corresponding to the target test case is determined.
In this embodiment, a verification scheme for a target test case is provided, and a plurality of preset verification modes are adopted to respectively verify the target test case, so that under the condition that verification results respectively corresponding to the preset verification modes are all passed, the verification result corresponding to the target test case is determined to be passed, and thus, the comprehensive determination of the verification result can be achieved, and the determined verification result is accurate.
In this embodiment, if the verification result is passed, the electronic device executes step S205-step S206.
Step S205, if the verification result is passed, determining the case type to which the target test case belongs.
In this embodiment, if the verification result is that the verification result is passed, it indicates that the verification of the target test case is passed, and then the class of the case to which the target test case belongs is determined, so as to implement classification of the target test case. Wherein the use case category is a general category or a special category. The general category representation is irrelevant to the service scene, any service scene can be automatically tested by using the test cases belonging to the general category, the special category representation is relevant to the service scene, and the test cases belonging to the special category can only automatically test the service scene corresponding to the special category.
Optionally, the implementation manner of determining the case category to which the target test case belongs includes: if the service type included in the service type data is null, determining that the service instance type to which the target test instance belongs is a general type; if the service type included in the service type data is not null, determining that the service instance type to which the target test instance belongs is a special type.
The service type is a service scene for which the null indication target test case does not exist, and the class of the service case to which the target test case belongs can be considered as a general class; and if the service type is not empty, the service scenario for which the target test case exists is determined to be the special type.
In this embodiment, by classifying the target test cases by using the service types included in the service class data, it can be determined whether the target test cases belong to a general class or a specific class, and classification of the test cases is achieved.
Step S206, storing the case categories and the target test cases in a case warehouse correspondingly; the test cases stored in the case repository are used to perform test tasks to automatically test the target software.
In this embodiment, after determining the case category to which the target test case belongs, the case category is stored in the case repository in correspondence with the target test case.
Optionally, after storing the case category and the target test case in the case repository, the test case generating method provided by the application further includes the following steps: when the current time reaches the test time corresponding to the test task, acquiring a target test case corresponding to the test task from a case warehouse; according to the test frequency corresponding to the test task, executing the target test case by adopting a thread corresponding to the test task so as to obtain a test result corresponding to the test task; and writing the test result into a test report corresponding to the target software.
The electronic device configures a test task in advance, and configures a test time and a test frequency corresponding to the test task, wherein the test time refers to the time for starting to execute the test task, the test frequency refers to the frequency for executing the test task, and in addition, an execution node corresponding to the test task can be configured. When the current time reaches the test time corresponding to the test task, acquiring a target test case corresponding to the test task from a case warehouse, and executing the target test case by adopting a thread corresponding to the test task. Under the condition that the number of test cases in the case warehouse is large, a plurality of test tasks can be configured, the electronic equipment executes the test cases in a multithread parallel execution mode, for example, a test task thread pool is built, and the test cases are executed by threads in the thread pool. After the electronic device obtains the test report, the electronic device sends the test report to the user terminal so that the user can check the test report through the user terminal.
In the embodiment, the test case execution scheme is provided, and the thread is adopted to execute the target test case in a test task mode, so that the ordered execution of the test case is realized, and the stability of a test process is ensured.
For example, fig. 4 is a schematic diagram of a Test Case generation flow shown in an exemplary embodiment, and as shown in fig. 4, an electronic device obtains input data, where the input data includes data such as a user story, a requirement document, an interface document, and a custom description, and then inputs the input data into a Test Case generation model, and the Test Case generation model generates a plurality of Test cases "Case1, case2, and Case3" by describing normalization, dialog generation, and Case divergence, and then checks, classifies, and stores the Test cases, and then executes the Test cases by using threads "Test Task1, test Task2, and Test Task3" in a Test Task thread pool, thereby obtaining a Test result. And performing configuration of the test tasks and scheduling of the test tasks in advance to obtain a test task thread pool.
The above embodiment is described by taking the verification result as a passing example, and in other embodiments, the verification result is failed, and correspondingly, the test case generating method provided by the application further includes the following steps: if the verification result is failed, sending the verification result and the failed reason to the user terminal so that the user terminal can display prompt information, wherein the prompt information comprises information for prompting the user to check the failed test case and the failed reason. The failed reason may be a failed preset check mode. For example, if the naming check is not passed, the reason for the failure is that the naming check is not passed.
In this embodiment, when the verification result is failed, through data interaction with the user terminal, the user terminal can display prompt information, so that the user is prompted through the prompt information, and the user can check the verification result of the target test case and learn the failed reason.
Optionally, the test case generating method provided by the application further comprises the following steps: when the user terminal displays the prompt information, a correction button is also displayed, and the correction button is used for triggering correction of the test case; the user terminal responds to the triggering of the correction button and sends a correction request aiming at the target test case to the electronic equipment; accordingly, the electronic equipment receives a correction request for the target test case sent by the user terminal and sends the target test case to the user terminal; the user terminal receives the target test case and displays the target test case so that the user corrects the target test case; the user terminal acquires the corrected target test case and sends the target test case to the electronic equipment; accordingly, the electronic device receives the corrected target test case and verifies the corrected target test case to obtain a verification result corresponding to the corrected target test case.
The style of the correction button may be set as required, which is not limited in this embodiment. The implementation manner of verifying the corrected target test case to obtain the verification result corresponding to the corrected target test case is referred to step S204, and is not described herein. If the verification result is passed, the above steps S205 to S206 are continued.
In this embodiment, under the condition that the user checks the prompt information, there may be a need for correcting the target test case, and by displaying the correction button, a manner of triggering the correction by the user is provided, so that by displaying the target test case, the user is convenient to directly correct the target test case, after the corrected target test case is obtained, whether the corrected target test case meets the condition is still determined by adopting a verification manner, and the target test case is perfected by adopting a manual correction manner.
The application provides a scheme for automatically generating the test cases, releases manpower, releases the manpower from repeated test case writing and test execution, automatically generates the test cases with comprehensive coverage by using a strong machine learning model, avoids manual omission, and is beneficial to quickly verifying the accuracy of codes by automatic test, and the generation of test results is beneficial to quickly and intuitively knowing the quality of the codes. In order to ensure the accuracy of the use cases generated by the model, the application provides multidimensional data input, and classifies the test use cases output by the model: the method comprises the steps of including general use cases and special use cases, wherein the general use cases cover a comprehensive scene, the special use cases are closely related to the service, and meanwhile, a verification link and a manual correction link are added, so that high availability of the use cases is ensured. The method provides a flexible and configurable mode for automatically executing the test tasks, is easier to use, adopts multi-thread scheduling, obtains test results faster, and finally performs unified analysis processing on the test results to generate a test report. And the code function can be conveniently and quickly verified, and the problem can be quickly found.
Fig. 5 is a schematic structural diagram of a test case generating device according to an exemplary embodiment, and as shown in fig. 5, in this embodiment, the test case generating device 300 may be disposed in an electronic device, and the test case generating device 300 includes:
an obtaining module 301, configured to obtain input data for generating a target test case; the input data comprise demand class data, interface class data, service class data and definition class data corresponding to the target software, and the target test case is used for automatically testing the target software;
the first determining module 302 is configured to input data into a test case generating model after training, determine a target test case by using the test case generating model, where the test case generating model is obtained by using historical test case training;
the verification module 303 is configured to verify the target test case to obtain a verification result corresponding to the target test case; checking whether the result is passed or not;
the second determining module 304 is configured to determine a case class to which the target test case belongs if the verification result is passed;
a storage module 305, configured to store the case type and the target test case in a case repository; the test cases stored in the case repository are used to perform test tasks to automatically test the target software.
Optionally, the verification module 303 is configured to:
respectively checking the target test cases by adopting a plurality of preset checking modes to obtain checking results respectively corresponding to the preset checking modes;
if the verification results corresponding to the preset verification modes are all passed, determining that the verification result corresponding to the target test case is passed, otherwise, determining that the verification result does not pass.
Optionally, the verification module 303 is configured to:
when the preset verification mode is command verification, verifying whether the name of the target test case accords with the preset naming standard, if so, determining that the verification result corresponding to the command verification is passed, otherwise, determining that the verification result does not pass;
when the preset checking mode is field non-empty checking, checking whether necessary fields in the target test case are non-empty, if so, determining that a checking result corresponding to the field non-empty checking is passing, otherwise, determining that the checking result is not passing;
when the preset verification mode is parameter use verification, verifying whether each parameter in the target test case is marked with a corresponding use, if so, determining that a verification result corresponding to parameter use verification is passed, and otherwise, determining that the verification result is failed.
Optionally, the second determining module 304 is configured to:
if the service type included in the service type data is null, determining that the service instance type to which the target test instance belongs is a general type;
if the service type included in the service type data is not null, determining that the service instance type to which the target test instance belongs is a special type.
Optionally, the apparatus 300 further includes a testing module, configured to:
when the current time reaches the test time corresponding to the test task, acquiring a target test case corresponding to the test task from a case warehouse;
according to the test frequency corresponding to the test task, executing the target test case by adopting a thread corresponding to the test task so as to obtain a test result corresponding to the test task;
and writing the test result into a test report corresponding to the target software.
Optionally, the apparatus 300 further comprises:
and the sending module is used for sending the verification result and the failed reason to the user terminal if the verification result is failed, so that the user terminal displays prompt information, wherein the prompt information comprises information for prompting the user to verify the failed test case and the failed reason.
Optionally, the apparatus 300 further comprises:
the receiving module is used for receiving a correction request aiming at the target test case and sent by the user terminal; the correction request is sent by the user terminal under the condition that the correction button is triggered;
The sending module is further used for sending the target test case to the user terminal so that the user terminal can display the target test case and acquire a corrected target test case obtained by correcting the target test case by a user;
the receiving module is also used for receiving the corrected target test case sent by the user terminal;
and the verification module 303 is configured to verify the modified target test case to obtain a verification result corresponding to the modified target test case.
The test case generating device provided in this embodiment may execute the technical scheme of the corresponding method embodiment, and its implementation principle and technical effects are similar to those of the corresponding method embodiment, and are not described herein again.
The embodiment of the application also provides electronic equipment. The electronic device is intended for various forms of servers, such as servers, server clusters, distributed servers, or other types of servers.
Fig. 6 is a schematic diagram of an electronic device according to an exemplary embodiment. As shown in fig. 6, the electronic device 400 includes: a processor 401, and a memory 402 and a transceiver 403 communicatively coupled to the processor 401, respectively.
Memory 402 stores computer-executable instructions; the transceiver 403 is used for receiving and transmitting data with the user terminal;
The processor 401 executes computer-executable instructions stored in the memory 402 to implement the test case generating method provided by the present application.
In this embodiment of the present application, the memory 402 and the transceiver 403 are respectively connected to the processor 401 through a bus. The bus may be an industry standard architecture (Industry Standard Architecture, ISA) bus, an external device interconnect (Peripheral Component Interconnect, PCI) bus, or an extended industry standard architecture (Extended Industry Standard Architecture, EISA) bus, among others. The buses may be divided into address buses, data buses, control buses, etc. For ease of illustration, the figures are shown with only one bold line, but not with only one bus or one type of bus.
The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the applications described and/or claimed herein. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired.
In an exemplary embodiment, there is also provided a computer-readable storage medium having stored therein computer-executable instructions for implementing the test case generating method provided by the present application when executed by a processor.
In an exemplary embodiment, a computer program product is also provided, comprising computer-executable instructions for implementing the test case generating method provided by the application when the computer-executable instructions in the computer program product are executed by a processor.
It should be noted that, for simplicity of description, the foregoing method embodiments are all described as a series of acts, but it should be understood by those skilled in the art that the present application is not limited by the order of acts described, as some steps may be performed in other orders or concurrently in accordance with the present application. Further, those skilled in the art will also appreciate that the embodiments described in the specification are alternative embodiments, and that the acts and modules referred to are not necessarily required for the present application.
It should be further noted that, although the steps in the flowchart are sequentially shown as indicated by arrows, the steps are not necessarily sequentially performed in the order indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly stated herein, and the steps may be executed in other orders. Moreover, at least a portion of the steps in the flowcharts may include a plurality of sub-steps or stages that are not necessarily performed at the same time, but may be performed at different times, the order in which the sub-steps or stages are performed is not necessarily sequential, and may be performed in turn or alternately with at least a portion of the sub-steps or stages of other steps or other steps.
It will be appreciated that the device embodiments described above are merely illustrative and that the device of the application may be implemented in other ways. For example, the division of the units/modules in the above embodiments is merely a logic function division, and there may be another division manner in actual implementation. For example, multiple units, modules, or components may be combined, or may be integrated into another system, or some features may be omitted or not performed.
In addition, each functional unit/module in each embodiment of the present application may be integrated into one unit/module, or each unit/module may exist alone physically, or two or more units/modules may be integrated together, unless otherwise specified. The integrated units/modules described above may be implemented either in hardware or in software program modules.
The integrated units/modules, if implemented in hardware, may be digital circuits, analog circuits, etc. Physical implementations of hardware structures include, but are not limited to, transistors, memristors, and the like. Unless otherwise indicated, the processor may be any suitable hardware processor, such as a central processing unit (Central Processing Unit, CPU), a graphics processor (Graphics Processing Unit, GPU), an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), a digital signal processor (Digital Signal Processor, DSP), a programmable logic device (Programmable Logic Device, PLD), a field programmable gate array (Field Programmable Gate Array, FPGA), a controller, microcontroller, microprocessor, or other electronic element. Unless specifically stated otherwise, the Memory may be implemented by any type of volatile or non-volatile Memory device or combination thereof, such as a U disk, random-Access Memory (RAM), static Random-Access Memory (SRAM), dynamic Random-Access Memory (DRAM), enhanced Dynamic Random-Access Memory (Enhanced Dynamic Random-Access Memory, EDRAM), electrically erasable programmable Read-Only Memory (Electrically Erasable Programmable Read-Only Memory, EEPROM), erasable programmable Read-Only Memory (Erasable Programmable Read-Only Memory, EPROM), programmable Read-Only Memory (Programmable Read-Only Memory, PROM), read-Only Memory (ROM), resistive Memory (Resistive Random Access Memory, RRAM), high Bandwidth Memory (High Bandwidth Memory), hybrid Memory cube HMC (Hybrid Memory Cube), magnetic hard disk, flash Memory, optical disk, removable disk, or various other programmable code storage media.
The integrated units/modules may be stored in a computer readable memory if implemented in the form of software program modules and sold or used as a stand-alone product. Based on this understanding, the technical solution of the present application may be embodied essentially or partly in the form of a software product or all or part of the technical solution, which is stored in a memory, comprising instructions for causing an electronic device to perform all or part of the steps of the method of the various embodiments of the application.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments. The technical features of the foregoing embodiments may be arbitrarily combined, and for brevity, all of the possible combinations of the technical features of the foregoing embodiments are not described, however, all of the combinations of the technical features should be considered as being within the scope of the disclosure.
Other embodiments of the application will be apparent to those skilled in the art from consideration of the specification and practice of the application disclosed herein. This application is intended to cover any variations, uses, or adaptations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It is to be understood that the application is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (10)

1. A test case generation method, comprising:
acquiring input data for generating a target test case; the input data comprise demand class data, interface class data, service class data and definition class data corresponding to target software, and the target test case is used for automatically testing the target software;
inputting the input data into a test case generation model after training, determining the target test case by adopting the test case generation model, wherein the test case generation model is obtained by adopting historical test case training;
verifying the target test case to obtain a verification result corresponding to the target test case; the verification result is passed or failed;
if the verification result is passed, determining the class of the case to which the target test case belongs;
storing the case categories and the target test cases in a case warehouse correspondingly; the test cases stored in the case repository are used to perform test tasks to automatically test the target software.
2. The method of claim 1, wherein the verifying the target test case to obtain the verification result corresponding to the target test case comprises:
a plurality of preset verification modes are adopted to respectively verify the target test cases so as to obtain verification results respectively corresponding to the preset verification modes;
if the verification results corresponding to the preset verification modes are all passed, determining that the verification result corresponding to the target test case is passed, otherwise, determining that the verification result does not pass.
3. The method of claim 2, wherein the verifying the target test case by using a plurality of preset verification modes to obtain verification results respectively corresponding to the preset verification modes respectively includes:
when the preset verification mode is command verification, verifying whether the name of the target test case accords with a preset naming specification, if so, determining that a verification result corresponding to the command verification is passed, otherwise, determining that the verification result does not pass;
when the preset checking mode is field non-empty checking, checking whether necessary fields in the target test case are non-empty, if so, determining that a checking result corresponding to the field non-empty checking is passing, otherwise, determining that the checking result is not passing;
When the preset verification mode is parameter use verification, verifying whether each parameter in the target test case is marked with a corresponding use, if so, determining that a verification result corresponding to the parameter use verification is passed, and otherwise, determining that the verification result is failed.
4. A method according to any one of claims 1-3, wherein said determining a class of use cases to which the target test case belongs comprises:
if the service type included in the service type data is null, determining that the service instance type to which the target test instance belongs is a general type;
and if the service type included in the service type data is not null, determining that the service instance type to which the target test instance belongs is a special type.
5. A method according to any of claims 1-3, wherein after storing the case categories in the case repository corresponding to the target test cases, the method further comprises:
when the current time reaches the test time corresponding to the test task, acquiring the target test case corresponding to the test task from the case warehouse;
according to the test frequency corresponding to the test task, executing the target test case by adopting a thread corresponding to the test task so as to obtain a test result corresponding to the test task;
And writing the test result into a test report corresponding to the target software.
6. A method according to any one of claims 1-3, wherein the method further comprises:
if the verification result is not passed, sending the verification result and the reason of the failed verification to a user terminal so that the user terminal displays prompt information; the prompt information comprises information for prompting the user that the target test case is not passed and the reason for the failed test case.
7. The method of claim 6, wherein the method further comprises:
receiving a correction request aiming at the target test case, which is sent by the user terminal; the correction request is sent by the user terminal under the condition that a correction button is triggered;
the target test case is sent to the user terminal, so that the user terminal displays the target test case, and the corrected target test case obtained by correcting the target test case by a user is obtained;
receiving the corrected target test case sent by the user terminal;
and verifying the corrected target test case to obtain a verification result corresponding to the corrected target test case.
8. A test case generating apparatus, comprising:
the acquisition module is used for acquiring input data for generating a target test case; the input data comprise demand class data, interface class data, service class data and definition class data corresponding to target software, and the target test case is used for automatically testing the target software;
the first determining module is used for inputting the input data into a test case generating model after training, determining the target test case by adopting the test case generating model, and obtaining the test case generating model by adopting historical test case training;
the verification module is used for verifying the target test case to obtain a verification result corresponding to the target test case; the verification result is passed or failed;
the second determining module is used for determining the case type of the target test case if the verification result is passed;
the storage module is used for storing the case categories and the target test cases into a case warehouse correspondingly; the test cases stored in the case repository are used to perform test tasks to automatically test the target software.
9. An electronic device, comprising: a processor and a memory and a transceiver communicatively coupled to the processor, respectively;
the memory stores computer-executable instructions; the transceiver is used for receiving and transmitting data with the user terminal;
the processor executes computer-executable instructions stored in the memory to implement the test case generation method of any one of claims 1 to 7.
10. A computer readable storage medium having stored therein computer executable instructions which when executed by a processor are for implementing the test case generation method of any of claims 1 to 7.
CN202311061883.2A 2023-08-22 2023-08-22 Test case generation method, device, equipment and storage medium Pending CN117093484A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311061883.2A CN117093484A (en) 2023-08-22 2023-08-22 Test case generation method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311061883.2A CN117093484A (en) 2023-08-22 2023-08-22 Test case generation method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117093484A true CN117093484A (en) 2023-11-21

Family

ID=88772940

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311061883.2A Pending CN117093484A (en) 2023-08-22 2023-08-22 Test case generation method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117093484A (en)

Similar Documents

Publication Publication Date Title
CN108073519B (en) Test case generation method and device
CN108427632B (en) Automatic test method and device
CN108021505B (en) Data online method and device and computer equipment
CN111026670B (en) Test case generation method, test case generation device and storage medium
WO2006007588A2 (en) Method and system for test case generation
US10664644B1 (en) Method and apparatus for schematic verification of electronic circuits
US20210048999A1 (en) Automated generation of status chains for software updates
CN110688111A (en) Configuration method, device, server and storage medium of business process
CN114443503A (en) Test case generation method and device, computer equipment and storage medium
CN110750434A (en) Interface testing method and device, electronic equipment and computer readable storage medium
CN112395182A (en) Automatic testing method, device, equipment and computer readable storage medium
CN112052157B (en) Method, device and system for constructing test message
US20120310849A1 (en) System and method for validating design of an electronic product
CN113778878A (en) Interface testing method and device, electronic equipment and storage medium
CN112860587A (en) UI automatic test method and device
CN117093484A (en) Test case generation method, device, equipment and storage medium
CN116150020A (en) Test case conversion method and device
CN116204396A (en) Test method and device for performance of analytical database
CN112242177A (en) Memory testing method and device, computer readable storage medium and electronic equipment
CN115712571A (en) Interactive service test device, interactive service test device, computer equipment and storage medium
CN115248783A (en) Software testing method, system, readable storage medium and computer equipment
CN111679924B (en) Reliability simulation method and device for componentized software system and electronic equipment
CN111666301B (en) Service interface testing method, computer device and storage medium
CN116401177B (en) DDL correctness detection method, device and medium
CN116991706B (en) Vehicle automatic test method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination