CN117435513B - Test case generation method, terminal equipment and computer readable storage medium - Google Patents

Test case generation method, terminal equipment and computer readable storage medium Download PDF

Info

Publication number
CN117435513B
CN117435513B CN202311766114.2A CN202311766114A CN117435513B CN 117435513 B CN117435513 B CN 117435513B CN 202311766114 A CN202311766114 A CN 202311766114A CN 117435513 B CN117435513 B CN 117435513B
Authority
CN
China
Prior art keywords
test
test case
data
request
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311766114.2A
Other languages
Chinese (zh)
Other versions
CN117435513A (en
Inventor
章丽
丁会燕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Zhicheng Software Technology Service Co ltd
Shenzhen Smart City Technology Development Group Co ltd
Original Assignee
Shenzhen Zhicheng Software Technology Service Co ltd
Shenzhen Smart City Technology Development Group Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Zhicheng Software Technology Service Co ltd, Shenzhen Smart City Technology Development Group Co ltd filed Critical Shenzhen Zhicheng Software Technology Service Co ltd
Priority to CN202311766114.2A priority Critical patent/CN117435513B/en
Publication of CN117435513A publication Critical patent/CN117435513A/en
Application granted granted Critical
Publication of CN117435513B publication Critical patent/CN117435513B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3696Methods or tools to render software testable
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The application is applied to the technical field of software testing, and discloses a test case generation method, terminal equipment and a computer readable storage medium, wherein the method comprises the following steps: acquiring interface request data, and cleaning the interface request data based on a preset cleaning rule to obtain interaction data corresponding to the preset cleaning rule; determining corresponding parameterized script options based on environment configuration corresponding to the interface request data; and generating a migratable test case based on the interaction data and the parameterized script options. The method solves the problem that in the interface test technology, the test cases corresponding to the interfaces of the appointed environment or path cannot be automatically generated, and achieves the effects of automatically generating the test cases and improving the test efficiency.

Description

Test case generation method, terminal equipment and computer readable storage medium
Technical Field
The present disclosure relates to the field of software testing technologies, and in particular, to a test case generating method, a terminal device, and a computer readable storage medium.
Background
The interface test technology is relatively mature, and is widely applied to software development and test in various industries, especially in the fields of Internet, finance, electronic commerce and the like. Interface testing techniques are widely used to verify the stability, security and performance of systems.
In the process of interface test, after a developer manually writes a test case with a test message, the test case is input into a test frame to execute the test case. With the increase of system complexity and interface quantity, and after each test data update, the developer needs to rewrite one test case. There is a problem in that a test case corresponding to an interface specifying an environment or a path cannot be automatically generated.
Disclosure of Invention
The embodiment of the application solves the problem that the test case corresponding to the interface of the appointed environment or path cannot be automatically generated in the interface test technology by providing the test case generation method, the terminal equipment and the computer readable storage medium, and achieves the effects of automatically generating the test case and improving the test efficiency.
The embodiment of the application provides a test case generation method, which comprises the following steps:
acquiring interface request data, and cleaning the interface request data based on a preset cleaning rule to obtain interaction data corresponding to the preset cleaning rule;
determining corresponding parameterized script options based on environment configuration corresponding to the interface request data;
and generating a migratable test case based on the interaction data and the parameterized script options.
Optionally, the step of generating the migratable test cases based on the interaction data and the parameterized script options includes:
determining the testing dimension of the interaction data, and acquiring testing data corresponding to the testing dimension;
determining parameters corresponding to the parameterized script options in the test data;
and generating a migratable test case based on the test data and the parameters.
Optionally, after the step of generating the migratable test cases based on the interaction data and the parameterized script options, the method further includes:
if a test request is received, acquiring target environment configuration of a test environment corresponding to the test request;
configuring parameter values of the parameterized script options based on the target environment configuration to configure the migratable test cases as target test cases matching the target environment configuration;
and running the target test case to respond to the test request and outputting a corresponding test result.
Optionally, the step of running the target test case to respond to the test request and outputting a corresponding test result includes:
acquiring a test field in the test request, and acquiring a reference result of the movable test case based on the test field;
operating the target test case, and acquiring an initial test result corresponding to the target test case based on the test field;
and outputting a corresponding test result based on the reference result and the initial test result.
Optionally, the step of outputting the corresponding test result based on the reference result and the initial test result includes:
acquiring a time field of the target test case, and judging whether the target test case is the latest test case or not based on the time field;
and outputting a corresponding test result based on the judgment result, the reference result and the initial test result.
Optionally, the step of outputting the corresponding test result based on the reference result and the initial test result includes:
acquiring a difference field and a value corresponding to the difference field based on the reference result and the initial test result;
and generating a difference table according to the difference field and the value corresponding to the difference field, and outputting a test result with the difference table.
Optionally, if the test request is received, the step of obtaining the target environment configuration of the test environment corresponding to the test request includes:
if a test request is received, judging whether a target test case corresponding to the test request exists in a pre-stored test case library;
if not, executing the step of acquiring the target environment configuration of the test environment corresponding to the test request, and adding the target test case into the pre-stored test case library after the target test case is obtained.
Optionally, the step of cleaning the interface request data based on a preset cleaning rule to obtain the interaction data corresponding to the preset cleaning rule includes:
determining a corresponding specified path or specified environment based on a preset cleaning rule;
and analyzing the interface request data based on the specified path or the specified environment to acquire the interaction data corresponding to the preset cleaning rule.
In addition, in order to achieve the above object, an embodiment of the present invention further provides a terminal device, including a memory, a processor, and a test case generating program stored in the memory and capable of running on the processor, where the processor implements the method as described above when executing the test case generating program.
In addition, in order to achieve the above object, an embodiment of the present invention further provides a computer-readable storage medium having stored thereon a test case generating program that, when executed by a processor, implements the method as described above.
One or more technical solutions provided in the embodiments of the present application at least have the following technical effects or advantages:
when the interface request data is received, the interface request data is automatically cleaned according to a preset cleaning rule, and interaction data corresponding to the preset cleaning rule is obtained. And acquiring environment configuration corresponding to the interface request data, and determining corresponding parameterized script options. And generating the migratable test cases according to the interaction data and the parameterized script options. Because parameters corresponding to different environments can be received through parameterized script options, test cases corresponding to different environments are constructed. The problem that the test case applied to the specified environment or the specified path cannot be automatically generated in the interface test technology is solved, and the effect of improving the interface test efficiency is achieved.
Drawings
FIG. 1 is a schematic flow chart of a first embodiment of a test case generation method of the present application;
FIG. 2 is a schematic flow chart of a second embodiment of a test case generation method of the present application;
FIG. 3 is a schematic diagram of a test flow chart of an embodiment of a test case generation method of the present application;
fig. 4 is a schematic diagram of a terminal structure of a hardware running environment according to an embodiment of the present application.
Detailed Description
In the interface test technology, the test platform generally cannot automatically generate test cases after cleaning test data, and the generated test cases cannot be applied to a specified environment or a specified path, so that the test efficiency is low. In order to solve the problem, the application provides a test case generation method, after interface request data is acquired, the interface request data is automatically cleaned based on a preset cleaning rule, and interaction data is obtained. And determining parameterized script options according to the environment configuration information, and generating the migratable test case based on the interactive data and the parameterized script options. In actual use, different environment configurations can be determined according to different parameter contents in the parameterized script options, so that the test cases can be applied to different environments or different paths, the test cases corresponding to the designated environments or the designated paths do not need to be regenerated when the interface is tested each time, and the interface testing efficiency is improved.
In order to better understand the above technical solution, exemplary embodiments of the present application will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present application are shown in the drawings, it should be understood that the present application may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
In order to better understand the above technical solutions, the following detailed description will refer to the accompanying drawings and specific embodiments.
Example 1
In this embodiment, a test case generating method is provided.
Referring to fig. 1, the test case generation method of the present embodiment includes the steps of:
step S100: acquiring interface request data, and cleaning the interface request data based on a preset cleaning rule to obtain interaction data corresponding to the preset cleaning rule;
in this embodiment, the interface request data refers to data sent to the interface, and is used to trigger a specific behavior or operation of the interface. The interface request data includes a request path, a request parameter, a request method, a preset response parameter, and the like. The interactive data refers to request data sent to the interface, and the request data comprises information such as a request method, a URL, a request header, a request body and the like.
As an alternative implementation manner, the tool may grasp the interface request information, store the request information in the information file, and then automatically analyze the data information in the information file to obtain the interface request data. The request information is stored in the information file, so that the request information can be conveniently checked and used at any time, and the request data is multiplexed. And when a certain test requirement is needed to be processed with the assistance of multiple parties, the information files can be shared, and the working efficiency and the cooperation efficiency are improved.
The request information may be grabbed by different tools, for example by the API test tool Postman, sending a request, selecting a request to save in the request history, exporting it as a JSON or other format information file. The detailed information of the HTTP request and response is obtained through the HTTP packet capturing tool Fiddler, including a request head, a request body, a response head, a response body and the like, and is stored in an SAZ or other format information file. The detailed information of HTTP and HTTPs requests is captured by the HTTP proxy tool Charles and saved as a CHLS or other formatted information file. After the request information is stored in the information file, the preset program automatically analyzes the data information in the information file to obtain interface request data, and the interface request data is used for subsequent data cleaning, test case generation or other operations, so that the working efficiency can be improved, and manual operation and errors can be reduced.
As another alternative, the data may be cleaned together in the information file. And determining a corresponding specified path or specified environment based on a preset cleaning rule. And analyzing the interface request data based on the specified path or the specified environment to obtain interaction data corresponding to the preset cleaning rule. The interaction data includes request data and preset response data, and after the test case is executed, the actual response data may be compared with the preset response data.
For example, when cleaning interface request data, a configuration file is created first for storing cleaning rules under different paths or environments, for example, a JSON file or INI file may be used to store rules. Different rules can be defined according to requirements, including a request method, a URL, a request header, a response status code, a response header and the like. And in the test code, reading the corresponding cleaning rule according to the specified path or environment, and analyzing the interface request data to obtain the corresponding interaction data. Assuming that a plurality of sections are defined in the configuration file, each section corresponds to a cleaning rule of a path and an environment, a corresponding section name is constructed according to the specified path or environment, and the corresponding cleaning rule is read from the configuration file.
Step S200: determining corresponding parameterized script options based on environment configuration corresponding to the interface request data;
in the present embodiment, in the interface test, a development environment, a test environment, a pre-production environment, a production environment, and the like are included. Each environment is built in a server corresponding to an ip address. When the environment configuration is carried out, the configuration of the test environment can be completed according to the ip address and the designated path. The parameterized script option refers to setting a parameter item in the test case, wherein the parameter and the server have a mapping relation. When the test cases need to be applied in different environments, parameters of the corresponding environments can be directly transferred to the parameter items, so that one test case can be applied in different test environments.
As an alternative implementation, after data cleansing is performed and interaction data is generated, environment configuration information is received, and the environment configuration information is transferred as parameters to parameterized script options. The environment configuration information may include various parameters and settings related to the test environment, such as a target web site for the test, database connection information, authentication credentials for authentication, paths of files needed for use in the test, addresses and ports of proxy servers, etc. The parameter script option is set, so that portability and flexibility of use cases can be ensured, and test work is more efficient and reliable.
The server address is illustratively passed as a parameter and different parameter values are set in different environments. When the test case is applied to different environments, only the values of the parameters need to be changed, and the whole script file does not need to be modified. The test case can be accurately positioned to the target server according to the configured parameter values and corresponding operation is executed, and the test case can be used in development, test, production or pre-production environments. For example, if a test case is generated in a development environment, the parameters passed by the parameter items of the test case may be the server address of the development environment. If the test case is to be applied in a production environment, the server address in the test case needs to be modified so that the test case can be applied in the production environment. The service address of the production environment is obtained and used as a parameterized script option.
As another alternative embodiment, at least one corresponding environment may be determined simultaneously according to the interface request data, and when multiple environments need to be configured at a time, multiple environments may be configured simultaneously by means of a multi-threaded test or a containerized test. Analyzing the interface request data, and knowing the function and the requirement of the interface and the dependency relationship on the environment. And determining the environment to be matched of the interface request data according to the functions and requirements of the interface and the dependency relationship on the environment, and taking the address of the environment to be matched as the value of the parameterized script option.
For example, when multiple environments need to be tested at the same time, multiple threads can be created in one test script in a multithreading test mode, each thread is responsible for testing one environment, each thread runs independently, and server addresses of different environments are transmitted to parameterized script options of each thread. To improve the efficiency and quality of the test.
Step S300: and generating a migratable test case based on the interaction data and the parameterized script options.
In this embodiment, the portable test case is a general test case, and may be applied in different environments.
As an optional implementation manner, when the migratable test case is generated, firstly determining the test dimension of the interaction data, acquiring the test data corresponding to the test dimension, and determining the parameters corresponding to the parameterized script options in the test data. And generating the movable test case based on the test data and the parameters. Namely, different test data are prepared according to different test dimensions, and parameters corresponding to parameterized script options in different environments are combined to generate a migratable test case.
For example, when performing the test of the interactive data, different test data needs to be prepared according to different test dimensions. For example, for a functional test dimension, test data such as normal functions, exception handling, boundary conditions, etc. need to be prepared for review to ensure that the system is functioning properly under all conditions. For the performance test dimension, test data under different load conditions needs to be prepared to test the performance of the system under different loads. For the security test dimension, test data including authentication, data encryption, access control, etc. needs to be prepared to test the security and protection capabilities of the system. By preparing a proper test data set, the comprehensiveness and the accuracy of the test can be ensured, and the efficiency and the quality of the test are improved.
As another alternative, in generating test cases, cases related to boundary condition testing and abnormal situation testing may be generated to test the system's ability to handle and fault-tolerant abnormal situations and special situations.
For example, when the portable test case is generated, the portable test case can be generated according to each boundary value of the interaction data so as to test the running condition of the same boundary value in different environments. Such as testing maximum, minimum, null, etc. The method can also add abnormal data into the interactive data, for example, for a shopping cart function, test cases can be generated to test the execution results of the test cases when the abnormal conditions of adding goods exceeding the stock quantity, inputting invalid coupon codes and the like are added, and the test cases are adjusted according to the execution results.
In this embodiment, after the interface request is obtained, the interface request data is parsed, and based on a preset cleaning rule, the interface request data is cleaned, so as to obtain the interactive data. And then analyzing and obtaining the environment configuration of the environment to be tested according to the received interface request data. And determining parameterized script options according to the environment configuration, and generating the migratable test cases according to the interactive data and the parameterized script options. Different parameters are transmitted into the parameterized script options, so that the test case can be applied to different environments, namely, the test case is changed into a movable test case. The test case does not need to be reconstructed in each test environment, so that the test efficiency can be improved.
Example two
Based on the first embodiment, another embodiment of the present application is presented, referring to fig. 2, after the step of generating a migratable test case based on the interaction data and the parameterized script options, the method includes the following steps:
step S400: if a test request is received, acquiring target environment configuration of a test environment corresponding to the test request;
step S500: configuring parameter values of the parameterized script options based on the target environment configuration to configure the migratable test cases as target test cases matching the target environment configuration;
step S600: and running the target test case to respond to the test request and outputting a corresponding test result.
In this embodiment, after the migratable test case is generated, if a test request is received, a target environment is determined according to the test request. And then, according to the target environment, transmitting the corresponding parameters to the parameterized script options, and completing the configuration of the target environment for the movable test case.
As an optional implementation manner, after receiving a test request, judging whether a target test case corresponding to the test request exists in a pre-stored test case library, and if so, entering the steps of normal environment configuration and test case operation. And if the corresponding target test case does not exist, executing the step of acquiring the target environment configuration of the test environment corresponding to the test request, and adding the target environment configuration into the pre-stored test case library after the target test case is obtained. The test case library is continuously perfected and updated, and the quality and coverage of the test cases are improved, so that the test and deployment of the system are better supported.
For example, after receiving the test request, whether the target test case matched with the test request exists in the pre-stored test case library can be judged according to key information of the test request, such as a functional module, a test type, a parameter value and the like. If not, determining the test environment corresponding to the test request can be determined by environment identification or configuration information provided in the test request. And acquiring configuration information of the target environment according to the identification or configuration information of the test environment, wherein the configuration information comprises server addresses, database configuration, network settings and the like of the target environment. And generating a target test case according to the configuration information of the target environment. According to the predefined parameterized script options and interaction data, the configuration information of the target environment is combined to generate the test case suitable for the target environment. And adding the generated target test case into a pre-stored test case library so as to facilitate subsequent execution and management.
As another alternative implementation manner, after generating the target test case according to the new test request, the test field in the test request is obtained, and the reference result of the migratable test case is obtained based on the test field. And then, running the target test case, and generating an initial test result based on the test field. And comparing the reference result with the initial test result, and outputting the test result.
Illustratively, the target test case includes a test field corresponding to the test request and a default test field. The default field is a field that does not need to be matched, so that when the test case is executed, only the test field corresponding to the test request can be obtained. Firstly, a reference result corresponding to the test field, namely a predicted result, is obtained, then the target test case is operated, and the generated initial test result is compared with the predicted result. And acquiring a difference value in the initial test result and the predicted result and a field corresponding to the difference value, generating a difference table and outputting the difference table so that a tester can check the test result.
As a further alternative implementation manner, after comparing the initial test result with the reference result, if the difference field and the difference value do not exist, a time field in the target test case is obtained, whether the target test case is the latest test case is judged according to the time field, if so, the end of the test is indicated, and the initial test result is the test result. If not, the test is not completed, and the value corresponding to the time field of the target test case is output. Since the execution result of the test case may change over time, if the execution result of the test case is outdated, this result may no longer reflect the current system state and thus cannot be used as a valid reference.
For example, in a system, the execution result of a test case may be affected by external factors, such as network conditions, hardware states, etc., and if the execution result of the test case is outdated, this result may no longer reflect the current network conditions or hardware states, and thus cannot be used as a valid reference. In contrast, if the test case execution result is up-to-date, this result may better reflect the current network condition or hardware state. Therefore, whether the test result is the latest value is judged through the time field, so that the validity and the accuracy of the test result can be ensured.
As yet another alternative embodiment, referring to fig. 3, after receiving the test request, the cleaning request data is automatically filtered, and the cleaned request data is subjected to data migration so that the request data can be invoked at any time. After the environment configuration information is received, environment configuration is carried out, and test cases corresponding to the test requests are generated. Before executing the test case, the custom field in the test case is removed, so that the data operand is reduced. Custom fields in a test case are part of the test case that are unique and therefore do not require comparison.
Illustratively, all requests are captured first, cleaning request data is filtered automatically, useful data is acquired, data migration is entered, in the data migration process, cleaned request data is exported, the request data comprises data such as an interface path, an input parameter, an output parameter and the like, test cases can be generated by the request data, and the request data is stored in a data file, namely an Excel file, so that subsequent calling can be conducted. After data migration is completed, environment configuration, namely test environment configuration, is performed, one set of data test cases can be executed in different environments, and the environment configuration is required in the data test cases. After the environment configuration is completed, the execution of the test case is entered, and the custom fields in the test case need to be removed, namely, the comparison of the custom fields is removed, so that the data operation is reduced. After the test case is executed, calling a parameter output corresponding to the current test request in the data file, comparing the execution result of the test case with the parameter output, and if the execution result is inconsistent with the parameter output, directly outputting different test fields to indicate that the test is not passed. If the execution result is consistent with the parameter, whether the data used by the current test case is the latest data or not is also required to be determined, if so, the test is passed, and if not, the test is not passed, and a time field in the current test case is output. When judging whether the data used by the test case is the latest data or not, the execution time of the test case can be compared with the request time in the cleaned request data.
In this embodiment, after a new test request is received, if a test case corresponding to the test request does not exist in the pre-stored test case library, a subsequent test operation may be executed after a target test case is generated, and the target test case is stored in the pre-stored test case library, so that when the same test request is received again, the test can be completed quickly, and the test efficiency is improved.
Example III
In an embodiment of the present application, a test case generating device is provided.
Referring to fig. 4, fig. 4 is a schematic diagram of a terminal structure of a hardware running environment according to an embodiment of the present application.
As shown in fig. 4, the control terminal may include: a processor 1001, such as a CPU, a network interface 1003, memory 1004, and a communication bus 1002. Wherein the communication bus 1002 is used to enable connected communication between these components. The network interface 1003 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1004 may be a high-speed RAM memory or a stable memory (non-volatile memory), such as a disk memory. The memory 1004 may also optionally be a storage device separate from the processor 1001 described above.
It will be appreciated by those skilled in the art that the terminal structure shown in fig. 4 is not limiting of the terminal and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
As shown in fig. 4, an operating system, a network communication module, and a test case generation program may be included in the memory 1004, which is one type of computer storage medium.
In the test case generating device hardware structure shown in fig. 4, the processor 1001 may call the test case generating program stored in the memory 1004 and perform the following operations:
acquiring interface request data, and cleaning the interface request data based on a preset cleaning rule to obtain interaction data corresponding to the preset cleaning rule;
determining corresponding parameterized script options based on environment configuration corresponding to the interface request data;
and generating a migratable test case based on the interaction data and the parameterized script options.
Optionally, the processor 1001 may call the test case generation program stored in the memory 1004, and further perform the following operations:
determining the testing dimension of the interaction data, and acquiring testing data corresponding to the testing dimension;
determining parameters corresponding to the parameterized script options in the test data;
and generating a migratable test case based on the test data and the parameters.
Optionally, the processor 1001 may call the test case generation program stored in the memory 1004, and further perform the following operations:
if a test request is received, acquiring target environment configuration of a test environment corresponding to the test request;
configuring parameter values of the parameterized script options based on the target environment configuration to configure the migratable test cases as target test cases matching the target environment configuration;
and running the target test case to respond to the test request and outputting a corresponding test result.
Optionally, the processor 1001 may call the test case generation program stored in the memory 1004, and further perform the following operations:
acquiring a test field in the test request, and acquiring a reference result of the movable test case based on the test field;
operating the target test case, and acquiring an initial test result corresponding to the target test case based on the test field;
and outputting a corresponding test result based on the reference result and the initial test result.
Optionally, the processor 1001 may call the test case generation program stored in the memory 1004, and further perform the following operations:
acquiring a time field of the target test case, and judging whether the target test case is the latest test case or not based on the time field;
and outputting a corresponding test result based on the judgment result, the reference result and the initial test result.
Optionally, the processor 1001 may call the test case generation program stored in the memory 1004, and further perform the following operations:
acquiring a difference field and a value corresponding to the difference field based on the reference result and the initial test result;
and generating a difference table according to the difference field and the value corresponding to the difference field, and outputting a test result with the difference table.
Optionally, the processor 1001 may call the test case generation program stored in the memory 1004, and further perform the following operations:
if a test request is received, judging whether a target test case corresponding to the test request exists in a pre-stored test case library;
if not, executing the step of acquiring the target environment configuration of the test environment corresponding to the test request, and adding the target test case into the pre-stored test case library after the target test case is obtained.
Optionally, the processor 1001 may call the test case generation program stored in the memory 1004, and further perform the following operations:
determining a corresponding specified path or specified environment based on a preset cleaning rule;
and analyzing the interface request data based on the specified path or the specified environment to acquire the interaction data corresponding to the preset cleaning rule.
In addition, in order to achieve the above object, an embodiment of the present invention further provides a terminal device, including a memory, a processor, and a test case generating program stored in the memory and capable of running on the processor, where the processor implements the test case generating method as described above when executing the test case generating program.
In addition, in order to achieve the above object, an embodiment of the present invention provides a computer-readable storage medium having stored thereon a test case generation program that, when executed by a processor, implements the test case generation method described above.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It should be noted that in the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The application may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second, third, etc. do not denote any order. These words may be interpreted as names.
While preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various modifications and variations can be made in the present application without departing from the spirit or scope of the invention. Thus, if such modifications and variations of the present application fall within the scope of the claims and the equivalents thereof, the present application is intended to cover such modifications and variations.

Claims (5)

1. The test case generation method is characterized by comprising the following steps of:
capturing interface request information, and storing the interface request information in an information file;
analyzing the data information in the information file to obtain interface request data, and determining a corresponding specified path or specified environment based on a preset cleaning rule;
analyzing the interface request data based on the specified path or the specified environment to obtain interaction data corresponding to the preset cleaning rule;
determining corresponding parameterized script options based on environment configuration corresponding to the interface request data, wherein environment configuration information is received, the environment configuration information is used as parameters to be transferred to the parameterized script options, a server address is used as parameters to be transferred, and different parameter values are set in different environments;
based on the interaction data and the parameterized script options, a migratable test case is generated, wherein,
determining the testing dimension of the interaction data, and acquiring testing data corresponding to the testing dimension;
determining parameters corresponding to the parameterized script options in the test data;
generating the movable test case based on the test data and the parameters;
if a test request is received, acquiring target environment configuration of a test environment corresponding to the test request;
if a test request is received, judging whether a target test case corresponding to the test request exists in a pre-stored test case library;
if not, executing the step of acquiring the target environment configuration of the test environment corresponding to the test request, and adding the target environment configuration into the pre-stored test case library after obtaining the target test case;
configuring parameter values of the parameterized script options based on the target environment configuration to configure the migratable test cases as target test cases matching the target environment configuration;
acquiring a test field in the test request, and acquiring a reference result of the movable test case based on the test field;
operating the target test case, and acquiring an initial test result corresponding to the target test case based on the test field;
outputting a corresponding test result based on the reference result and the initial test result;
when multiple environments need to be tested at the same time, multiple threads are created in one test script in a multithreading test mode, each thread is responsible for testing one environment, each thread operates independently, and server addresses of different environments are transmitted to parameterized script options of each thread.
2. The test case generating method as claimed in claim 1, wherein the step of outputting the corresponding test result based on the reference result and the initial test result comprises:
acquiring a time field of the target test case, and judging whether the target test case is the latest test case or not based on the time field;
and outputting a corresponding test result based on the judgment result, the reference result and the initial test result.
3. The test case generating method as claimed in claim 1, wherein the step of outputting the corresponding test result based on the reference result and the initial test result comprises:
acquiring a difference field and a value corresponding to the difference field based on the reference result and the initial test result;
and generating a difference table according to the difference field and the value corresponding to the difference field, and outputting a test result with the difference table.
4. A terminal device comprising a memory, a processor and a test case creation program stored on the memory and executable on the processor, the processor implementing the method of any of claims 1-3 when executing the test case creation program.
5. A computer readable storage medium, wherein a test case creation program is stored on the computer readable storage medium, which when executed by a processor, implements the method of any of claims 1-3.
CN202311766114.2A 2023-12-21 2023-12-21 Test case generation method, terminal equipment and computer readable storage medium Active CN117435513B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311766114.2A CN117435513B (en) 2023-12-21 2023-12-21 Test case generation method, terminal equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311766114.2A CN117435513B (en) 2023-12-21 2023-12-21 Test case generation method, terminal equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN117435513A CN117435513A (en) 2024-01-23
CN117435513B true CN117435513B (en) 2024-04-02

Family

ID=89558699

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311766114.2A Active CN117435513B (en) 2023-12-21 2023-12-21 Test case generation method, terminal equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN117435513B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108984405A (en) * 2018-07-12 2018-12-11 郑州云海信息技术有限公司 A kind of performance test methods, device and computer readable storage medium
CN110232016A (en) * 2019-03-08 2019-09-13 上海蔚来汽车有限公司 Interface testing case generation method, device and controller and medium
CN113377661A (en) * 2021-06-23 2021-09-10 深圳平安智汇企业信息管理有限公司 Interface testing method and device, electronic equipment and storage medium
CN114116522A (en) * 2022-01-27 2022-03-01 四川野马科技有限公司 Swagger-based method for automatically testing interface
CN116028377A (en) * 2023-03-28 2023-04-28 之江实验室 Automatic test method, device and medium based on jenkins

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IN2013CH04617A (en) * 2013-10-14 2015-04-24 Cognizant Technology Solutions India Pvt Ltd
US9424290B2 (en) * 2014-03-11 2016-08-23 Wipro Limited System and method for data validation

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108984405A (en) * 2018-07-12 2018-12-11 郑州云海信息技术有限公司 A kind of performance test methods, device and computer readable storage medium
CN110232016A (en) * 2019-03-08 2019-09-13 上海蔚来汽车有限公司 Interface testing case generation method, device and controller and medium
CN113377661A (en) * 2021-06-23 2021-09-10 深圳平安智汇企业信息管理有限公司 Interface testing method and device, electronic equipment and storage medium
CN114116522A (en) * 2022-01-27 2022-03-01 四川野马科技有限公司 Swagger-based method for automatically testing interface
CN116028377A (en) * 2023-03-28 2023-04-28 之江实验室 Automatic test method, device and medium based on jenkins

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
面向云测试的并行测试用例自动生成方法;刘晓强等;计算机应用;20150410(第04期);全文 *

Also Published As

Publication number Publication date
CN117435513A (en) 2024-01-23

Similar Documents

Publication Publication Date Title
CN109460527B (en) Product data configuration method, device, computer equipment and storage medium
EP1873706A1 (en) Systems and methods for integrating services
US7620939B2 (en) Automatic configuration of regression test controls
US9170821B1 (en) Automating workflow validation
CN110413507B (en) System test method, device, computer equipment and storage medium
CN112039900B (en) Network security risk detection method, system, computer device and storage medium
CN111752844A (en) Interface testing method and device, computing equipment and storage medium
EP3234851A1 (en) A system and method for facilitating static analysis of software applications
CN111026636A (en) Software project testing method, device, equipment and storage medium
US20110016454A1 (en) Method and system for testing an order management system
CN112395184A (en) Information acquisition method, equipment and computer storage medium
CN110888804A (en) Interface test method and interface test platform
CN113778771A (en) Terminal testing method, system and storage medium
CN117435513B (en) Test case generation method, terminal equipment and computer readable storage medium
Belli et al. Test generation and minimization with" Basic" statecharts
CN112346981A (en) Joint debugging test coverage rate detection method and system
CN110187890B (en) Project deployment method, electronic equipment and storage medium
CN111597093B (en) Exception handling method, device and equipment thereof
Tsai et al. Scenario-based test case generation for state-based embedded systems
CN105260193A (en) Self healing frame and healing method of large software
CN112748950A (en) Software code examination method and device
CN114371866A (en) Version reconfiguration test method, device and equipment of service system
CN113518974A (en) System and method for finding and identifying computing nodes in a network
CN117493162B (en) Data verification method, system, equipment and storage medium for interface test
CN111160403A (en) Method and device for multiplexing and discovering API (application program interface)

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant