CN112148610A - Test case execution method and device, computer equipment and storage medium - Google Patents

Test case execution method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN112148610A
CN112148610A CN202011038044.5A CN202011038044A CN112148610A CN 112148610 A CN112148610 A CN 112148610A CN 202011038044 A CN202011038044 A CN 202011038044A CN 112148610 A CN112148610 A CN 112148610A
Authority
CN
China
Prior art keywords
test case
execution
executed
test
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011038044.5A
Other languages
Chinese (zh)
Inventor
刘芳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
OneConnect Smart Technology Co Ltd
OneConnect Financial Technology Co Ltd Shanghai
Original Assignee
OneConnect Financial Technology Co Ltd Shanghai
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by OneConnect Financial Technology Co Ltd Shanghai filed Critical OneConnect Financial Technology Co Ltd Shanghai
Priority to CN202011038044.5A priority Critical patent/CN112148610A/en
Publication of CN112148610A publication Critical patent/CN112148610A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases

Abstract

The application relates to the technical field of test management, and provides a test case execution method, a test case execution device, computer equipment and a storage medium. The method comprises the following steps: the method comprises the steps of obtaining test script file data to be tested, analyzing the test script file data to obtain test case data, classifying and pushing the test case data according to script types, obtaining a test case to be executed according to the test case execution task request parameters when a test case execution request carrying the test case execution task request parameters is received, calling a preset execution engine corresponding to the type of the test case to be executed to execute the test case to be executed, and obtaining a test case execution result. The method can solve the execution problem of different types of test cases, is compatible with the execution of test scripts of different test frames, and improves the execution efficiency of the test cases.

Description

Test case execution method and device, computer equipment and storage medium
Technical Field
The present application relates to the field of test management technologies, and in particular, to a test case execution method, apparatus, computer device, and storage medium.
Background
The test is an important stage in the process of project research and development and is the key for ensuring the product quality, and the core task of the test work is the development, management and operation of test cases.
In order to save the execution time of the test cases, the scheme of executing the test cases in batches by using a test suite or a test framework is more and more widely applied. Such as unittest from python, testng from java, junit, script files for a meter tool, gurit from c + +, and so forth.
However, script files of different frames can only be executed in a supported language environment, if script files containing multiple frames need to be executed, a tester needs to put a test script file in a language environment capable of correspondingly supporting the test script file to execute, and the process is repeated and tedious, consumes labor and time cost, so that the execution efficiency of test cases is not high.
Disclosure of Invention
In view of the foregoing, it is desirable to provide an efficient test case execution method, apparatus, computer device and storage medium for solving the above technical problems.
A test case execution method, the method comprising:
acquiring test script file data to be tested;
analyzing the test script file data to obtain test case data;
classifying and pushing the test case data according to the script type;
when a test case execution request carrying the test case execution task request parameter is received, the test case to be executed is obtained according to the test case execution task request parameter, a preset execution engine corresponding to the type of the test case to be executed is called to execute the test case to be executed, and a test case execution result is obtained.
In one embodiment, the test case execution task request parameters include a test case execution task identifier, a set of test case identifiers, a file name, a file path, and a script type.
In one embodiment, obtaining the test case to be executed according to the test case execution task parameter includes:
searching out corresponding test case file data to be executed according to the file name and the file path;
extracting a test case identifier in the file data of the test case to be executed according to a preset key field;
and when the test case identification exists in the test case identification set, acquiring the test case to be executed according to the test case identification.
In one embodiment, according to the test case execution task request parameter, invoking a preset execution engine to execute the test case to be executed, and obtaining the test case execution result includes:
transmitting the test case to be executed to a corresponding preset execution engine according to the script type;
and calling a preset execution engine to execute the test case to be executed according to a preset execution command to obtain a test case execution result.
In one embodiment, after obtaining the test case to be executed and calling the preset execution engine corresponding to the type of the test case to be executed to execute the test case to be executed and obtain the execution result of the test case, the method further includes:
and sending a data write-back instruction to the preset execution engine, wherein the data write-back instruction is used for enabling the preset execution engine to call a preset callback function to write back data, and updating the execution state of the test case to be executed.
In one embodiment, after sending the data write-back instruction to the predetermined execution engine, the method further includes:
monitoring data write back of a preset execution engine;
if the write-back record of the data of the preset execution engine is not monitored after the preset time, recording abnormal information of the preset execution engine.
In one embodiment, after obtaining the test case to be executed and calling the preset execution engine corresponding to the type of the test case to be executed to execute the test case to be executed and obtain the execution result of the test case, the method further includes:
and generating a test case execution report according to the test case execution result.
A test case execution apparatus, the apparatus comprising:
the data acquisition module is used for acquiring test script file data to be tested;
the data analysis module is used for analyzing the test script file data to obtain test case data;
the test case classification module is used for classifying and pushing test case data according to the script type;
and the test case execution module is used for acquiring the test case to be executed according to the test case execution task request parameter and calling a preset execution engine corresponding to the type of the test case to be executed to execute the test case to be executed to obtain a test case execution result when receiving the test case execution request carrying the test case execution task request parameter.
A computer device comprising a memory and a processor, the memory storing a computer program, the processor implementing the following steps when executing the computer program:
acquiring test script file data to be tested;
analyzing the test script file data to obtain test case data;
classifying and pushing the test case data according to the script type;
when a test case execution request carrying the test case execution task request parameter is received, the test case to be executed is obtained according to the test case execution task request parameter, a preset execution engine corresponding to the type of the test case to be executed is called to execute the test case to be executed, and a test case execution result is obtained.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
acquiring test script file data to be tested;
analyzing the test script file data to obtain test case data;
classifying and pushing the test case data according to the script type;
when a test case execution request carrying the test case execution task request parameter is received, the test case to be executed is obtained according to the test case execution task request parameter, a preset execution engine corresponding to the type of the test case to be executed is called to execute the test case to be executed, and a test case execution result is obtained.
According to the test case execution method, the test case execution device, the computer equipment and the storage medium, test script file data to be tested are obtained, the test script file data are analyzed to obtain test case data, the test case data are classified and pushed according to script types, when a test case execution request carrying test case execution task request parameters is received, a test case to be executed is obtained according to the test case execution task request parameters, a preset execution engine corresponding to the test case to be executed is called to execute the test case to be executed, and a test case execution result is obtained. In the process, the test case data of the same type are divided into one type according to the script type and are pushed, when a test case execution request carrying the test case execution task request parameter is received, the preset execution engine corresponding to the test case type to be executed is called to execute the test case to be executed according to the test case execution task request parameter, so that the test case execution result is obtained, the execution problem of the test cases of different types (different test frames) is solved, the execution of the test scripts of different test frames can be compatible, and the test case execution efficiency is improved.
Drawings
FIG. 1 is a diagram of an application environment for a test case execution method in one embodiment;
FIG. 2 is a flowchart illustrating a method for test case execution according to an embodiment;
FIG. 3 is a flowchart illustrating steps of obtaining test cases to be executed in one embodiment;
FIG. 4 is a detailed flowchart of a test case execution method according to another embodiment;
FIG. 5 is a block diagram showing the structure of a test case execution apparatus according to an embodiment;
FIG. 6 is a block diagram of a test case execution apparatus according to another embodiment;
FIG. 7 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The test case execution method provided by the application can be applied to the application environment shown in fig. 1. Wherein the terminal 102 communicates with the server 104 via a network. Specifically, a user uploads test script file data containing test case data to the server 104 through the terminal 102, then sends a test case data processing request to the server 104 through the terminal 102, the server 104 responds to the request to obtain the test script file data to be tested, analyzes the test script file data to obtain the test case data, sorts and pushes the test case data according to script types, and when receiving a test case execution request carrying a test case execution task request parameter, obtains a test case to be executed according to the test case execution task request parameter, and invokes a preset execution engine corresponding to the test case to be executed to execute the test case to be executed, so as to obtain a test case execution result. The terminal 102 may be, but not limited to, various personal computers, notebook computers, smart phones, tablet computers, and portable wearable devices, and the server 104 may be implemented by an independent server or a server cluster formed by a plurality of servers.
In one embodiment, as shown in fig. 2, a test case execution method is provided, which is described by taking the application of the method to the server in fig. 1 as an example, and includes the following steps:
step 202, obtaining test script file data to be tested.
In this embodiment, the server is applied to a server deployed with a universal test framework universal platform independently developed by the team of the present application. Specifically, the test frame universal platform is a front-end and rear-end separation frame, the front end adopts a layui + js + html technology frame, and the rear end adopts a springboot frame. In practical application, a tester may open a local folder through a front page entry, select a folder in which test script data to be executed is located, and then import a test script file. And when a test case data processing request is received, acquiring imported test script file data, wherein the test script file data comprises all files of the folder and all files of the subfolders, test script data, file name data and the like contained in the files.
And 204, analyzing the test script file data to obtain test case data.
After the script file data to be tested is obtained, polling all test script files under the script file folder to be tested, analyzing the content in each file, and obtaining test case data, wherein the test case data is script data substantially. Specifically, the test script files are classified according to script types, and may include postman type script files, Jmeter type script files, python type script files, java type script files, and the like. Taking a postman type file as an example, a json format is used as a suffix, the content of requests is analyzed, the requests are array, each array is circulated, and the ID (the ID is a unique identifier), the name and the description of the test case are obtained. Taking the script file of Jmeter as an example, the script file of Jmeter is in jmx format (which can be opened by notepad, actually is xml format file), polling http samplerproxy node, and obtaining testname and description data. Other java files and python files can uniformly analyze the suit file to obtain test case data.
And step 206, classifying and pushing the test case data according to the script type.
Specifically, after the test case data is analyzed, the script type to which the analyzed test case data belongs can be judged, and the test case data of the same script type is divided into one type according to the script type. For example, test case data with a Java type suffix as class script type and test case data with a python type suffix as pyscript type can be classified into one type, and then the divided test case data is pushed to the front end to be displayed so that a user can view the test cases and dynamically select the test cases to be executed. In this embodiment, different types of script files can be uniformly displayed on a front-end page by using the use case name, the use case description, the file name, the file path, and the script type.
And 208, when a test case execution request carrying the test case execution task request parameter is received, acquiring the test case to be executed according to the test case execution task request parameter, and calling a preset execution engine corresponding to the type of the test case to be executed to execute the test case to be executed to obtain a test case execution result.
The test case execution task request parameters comprise a test case execution task identifier (task ID), a test case identifier set (test case ID set), a file name, a file path and a script type. The preset execution engine in the embodiment includes various types of execution engines, and can compatibly support the execution of test cases including types of python, java, postman, jmeter, and the like. In practical application, the analyzed test case data is not all test cases to be executed, and a user can select the test cases to be executed so as to filter out the test cases which are not required to be executed, thereby saving system resources. Specifically, the user may select the test case to be executed by a single selection, multiple selection, or full selection. After the user selects the test case to be executed, the user can click a button of 'screening test case' on a front-end page, call a corresponding interface to perform case processing, screen out the test case data to be executed according to the target test case data selected by the user, and further display the test case to be executed on the front end for the user to select and execute the corresponding test case. After the test cases to be executed are screened and displayed according to the selection of the user, the user can click a button for executing the test cases on a page, a test case execution request (including test case execution task request parameters such as task ID, a test case ID set, a file path and a script type) is sent, a preset execution engine (hereinafter referred to as an execution engine) corresponding to the type of the test cases to be executed can be called by the rear end in an interface calling mode, the test case execution task parameters, namely the task ID, are sent to the execution engine, and the execution engine can execute the test cases in a batch mode according to the task ID to obtain a test case execution result. If the type of the test case is python, the corresponding call is made
The test case execution method comprises the steps of obtaining test script file data to be tested, analyzing the test script file data to obtain test case data, classifying and pushing the test case data according to script types, obtaining a test case to be executed according to the test case execution task request parameters when a test case execution request carrying the test case execution task request parameters is received, calling a preset execution engine corresponding to the test case to be executed to execute the test case to be executed, and obtaining a test case execution result. In the process, the test case data of the same type are divided into one type according to the script type and are pushed, when a test case execution request carrying the test case execution task request parameter is received, the preset execution engine corresponding to the test case type to be executed is called to execute the test case according to the test case execution task request parameter, and the test case execution result is obtained.
As shown in fig. 3, in one embodiment, obtaining the test case to be executed according to the test case execution task parameter includes:
step 228, finding out corresponding test case file data to be executed according to the file name and the file path;
step 248, extracting a test case identifier in the file data of the test case to be executed according to a preset key field;
and 268, when the test case identifier exists in the test case identifier set, acquiring the test case to be executed according to the test case identifier.
In specific implementation, the obtained test cases to be executed include multiple types of test cases. Taking the example of obtaining a postman script file, a received test case execution request carries a test case ID set, a file name, a file path and a script type, a corresponding test case file is found according to the file name and the path to carry out deep copying, requests in the file are polled after copying, the test case ID is extracted, then whether the extracted test case ID exists or not is searched in the test case ID set which is requested to be executed, if yes, no processing is carried out, a corresponding test case can be obtained according to a test case identifier, and if not, the child json node is deleted. Taking the jmeter script as an example, since the jmeter can control whether to execute the test case by enabling and disabling, the selected test case can be set as enabled, and the unselected test cases can be set as disabled. For Java and Python type scripts, unselected test cases can be set as exceptions, and test cases to be executed are screened out again based on the mode. Further, the return of the result generated by the front end (success or failure and task ID) may be requested again, and a task ID record is saved in the database, recording the script file ID, the script name and the script path (deep-copied script file, at the server) and the script type under the ID. In the embodiment, the test case data to be executed is screened out according to the user selection, so that the user experience is improved, meanwhile, unnecessary execution of the test case can be reduced, and consumption of system resources is avoided.
In one embodiment, according to the test case execution task request parameter, invoking a preset execution engine to execute the test case to be executed, and obtaining the test case execution result includes:
transmitting the test case to be executed to a corresponding preset execution engine according to the script type;
and calling a preset execution engine to execute the test case to be executed according to a preset execution command to obtain a test case execution result.
In specific implementation, a corresponding configuration mode can be provided to register the cloud server, that is, the test machine resources of a corresponding environment (such as a python operating environment) are registered and deployed. And configuring machine IP (Internet Protocol, Protocol for interconnection between networks), ports, file storage positions and supported operating environments in the configuration file. After acquiring the task ID, the execution engine may set the execution state of the task ID as in execution according to the task ID, and then go to the database to query the task script, the script type, and the script path, and query the test machine (i.e., configuration file) that can execute the script file of the type. And further, reading the data stream according to the script path, transmitting the data stream to a file storage position of the test machine, calling a corresponding written execution command execution script file, polling the script type under the task ID, sequentially transmitting corresponding data to the test machine, and calling and executing. In the case where there are multiple test machines available, one test machine may be randomly selected. Specifically, the execution commands for executing the different types of script files may include: execution command of postman script: newman run/users/data/postman/xxxx.json-r html; execution command of the Jmeter script: jmeter-n-t [ jmx script path ] -l [ test output file path ]; execution command of Java testng script: java org, testing ng% LIB% \ suite, xml, and the like, and specifically, the script execution commands may be respectively written in an executable file (an sh file of linux), a file path (a file name + a path) is transmitted as a parameter when the sh file is executed, and the file path is known when the sh file is transmitted to a test machine. The file transmission and the call execution of each type of script are completed through an asynchronous thread. Based on the method, multiple threads can be started to parallelly call different execution engines to execute different types of test cases, and the test case execution efficiency is improved.
As shown in fig. 4, in one embodiment, after obtaining the test case to be executed and calling the preset execution engine corresponding to the type of the test case to be executed to execute the test case to be executed, and obtaining the execution result of the test case, the method further includes: and step 210, sending a data write-back instruction to the preset execution engine, wherein the data write-back instruction is used for enabling the preset execution engine to call a preset callback function to write back data, and updating the execution state of the test case to be executed.
In specific implementation, after detecting that the execution engine executes the test case data, the execution engine sends a data write-back instruction to the execution engine, so that the execution engine calls a preset callback function to write back the data, and the preset callback function is used for updating the execution state of the test case and analyzing the execution result of the test case. And the preset callback function is called to write back the data, so that the execution progress of the script can be updated in time, and a tester can check the execution progress of the test case at any time.
As shown in fig. 4, in one embodiment, after sending the data write-back instruction to the predetermined execution engine, the method further includes: in step 212, the write back of the data of the predetermined execution engine is monitored, and if the write back record of the data of the predetermined execution engine is not monitored after the predetermined time, the abnormal information of the predetermined execution engine is recorded.
In practical application, the execution engine may not write back data and the execution progress of part of the test cases may not be updated due to some objective factors such as network failure. In order to find out the abnormal information in time, a timeout mechanism may be set to monitor data write back of each execution engine, and if the execution state of the test case to be executed updated by the execution engine is not monitored within a preset time (e.g., within 3 minutes), the abnormal information of the test case executed by the execution engine is recorded. The problem of abnormal execution of the test case can be found in time by recording abnormal information, and the problem is checked.
In one embodiment, after obtaining the test case to be executed and calling the preset execution engine corresponding to the type of the test case to be executed to execute the test case to be executed and obtain the execution result of the test case, the method further includes: step 214, generating a test case execution report according to the test case execution result.
During specific implementation, the test case execution results generated corresponding to each type of test case are files in html format, the html format content generated by each type of test case can be analyzed, the execution results of all test cases are obtained, and the execution results are stored in a database. Specifically, after it is monitored that each preset execution engine completes execution of the test case, all test case execution results are read, and test case execution reports with uniform formats are generated according to a preset format and stored in a warehouse. Specifically, the test case execution report may include data such as a test case ID, a test case name, an execution result, an error reason, and an associated task ID. In the embodiment, the test case execution results are arranged into the test case execution report with the uniform format, so that the readability of the test case execution results can be improved, and the result can be conveniently checked by a tester.
In one embodiment, after generating the test case execution report according to the test case execution result, the method further includes: and sending the test case execution report in a mail form.
In practical application, the test case execution report and/or the test result may be sent to the specified mailbox in the form of a mail through a preset mail sending mechanism, so that a tester can be informed of the execution result of the test case in time.
It should be understood that although the various steps in the flow charts of fig. 2-4 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 2-4 may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed in sequence, but may be performed in turn or alternately with other steps or at least some of the other steps.
In one embodiment, as shown in fig. 5, a test case execution device is provided, which includes: a data acquisition module 510, a data analysis module 520, a test case classification module 530, and a test case execution module 540, wherein:
and a data obtaining module 510, configured to obtain test script file data to be tested.
And the data analysis module 520 is configured to analyze the test script file data to obtain test case data.
And the test case classification module 530 is used for classifying and pushing the test case data according to the script type.
The test case execution module 540 is configured to, when a test case execution request carrying the test case execution task request parameter is received, obtain a test case to be executed according to the test case execution task request parameter, and invoke a preset execution engine corresponding to the type of the test case to be executed to execute the test case to be executed, so as to obtain a test case execution result.
In one embodiment, the test case execution module 540 is further configured to find out corresponding test case file data to be executed according to the file name and the file path, extract a test case identifier in the test case file data to be executed according to a preset key field, and obtain the test case to be executed according to the test case identifier when the test case identifier is detected to exist in the test case identifier set.
In one embodiment, the test case execution module 540 is further configured to transmit the test case to be executed to a corresponding preset execution engine according to the script type, and invoke the preset execution engine to execute the test case to be executed according to a preset execution command, so as to obtain a test case execution result.
As shown in fig. 6, in an embodiment, the apparatus further includes an execution state updating module 550, configured to send a data write-back instruction to the preset execution engine, where the data write-back instruction is used to enable the preset execution engine to call the preset callback function to write back data, so as to update the execution state of the test case to be executed.
As shown in fig. 6, in one embodiment, the apparatus further includes an exception information recording module 560, configured to monitor the write-back of the data of the test case to be executed by the predetermined execution engine, and record exception information of the predetermined execution engine if the write-back record of the data of the predetermined execution engine is not monitored after a predetermined time.
As shown in fig. 6, in an embodiment, the apparatus further includes an execution report generation module 570, configured to generate a test case execution report according to the test case execution result.
For specific limitations of the test case execution apparatus, reference may be made to the above limitations of the test case execution method, which is not described herein again. The modules in the test case execution apparatus may be implemented in whole or in part by software, hardware, and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which may be a server, the internal structure of which may be as shown in fig. 7. The computer device includes a processor, a memory, and a network interface connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The database of the computer device is used for storing test case file data, test case execution result data and the like. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a test case execution method.
Those skilled in the art will appreciate that the architecture shown in fig. 7 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided, comprising a memory and a processor, the memory having a computer program stored therein, the processor implementing the following steps when executing the computer program: the method comprises the steps of obtaining test script file data to be tested, analyzing the test script file data to obtain test case data, classifying and pushing the test case data according to script types, obtaining a test case to be executed according to the test case execution task request parameters when a test case execution request carrying the test case execution task request parameters is received, calling a preset execution engine corresponding to the type of the test case to be executed to execute the test case to be executed, and obtaining a test case execution result.
In one embodiment, the processor, when executing the computer program, performs the steps of: and searching corresponding test case file data to be executed according to the file name and the file path, extracting a test case identifier in the test case file data to be executed according to a preset key field, and acquiring the test case to be executed according to the test case identifier when the test case identifier exists in the test case identifier set.
In one embodiment, the processor, when executing the computer program, performs the steps of: and transmitting the test case to be executed to a corresponding preset execution engine according to the script type, and calling the preset execution engine to execute the test case to be executed according to a preset execution command to obtain a test case execution result.
In one embodiment, the processor, when executing the computer program, performs the steps of: and sending a data write-back instruction to the preset execution engine, wherein the data write-back instruction is used for enabling the preset execution engine to call a preset callback function to write back data, and updating the execution state of the test case to be executed.
In one embodiment, the processor, when executing the computer program, performs the steps of: and monitoring the data write-back of the test case to be executed by the preset execution engine, and recording the abnormal information of the preset execution engine if the data write-back record of the preset execution engine is not monitored after the preset time.
In one embodiment, the processor, when executing the computer program, performs the steps of: and generating a test case execution report according to the test case execution result.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored, which when executed by a processor performs the steps of: the method comprises the steps of obtaining test script file data to be tested, analyzing the test script file data to obtain test case data, classifying and pushing the test case data according to script types, obtaining a test case to be executed according to the test case execution task request parameters when a test case execution request carrying the test case execution task request parameters is received, calling a preset execution engine corresponding to the type of the test case to be executed to execute the test case to be executed, and obtaining a test case execution result.
In one embodiment, the computer program when executed by the processor further performs the steps of: and searching corresponding test case file data to be executed according to the file name and the file path, extracting a test case identifier in the test case file data to be executed according to a preset key field, and acquiring the test case to be executed according to the test case identifier when the test case identifier exists in the test case identifier set.
In one embodiment, the computer program when executed by the processor further performs the steps of: and transmitting the test case to be executed to a corresponding preset execution engine according to the script type, and calling the preset execution engine to execute the test case to be executed according to a preset execution command to obtain a test case execution result.
In one embodiment, the computer program when executed by the processor further performs the steps of: and sending a data write-back instruction to the preset execution engine, wherein the data write-back instruction is used for enabling the preset execution engine to call a preset callback function to write back data, and updating the execution state of the test case to be executed.
In one embodiment, the computer program when executed by the processor further performs the steps of: and monitoring the data write-back of the test case to be executed by the preset execution engine, and recording the abnormal information of the preset execution engine if the data write-back record of the preset execution engine is not monitored after the preset time.
In one embodiment, the computer program when executed by the processor further performs the steps of: and generating a test case execution report according to the test case execution result.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database or other medium used in the embodiments provided herein can include at least one of non-volatile and volatile memory. Non-volatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical storage, or the like. Volatile Memory can include Random Access Memory (RAM) or external cache Memory. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A test case execution method, the method comprising:
acquiring test script file data to be tested;
analyzing the test script file data to obtain test case data;
classifying and pushing the test case data according to script types;
when a test case execution request carrying the test case execution task request parameter is received, obtaining a test case to be executed according to the test case execution task request parameter, and calling a preset execution engine corresponding to the type of the test case to be executed to execute the test case to be executed, so as to obtain a test case execution result.
2. The method of claim 1, wherein the test case execution task request parameters comprise a test case execution task identifier, a set of test case identifiers, a file name, a file path, and a script type.
3. The method according to claim 2, wherein the obtaining the test case to be executed according to the test case execution task parameter comprises:
searching out corresponding test case file data to be executed according to the file name and the file path;
extracting a test case identifier in the test case file data to be executed according to a preset key field;
and when the test case identification is detected to exist in the test case identification set, acquiring the test case to be executed according to the test case identification.
4. The method according to claim 2, wherein the step of calling a preset execution engine to execute the test case to be executed according to the test case execution task request parameter to obtain a test case execution result comprises:
transmitting the test case to be executed to the corresponding preset execution engine according to the script type;
and calling a preset execution engine to execute the test case to be executed according to a preset execution command to obtain a test case execution result.
5. The method according to claim 1, wherein the obtaining of the test case to be executed and the calling of the preset execution engine corresponding to the type of the test case to be executed to execute the test case to be executed, after obtaining the execution result of the test case, further comprises:
and sending a data write-back instruction to the preset execution engine, wherein the data write-back instruction is used for enabling the preset execution engine to call a preset callback function to write back data, and updating the execution state of the test case to be executed.
6. The method of claim 5, wherein after sending the data write-back instruction to the pre-configured execution engine, further comprising:
monitoring data write back of the preset execution engine;
if the write-back record of the data of the preset execution engine is not monitored after the preset time, recording the abnormal information of the preset execution engine.
7. The method according to any one of claims 1 to 6, wherein the obtaining of the test case to be executed and the calling of the preset execution engine corresponding to the type of the test case to be executed to execute the test case to be executed, after obtaining the execution result of the test case, further comprises:
and generating a test case execution report according to the test case execution result.
8. A test case execution apparatus, the apparatus comprising:
the data acquisition module is used for acquiring test script file data to be tested;
the data analysis module is used for analyzing the test script file data to obtain test case data;
the test case classification module is used for classifying and pushing the test case data according to the script type;
the test case execution module is used for acquiring a test case to be executed according to the test case execution task request parameter when receiving a test case execution request carrying the test case execution task request parameter, and calling a preset execution engine corresponding to the type of the test case to be executed to execute the test case to be executed, so as to obtain a test case execution result.
9. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor, when executing the computer program, implements the steps of the method of any of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 7.
CN202011038044.5A 2020-09-28 2020-09-28 Test case execution method and device, computer equipment and storage medium Pending CN112148610A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011038044.5A CN112148610A (en) 2020-09-28 2020-09-28 Test case execution method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011038044.5A CN112148610A (en) 2020-09-28 2020-09-28 Test case execution method and device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112148610A true CN112148610A (en) 2020-12-29

Family

ID=73894949

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011038044.5A Pending CN112148610A (en) 2020-09-28 2020-09-28 Test case execution method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112148610A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112799936A (en) * 2021-01-08 2021-05-14 合肥美昂兴电子技术有限公司 Embedded kernel engine algorithm for testing measurement system
CN112860542A (en) * 2021-01-20 2021-05-28 北京神州数字科技有限公司 UI automation test method
CN112988596A (en) * 2021-04-27 2021-06-18 北京全路通信信号研究设计院集团有限公司 Automatic testing method and system based on state transition
CN113157266A (en) * 2021-04-30 2021-07-23 武汉众邦银行股份有限公司 Script operation implementation method of Jmeter tool
CN113220595A (en) * 2021-06-11 2021-08-06 中国农业银行股份有限公司 Test method and device
CN113641747A (en) * 2021-10-15 2021-11-12 北京新氧科技有限公司 Method, device and system for accessing postman tool to database
CN117056241A (en) * 2023-10-13 2023-11-14 彩讯科技股份有限公司 Application program testing method and device for mobile terminal
CN113434395B (en) * 2021-06-22 2024-04-30 中国平安人寿保险股份有限公司 Automatic generation method, device, equipment and medium for test cases

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112799936B (en) * 2021-01-08 2024-03-12 合肥美昂兴电子技术有限公司 Embedded kernel engine algorithm of test measurement system
CN112799936A (en) * 2021-01-08 2021-05-14 合肥美昂兴电子技术有限公司 Embedded kernel engine algorithm for testing measurement system
CN112860542A (en) * 2021-01-20 2021-05-28 北京神州数字科技有限公司 UI automation test method
CN112988596A (en) * 2021-04-27 2021-06-18 北京全路通信信号研究设计院集团有限公司 Automatic testing method and system based on state transition
CN112988596B (en) * 2021-04-27 2021-08-03 北京全路通信信号研究设计院集团有限公司 Automatic testing method and system based on state transition
CN113157266A (en) * 2021-04-30 2021-07-23 武汉众邦银行股份有限公司 Script operation implementation method of Jmeter tool
CN113220595A (en) * 2021-06-11 2021-08-06 中国农业银行股份有限公司 Test method and device
CN113220595B (en) * 2021-06-11 2023-10-03 中国农业银行股份有限公司 Test method and test equipment
CN113434395B (en) * 2021-06-22 2024-04-30 中国平安人寿保险股份有限公司 Automatic generation method, device, equipment and medium for test cases
CN113641747A (en) * 2021-10-15 2021-11-12 北京新氧科技有限公司 Method, device and system for accessing postman tool to database
CN113641747B (en) * 2021-10-15 2022-03-18 北京新氧科技有限公司 Method, device and system for accessing postman tool to database
CN117056241B (en) * 2023-10-13 2024-01-26 彩讯科技股份有限公司 Application program testing method and device for mobile terminal
CN117056241A (en) * 2023-10-13 2023-11-14 彩讯科技股份有限公司 Application program testing method and device for mobile terminal

Similar Documents

Publication Publication Date Title
CN112148610A (en) Test case execution method and device, computer equipment and storage medium
CN112910945B (en) Request link tracking method and service request processing method
US10146663B2 (en) Modeling and testing interactions between components of a software system
US11422873B2 (en) Efficient message queuing service using multiplexing
CN111522922A (en) Log information query method and device, storage medium and computer equipment
US20130081001A1 (en) Immediate delay tracker tool
CN110750458A (en) Big data platform testing method and device, readable storage medium and electronic equipment
US7913233B2 (en) Performance analyzer
US10528456B2 (en) Determining idle testing periods
CN112650688A (en) Automated regression testing method, associated device and computer program product
CN109783457A (en) CGI interface managerial method, device, computer equipment and storage medium
US20210224102A1 (en) Characterizing operation of software applications having large number of components
CN113836014A (en) Interface testing method and device, electronic equipment and storage medium
CN116644250A (en) Page detection method, page detection device, computer equipment and storage medium
CN116955154A (en) Method and device for testing application program interface
CN116126808A (en) Behavior log recording method, device, computer equipment and storage medium
CN107656849B (en) Method and device for positioning performance problem of software system
CN112202598B (en) Log recording method and device
CN111177100B (en) Training data processing method, device and storage medium
US11228611B1 (en) Scanning unexposed web applications for vulnerabilities
CN113238901A (en) Multi-device automatic testing method and device, storage medium and computer device
CN112181535A (en) Interface calling method, device, server and storage medium
CN111324542B (en) Web application regression test case selection system, method and equipment
EP3995966A1 (en) System and method for automatic application log messages grouping using logging framework code instrumentation
CN114138611A (en) Diagnosis method and system for service operation log

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination