CN117931632A - Automatic integrated test method, device, equipment and storage medium - Google Patents

Automatic integrated test method, device, equipment and storage medium Download PDF

Info

Publication number
CN117931632A
CN117931632A CN202311723133.7A CN202311723133A CN117931632A CN 117931632 A CN117931632 A CN 117931632A CN 202311723133 A CN202311723133 A CN 202311723133A CN 117931632 A CN117931632 A CN 117931632A
Authority
CN
China
Prior art keywords
test
instruction
description file
execution
test case
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311723133.7A
Other languages
Chinese (zh)
Inventor
何继光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Sanqi Jichuang Network Technology Co ltd
Original Assignee
Guangzhou Sanqi Jichuang Network Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Sanqi Jichuang Network Technology Co ltd filed Critical Guangzhou Sanqi Jichuang Network Technology Co ltd
Priority to CN202311723133.7A priority Critical patent/CN117931632A/en
Publication of CN117931632A publication Critical patent/CN117931632A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The embodiment of the application discloses an automatic integrated test method, an automatic integrated test device, automatic integrated test equipment and a storage medium, wherein the automatic integrated test method comprises the following steps: the method comprises the steps of obtaining a test case and receiving a description file generation instruction, generating a description file of the test case according to the type of the description file generation instruction, wherein the type of the description file generation instruction comprises at least one or more of writing instructions, nested calling instructions or recording instructions, respectively calling the test case through connecting different test interfaces, generating the test instruction corresponding to the test case based on the description file, determining an execution thread parameter and an execution account number parameter of the test instruction, determining an execution strategy according to the execution thread parameter and the execution account number parameter, executing the test instruction corresponding to the test case according to the determined execution strategy, and outputting a visual test result, so that the problems of low test efficiency and low test coverage rate are solved, and the test efficiency and the test coverage rate are improved.

Description

Automatic integrated test method, device, equipment and storage medium
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to an automatic integrated test method, an automatic integrated test device, automatic integrated test equipment and a storage medium.
Background
The testing is an important link in the project development, each function is verified before the project is updated to the production environment, the requirements of the development are met, the project is enabled to run normally, and if errors exist in the project development or the actual requirements are not found in time, the maintenance cost is increased when the project is put into use, and the user experience is affected.
In the related technology, a general automatic test frame or an automatic test frame aiming at web application is generally adopted, the test program cannot be tightly combined with the test program, and a test description script writing mode related to the test program is not adopted, so that the test coverage rate is low, and a test case description file aiming at the test program is usually written manually by manpower, so that the test efficiency is low, and the project development cost is correspondingly increased.
Disclosure of Invention
The embodiment of the application provides an automatic integrated test method, an automatic integrated test device, automatic integrated test equipment and a storage medium, solves the problems of low test efficiency and low test coverage rate, and improves the test efficiency and the test coverage rate.
In a first aspect, an embodiment of the present application provides an automated integrated testing method, including:
acquiring a test case and receiving a description file generation instruction, and generating a description file of the test case according to the type of the description file generation instruction, wherein the type of the description file generation instruction comprises at least one or more of a writing instruction, a nested calling instruction or a recording instruction;
Respectively calling the test cases by connecting different test interfaces, generating a test instruction corresponding to the test case based on the description file, determining an execution thread parameter and an execution account parameter of the test instruction, and determining an execution strategy according to the execution thread parameter and the execution account parameter;
executing the test instruction corresponding to the test case according to the determined execution strategy, and outputting a visual test result.
Optionally, in the case that the type of the description file generation instruction is a writing instruction, before the generating the description file of the test case according to the type of the description file generation instruction, the method further includes:
Invoking description standard information of the description file, wherein the description standard information comprises a designated description tag and a description file writing format;
Correspondingly, the generating the description file of the test case according to the type of the description file generating instruction includes:
and receiving an editing instruction through a set file editing interface, and generating a description file of the test case according to the description standard information and the editing instruction.
Optionally, in the case that the description file generation instruction is a nested call instruction, the generating the description file of the test case according to the type of the description file generation instruction includes:
Calling a nested call tag, calling a plurality of description files in a description file library through the nested call tag, and combining the plurality of description files to generate the description file of the test case;
And generating the description file of the test case according to the type of the description file generation instruction under the condition that the description file generation instruction is a recording instruction, wherein the description file generation instruction comprises the following steps:
And under the condition of receiving the recording instruction, a recording interface is called, the operation behavior is recorded through the recording interface, and the description file of the test case is generated.
Optionally, the generating, by connecting different test interfaces, the test instruction corresponding to the test case based on the description file includes:
Calling the test case through connecting a first test interface, determining protocol parameters of the test case based on the description file, and generating a first test instruction according to the protocol parameters, wherein the protocol parameters comprise monitored protocol identification parameters and protocol operation mode parameters;
And calling the test case through connecting a second test interface, determining instruction parameters of the test case based on the description file, and generating a second test instruction according to the instruction parameters, wherein the instruction parameters comprise at least one or more of acquisition parameters, movement parameters or fight parameters, and the second test instruction comprises a chain test instruction or a tree test instruction.
Optionally, the generating a first test instruction according to the protocol parameter includes:
assembling the monitored protocol identification parameter and the protocol operation mode parameter to generate the first test instruction;
Writing a second test instruction according to the instruction parameters, including:
and determining an instruction execution sequence according to the instruction parameters, and generating a chain test instruction or a tree test instruction based on the instruction execution sequence.
Optionally, before the obtaining the test case, the method further includes:
Sending a test case acquisition request, and under the condition that the acquisition request cannot be responded, invoking a project configuration table and generating a description file of the test case according to the type of the received description file generation instruction;
Connecting a data simulation interface, simulating a target input object in the simulation data interface according to the project configuration table and the description file, and assembling the simulated target input object to generate a simulation test instruction;
determining an execution thread parameter and an execution account parameter of the simulation test instruction, and determining a single-thread execution strategy or a multi-thread execution strategy according to the execution thread parameter and the execution account parameter;
And executing the simulation test instruction according to the single-thread execution strategy or the multi-thread execution strategy, and outputting a visual test result.
Optionally, after the generating the visual test result, the method further includes:
And counting the test coverage rate of the test result through a coverage rate plug-in, generating a coverage rate report, and optimizing a program script for uncovered code branches according to the coverage rate report.
In a second aspect, an embodiment of the present application provides an automated integrated test apparatus, comprising:
the system comprises a description file generation module, a storage module and a storage module, wherein the description file generation module is used for acquiring a test case and receiving a description file generation instruction, and generating a description file of the test case according to the type of the description file generation instruction, wherein the type of the description file generation instruction comprises at least one or more of a writing instruction, a nested calling instruction or a recording instruction;
the test case calling module is used for respectively calling the test cases by connecting different test interfaces;
The test instruction generation module is used for generating a test instruction corresponding to the test case based on the description file and determining an execution thread parameter and an execution account parameter of the test instruction;
The execution policy determining module is used for determining a single-thread execution policy or a multi-thread execution policy according to the execution thread parameters and the execution account parameters;
The test module is used for executing the test instruction corresponding to the test case according to the single-thread execution strategy or the multi-thread execution strategy;
And the test result output module is used for outputting a visual test result.
In a third aspect, embodiments of the present application provide an automated integrated test equipment, the equipment comprising: one or more processors; and a storage configured to store one or more programs that, when executed by the one or more processors, cause the one or more processors to implement the automated integrated test method of the first aspect.
In a fourth aspect, embodiments of the present application provide a storage medium containing computer executable instructions which, when executed by a computer processor, are used to perform an automated integrated test method as described in the first aspect.
According to the embodiment of the application, the test case is obtained and the description file generation instruction is received, the description file of the test case is generated according to the type of the description file generation instruction, the type of the description file generation instruction comprises at least one or more of writing instructions, nested calling instructions or recording instructions, the test case is respectively called by connecting different test interfaces, the test instruction corresponding to the test case is generated based on the description file, the execution thread parameters and the execution account parameters of the test instruction are determined, and the execution strategy is determined according to the execution thread parameters and the execution account parameters. The corresponding interfaces can be called according to different test requirements and the execution strategy matched with the corresponding interfaces is adopted, so that the tight correlation between the test mode and the test item is ensured, and the test coverage rate is improved. The description file of the test case can be generated in various modes according to specific test requirements, so that the test description file closely related to the test item can be obtained, the efficiency of generating the description file can be ensured, and the test efficiency of the whole item is improved.
Drawings
FIG. 1 is a flow chart of an automated integrated test method provided by an embodiment of the present application;
FIG. 2 is a flowchart of a method for generating a description file according to an embodiment of the present application;
FIG. 3 is a flowchart of a method for generating test instructions according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a chain structure and tree structure according to an embodiment of the present application;
FIG. 5 is a flow chart of another automated integrated test method provided by an embodiment of the present application;
FIG. 6 is a schematic diagram of an automated integrated test apparatus according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an automated integrated test apparatus according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the following detailed description of specific embodiments of the present application is given with reference to the accompanying drawings. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting thereof. It should be further noted that, for convenience of description, only some, but not all of the matters related to the present application are shown in the accompanying drawings. Before discussing exemplary embodiments in more detail, it should be mentioned that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart depicts operations (or steps) as a sequential process, many of the operations can be performed in parallel, concurrently, or at the same time. Furthermore, the order of the operations may be rearranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figures. The processes may correspond to methods, functions, procedures, subroutines, and the like.
The technical solutions of the embodiments of the present application will be clearly described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which are obtained by a person skilled in the art based on the embodiments of the present application, fall within the scope of protection of the present application.
The terms first, second and the like in the description and in the claims, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged, as appropriate, such that embodiments of the present application may be implemented in sequences other than those illustrated or described herein, and that the objects identified by "first," "second," etc. are generally of a type, and are not limited to the number of objects, such as the first object may be one or more. Furthermore, in the description and claims, "and/or" means at least one of the connected objects, and the character "/", generally means that the associated object is an "or" relationship.
The method, the device, the equipment and the storage medium for automated integrated testing provided by the embodiment of the application are described in detail below through specific embodiments and application scenes thereof with reference to the accompanying drawings.
Fig. 1 is a flowchart of an automated integrated testing method according to an embodiment of the present application, where the embodiment of the present application may be used to test various functions, parameters, game scenarios, etc. before a game item is put into use. Based on the above usage scenario, it can be appreciated that the execution subject of the present solution may be an automated test system.
An automated integrated test method is described below as an example. Referring to fig. 1, the method includes:
step S101, a test case is obtained, a description file generation instruction is received, a description file of the test case is generated according to the type of the description file generation instruction, and the type of the description file generation instruction comprises at least one or more of a writing instruction, a nested calling instruction or a recording instruction.
In one embodiment, a test case may be a set of documents designed and written for performing software system testing, consisting essentially of test inputs, execution conditions, and expected results. The test case is an important basis for executing the test, and has the characteristics of effectiveness, repeatability, easiness in organization, clearness, simplicity, maintainability and the like. The description file generation instruction may be an instruction issued by a tester according to an actual test item or an actual test requirement, and the instruction may be generated by the tester by clicking a button of a corresponding function on a screen of the terminal device. The type of description file generation instruction may include at least one or more of a write instruction, a nested call instruction, or a record instruction. The programming instruction is used for controlling the automatic test system to automatically program the test case description file through a certain rule. The nested call instruction may be an instruction that controls the automated test system to call an existing description file in the database. The recording instructions may be instructions that control the automated test system to record the manual operation of the tester. The description file of the test case may be a file describing the test behavior of the test case by specifying a corresponding tag. Such as by defining the number of threads to execute via a thread tag. And loading the test cases according to a test case loading path provided by the automatic test system, receiving a description file generation instruction sent by a tester, and determining a method for generating the test case description file according to the description file generation instruction.
Step S102, respectively calling the test cases by connecting different test interfaces, generating test instructions corresponding to the test cases based on the description files, determining execution thread parameters and execution account parameters of the test instructions, and determining an execution strategy according to the execution thread parameters and the execution account parameters.
The test interface can be an interface set according to the fact that the tested objects are not opposite, and the automatic test system can realize corresponding tests by calling the corresponding test interface. The test interface may include an analog data interface, a protocol listening interface, a behavior instruction interface, and the like. The test instructions corresponding to the test cases can be specific test behaviors in the game process, such as generated special effects, rotation behaviors of characters and the like. The thread of execution parameter may be the number of threads of execution defined by a tag in the description file, such as by a thread tag. The execution account parameters are the range of the specified accounts and the number of the accounts defined by the labels in the description file, and the like, such as the range and the number of the specified accounts by the account labels and num attributes. The execution policy may be used to represent specific test behavior, and may include repeat instruction calls of a single thread single account, chained instruction calls of a single thread single account, repeat instruction calls of multiple threads multiple accounts, chained instruction calls of multiple threads multiple accounts, and the like. In order to solve the thread safety problem, the single game account acts on a fixed single thread, and in order to simulate the actual game environment, the automatic test system defaults that the test acts of the single game account are only executed by the single thread, so that the test case of the single thread can execute the expected automatic test without specifying an additional running strategy. In some large copy playing methods, a large number of players are often struggled in the same scene, and resources of the copy are shared by the players, namely when the number of specified accounts is multiple and the number of threads is multiple in the description file, the same resources are struggled by multiple threads, which is slightly careless in development, so that potential thread safety hazards may be left. Therefore, by making corresponding execution strategies in the testing process, the correctness and stability of the multi-user copy playing method can be ensured. Optionally, after determining the specific execution policy, the execution times may be set correspondingly, so as to ensure the coverage rate of the test.
The self-organized test system determines target parameters of the test cases through the specified contents in the description files of the test cases by connecting corresponding test interfaces and calling the test cases through the interfaces, obtains test instructions corresponding to the test cases by processing the target parameters, and determines the execution thread parameters and the execution account parameters of the test instructions according to the description files to determine the corresponding specific execution strategies.
Step S103, executing the test instruction corresponding to the test case according to the determined execution strategy, and outputting a visual test result.
Executing the test instruction corresponding to the test case through the determined execution strategy and the set execution times, and inputting a detailed test result. The test results comprise data output and assertion results which are output on a control console, a log file or a front-end webpage and are used for a tester to evaluate whether the functions meet expectations, such as total execution time consumption, each time consumption, execution time ratio of each game behavior instruction, error log reporting, and the like. The tester can find out whether the functional abnormality exists by comparing the time consumption and the complexity of the instruction. The key information in the game item execution can be stored in a log file, an automatic test system can extract specific game logs generated in the execution process through an anchor point or a specified mode, a tester can exclude irrelevant logs, and whether the test function accords with the expectation or not can be quickly checked through the extracted key logs.
According to the embodiment of the application, the test case is obtained and the description file generation instruction is received, the description file of the test case is generated according to the type of the description file generation instruction, the type of the description file generation instruction comprises at least one or more of writing instructions, nested calling instructions or recording instructions, the test case is respectively called by connecting different test interfaces, the test instruction corresponding to the test case is generated based on the description file, the execution thread parameters and the execution account parameters of the test instruction are determined, and the execution strategy is determined according to the execution thread parameters and the execution account parameters. The corresponding interfaces can be called according to different test requirements and the execution strategy matched with the corresponding interfaces is adopted, so that the tight correlation between the test mode and the test item is ensured, and the test coverage rate is improved. The description file of the test case can be generated in various modes according to specific test requirements, so that the test description file closely related to the test item can be obtained, the efficiency of generating the description file can be ensured, and the test efficiency of the whole item is improved.
In one embodiment, after generating the visual test result, further comprising: and counting the test coverage rate of the test result through a coverage rate plug-in, generating a coverage rate report, and optimizing a program script for uncovered code branches according to the coverage rate report.
The coverage rate plug-in can support the calculation of the coverage condition of test codes on projects, can locate the code parts which are not covered by the test, can check the waste codes and unreasonable logic in the program at the same time, and is used for improving the code quality. The coverage report can show the coverage of each branch in the code management system and calculate the coverage of the whole branch. The program script optimization can write related parameters of the uncovered branches into the description file through specified parameters of the labels in the complete test case description file, so that the automatic test is more accurate and comprehensive. After a visual test result is generated, the automatic test system invokes the coverage rate plug-in, the coverage rate plug-in detects whether all branches are covered or not, and generates a coverage rate report, a tester can send a description file writing request according to the coverage rate, the writing request comprises a writing requirement and a writing instruction, the automatic test system re-writes the description file of the test case by receiving the writing requirement and the writing instruction, and tests the test case again based on the newly generated description file, so that the automatic test is more accurate and comprehensive.
Fig. 2 is a flowchart of a method for generating a description file according to an embodiment of the present application, as shown in the drawings, specifically includes:
Step S201, a test case is obtained and a description file generation instruction is received.
In step S202, in the case that the type of the description file generation instruction is a writing instruction, description standard information of the description file is called, where the description standard information includes a specified description tag and a description file writing format.
In one embodiment, descriptive standard information may be used to normalize a set of developing semantic normative descriptive standards that are adapted to the application scenario of an embodiment of the present application. The description standard information includes various labels, such as account, login, fight, skill, if, else, when, then and where, etc., to describe "what account number is used", "what condition is, what precondition is used", "what parameter is used", "what combat is used", "what skill is used", etc., the writing of test case description file is normalized from the semantic level, and the test case description file written by using the specified format and label can be loaded and translated by the automated test system to generate execution instructions, and all the execution instructions are executed according to the execution mode set by the execution policy. And under the condition that the type of the description file generation instruction is a writing instruction, calling a pre-written normative description tag and writing format.
And step 203, receiving an editing instruction through a set file editing interface, and generating a description file of the test case according to the description standard information and the editing instruction.
The file editing interface may be a portal to a single function request protocol for text editing. The editing operation of the description file corresponding to the test case can be realized through the editing interface. And under the condition that the type of the description file generation instruction is a writing instruction, calling pre-written description standard information, and writing the description file corresponding to the test case according to the normative description label and the writing format indicated in the description standard information. Optionally, after the description file of the test case is generated, the description file may be stored in a description file library, and the stored description files may be classified and managed by the description file library, or the stored description files may be retrieved in the description file record.
According to the embodiment of the application, under the condition that the type of the description file generation instruction is the writing instruction, the description standard information of the description file is called, the description standard information comprises the appointed description tag and the description file writing format, the editing instruction is received through the set file editing interface, and the description file of the test case is generated according to the description standard information and the editing instruction. The semantic standardization description standard can be provided by calling the pre-written description standard information, so that the test system can quickly determine the required test command, the test command can be more in line with the scene of an actual test project, the description file of the test case can be automatically generated by calling the file editing interface, the efficiency of generating the description file is greatly improved, and the efficiency of testing the whole project is further improved.
In one embodiment, in the case that the description file generation instruction is a nested call instruction, generating the description file of the test case according to the type of the description file generation instruction includes:
And calling a nested call tag, calling a plurality of description files in the description file library through the nested call tag, and combining the plurality of description files to generate the description file of the test case.
Nested call tags may be used to make tools describing file calls. The description file library may be a database for storing only description files. When the test case is a complex multifunctional test case, a tester can generate a description file of the test case by sending a nested call instruction. After the automatic test system receives the nested call request, the required description files are searched in the description file library according to the classification, the search result of the description files is obtained, the corresponding description files are called according to the search result, and all the called description files are freely combined to generate the description files of the more complex test cases.
In the case that the description file generation instruction is a recording instruction, generating a description file of the test case according to the type of the description file generation instruction, including:
And under the condition of receiving the recording instruction, a recording interface is called, the operation behavior is recorded through the recording interface, and a description file of the test case is generated.
The recording interface may be an interface for providing recording functions. Under the condition of logging in a game, a tester can use a game GM (Game Master) to command to call a recording interface provided by an automatic test system to start recording, the automatic test system can start recording all operations of the tester or operations within a designated range, the corresponding recording interface can be called again to stop recording, the instruction recorded in the process is a code, and test parameters and test times are modified to quickly obtain a required test case description file.
According to the embodiment of the application, the nested call tag is called, a plurality of description files are called in the description file library through the nested call tag, the plurality of description files are combined to generate the description file of the test case, the test behavior arrangement function is provided, the existing description file is matched freely, the test case description file of the tested function does not need to be repeatedly written, and the multiplexing of the test case is realized. And under the condition of receiving the recording instruction, the recording interface is called, the recording interface is used for recording the operation behavior and generating the description file of the test case, so that a powerful automatic test function is realized, and the test efficiency of the project is further improved.
Fig. 3 is a flowchart of a test instruction generating method according to an embodiment of the present application, where as shown in fig. 3, the method specifically includes:
Step 301, a test case is obtained and a description file generation instruction is received, a description file of the test case is generated according to the type of the description file generation instruction, and the type of the description file generation instruction comprises at least one or more of a writing instruction, a nested calling instruction or a recording instruction.
Step S302, a test case is called through connection with a first test interface, protocol parameters of the test case are determined based on the description file, a first test instruction is generated according to the protocol parameters, and the protocol parameters comprise monitored protocol identification parameters and protocol operation mode parameters.
In one embodiment, the first test interface may be an automated test interface for a listening protocol, which may be a protocol listening interface. The protocol parameter may be all data specified in the description file regarding the listening protocol. Optionally, the protocol parameters include a protocol identification parameter and a protocol operation mode parameter. The protocol identification parameter may be used to indicate a protocol to be monitored, and the protocol operation mode parameter may be used to identify a specific operation performed on the monitored protocol, for example, a protocol to be monitored specified by a message tag, an operation performed after the monitoring of the protocol may be specified by a write tag, and interception modification of protocol contents may be performed by a modification attribute. The first test instruction may represent a test of a specific behavior of the monitored protocol. Since the description file is only used for describing the corresponding behaviors through the tag, and cannot represent the execution mode of each behavior, the specific execution mode is obtained by processing the protocol parameters obtained according to the description file, and the first test instruction is obtained.
In one embodiment, generating the first test instruction according to the protocol parameter includes: and assembling the monitored protocol identification parameter and the protocol operation mode parameter to generate a first test instruction.
The test case is called through the connection protocol monitoring interface, the protocol to be monitored and the operation executed after the protocol is monitored are determined according to the label appointed parameter in the description file, or interception modification and the like are carried out on the content of the protocol according to the actually received instruction. And the specific execution mode of the protocol, namely the first test instruction, is obtained by assembling the obtained protocol identification parameter and the protocol operation mode parameter.
Step S303, calling the test case by connecting a second test interface, determining instruction parameters of the test case based on the description file, and generating a second test instruction according to the instruction parameters, wherein the instruction parameters comprise at least one or more of acquisition parameters, movement parameters or fight parameters, and the second test instruction comprises a chain test instruction or a tree test instruction.
The second test interface may be an automated test interface for behavioral instructions, and the second test interface may be a behavioral instruction interface. The instruction parameters may include at least one or more of acquisition parameters, movement parameters, or combat parameters. Wherein the acquisition parameters can include the number of acquisitions in the game scene, and the movement parameters can include the movement direction, movement distance, and the like of the character in the game. The combat parameters may include blood volume of the character, status of the character, etc. Writing an acquisition instruction according to the acquisition parameters specified in the description file, writing a movement instruction through the specified movement parameters, writing a fight instruction through the specified fight parameters, and setting the execution sequence of at least one instruction of the acquisition instruction, the movement instruction or the fight instruction in a manner of writing a chain test instruction or a tree test instruction. Fig. 4 is a schematic diagram of a chain structure and tree structure provided in an embodiment of the present application, as shown in fig. 4, a movement instruction is written, so that the character in the game can move in the scene, and the user can control the behavior parameters such as the movement direction, the movement distance, etc. of the character by self-defining, and after the movement instruction is executed, the next behavior instruction, such as a combat instruction or a collection instruction, can be found in the chain structure. The tree structure can be used for adding instruction conditions, and after the execution of the previous instruction is completed, the AI simulation to a certain degree is realized through condition judgment. For example, after the movement instruction is executed, the object existing nearby is judged, the acquisition instruction is executed if the acquisition object exists, and the fight instruction is executed if the acquisition object exists.
Step S304, executing the test instruction corresponding to the test case according to the determined execution strategy, and outputting a visual test result.
According to the embodiment of the application, the test case is called through connecting the first test interface, the protocol parameters of the test case are determined based on the description file, the first test instruction is generated according to the protocol parameters, the test case is called through connecting the second test interface, the instruction parameters of the test case are determined based on the description file, the second test instruction is generated according to the instruction parameters, the test instruction of the target test object can be directly obtained through calling the corresponding interface, various in-game behaviors can be conveniently realized by providing the instruction interface, the control is not needed through the protocol, a plurality of behavior instructions can be sequentially executed through a chain structure, the branch condition execution can be realized by adapting to a tree structure, the tree test instruction can be suitable for the pressure test of multiple copies, the real operation of different players in an on-line environment is simulated, whether the copies have abnormal CPU resource consumption points or have concurrent problems or not is found, and the like.
Fig. 5 is a flowchart of another automated integrated testing method according to an embodiment of the present application, as shown in fig. 5, specifically including:
step S501, a test case acquisition request is sent, and in the case that the acquisition request cannot be responded, a project configuration table is called and a description file of the test case is generated according to the type of the received description file generation instruction.
In one embodiment, the case acquisition request may be a request sent by the automated test system to the development management system to acquire a test case. Because in game item testing, the next operation needs to be completed, or the previous operation needs to be completed to a certain extent, if real data is used, a large number of front-end operations must be completed, so that when the large number of front-end operations are not completed, the development management system cannot respond to the test case acquisition request sent by the automatic test system. The project configuration table may be a configuration table commonly used in game projects, and the project configuration table can describe game logic to be executed, such as an account table defining a game account number to be loaded and executed, a message table defining a game interaction protocol to be simulated, and an action table defining various test behaviors to be executed. When the real test case cannot be acquired, the configuration table can be called, and the description file of the test case can be generated according to the type of the received description file generation instruction. The specific implementation manner of generating the description file of the test case according to the type of the received description file generation instruction is the same as the manner of step S101, and will not be repeated here.
Step S502, connecting a data simulation interface, simulating a target input object in the simulation data interface according to the project configuration table and the description file, and assembling the simulated target input object to generate a simulation test instruction.
The simulation data interface can simulate various target parameter objects which need to be input. The simulation target input object may be a target parameter to be tested, such as an account number data field, a reading range of the field, a custom input data range, and the like. Under the condition that the development management system cannot respond to a test case acquisition request sent by the automatic test system, calling a project configuration table, connecting a simulation data interface, generating a description file of the simulation test case, reading a target input object in the configuration table according to the description in the description file in the simulation data interface, and assembling the read target input object to obtain a simulation test instruction required by the simulation test case.
Step S503, determining an execution thread parameter and an execution account parameter of the simulation test instruction, and determining a single-thread execution policy or a multi-thread execution policy according to the execution thread parameter and the execution account parameter.
Step S504, executing the simulation test instruction according to the single-thread execution strategy or the multi-thread execution strategy, and outputting a visual test result.
The processing manner of the simulation test instruction in step S503 and step S504 is the same as that of the test instruction in step S102 and step S103, and will not be repeated here.
According to the embodiment of the application, the project configuration table is called and the description file of the test case is generated according to the type of the received description file generation instruction by sending the test case acquisition request, under the condition that the acquisition request cannot be responded, the data simulation interface is connected, the simulation target input object is simulated in the simulation data interface according to the project configuration table and the description file, and the simulation target input object is assembled to generate the simulation test instruction. The condition that the target test case can be obtained only after a large number of front-end operations are completed can be avoided, and therefore the test efficiency is improved.
Fig. 6 is a schematic structural diagram of an automated integrated testing device according to an embodiment of the present application, as shown in fig. 6, specifically including:
The description file generation module 61 is configured to obtain a test case and receive a description file generation instruction, and generate a description file of the test case according to a type of the description file generation instruction, where the type of the description file generation instruction includes at least one or more of a writing instruction, a nested calling instruction, or a recording instruction;
The test case retrieving module 62 is configured to separately retrieve the test cases by connecting different test interfaces;
A test instruction generating module 63, configured to generate a test instruction corresponding to the test case based on the description file, determine an execution thread parameter and an execution account parameter of the test instruction,
An execution policy determining module 64, configured to determine a single-thread execution policy or a multi-thread execution policy according to the execution thread parameter and the execution account parameter;
the test module 65 is configured to execute a test instruction corresponding to the test case according to the single-thread execution policy or the multi-thread execution policy;
the test result output module 66 is configured to output a visual test result.
According to the embodiment of the application, the test case is obtained and the description file generation instruction is received, the description file of the test case is generated according to the type of the description file generation instruction, the type of the description file generation instruction comprises at least one or more of writing instructions, nested calling instructions or recording instructions, the test case is respectively called by connecting different test interfaces, the test instruction corresponding to the test case is generated based on the description file, the execution thread parameters and the execution account parameters of the test instruction are determined, and the execution strategy is determined according to the execution thread parameters and the execution account parameters. The corresponding interfaces can be called according to different test requirements and the execution strategy matched with the corresponding interfaces is adopted, so that the tight correlation between the test mode and the test item is ensured, and the test coverage rate is improved. The description file of the test case can be generated in various modes according to specific test requirements, so that the test description file closely related to the test item can be obtained, the efficiency of generating the description file can be ensured, and the test efficiency of the whole item is improved.
In one possible embodiment, the description standard information obtaining module is configured to invoke description standard information of the description file, where the description standard information includes a specified description tag and a description file writing format.
The description file generating module 61 is specifically configured to receive an editing instruction through a set file editing interface, and generate a description file of the test case according to the description standard information and the editing instruction.
In one possible embodiment, the description file generating module 61 is specifically configured to call a nested call tag, call a plurality of description files in the description file library through the nested call tag, and combine the plurality of description files to generate a description file of the test case;
the description file generating module 61 is specifically configured to invoke a recording interface when receiving a recording instruction, record an operation behavior through the recording interface, and generate a description file of the test case.
In one possible embodiment, the test case retrieving module 62 is specifically configured to retrieve the test case by connecting to the first test interface;
The test instruction generating module 63 is specifically configured to determine a protocol parameter of the test case based on the description file, and generate a first test instruction according to the protocol parameter, where the protocol parameter includes a monitored protocol identification parameter and a protocol operation mode parameter;
the test case retrieving module 62 specifically retrieves the test case by connecting with a second test interface;
The test instruction generating module 63 is specifically configured to determine an instruction parameter of the test case based on the description file, and generate a second test instruction according to the instruction parameter, where the instruction parameter includes at least one or more of an acquisition parameter, a movement parameter, or a combat parameter, and the second test instruction includes a chain test instruction or a tree test instruction.
In a possible embodiment, the test instruction generating module 63 is specifically configured to assemble the monitored protocol identification parameter and the protocol operation mode parameter to generate the first test instruction;
and determining an instruction execution sequence according to the instruction parameters, and generating a chain test instruction or a tree test instruction based on the instruction execution sequence.
In one possible embodiment, the project configuration table retrieving module is configured to send a test case acquisition request, and retrieve the project configuration table if the acquisition request cannot be responded;
the description file generating module 61 is configured to generate a description file of the test case according to the type of the received description file generating instruction;
the test instruction generating module 63 is configured to connect to a data simulation interface, simulate a target input object in the simulation data interface according to the project configuration table and the description file, and assemble the simulated target input object to generate a simulated test instruction;
The execution policy determining module 64 is configured to determine an execution thread parameter and an execution account parameter of the simulated test instruction, and determine a single-thread execution policy or a multi-thread execution policy according to the execution thread parameter and the execution account parameter;
The test module 65 is configured to execute the simulated test instruction according to the single-threaded execution policy or the multi-threaded execution policy;
the test result output module 66 is used for outputting visual test results.
The embodiment of the application also provides an automatic integrated test device which can integrate the automatic integrated test device provided by the embodiment of the application. Fig. 7 is a schematic structural diagram of an automated integrated test apparatus according to an embodiment of the present application. Referring to fig. 7, the automated integrated test equipment includes: an input device 73, an output device 74, a memory 72, and one or more processors 71; a memory 72 for storing one or more programs; the one or more programs, when executed by the one or more processors 71, cause the one or more processors 71 to implement the automated integrated test method as provided by the above embodiments. Wherein the input device 73, the output device 74, the memory 72 and the processor 71 may be connected by a bus or otherwise, for example in fig. 7.
The memory 72 is a computer readable storage medium that can be used to store software programs, computer executable programs, and modules, such as program instructions/modules corresponding to the automated integrated test method provided by any of the embodiments of the present application. The memory 72 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, at least one application program required for functions; the storage data area may store data created according to the use of the device, etc. In addition, memory 72 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid-state storage device. In some examples, memory 72 may further include memory located remotely from processor 71, which may be connected to the device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input means 73 may be used to receive input numeric or character information and to generate key signal inputs related to user settings and function control of the device. The output device 74 may include a display device such as a display screen.
The processor 71 executes various functional applications of the device and data processing by running software programs, instructions and modules stored in the memory 72, i.e. implements the automated integrated test method described above.
The automatic integrated test device, the automatic integrated test equipment and the computer provided by the embodiment can be used for executing the automatic integrated test method provided by any embodiment, and have corresponding functions and beneficial effects.
The embodiment of the present application also provides a storage medium storing computer executable instructions, which when executed by a computer processor, are configured to perform the automated integrated test method provided in the embodiment, the automated integrated test method comprising: the method comprises the steps of obtaining a test case and receiving a description file generation instruction, and generating a description file of the test case according to the type of the description file generation instruction, wherein the type of the description file generation instruction comprises at least one or more of a writing instruction, a nested calling instruction or a recording instruction; respectively calling test cases by connecting different test interfaces, generating test instructions corresponding to the test cases based on the description files, determining execution thread parameters and execution account parameters of the test instructions, and determining an execution strategy according to the execution thread parameters and the execution account parameters; executing the test instruction corresponding to the test case according to the determined execution strategy, and outputting a visual test result.
Storage media-any of various types of memory devices or storage devices. The term "storage medium" is intended to include: mounting media such as CD-ROM, floppy disk or tape devices; computer system memory or random access memory such as DRAM, DDR RAM, SRAM, EDO RAM, lanbas (Rambus) RAM, etc.; nonvolatile memory such as flash memory, magnetic media (e.g., hard disk or optical storage); registers or other similar types of memory elements, etc. The storage medium may also include other types of memory or combinations thereof. In addition, the storage medium may be located in a first computer system in which the program is executed, or may be located in a second, different computer system connected to the first computer system through a network such as the internet. The second computer system may provide program instructions to the first computer for execution. The term "storage medium" may include two or more storage media that may reside in different locations (e.g., in different computer systems connected by a network). The storage medium may store program instructions (e.g., embodied as a computer program) executable by one or more processors.
Of course, the storage medium containing the computer executable instructions provided in the embodiments of the present application is not limited to the automated integrated test method described above, and may also perform the related operations in the automated integrated test method provided in any embodiment of the present application.
The automated integrated test apparatus, the device and the storage medium provided in the foregoing embodiments may perform the automated integrated test method provided in any embodiment of the present application, and technical details not described in detail in the foregoing embodiments may be referred to the automated integrated test method provided in any embodiment of the present application.
The foregoing description is only of the preferred embodiments of the application and the technical principles employed. The present application is not limited to the specific embodiments described herein, but is capable of numerous modifications, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the application. Therefore, while the application has been described in connection with the above embodiments, the application is not limited to the embodiments, but may be embodied in many other equivalent forms without departing from the spirit of the application, the scope of which is set forth in the following claims.

Claims (10)

1. An automated integrated test method, comprising:
acquiring a test case and receiving a description file generation instruction, and generating a description file of the test case according to the type of the description file generation instruction, wherein the type of the description file generation instruction comprises at least one or more of a writing instruction, a nested calling instruction or a recording instruction;
Respectively calling the test cases by connecting different test interfaces, generating a test instruction corresponding to the test case based on the description file, determining an execution thread parameter and an execution account parameter of the test instruction, and determining an execution strategy according to the execution thread parameter and the execution account parameter;
executing the test instruction corresponding to the test case according to the determined execution strategy, and outputting a visual test result.
2. The automated integrated test method of claim 1, wherein, in the case where the type of the description file generation instruction is a write instruction, before the generating the description file of the test case according to the type of the description file generation instruction, further comprising:
Invoking description standard information of the description file, wherein the description standard information comprises a designated description tag and a description file writing format;
Correspondingly, the generating the description file of the test case according to the type of the description file generating instruction includes:
and receiving an editing instruction through a set file editing interface, and generating a description file of the test case according to the description standard information and the editing instruction.
3. The automated integrated test method of claim 1, wherein: and generating the description file of the test case according to the type of the description file generation instruction under the condition that the description file generation instruction is a nested call instruction, wherein the method comprises the following steps:
Calling a nested call tag, calling a plurality of description files in a description file library through the nested call tag, and combining the plurality of description files to generate the description file of the test case;
And generating the description file of the test case according to the type of the description file generation instruction under the condition that the description file generation instruction is a recording instruction, wherein the description file generation instruction comprises the following steps:
And under the condition of receiving the recording instruction, a recording interface is called, the operation behavior is recorded through the recording interface, and the description file of the test case is generated.
4. The automated integrated testing method of claim 1, wherein the generating the test instruction corresponding to the test case based on the description file by respectively calling the test case through connecting different test interfaces includes:
Calling the test case through connecting a first test interface, determining protocol parameters of the test case based on the description file, and generating a first test instruction according to the protocol parameters, wherein the protocol parameters comprise monitored protocol identification parameters and protocol operation mode parameters;
And calling the test case through connecting a second test interface, determining instruction parameters of the test case based on the description file, and generating a second test instruction according to the instruction parameters, wherein the instruction parameters comprise at least one or more of acquisition parameters, movement parameters or fight parameters, and the second test instruction comprises a chain test instruction or a tree test instruction.
5. The automated integrated test method of claim 4, wherein generating the first test instruction from the protocol parameter comprises:
assembling the monitored protocol identification parameter and the protocol operation mode parameter to generate the first test instruction;
Writing a second test instruction according to the instruction parameters, including:
and determining an instruction execution sequence according to the instruction parameters, and generating a chain test instruction or a tree test instruction based on the instruction execution sequence.
6. The automated integrated test method of claim 1, further comprising, prior to the obtaining the test case:
Sending a test case acquisition request, and under the condition that the acquisition request cannot be responded, invoking a project configuration table and generating a description file of the test case according to the type of the received description file generation instruction;
Connecting a data simulation interface, simulating a target input object in the simulation data interface according to the project configuration table and the description file, and assembling the simulated target input object to generate a simulation test instruction;
determining an execution thread parameter and an execution account parameter of the simulation test instruction, and determining a single-thread execution strategy or a multi-thread execution strategy according to the execution thread parameter and the execution account parameter;
And executing the simulation test instruction according to the single-thread execution strategy or the multi-thread execution strategy, and outputting a visual test result.
7. The automated integrated test method of claim 1, further comprising, after the generating the visual test result:
And counting the test coverage rate of the test result through a coverage rate plug-in, generating a coverage rate report, and optimizing a program script for uncovered code branches according to the coverage rate report.
8. An automated integrated test apparatus, comprising:
the system comprises a description file generation module, a storage module and a storage module, wherein the description file generation module is used for acquiring a test case and receiving a description file generation instruction, and generating a description file of the test case according to the type of the description file generation instruction, wherein the type of the description file generation instruction comprises at least one or more of a writing instruction, a nested calling instruction or a recording instruction;
the test case calling module is used for respectively calling the test cases by connecting different test interfaces;
The test instruction generation module is used for generating a test instruction corresponding to the test case based on the description file and determining an execution thread parameter and an execution account parameter of the test instruction;
The execution policy determining module is used for determining a single-thread execution policy or a multi-thread execution policy according to the execution thread parameters and the execution account parameters;
The test module is used for executing the test instruction corresponding to the test case according to the single-thread execution strategy or the multi-thread execution strategy;
And the test result output module is used for outputting a visual test result.
9. An automated integrated test equipment, the equipment comprising: one or more processors; storage means for storing one or more programs that, when executed by the one or more processors, cause the one or more processors to implement the automated integrated test method of any of claims 1-7.
10. A storage medium storing computer executable instructions which, when executed by a computer processor, are for performing the automated integrated test method of any of claims 1-7.
CN202311723133.7A 2023-12-13 2023-12-13 Automatic integrated test method, device, equipment and storage medium Pending CN117931632A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311723133.7A CN117931632A (en) 2023-12-13 2023-12-13 Automatic integrated test method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311723133.7A CN117931632A (en) 2023-12-13 2023-12-13 Automatic integrated test method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117931632A true CN117931632A (en) 2024-04-26

Family

ID=90768985

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311723133.7A Pending CN117931632A (en) 2023-12-13 2023-12-13 Automatic integrated test method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117931632A (en)

Similar Documents

Publication Publication Date Title
US9465718B2 (en) Filter generation for load testing managed environments
US6986125B2 (en) Method and apparatus for testing and evaluating a software component using an abstraction matrix
US6941546B2 (en) Method and apparatus for testing a software component using an abstraction matrix
CN108763076A (en) A kind of Software Automatic Testing Method, device, equipment and medium
US20130263090A1 (en) System and method for automated testing
CN110013672B (en) Method, device, apparatus and computer-readable storage medium for automated testing of machine-run games
CN111124919A (en) User interface testing method, device, equipment and storage medium
CN110362490B (en) Automatic testing method and system for integrating iOS and Android mobile applications
CN112433948A (en) Simulation test system and method based on network data analysis
CN115658529A (en) Automatic testing method for user page and related equipment
US8850407B2 (en) Test script generation
CN112860587B (en) UI automatic test method and device
CN114297961A (en) Chip test case processing method and related device
CN117134986A (en) Method, system and device for generating external network honey point based on ChatGPT
CN116719736A (en) Test case generation method and device for testing software interface
CN116185826A (en) Test method, device, equipment and storage medium
CN117931632A (en) Automatic integrated test method, device, equipment and storage medium
CN113986263A (en) Code automation test method, device, electronic equipment and storage medium
CN113986753A (en) Interface test method, device, equipment and storage medium
CN109669868A (en) The method and system of software test
CN114968687B (en) Traversal test method, apparatus, electronic device, program product, and storage medium
CN118227506B (en) UI (user interface) automatic test system and method based on RPA (remote procedure alliance) robot
CN118503077A (en) Game testing method, device, equipment and storage medium
CN118626390A (en) White box component testing method, white box component testing equipment and readable storage medium
CN118227441A (en) Test system and test method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination