CN110765026A - Automatic testing method and device, storage medium and equipment - Google Patents

Automatic testing method and device, storage medium and equipment Download PDF

Info

Publication number
CN110765026A
CN110765026A CN201911053638.0A CN201911053638A CN110765026A CN 110765026 A CN110765026 A CN 110765026A CN 201911053638 A CN201911053638 A CN 201911053638A CN 110765026 A CN110765026 A CN 110765026A
Authority
CN
China
Prior art keywords
test
target
test case
target container
cases
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911053638.0A
Other languages
Chinese (zh)
Other versions
CN110765026B (en
Inventor
张乐源
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Neusoft Wang Hai Technology Co Ltd
Original Assignee
Beijing Neusoft Wang Hai Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Neusoft Wang Hai Technology Co Ltd filed Critical Beijing Neusoft Wang Hai Technology Co Ltd
Priority to CN201911053638.0A priority Critical patent/CN110765026B/en
Publication of CN110765026A publication Critical patent/CN110765026A/en
Application granted granted Critical
Publication of CN110765026B publication Critical patent/CN110765026B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • G06F9/5083Techniques for rebalancing the load in a distributed system
    • G06F9/5088Techniques for rebalancing the load in a distributed system involving task migration
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The present application relates to the field of automated testing technologies, and in particular, to an automated testing method, an automated testing apparatus, a storage medium, and an automated testing device, where the automated testing method includes: acquiring a plurality of test cases according to preset test requirements; creating a plurality of target containers for executing the test cases on a preset distributed system; and matching the data size of the test case with the processing performance of the target container, and distributing the test case to the target container matched with the test case for operation. According to the scheme provided by the application, the multiple target containers are constructed, the test cases are distributed to the target containers matched with the multiple target containers to operate, the purpose that the multiple test cases are executed on the same host in parallel is achieved, and the resource utilization rate and the automatic test efficiency are improved.

Description

Automatic testing method and device, storage medium and equipment
Technical Field
The present application relates to the field of automated testing technologies, and in particular, to an automated testing method, an automated testing apparatus, a storage medium, and a device.
Background
With the continuous development of the software industry, newly developed software is more and more complex, and functions are more and more, so that software testing is more and more complex. In the software testing process, in order to ensure the testing quality, a large number of test cases are generally required to be designed and compiled, and then the compiled test cases are used for performing the functional test or the performance test of the software, and the traditional mode of performing the manual test by using the test cases has low efficiency.
At present, test cases are often used for automatic testing, but automatic testing can only be performed in a single thread on one machine, that is, common automatic test execution adopts a single process to execute a test case set, and the test time increases with the increase of the number of cases and the complexity of the flow, generally speaking, when an interface test case set is more than hundreds, the test time is more than ten minutes, and for some projects with a faster version iteration cycle, for example: the automatic cases corresponding to the sensitive development project are accumulated continuously, so that the number of the test cases of the sensitive development project is increased continuously along with the increase of the test times, a large amount of time is consumed for running all the test cases by the scheme, and the test efficiency is low.
Disclosure of Invention
The application provides an automatic testing method, an automatic testing device, a computer readable storage medium and computer equipment, so that the efficiency of an automatic testing process is improved.
The embodiment of the application firstly provides an automatic testing method, which comprises the following steps:
acquiring a plurality of test cases according to preset test requirements;
creating a plurality of target containers for executing the test cases on a preset distributed system;
and matching the data size of the test case with the processing performance of the target container, and distributing the test case to the target container matched with the test case for operation.
In one embodiment, the step of matching the processing performance of the target container according to the data size of the test case includes:
dividing a plurality of test cases into batches according to the number of the target containers, wherein the number of the test cases in each batch is the same as that of the target containers;
acquiring the data size of each test case for each batch, and sequencing the test cases according to the data size;
calling a sequencing result for sequencing the target containers in advance according to the processing performance of the target containers for each batch;
and matching the test cases with the same sequence with the target container for each batch.
In one embodiment, the step of creating a plurality of target containers for executing the test cases on a preset distributed system includes:
acquiring node information of a distributed system where a target container is located;
reading the system resource occupancy rate of each node on the distributed system according to the node information;
determining the processing performance of a target container corresponding to each node according to the system resource occupancy rate of each node;
and generating a configuration file corresponding to each node according to the test requirement and the processing performance, and creating a target container according to the configuration file.
In one embodiment, after the step of allocating the test cases to the target containers matched with the test cases for running, the method further includes:
collecting the system resource occupancy rate of each node to obtain the system resource occupancy rate;
and correcting the processing performance of the target container according to the system resource occupancy rate.
In one embodiment, after the step of allocating the test cases to the target containers matched with the test cases for running, the method further includes:
acquiring a log file for recording the running process of the test case;
extracting a test result and a corresponding test case identifier from the log file;
and integrating the test result and the test case identification to generate a test report and sending the test report.
In one embodiment, after the step of generating the test report, the method further includes:
detecting that the test result in the test report exceeds a preset standard reference range, and determining a test case identifier associated with the test result;
executing the target test case corresponding to the test case identification for the second time to obtain a second test result;
and determining abnormal information causing the abnormal test result according to the secondary test result.
In one embodiment, the step of determining, according to the secondary test result, exception information causing an exception to the test result includes:
and when the secondary test result is also beyond the standard reference range, determining that the test case corresponding to the test result is abnormal.
Correspondingly, this application still provides an automatic testing arrangement, includes:
the test case obtaining module is used for obtaining a plurality of test cases according to preset test requirements;
the system comprises a creating target container module, a testing case executing module and a processing module, wherein the creating target container module is used for creating a plurality of target containers for executing the testing cases on a preset distributed system;
and the distribution operation module is used for matching the data size of the test case with the processing performance of the target container and distributing the test case to the target container matched with the test case for operation.
Further, an embodiment of the present application also provides a computer-readable storage medium, which is used for storing computer instructions, and when the computer-readable storage medium is run on a computer, the computer is enabled to execute the steps of the automated testing method according to any one of the above technical solutions.
Still further, an embodiment of the present application further provides a computer device, where the computer device includes:
one or more processors;
a storage device for storing one or more programs,
when the one or more programs are executed by the one or more processors, the one or more processors implement the steps of the automated testing method according to any one of the above-mentioned technical solutions.
Compared with the prior art, the scheme provided by the application at least has the following advantages:
according to the automatic test method, the corresponding test cases are obtained according to test requirements, then a plurality of target containers used for executing the test cases are created on a distributed system, the target containers are used for processing the test cases matched with the processing performance, and compared with the traditional single-thread test case execution scheme, the purpose of executing the test cases on the same host in parallel is achieved by constructing the target containers, and the resource utilization rate and the test efficiency of the test cases are improved.
In addition, the data size of the test case is matched with the processing performance of the target container, the test case with larger data volume is distributed to the target container with stronger processing performance to be executed, the execution efficiency of the test case is improved, and the efficiency of the whole test process is improved.
Drawings
FIG. 1 is a diagram of an exemplary environment for implementing an automated testing method according to an embodiment of the present disclosure;
FIG. 2 is a flow chart of an automated testing method provided by an embodiment of the present application;
FIG. 3 is a flowchart illustrating creating a plurality of target containers for executing the test cases on a predetermined distributed system according to an embodiment of the present application;
FIG. 4 is a flowchart illustrating matching processing performance of a target container according to the data size of the test case according to an embodiment of the present application;
FIG. 5 is a flow chart of data analysis of a test report provided by an embodiment of the present application;
FIG. 6 is a schematic structural diagram of an automated testing apparatus according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary only for the purpose of explaining the present application and are not to be construed as limiting the present application.
Fig. 1 is a diagram of an implementation environment of an automated testing method according to an embodiment, where the implementation environment includes a user terminal and a server, and a distributed system disposed at the user terminal or the server, where the distributed system includes a plurality of nodes, and fig. 1 shows a case where the distributed system is disposed at the server, and the distributed system includes N nodes (N is an integer greater than 1).
With reference to the environment diagram provided in fig. 1, the implementation process of the scheme provided in the present application when used on the server side is as follows: the method comprises the steps of obtaining a plurality of test cases meeting test requirements according to the test requirements, creating a plurality of target containers for executing the test cases on a preset distributed system, carrying out matching processing according to the data size of the test cases and the processing performance of the target containers, distributing the test cases to the target containers matched with the test cases for operation, achieving the purpose of carrying out parallel operation on the test cases by utilizing the target containers, and improving the efficiency of automatic testing.
The user terminal provided by the application can be a smart phone, a tablet computer, a notebook computer, a desktop computer, and the like, and the server terminal can be a computer device with a processing function, but is not limited thereto. The server and the user terminal may be connected to each other through bluetooth, USB (universal serial bus), or other communication connection methods, which is not limited herein.
Fig. 2 is a flowchart of an automated testing method according to an embodiment of the present application, including the following steps:
step S210, obtaining a plurality of test cases according to preset test requirements;
step S220, a plurality of target containers for executing test cases are created on a preset distributed system according to the test requirements;
and step S230, matching the data size of the test case with the processing performance of the target container, and distributing the test case to the target container matched with the test case for operation.
The server receives a test request for an application, analyzes the test request to obtain a test requirement therein, wherein the test requirement is a test for a software development result in a software development process, and can include a test for a whole or a local program of the software, such as a functional test, a performance test and the like. The software testing method preferentially adopts a regression testing method to carry out software testing, and each new version of software testing can utilize the regression testing to carry out testing, so that the cost of the system testing, the maintenance upgrading and other stages is greatly reduced.
Writing a test case according to the test requirement or calling a test case stored aiming at the test requirement in advance, wherein the process of calling the test case stored aiming at the test requirement in advance comprises the following steps: and loading the storage address of the test case in the distributed system, namely pointing the operation address of the test case to the distributed system, and grabbing the test case by the distributed system through the storage address of the test case.
After obtaining the test case, the test case is started regularly by using the configuration file, for example: the test case is started regularly by utilizing jenkins, the configuration file can be set according to actual conditions, the start process of the test case can be monitored dynamically by utilizing the configuration file of jenkins, the start period and the start time can be set through the configuration file, and the purpose of automatically starting tests on a large number of test cases is achieved.
The distributed system provided by the application comprises a plurality of nodes, the nodes can be server nodes, the server nodes can be distributed on a plurality of different hardware devices, each server node is not only configured with certain data processing performance for automatic test, but also configured with certain data processing performance for other application programs, namely, a plurality of application programs are simultaneously operated on one server node, therefore, the available data processing performance on different server nodes is different, and the processing performance of the target container can be preset with fixed processing performance on the server node based on the target container established by the server node, or the current available processing performance in the server node can be set to be the processing performance of all the target containers .
The target container provided by the application can realize virtualization on the operating system level, a plurality of target containers can directly multiplex the operating system of a local host in the test process, the purpose of executing a plurality of test cases on the same host in parallel is realized, the resource utilization rate is improved, the cost is reduced, and the management and fault tolerance disaster tolerance are facilitated.
After the step of creating a plurality of target containers for executing the test cases in step S220, the method further includes: and acquiring the processing performance of the target container, and if the processing performance of the target container is the currently available processing performance in the server node, determining the currently available system resources based on the currently occupied system resources and the rated system resources of the server node, namely determining the processing performance of the target container through the occupancy rate of the remaining system resources. If the processing performance of the target container is the set fixed processing performance, different fixed processing performances can be set for different target containers, so that the utilization rate of system resources of different nodes in the distributed system is improved.
And acquiring the data size of the test case, matching the data size of the test case with the processing performance of the target container, and distributing the test case with larger data volume to the target container with stronger processing performance for execution so as to improve the test efficiency.
According to the scheme provided by the application, the corresponding test cases are obtained according to the test requirements, the test process of the test cases is started, then a plurality of target containers used for executing the test cases are created on the distributed system, the target containers are used for processing the test cases matched with the processing performance, and compared with the traditional scheme for executing the test cases in a single thread mode, the purpose of executing the test cases on the same host in parallel is achieved by constructing the target containers, and the resource utilization rate and the test efficiency of the test cases are improved.
In addition, the data size of the test case is matched with the processing performance of the target container, the test case with larger data volume is distributed to the target container with stronger processing performance to be executed, the execution efficiency of the test case is improved, and the efficiency of the whole test process is improved.
In order to make clearer the automated testing scheme provided by the present application and its technical effects, the following detailed description will be given of specific embodiments thereof with reference to a plurality of examples.
In an embodiment, the step of creating a plurality of target containers for executing the test cases on the preset distributed system in step S220 may be performed in the following manner, and a flowchart thereof is shown in fig. 3, and includes:
s310, acquiring node information of a distributed system where the target container is located;
s320, reading the system resource occupancy rate of each node on the distributed system according to the node information;
s330, determining the processing performance of the target container corresponding to each node according to the system resource occupancy rate of each node;
s340, generating configuration files corresponding to the nodes according to the test requirements and the processing performance, and creating a target container according to the configuration files.
Before creating a plurality of target containers for executing test cases on a preset distributed system, a configuration file of the target container needs to be determined according to test requirements and node information of the distributed system.
The node information of the distributed system comprises system resource occupancy rates, available system resources of the target container are determined according to the current system resource occupancy rates of all the nodes, and then the processing performance of the target container is determined according to the available system resources.
The node information of the distributed system comprises information such as the number, the position, the system resource occupancy rate and the like of the node, the system resource occupancy rate of the node, and the resource occupancy rate used by each application in the representation current node or the sum of the resource occupancy rates occupied by all applications in the representation current node. The processing performance of the target vessel is determined by including: one or more of parameters such as CPU operation, IO interface, external system, etc. the processing performance of the target container is in direct proportion to the size of the system resource allocated by the target container. The system resources of the target container may be the system resources remaining at the current node, the remaining system resources being the system resources available to the target container, or fixed system resources set based on these remaining system resources, such as: the current occupancy of system resources by server nodes is 47%, then the processing capacity of the target container is the processing capacity characterized by the system resources (53% by total inventory of server nodes), or the processing capacity characterized by the system resources directly according to the set (50% by total inventory of server nodes) is the processing capacity of the target container.
According to the method and the device, the processing performance of the target container is determined according to the system resource occupancy rate, so that the test case is distributed according to the processing performance of the target container, and the execution efficiency of the test case is improved.
Generating a configuration file according to the test requirement and the processing performance of the target container obtained in the above embodiment, creating the target container according to the configuration file corresponding to each node, wherein the test requirement is digitalized, determining configuration parameters in the configuration file according to the test requirement, such as information of the number of the target containers, and the like, generating the configuration file of the target container by using the test requirement and the processing performance of the target container, and generating the target container according to the configuration file and the creation function.
The target container provided by the application is a docker container, and the creation-operation process of the docker container is as follows: creating a corresponding Job instance; configuring environment parameters for the jobs by using Docker Daemon; the job's run function is executed. In the implementation process of Job, there are two ways to configure parameters for Job: firstly, when creating a Job instance, directly initializing the Args attribute of the Job by using specified parameters; second, after Job is created, specified environment variables are added to Job.
The target container created by the application is a docker container, a software development user can pack test application and a dependency package in a unified mode into a portable docker container, and then the container is issued to any server provided with a docker engine, so that migration and expansion can be easily achieved. Moreover, the target container can effectively divide the resources managed by a single operating system into a plurality of isolated groups, so that conflicting resource use requirements are balanced among the groups, test cases run in the plurality of containers simultaneously, different containers are isolated from each other and do not interfere with each other, and the utilization rate of the operating system resources is high.
Further, in order to improve the creation efficiency of the target container, a mirror image warehouse may be used, and the number of layers in the mirror image is reduced, the speed of constructing the mirror image is increased, the time of constructing the target container is shortened, and the efficiency of the whole test process is improved.
According to the embodiment of the application, the multiple target containers are created on the distributed system, so that the multiple test cases can be executed in parallel by utilizing the multiple target containers in the following process, and the test efficiency is improved.
On this basis, since each node of the distributed system can run a plurality of programs simultaneously, that is, the system resources available at each node are in a changing state, in order to improve the accuracy of the processing performance of the target container, after the step of allocating the test cases to the target container matched with the test cases for running, the following scheme is further provided, including:
a1, collecting the system resource occupancy rate of each node to obtain the system resource occupancy rate;
and A2, correcting the processing performance of the target container according to the system resource occupancy rate.
According to the scheme provided by the embodiment of the application, after the test cases are distributed to the target containers to be executed, the processing performance of each target container can be subsequently corrected through acquiring the system resource occupancy rate, further, the processing performance of the target containers can be corrected regularly, the correction period can be set according to actual conditions, one test case can be processed for one time, the processing performance of the target containers can be updated regularly through the update period, the processing performance of the target containers is corrected based on the currently acquired system resource occupancy rate, the processing performance of the target containers is corrected, matching between the test cases and the target containers is performed based on the corrected processing performance, the matching accuracy between the test cases and the target containers is improved, and the test efficiency of the test cases is further improved.
The step of matching the data size of the test case with the processing performance of the target container in step S230 may be performed in the following manner, and a flowchart thereof is shown in fig. 4, and includes:
s410, dividing a plurality of test cases in batches according to the number of the target containers, wherein the number of the test cases in each batch is the same as that of the target containers;
s420, acquiring the data size of each test case for each batch, and sequencing the test cases according to the data size;
s430, calling a sequencing result for sequencing the target containers in advance according to the processing performance of the target containers for each batch;
s440, aiming at each batch, matching the test cases with the same sequence with the target container.
The embodiment of the application provides a matching scheme of target containers and test cases, wherein each target container executes one test case each time, so that batch division is performed on the test cases to perform batch testing, the number of the test cases in each batch is the same as that of the target containers, the target containers are sorted in advance according to the processing performance of the target containers to obtain a sorting result of the target containers on the processing performance, the test cases in each batch are sorted according to the data size to obtain a sorting result of the test cases, and the number of the test cases in each batch is the same as that of the target containers, so that the test cases with the same sorting order are matched with the target containers and are in a one-to-one corresponding matching relationship.
Under the condition, if the residual quantity of the number of the test cases to the number of the target containers is not zero, the number r of the test cases of the last batch is obtained, the test cases of the last batch are sequenced according to the data size, r target containers which are ranked at the front are called, and matching processing is carried out according to the processing performance of the target containers and the data size of the test cases, so that the processing efficiency of the test cases is improved to the maximum extent.
In the foregoing embodiment, a plurality of target containers are set on a distributed system, and a test case is allocated to a matched target container for execution, so as to achieve the purpose of improving the execution efficiency of the test case, and this embodiment can further improve the test process by the following embodiment, where after the step of allocating the test case to the matched target container for operation in step S230, the method further includes:
b1, acquiring a log file for recording the running process of the test case;
b2, extracting a test result and a corresponding test case identifier from the log file;
and B3, integrating the test result and the test case identification to generate a test report and sending the test report.
The method comprises the steps that a log file records the whole process of running test cases in a target container, intermediate process data and a final test result are recorded, and after the log file of each test case is obtained, data crawling is carried out on the log file, and the test result and the test case identification are obtained. The method comprises the steps of obtaining log files of a plurality of test cases, crawling test results and associated test case identifications, carrying out data integration on the test results and the test case identifications to generate a test report, and sending the test report to a user side in a mail mode.
The process of creating the target container, running the test case by using the target container and forming the test report can be automatically executed through the script file, the test case can be executed in the target container and the test report can be generated by using a run () method of an HTMLTestRunner in the process of writing the script file to generate the test report, the HTMLTestRunner is an extension of a Python standard card unit test framework and is mainly used for generating an HTML test report so as to generate a popular and understandable test report to display an automatic test result.
After the test report is obtained through the above embodiment, the data analysis is performed on the test report, and the data analysis process may be performed in the following manner, and a flowchart thereof is shown in fig. 5, and includes:
s510, detecting that a test result in the test report exceeds a preset standard reference range, and calling a test case identifier associated with the test result;
s520, carrying out secondary execution on the target test case corresponding to the test case identification to obtain a secondary test result;
s530, determining abnormal information causing the abnormal test result according to the secondary test result.
The method comprises the steps of setting a standard reference range for each test result in advance, comparing each test result in a test report with the corresponding standard reference range, judging whether an abnormal condition that the test result exceeds the standard reference range exists, and if the test result in the test report is detected to exceed the standard reference range, indicating that the test result is abnormal, wherein the abnormal condition may be that the target container has problems in the execution process, such as incomplete test caused by insufficient running space of the target container, abnormal running environment and the like, and also may be that the test case data is damaged due to errors of the test case itself. The present application provides a scheme that, for an abnormal test result, a test case identifier associated with the test result is retrieved, a target test case corresponding to the test case identifier is placed in a target container for secondary execution, in order to distinguish the target containers, a target container for obtaining the test result by first execution is referred to as a first target container, where the target container is preferably another target container except the first target container, for example: the abnormal test result is executed in the target container 4, the target containers numbered 1 to 4 normally operate, the test case corresponding to the test result can be placed in any one of the target containers numbered 1 to 3 for re-execution, the secondary test result is obtained, and the abnormal information causing the abnormal test result is determined based on the analysis of the secondary test result, such as: and comparing and analyzing the secondary test result with the test result of the primary test, and determining abnormal information according to the analysis result. The first target container can also be used for executing the target test case for the second time so as to eliminate test exception caused by accidental factors or environmental factors.
The embodiment of the application provides a scheme, a target test case corresponding to an abnormal test result is subjected to secondary execution, abnormal information causing the abnormal test result is determined based on the test result and the secondary test result, and the mode can conveniently and clearly determine the abnormal information, such as the abnormal test result caused by accidental factors or environmental factors can be directly eliminated.
On this basis, the present application further provides a step of further specifying a test reason of the abnormal test result in the following embodiments, and determining abnormal information causing the abnormal test result according to the test result with reference to fig. 5, including:
s540, judging whether the secondary test result exceeds the standard reference range;
and S550, when the secondary test result is also beyond the standard reference range, determining that the test case corresponding to the test result is abnormal.
The test result and the secondary test result are tested aiming at the same test case, so that a standard reference range preset for the test case is called, the secondary test result is compared with the standard reference range, if the secondary test result is also beyond the standard reference range, the test case is determined to be wrong, and if the secondary test result is not beyond the standard reference range, the target container corresponding to the test result is possibly abnormal.
By the scheme, whether the current test result with the problem is caused by the error of the test case can be determined, misjudgment of the test case is avoided, and the method is favorable for determining the accurate reason of the problem of the test result.
Further, after the step of determining that the test case corresponding to the test result is abnormal in step S550, the method further includes: and S560, sending the abnormal reminding information of the test case, and responding to the deleting instruction carried in the reminding information to execute the operation of deleting the abnormal test case.
And sending a reminding message that the test case has errors after the test case is determined to be abnormal, wherein the reminding message carries deletion information, and responding to the deletion instruction to execute the operation of deleting the abnormal test case so as to ensure the accuracy of the test case set, avoid other test processes from using the test case with errors to test, and be favorable for ensuring the accuracy of the test result.
Further, after the step of allocating the test cases to the target containers matched with the test cases for running, the method further comprises the following steps: the target container is deleted.
After the test processes of all the test cases are completed, the completion of the test processes comprises the following steps: after the test cases with the test results exceeding the preset standard reference range are executed for the second time, all the test cases finish the test and test verification processes, the target container is not required to be executed in the test process, the target container is deleted, and the occupation of the target container on system resources is eliminated.
The above embodiments of the automated testing method provided in the present application are directed to the method, and the following describes embodiments of an automated testing apparatus corresponding to the method.
An embodiment of the present application further provides an automatic testing apparatus, a schematic structural diagram of which is shown in fig. 6, including: the test case obtaining module 610, the target container creating module 620 and the operation allocating module 630 are as follows:
the test case obtaining module 610 is configured to obtain a plurality of test cases according to a preset test requirement;
a create target container module 620, configured to create a plurality of target containers for executing the test cases on a preset distributed system;
and the allocation operation module 630 is configured to match the data size of the test case with the processing performance of the target container, and allocate the test case to the target container matched with the test case for operation.
With regard to the automatic test device in the above embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be described in detail here.
Further, an embodiment of the present application also provides a computer readable storage medium, on which computer instructions are stored, and the computer instructions, when executed by a processor, implement the steps of the automated testing method described in any one of the above. The storage medium includes, but is not limited to, any type of disk including floppy disks, hard disks, optical disks, CD-ROMs, and magneto-optical disks, ROMs (Read-Only memories), RAMs (Random AcceSS memories), EPROMs (EraSable Programmable Read-Only memories), EEPROMs (Electrically EraSable Programmable Read-Only memories), flash memories, magnetic cards, or optical cards. That is, a storage medium includes any medium that stores or transmits information in a form readable by a device (e.g., a computer). Which may be a read-only memory, magnetic or optical disk, or the like.
Still further, an embodiment of the present application further provides a computer device, where the computer device includes:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the steps of the automated testing method of any one of the preceding claims.
FIG. 7 is a block diagram illustrating a computer device 700 according to an example embodiment. For example, the computer device 700 may be provided as a server side. Referring to fig. 7, computer device 700 includes a processing component 722 that further includes one or more processors, and memory resources, represented by memory 732, for storing instructions, e.g., applications, that are executable by processing component 722. The application programs stored in memory 732 may include one or more modules that each correspond to a set of instructions. Further, the processing component 722 is configured to execute instructions to perform the steps of the automated test method described above.
The computer device 700 may also include a power component 726 configured to perform power management of the computer device 700, a wired or wireless network interface 750 configured to connect the computer device 700 to a network, and an input output (I/O) interface 758. Computer device 700 may operate based on an operating system stored in memory 732, such as WindowsServerTM, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, or the like. It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least a portion of the steps in the flow chart of the figure may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed in sequence, but may be performed alternately or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
It should be understood that each functional unit in the embodiments of the present application may be integrated into one processing module, each unit may exist alone physically, or two or more units may be integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
The foregoing is only a partial embodiment of the present application, and it should be noted that, for those skilled in the art, several modifications and decorations can be made without departing from the principle of the present application, and these modifications and decorations should also be regarded as the protection scope of the present application.

Claims (10)

1. An automated testing method, comprising:
acquiring a plurality of test cases according to preset test requirements;
creating a plurality of target containers for executing the test cases on a preset distributed system;
and matching the data size of the test case with the processing performance of the target container, and distributing the test case to the target container matched with the test case for operation.
2. The automated testing method of claim 1, wherein the step of matching the data size of the test case with the processing performance of the target container comprises:
dividing a plurality of test cases into batches according to the number of the target containers, wherein the number of the test cases in each batch is the same as that of the target containers;
acquiring the data size of each test case for each batch, and sequencing the test cases according to the data size;
calling a sequencing result for sequencing the target containers in advance according to the processing performance of the target containers for each batch;
and matching the test cases with the same sequence with the target container for each batch.
3. The automated testing method of claim 1, wherein the step of creating a plurality of target containers for executing the test cases on a predetermined distributed system comprises:
acquiring node information of a distributed system where a target container is located;
reading the system resource occupancy rate of each node on the distributed system according to the node information;
determining the processing performance of a target container corresponding to each node according to the system resource occupancy rate of each node;
and generating a configuration file corresponding to each node according to the test requirement and the processing performance, and creating a target container according to the configuration file.
4. The automated testing method of claim 3, wherein after the step of assigning the test cases to the target containers matched with the test cases, the method further comprises:
collecting the system resource occupancy rate of each node to obtain the system resource occupancy rate;
and correcting the processing performance of the target container according to the system resource occupancy rate.
5. The automated testing method of claim 1, wherein after the step of assigning the test cases to the target containers matched with the test cases, the method further comprises:
acquiring a log file for recording the running process of the test case;
extracting a test result and a corresponding test case identifier from the log file;
and integrating the test result and the test case identification to generate a test report and sending the test report.
6. The automated testing method of claim 5, further comprising, after the step of generating a test report:
detecting that the test result in the test report exceeds a preset standard reference range, and determining a test case identifier associated with the test result;
executing the target test case corresponding to the test case identification for the second time to obtain a second test result;
and determining abnormal information causing the abnormal test result according to the secondary test result.
7. The automated testing method of claim 6, wherein the step of determining from the secondary test results the exception information that caused the test results to be abnormal comprises:
and when the secondary test result is also beyond the standard reference range, determining that the test case corresponding to the test result is abnormal.
8. An automated testing apparatus, comprising:
the test case obtaining module is used for obtaining a plurality of test cases according to preset test requirements;
the system comprises a creating target container module, a testing case executing module and a processing module, wherein the creating target container module is used for creating a plurality of target containers for executing the testing cases on a preset distributed system;
and the distribution operation module is used for matching the data size of the test case with the processing performance of the target container and distributing the test case to the target container matched with the test case for operation.
9. A computer-readable storage medium for storing computer instructions which, when executed on a computer, cause the computer to perform the steps of the automated testing method of any of claims 1 to 7.
10. A computer device, characterized in that the computer device comprises:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the steps of the automated testing method of any of claims 1-7.
CN201911053638.0A 2019-10-31 2019-10-31 Automatic test method, device, storage medium and equipment Active CN110765026B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911053638.0A CN110765026B (en) 2019-10-31 2019-10-31 Automatic test method, device, storage medium and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911053638.0A CN110765026B (en) 2019-10-31 2019-10-31 Automatic test method, device, storage medium and equipment

Publications (2)

Publication Number Publication Date
CN110765026A true CN110765026A (en) 2020-02-07
CN110765026B CN110765026B (en) 2023-08-01

Family

ID=69335079

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911053638.0A Active CN110765026B (en) 2019-10-31 2019-10-31 Automatic test method, device, storage medium and equipment

Country Status (1)

Country Link
CN (1) CN110765026B (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111506465A (en) * 2020-04-20 2020-08-07 北京易点淘网络技术有限公司 Computer function testing method and device
CN111581085A (en) * 2020-04-28 2020-08-25 广州市百果园信息技术有限公司 Joint debugging test system and method
CN111639022A (en) * 2020-05-16 2020-09-08 中信银行股份有限公司 Transaction testing method and device, storage medium and electronic device
CN112162927A (en) * 2020-10-13 2021-01-01 网易(杭州)网络有限公司 Test method, medium and device of cloud computing platform and computing equipment
CN112346979A (en) * 2020-11-11 2021-02-09 杭州飞致云信息科技有限公司 Software performance testing method, system and readable storage medium
CN112486812A (en) * 2020-11-26 2021-03-12 北京海量数据技术股份有限公司 Distributed framework software testing method and device supporting cloud
CN112596750A (en) * 2020-12-28 2021-04-02 上海安畅网络科技股份有限公司 Application testing method and device, electronic equipment and computer readable storage medium
CN113485905A (en) * 2021-02-26 2021-10-08 杜自然 Test method, device, equipment and computer storage medium in data transaction
CN114039974A (en) * 2021-10-20 2022-02-11 支付宝(杭州)信息技术有限公司 Cloud container generation method and device, storage medium and electronic equipment
CN117234949A (en) * 2023-11-13 2023-12-15 广州品唯软件有限公司 Test data noise reduction method and device, storage medium and computer equipment
CN117520129A (en) * 2023-11-21 2024-02-06 北京东青互联科技有限公司 Data center equipment monitoring method, device, equipment and medium
CN114039974B (en) * 2021-10-20 2024-05-31 支付宝(杭州)信息技术有限公司 Method and device for providing equipment service for user, storage medium and electronic equipment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8063908B1 (en) * 2007-11-08 2011-11-22 Nvidia Corporation System, method, and computer program product for validating a graphics processor design
CN102609352A (en) * 2011-01-19 2012-07-25 阿里巴巴集团控股有限公司 Parallel testing method and parallel testing server
CN108171050A (en) * 2017-12-29 2018-06-15 浙江大学 The fine granularity sandbox strategy method for digging of linux container
CN108959080A (en) * 2018-06-27 2018-12-07 郑州云海信息技术有限公司 A kind of automated testing method executed parallel based on UnitTest
CN109062780A (en) * 2018-06-25 2018-12-21 深圳市远行科技股份有限公司 The development approach and terminal device of automatic test cases
CN110083535A (en) * 2019-04-22 2019-08-02 网宿科技股份有限公司 A kind of method for testing software and device
CN114168429A (en) * 2021-12-17 2022-03-11 平安付科技服务有限公司 Error reporting analysis method and device, computer equipment and storage medium
CN114691494A (en) * 2022-02-28 2022-07-01 网宿科技股份有限公司 Test case execution method and system and test equipment

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8063908B1 (en) * 2007-11-08 2011-11-22 Nvidia Corporation System, method, and computer program product for validating a graphics processor design
CN102609352A (en) * 2011-01-19 2012-07-25 阿里巴巴集团控股有限公司 Parallel testing method and parallel testing server
CN108171050A (en) * 2017-12-29 2018-06-15 浙江大学 The fine granularity sandbox strategy method for digging of linux container
CN109062780A (en) * 2018-06-25 2018-12-21 深圳市远行科技股份有限公司 The development approach and terminal device of automatic test cases
CN108959080A (en) * 2018-06-27 2018-12-07 郑州云海信息技术有限公司 A kind of automated testing method executed parallel based on UnitTest
CN110083535A (en) * 2019-04-22 2019-08-02 网宿科技股份有限公司 A kind of method for testing software and device
CN114168429A (en) * 2021-12-17 2022-03-11 平安付科技服务有限公司 Error reporting analysis method and device, computer equipment and storage medium
CN114691494A (en) * 2022-02-28 2022-07-01 网宿科技股份有限公司 Test case execution method and system and test equipment

Non-Patent Citations (7)

* Cited by examiner, † Cited by third party
Title
BAELDUNG: "Docker Test Containers in Java Tests", pages 1, Retrieved from the Internet <URL:https://www.baeldung.com/docker-test-containers> *
于哲等: "基于用例录制的电力二次智能设备自动测试平台", 《电工技术》, no. 21, pages 75 - 78 *
朱海燕等: "测试用例排序的研究", vol. 30, no. 30, pages 79 - 81 *
老_张: "性能测试用例、策略和方法", pages 1 *
赵亮等: "一种J2EE服务器端软件测试用例的复用框架研究" *
赵亮等: "一种J2EE服务器端软件测试用例的复用框架研究", 《小型微型计算机系统》, vol. 24, no. 24, pages 663 - 667 *
软件测试培训: "测试设计方法:容量测试用例设计方法(干货)", pages 1, Retrieved from the Internet <URL:https://qa.tedu.cn/data/267069.html> *

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111506465A (en) * 2020-04-20 2020-08-07 北京易点淘网络技术有限公司 Computer function testing method and device
CN111581085A (en) * 2020-04-28 2020-08-25 广州市百果园信息技术有限公司 Joint debugging test system and method
CN111639022A (en) * 2020-05-16 2020-09-08 中信银行股份有限公司 Transaction testing method and device, storage medium and electronic device
CN112162927B (en) * 2020-10-13 2024-04-26 网易(杭州)网络有限公司 Testing method, medium, device and computing equipment of cloud computing platform
CN112162927A (en) * 2020-10-13 2021-01-01 网易(杭州)网络有限公司 Test method, medium and device of cloud computing platform and computing equipment
CN112346979A (en) * 2020-11-11 2021-02-09 杭州飞致云信息科技有限公司 Software performance testing method, system and readable storage medium
CN112486812A (en) * 2020-11-26 2021-03-12 北京海量数据技术股份有限公司 Distributed framework software testing method and device supporting cloud
CN112596750A (en) * 2020-12-28 2021-04-02 上海安畅网络科技股份有限公司 Application testing method and device, electronic equipment and computer readable storage medium
CN112596750B (en) * 2020-12-28 2022-04-26 上海安畅网络科技股份有限公司 Application testing method and device, electronic equipment and computer readable storage medium
CN113485905A (en) * 2021-02-26 2021-10-08 杜自然 Test method, device, equipment and computer storage medium in data transaction
CN113485905B (en) * 2021-02-26 2023-09-05 杜自然 Test method, device, equipment and computer storage medium in data transaction
CN114039974A (en) * 2021-10-20 2022-02-11 支付宝(杭州)信息技术有限公司 Cloud container generation method and device, storage medium and electronic equipment
CN114039974B (en) * 2021-10-20 2024-05-31 支付宝(杭州)信息技术有限公司 Method and device for providing equipment service for user, storage medium and electronic equipment
CN117234949B (en) * 2023-11-13 2024-03-19 广州品唯软件有限公司 Test data noise reduction method and device, storage medium and computer equipment
CN117234949A (en) * 2023-11-13 2023-12-15 广州品唯软件有限公司 Test data noise reduction method and device, storage medium and computer equipment
CN117520129A (en) * 2023-11-21 2024-02-06 北京东青互联科技有限公司 Data center equipment monitoring method, device, equipment and medium
CN117520129B (en) * 2023-11-21 2024-05-10 北京东青互联科技有限公司 Data center equipment monitoring method, device, equipment and medium

Also Published As

Publication number Publication date
CN110765026B (en) 2023-08-01

Similar Documents

Publication Publication Date Title
CN110765026B (en) Automatic test method, device, storage medium and equipment
CN110058998B (en) Software testing method and device
CN111258913A (en) Automatic algorithm testing method and device, computer system and readable storage medium
CN111176790A (en) Active maintenance method and device of cloud platform physical host and readable storage medium
CN115935035A (en) RPA flow visualization management method, device, equipment and readable storage medium
CN106708727B (en) Distributed virus characteristic sample verification method and system
CN110618853B (en) Detection method, device and equipment for zombie container
US11656977B2 (en) Automated code checking
CN114064216A (en) Virtual machine initialization method, device, terminal equipment and storage medium
CN113742224A (en) Test system, method, device, computer equipment and storage medium
CN112965895A (en) Desktop application program automatic test method, device, equipment and storage medium
CN112214413A (en) Application program testing method, device, equipment and storage medium
CN116702668A (en) Regression testing method and device, electronic equipment and storage medium
CN111783094A (en) Data analysis method and device, server and readable storage medium
CN111506388A (en) Container performance detection method, container management platform and computer storage medium
CN112346952A (en) Method, equipment and electronic equipment for regression testing of test cases
CN115757172A (en) Test execution method and device, storage medium and computer equipment
CN114860694A (en) Asynchronous collaborative data migration method and device for wind power plant monitoring system
CN107688479B (en) Android system network cluster, construction method thereof, and Android system network cluster data processing method and system
CN112596750B (en) Application testing method and device, electronic equipment and computer readable storage medium
CN106547583B (en) Operating system installation method and device
CN109901998B (en) Resource recovery method, device, system, computer equipment and storage medium
CN113703804A (en) System upgrading method, system, device and storage medium
CN108845932B (en) Unit testing method and device of network library, storage medium and terminal
CN114519477A (en) Large data platform tenant management system, method, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 801-2, floor 8, building 3, No. 22, Ronghua Middle Road, Beijing Economic and Technological Development Zone, Daxing District, Beijing

Applicant after: Wanghai Kangxin (Beijing) Technology Co.,Ltd.

Address before: Room 07, Room 2, Building B, 12 Hongda North Road, Beijing Daxing District, Beijing

Applicant before: BEIJING NEUSOFT VIEWHIGH TECHNOLOGY Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant