CN114924981A - Test task distribution method based on artificial intelligence and related equipment - Google Patents

Test task distribution method based on artificial intelligence and related equipment Download PDF

Info

Publication number
CN114924981A
CN114924981A CN202210598121.5A CN202210598121A CN114924981A CN 114924981 A CN114924981 A CN 114924981A CN 202210598121 A CN202210598121 A CN 202210598121A CN 114924981 A CN114924981 A CN 114924981A
Authority
CN
China
Prior art keywords
test
task
executor
tasks
test task
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210598121.5A
Other languages
Chinese (zh)
Inventor
李慧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Bank Co Ltd
Original Assignee
Ping An Bank Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Bank Co Ltd filed Critical Ping An Bank Co Ltd
Priority to CN202210598121.5A priority Critical patent/CN114924981A/en
Publication of CN114924981A publication Critical patent/CN114924981A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/48Program initiating; Program switching, e.g. by interrupt
    • G06F9/4806Task transfer initiation or dispatching
    • G06F9/4843Task transfer initiation or dispatching by program, e.g. task dispatcher, supervisor, operating system

Abstract

The application provides a test task distribution method and device based on artificial intelligence, electronic equipment and a storage medium, wherein the test task distribution method based on artificial intelligence comprises the following steps: constructing a test case database, and calling a test case from the database according to the test requirement of a user to generate a test task; establishing heartbeat connection between a server and a test executor to acquire a communication connection state between the server and the test executor; if the communication connection state is normal, distributing the test task to a test executor according to a custom distribution algorithm; executing the test task based on the test executor to obtain a test result; and generating a test report based on the test result for displaying. According to the method and the device, the test tasks are obtained, and the test tasks are distributed to the test executors according to the user-defined distribution algorithm, so that the system resources are distributed to the test tasks as fairly as possible, and the use efficiency of the system resources in the software test process is effectively improved.

Description

Test task distribution method based on artificial intelligence and related equipment
Technical Field
The present application relates to the field of artificial intelligence technologies, and in particular, to a method and an apparatus for distributing test tasks based on artificial intelligence, an electronic device, and a storage medium.
Background
With the expansion and deepening of software application, a software system is more and more complex, and in order to reduce cost, shorten software delivery time and improve software stability, the software needs to be tested frequently.
In the prior art, cloud testing is an automated testing scheme developed based on a cloud computing technology, and a user only needs to write a testing script and upload the testing script to a cloud platform to perform automated testing. However, the cloud test platform is limited by the performance of the server, and with the increase of software test tasks, if the performance of the server is insufficient, the test service will be greatly affected, so that how to reasonably distribute the test tasks so as to effectively improve the operating efficiency of the cloud test platform is an urgent problem to be solved.
Disclosure of Invention
In view of the foregoing, there is a need for a test task distribution method based on artificial intelligence and related devices, so as to solve the technical problem of how to improve the operating efficiency of a cloud test platform, where the related devices include a test task distribution apparatus based on artificial intelligence, an electronic device, and a storage medium.
The application provides a test task distribution method based on artificial intelligence, which comprises the following steps:
constructing a test case database, and calling test case data from the test case database according to the test requirements of a user to generate a test task;
establishing heartbeat connection between the server and the test executor to acquire a communication connection state between the server and the test executor;
if the communication connection state is normal, distributing the test task to a test executor according to a custom distribution algorithm;
executing the test task based on the test executor to obtain a test result;
and generating a test report based on the test result for displaying.
Therefore, the test tasks are obtained, and are distributed to the test executors according to the self-defined distribution algorithm, so that the system resources of the automatic test system are guaranteed to be distributed to the test tasks as fairly as possible, and the use efficiency of the system resources in the software test process is effectively improved.
In some embodiments, the constructing a test case database, and calling test case data from the test case database according to a test requirement of a user to generate a test task includes:
generating a test case according to the automatic test case generation tool to construct a test case database;
and calling the corresponding test case in the test case database according to the test requirement of the user to generate the test task.
Therefore, the test cases required by the test tasks can be automatically generated in a large batch through the automatic test case generation tool, the test tasks can be generated quickly, and the efficiency of the whole distribution process of the test tasks is improved.
In some embodiments, the establishing a heartbeat connection between the server and the test executor to obtain a communication connection state between the server and the test executor includes:
the test executor sends a heartbeat request to the server every other preset fixed period to acquire a heartbeat signal;
and judging the communication connection state between the test executor and the server based on the heartbeat signal.
Therefore, the test executor can be guaranteed to be in communication connection with the server all the time, the test task distribution failure caused by connection interruption is prevented, and the smooth proceeding of the test task distribution work is guaranteed.
In some embodiments, the distributing the test task to the test executor according to a custom distribution algorithm if the communication connection state is normal includes:
carrying out priority division on the test tasks to obtain a test task level data set;
sequencing the test tasks in the test task level data set according to the sequence of the priorities from high to low so as to generate a test task queue;
and distributing the test tasks in the test task queue to a test executor according to a custom distribution algorithm.
Therefore, the test tasks are obtained, and the test tasks are distributed to the test executors according to the custom distribution algorithm, so that the system resources can be guaranteed to be distributed to the test tasks as fairly as possible, and the use efficiency of the system resources in the software test process is effectively improved.
In some embodiments, before distributing the test tasks in the test task queue to the test executors according to the custom distribution algorithm, the method further includes:
counting system resources required by each test task in the test task queue to obtain a task resource set;
judging whether the priorities of the test tasks in the test task queue are consistent or not to obtain a judgment result;
if the judgment result is consistent, distributing the test tasks in the test task queue to a test executor according to a user-defined distribution algorithm and the task resource set;
if the judgment result is inconsistent, the priorities of the test tasks in the test task queue are subjected to standardized processing, and then the test tasks in the test task queue are distributed to a test executor according to a user-defined distribution algorithm and the task resource set.
Therefore, the corresponding distribution strategies can be respectively executed according to whether the priorities of the test tasks in the test task queue are consistent, and the system resources are guaranteed to be distributed to the test tasks as fairly as possible.
In some embodiments, said executing said test task based on said test executor to obtain test results comprises:
executing the test cases in the test tasks based on the test executors to obtain test results, wherein the test results comprise test success, test failure and test skipping;
if the test result is successful, continuing to execute a next test case corresponding to the current test case in the test task;
if the test result is test failure, ending the test task corresponding to the current test case;
and if the test result is test skipping, skipping the test task corresponding to the current test case and not continuing to execute.
Therefore, different operations can be performed according to the test result of the test case in the test task, and when the test result is test failure and test skip, a new test task is executed in time, so that the overall execution efficiency of the test task is improved.
In some embodiments, said generating a test report for presentation based on said test results comprises:
the test executor feeds back the test result to generate a test result report;
and storing the test result report to the test case database for displaying.
Therefore, the test results can be stored uniformly and displayed at the front end of the user, and the user can conveniently monitor the test results in real time.
The embodiment of the present application further provides a test task distribution device based on artificial intelligence, the device includes:
the generating unit is used for constructing a test case database and calling test case data from the test case database according to the test requirements of a user to generate a test task;
the connection unit is used for establishing heartbeat connection between the server and the test executor so as to acquire a communication connection state between the server and the test executor;
the distribution unit is used for distributing the test task to the test executor according to a custom distribution algorithm if the communication connection state is normal;
the execution unit is used for executing the test task based on the test executor to obtain a test result;
and the display unit is used for generating a test report based on the test result so as to display the test report.
An embodiment of the present application further provides an electronic device, where the electronic device includes:
a memory storing at least one instruction;
and the processor executes the instructions stored in the memory to realize the artificial intelligence based test task distribution method.
The embodiment of the application also provides a computer-readable storage medium, and at least one instruction is stored in the computer-readable storage medium and executed by a processor in the electronic device to implement the artificial intelligence based test task distribution method.
Drawings
FIG. 1 is a flow chart of a preferred embodiment of an artificial intelligence based test task distribution method to which the present application relates.
Fig. 2 is a flow chart of a preferred embodiment of distributing the test task to the test executor according to a custom distribution algorithm if the communication connection status is normal according to the present application.
FIG. 3 is a functional block diagram of a preferred embodiment of an artificial intelligence based test task distribution apparatus according to the present application.
Fig. 4 is a schematic structural diagram of an electronic device according to a preferred embodiment of the artificial intelligence based test task distribution method according to the present application.
Detailed Description
For a clearer understanding of the objects, features and advantages of the present application, reference is made to the following detailed description of the present application along with the accompanying drawings and specific examples. It should be noted that the embodiments and features of the embodiments of the present application may be combined with each other without conflict. In the following description, numerous specific details are set forth to provide a thorough understanding of the present application, and the described embodiments are merely a subset of the embodiments of the present application and are not intended to be a complete embodiment.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or to implicitly indicate the number of technical features indicated. Thus, features defined as "first" and "second" may explicitly or implicitly include one or more of the described features. In the description of the present application, "a plurality" means two or more unless specifically limited otherwise.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein in the description of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
The embodiment of the present Application provides a test task distribution method based on artificial intelligence, which can be applied to one or more electronic devices, where an electronic device is a device capable of automatically performing numerical calculation and/or information processing according to a preset or stored instruction, and hardware of the electronic device includes, but is not limited to, a microprocessor, an Application Specific Integrated Circuit (ASIC), a Programmable Gate Array (FPGA), a Digital Signal Processor (DSP), an embedded device, and the like.
The electronic device may be any electronic product capable of performing human-computer interaction with a client, for example, a Personal computer, a tablet computer, a smart phone, a Personal Digital Assistant (PDA), a game machine, an Internet Protocol Television (IPTV), an intelligent wearable device, and the like.
The electronic device may also include a network device and/or a client device. The network device includes, but is not limited to, a single network server, a server group consisting of a plurality of network servers, or a Cloud Computing (Cloud Computing) based Cloud consisting of a large number of hosts or network servers.
The Network where the electronic device is located includes, but is not limited to, the internet, a wide area Network, a metropolitan area Network, a local area Network, a Virtual Private Network (VPN), and the like.
Fig. 1 is a flowchart of a preferred embodiment of the artificial intelligence based test task distribution method according to the present application. The order of the steps in the flow chart may be changed and some steps may be omitted according to different needs.
S10, constructing a test case database, and calling test case data from the test case database according to the test requirements of the user to generate a test task.
In an optional embodiment, the constructing a test case database, and calling test case data from the test case database according to a test requirement of a user to generate a test task includes:
s101, generating a test case according to the automatic test case generation tool to construct a test case database.
In the optional embodiment, a corresponding test case can be generated for each test task according to the automatic test case generation tool, and all the test cases are stored in the database to form the test case database.
S102, calling the corresponding test case in the test case database according to the test requirement of the user to generate the test task.
In this optional embodiment, a coding tag may be set for the test requirement of the user and the corresponding test case, and the coding tag is used to directly invoke the corresponding test case in the test case database according to the test requirement input by the user at the client, so as to generate the test task. The coded labels can be numbers, symbols, letters and the like, and the scheme does not require specific requirements.
In this optional embodiment, when the user has a test requirement, the electronic device may receive task information to be tested according to a user interface of the client, call a corresponding test case from the test case database according to the task information, and send all the obtained test cases and corresponding test tasks to the server. The client is mainly a graphical user interface of an automatic test system and testers, and the testers can write test tasks, monitor execution states, check reports and the like through the client. One client can only establish connection with one server, one client can simultaneously run a plurality of test tasks, and each test task can also call a plurality of test cases. For example, if the function of the Baidu home page is tested, the test case may be a series of automated test cases such as test login, test jump, test search, and the like.
In this optional embodiment, the test task may be stored in an xml file, each peer directory in the xml file corresponds to a logical node, each logical node has two ID identifiers, one ID identifier is used as its own identifier, and the other ID identifier is used as an ID for scheduling the next peer logical node to be operated, and each logical node supports scheduling rules such as round robin and concurrency, and assists an auxiliary scheduling rule such as delay and assignment to assist scheduling.
Therefore, the test cases required by the test tasks can be automatically generated in a large batch through the automatic test case generation tool, the test tasks can be generated quickly, and the efficiency of the whole distribution process of the test tasks is improved.
And S11, establishing a heartbeat connection between the server and the test executor to acquire the communication connection state between the server and the test executor.
In an optional embodiment, the establishing a heartbeat connection between the server and the test executor to obtain a communication connection state between the server and the test executor includes:
and S111, the test executor sends a heartbeat request to the server every other preset fixed period to acquire a heartbeat signal.
In this alternative embodiment, the test executor is a pre-written code program for executing a test task distributed by the test system.
In this alternative embodiment, the heartbeat signal is a method of sending a small data packet to the other party of the interconnection at intervals, and determining whether the communication link between the two parties of the interconnection has been disconnected according to the reply condition of the other party.
In this alternative embodiment, the test executor sends a heartbeat request to the server every other preset fixed period to obtain a heartbeat signal, which is referred to as "heartbeat", and the preset fixed period may be 3 s.
And S112, judging the communication connection state between the test executor and the server based on the heartbeat signal.
In this alternative embodiment, the connection status between the test executor and the server may be determined based on the heartbeat signal. If the server receives the heartbeat request, a heartbeat signal is also sent to the test executor, and the server only stores the relevant information of the test executor with the heartbeat.
In this optional embodiment, it is ensured that all the allocated test executors are normally available by ensuring that only the relevant information of the normally operating test executors in the server by the heartbeat signal and deleting the relevant information of the unavailable test executors in time.
And S12, if the communication connection state is normal, distributing the test task to a test executor according to a custom distribution algorithm.
Referring to fig. 2, in an optional embodiment, if the communication connection state is normal, the distributing the test task to the test executor according to the custom distribution algorithm includes:
and S121, carrying out priority division on the test tasks to obtain a test task level data set.
In this optional embodiment, priority division may be performed on each test task in advance, for example, according to the time sequence of the test tasks, the earlier generated test task has higher priority; or according to the importance degree of the test tasks, the more important test tasks have higher priority, and all the test tasks after priority division are used as the task level data set in the scheme.
And S122, sequencing the test tasks in the test task level data set according to the sequence from high priority to low priority to generate a test task queue.
In this optional embodiment, after receiving the test task sent by the client, the server sequentially adds the test tasks in the test task level data set to the task queue according to the priority of the test task to wait for the test executor to perform scheduling. The test executor is a physical machine, a virtual machine and the like which are specially configured to execute tasks including test cases under the environment of a cloud test platform, and all test tasks added into a task queue according to a priority order are used as the test task queue.
And S123, distributing the test tasks in the test task queue to a test executor according to a custom distribution algorithm.
In this optional embodiment, before distributing the test tasks in the test task queue to the test executor according to the customized distribution algorithm, it is necessary to count system resources required by each test task in the test task queue to obtain a task resource set, and determine whether the priorities of each test task in the test task queue are consistent to obtain a determination result.
In the optional embodiment, the scheme realizes fair and reasonable distribution of the system resources of the current automatic test system through a custom distribution algorithm, that is, the total system resources of the automatic test system are distributed as much as possible according to the sequence of increasing demands, so that the situation that the resources obtained by the test tasks exceed the self-required resources is avoided, and the residual resources are equivalently shared for the test tasks which do not meet the performance requirements. Because a plurality of test tasks exist in the automatic test system at the same time and each test task has a corresponding priority, the self-defined distribution algorithm in the scheme comprises two conditions of consistent and inconsistent priorities of the test tasks.
In this optional embodiment, if the priorities are consistent, a total of m test tasks are set, the system resources of the automated test system required by the test tasks are sorted from small to large according to the task resource set, and when E1< E2< > em and the total resource amount is E are satisfied, the specific allocation process of the custom allocation algorithm is as follows:
a. allocating resource E/m to the test task 1 with the minimum requirement, which is likely to exceed the requirement of the test task;
b. the excess part is reclaimed, and the resources of (E-E1)/(m-1) are equally distributed to other testing tasks again;
c. and repeating the process b in sequence until the resource for the test task at a certain time does not meet the requirement of the test task. And when the testing task is allocated to the kth testing task, the allocated resources cannot meet the requirements of the testing task, and the rest resources are evenly distributed to all testing tasks without acquired resources, so that the allocation is completed.
For example, it is assumed that there are four test tasks A, B, C, D, the corresponding resource demands on the automated test system are 1, 4, and 10, the total resource amount E is 16, and the resource 16/4-4 is allocated to the test task a with the minimum demand, where the system resource allocated to the task a exceeds 3, so the excess part is collected, the remaining resources are averagely allocated to B, C, D, where the system resource allocated to B, C, D is 5, and where B, C also exceeds the required resource, so the excess parts of B, C are collected and allocated to D, and finally the resource amounts obtained by A, B, C, D are 1, 4, and 7, respectively.
In this optional embodiment, if the determination result is inconsistent, the priorities of the test tasks in the test task queue are standardized first, and for convenience of understanding, the process of allocating according to the custom allocation algorithm is as follows:
setting a total of four test tasks, wherein the corresponding resource requirements on the automatic test system are 2, 4 and 10, the corresponding priorities are 4, 2.5, 1 and 0.5 respectively, and the total resource amount of the automatic test system is 16;
firstly, the priority needs to be standardized, namely the minimum priority is set as 1, the priorities are respectively 8, 5, 2 and 1, the sum of the priorities is 16, the total system resources of the automatic test system are equally divided into 16 equal parts, and the four test tasks respectively obtain 8, 5, 2 and 1 parts of resources;
at this time, the test task 1 obtains 6 more resources, the test task 2 obtains 1 more resource, and the test task 3 and the test task 4 are under the condition that the resources are not satisfied, so that the redundant 7 resources are also distributed to the test task 3 and the test task 4 according to the priority, and the system resources of the automatic test system obtained by the test task 3 and the test task 4 are respectively 7 × 2/3 and 7 × 1/3;
since the resources required by the test task 3 are 4 parts and the allocated resources exceed the requirements of the test task 3, the test task 3 finally reallocates the excess resources to the test task 4, thereby completing allocation.
Therefore, the test tasks are obtained, and the test tasks are distributed to the test executors according to the custom distribution algorithm, so that the system resources of the automatic test system can be guaranteed to be distributed to the test tasks as fairly as possible, and the use efficiency of the system resources in the software test process is effectively improved.
S13, executing the test task based on the test executor to obtain a test result.
In an optional embodiment, each test execution machine feeds back a test result after executing a test case in the test task, where the test result includes three types, i.e., a test success, a test failure, and a test skip.
In this optional embodiment, if the test result is that the test is successful, because each logic node has two ID identifiers, one ID identifier is used as its own identifier, and the other ID identifier is used as an ID of a peer logic node to be scheduled to run next, the next test case corresponding to the current test case in the test task is continuously executed according to the ID identifiers; if the test result of any test case in the test tasks is test failure, ending the test task corresponding to the current test case; and if the test result of any test case in the test tasks is test skipping, the test task corresponding to the current test case is skipped and is not executed continuously.
Therefore, different operations can be performed according to the test result of the test case in the test task, and when the test result is test failure and test skip, a new test task is executed in time, so that the overall execution efficiency of the test task is improved.
And S14, generating a test report for display based on the test result.
In an optional embodiment, the generating a test report for presentation based on the test result comprises:
and S141, the test executor feeds back the test result to generate a test result report.
In this optional embodiment, the server sends the multiple test tasks to the test executor for concurrent processing, and the test executor feeds back a test result obtained after the test is executed, so as to form the test result report.
And S142, storing the test result report to the test case database for displaying.
In this optional embodiment, the test result report is stored in the database, and the user front end may call the result report API to display the result report API, so that the user may perform real-time monitoring, and the user may download the result report in the background.
In this optional embodiment, after the automated testing system obtains the test result and forms the test result report, the execution of the test task is completed. When the next test is carried out, if the test content is not changed, the test case information of the last test task is not required to be deleted, and only the test case is required to be deleted or added on the basis of the original test case, so that the safety test case can be dynamically expanded conveniently, the system structure is not required to be modified, and the redevelopment is avoided.
Therefore, the test results can be stored in a unified mode and displayed at the front end of the user, and the user can monitor the test results in real time conveniently.
Referring to fig. 3, fig. 3 is a functional block diagram of a preferred embodiment of the artificial intelligence based test task distributing apparatus according to the present invention. The artificial intelligence based test task distributing device 11 comprises a generating unit 110, a connecting unit 111, a distributing unit 112, an executing unit 113 and a displaying unit 114. A module/unit as referred to herein is a series of computer readable instruction segments capable of being executed by the processor 13 and performing a fixed function, and is stored in the memory 12. In the present embodiment, the functions of the modules/units will be described in detail in the following embodiments.
In an optional embodiment, the generating unit 110 is configured to build a test case database, and call test case data from the test case database according to a test requirement of a user to generate a test task.
In an optional embodiment, the constructing a test case database, and calling test case data from the test case database according to a test requirement of a user to generate a test task includes:
generating a test case according to the automatic test case generation tool to construct a test case database;
and calling the corresponding test case in the test case database according to the test requirement of the user to generate the test task.
In the optional embodiment, a corresponding test case can be generated for each test task according to the automatic test case generation tool, and all the test cases are stored in the database to form the test case database.
In this optional embodiment, an encoding tag may be set for the test requirement of the user and the corresponding test case, and is used to directly invoke the corresponding test case in the test case database according to the test requirement input by the user at the client, so as to generate the test task. The coded labels can be numbers, symbols, letters and the like, and the scheme does not require specific requirements.
In this optional embodiment, when the user has a test requirement, the electronic device may receive task information to be tested according to a user interface of the client, call a corresponding test case from the test case database according to the task information, and send all the obtained test cases and corresponding test tasks to the server. The client is mainly a graphical user interface of an automatic test system and testers, and the testers can write test tasks, monitor execution states, check reports and the like through the client. One client can only establish connection with one server, one client can simultaneously run a plurality of test tasks, and each test task can also call a plurality of test cases. For example, if the function of the Baidu home page is tested, the test case may be a series of automated test cases such as test login, test jump, test search, and the like.
In this optional embodiment, the test task may be stored in an xml file, each peer directory in the xml file corresponds to a logical node, each logical node has two ID identifiers, one ID identifier is used as its own identifier, and the other ID identifier is used as an ID for scheduling the next peer logical node to be operated, and each logical node supports scheduling rules such as round robin and concurrency, and assists an auxiliary scheduling rule such as delay and assignment to assist scheduling.
In an alternative embodiment, the connection unit 111 is configured to establish a heartbeat connection between the server and the test executor to obtain a communication connection status between the server and the test executor.
In an optional embodiment, the establishing a heartbeat connection between the server and the test executor to obtain a communication connection state between the server and the test executor includes:
the test executor sends a heartbeat request to the server every other preset fixed period to acquire a heartbeat signal;
and judging the communication connection state between the test executor and the server based on the heartbeat signal.
In this alternative embodiment, the heartbeat signal is a method of sending a small data packet to the other party of the interconnection at intervals, and determining whether the communication link between the two parties of the interconnection has been disconnected according to the reply condition of the other party.
In this alternative embodiment, the test executor sends a heartbeat request to the server every other preset fixed period to obtain a heartbeat signal, which is referred to as "heartbeat", and the preset fixed period may be 3 s.
In this alternative embodiment, the connection state between the test executor and the server may be determined based on the heartbeat signal. If the server receives the heartbeat request, a heartbeat signal is also sent to the test executor, and the server only stores the relevant information of the test executor with the heartbeat.
In this optional embodiment, it is ensured that all the allocated test executors are normally available by ensuring that only the relevant information of the normally operating test executors in the server by the heartbeat signal and deleting the relevant information of the unavailable test executors in time.
In an optional embodiment, the distributing unit 112 is configured to distribute the test task to the test executor according to a custom distribution algorithm if the communication connection state is normal.
In an optional embodiment, if the communication connection state is normal, distributing the test task to the test executor according to a custom distribution algorithm includes:
carrying out priority division on the test tasks to obtain a test task level data set;
sequencing the test tasks in the test task level data set according to the sequence of the priorities from high to low so as to generate a test task queue;
and distributing the test tasks in the test task queue to a test executor according to a custom distribution algorithm.
In this optional embodiment, priority division may be performed on each test task in advance, for example, according to the time sequence of the test tasks, the earlier generated test task has higher priority; or according to the importance degree of the test tasks, the more important test tasks have higher priority, and all test tasks subjected to priority division are used as the task level data set in the scheme.
In this optional embodiment, after receiving the test task sent by the client, the server sequentially adds the test tasks in the test task level data set to the task queue according to the priority of the test task to wait for the test executor to perform scheduling. The test executor is a physical machine, a virtual machine and the like which are specially configured to execute tasks including test cases under the environment of a cloud test platform, and all test tasks added into a task queue according to a priority order are used as the test task queue.
In this optional embodiment, before distributing the test tasks in the test task queue to the test executor according to the customized distribution algorithm, it is necessary to count system resources of the automated test system required by each test task in the test task queue to obtain a task resource set, and determine whether priorities of each test task in the test task queue are consistent to obtain a determination result.
In the optional embodiment, the scheme realizes fair and reasonable distribution of the system resources of the current automatic test system through a custom distribution algorithm, that is, the total system resources are distributed as much as possible according to the sequence of increasing demands, so that the situation that the resources obtained by the test tasks exceed the resources required by the test tasks is avoided, and the test tasks which do not meet the performance requirements share the remaining resources equivalently. Because a plurality of test tasks exist in the automatic test system at the same time and each test task has a corresponding priority, the self-defined distribution algorithm in the scheme comprises two conditions of consistent priority and inconsistent priority of the test tasks.
In this optional embodiment, if the priorities are consistent, a total of m test tasks are set, the system resources of the automated test system required by the test tasks are sorted from small to large according to the task resource set, and when E1< E2< > em and the total resource amount is E are satisfied, the specific allocation process of the custom allocation algorithm is as follows:
a. allocating resource E/m to user 1 with the smallest demand, which is likely to exceed the demand of this user;
b. the excess part is reclaimed, and the resources of (E-E1)/(m-1) are averagely distributed to other users again;
c. and repeating the process b in sequence until the resource for the user does not meet the requirement of the user at a certain time. And when the resource is allocated to the kth user, the allocated resource cannot meet the requirement of the user, and the rest of resources are evenly distributed to all users which do not obtain the resource, so that the allocation is finished.
For example, it is assumed that there are four test tasks A, B, C, D, the corresponding resource demands on the automated test system are 1, 4, and 10, the total resource amount E is 16, and the resource 16/4-4 is allocated to the test task a with the minimum demand, where the system resource allocated to the task a exceeds 3, so the excess part is collected, the remaining resources are averagely allocated to B, C, D, where the system resource allocated to B, C, D is 5, and where B, C also exceeds the required resource, so the excess parts of B, C are collected and allocated to D, and finally the resource amounts obtained by A, B, C, D are 1, 4, and 7, respectively.
In this optional embodiment, if the determination result is inconsistent, the priorities of the test tasks in the test task queue are standardized first, and for convenience of understanding, the process of allocating according to the custom allocation algorithm is as follows:
setting a total of four test tasks, wherein the corresponding resource requirements on the automatic test system are 2, 4 and 10, the corresponding priorities are 4, 2.5, 1 and 0.5 respectively, and the total resource amount of the automatic test system is 16;
firstly, the priority needs to be standardized, namely the minimum priority is set as 1, the priorities are respectively 8, 5, 2 and 1, the sum of the priorities is 16, the total system resources of the automatic test system are equally divided into 16 equal parts, and the four test tasks respectively obtain 8, 5, 2 and 1 parts of resources;
at this time, the test task 1 obtains 6 more resources, the test task 2 obtains 1 more resource, and the test task 3 and the test task 4 are under the condition that the resources are not satisfied, so that the redundant 7 resources are also distributed to the test task 3 and the test task 4 according to the priority, and the system resources of the automatic test system obtained by the test task 3 and the test task 4 are respectively 7 × 2/3 and 7 × 1/3;
since the resources required by the test task 3 are 4 parts and the allocated resources exceed the requirements of the test task 3, the test task 3 finally reallocates the excess resources to the test task 4, thereby completing allocation.
In an alternative embodiment, the execution unit 113 is configured to execute the test task based on the test executor to obtain a test result.
In an optional embodiment, each test execution machine feeds back a test result after executing a test case in a completed test task, where the test result includes a test success, a test failure, and a test skip.
In this optional embodiment, if the test result is that the test is successful, because each logic node has two ID identifiers, one ID identifier is used as its own identifier, and the other ID identifier is used as an ID of a peer logic node to be scheduled to run next, the next test case corresponding to the current test case in the test task is continuously executed according to the ID identifiers; if the test result of any test case in the test tasks is test failure, ending the test task corresponding to the current test case; and if the test result of any test case in the test tasks is test skip, the test task corresponding to the current test case is skipped and is not executed continuously.
In an alternative embodiment, the presentation unit 114 is configured to generate a test report for presentation based on the test result.
In an optional embodiment, the generating a test report for presentation based on the test result comprises:
the test executor feeds back the test result to generate a test result report;
and storing the test result report to the test case database for displaying.
In this optional embodiment, the server sends the multiple test tasks to the test executor for concurrent processing, and the test executor feeds back a test result obtained after the test is executed, so as to form the test result report.
In this optional embodiment, the test result report is stored in the database, and the user front end may call the result report API to display the result report API, so that the user may perform real-time monitoring, and the user may download the result report in the background.
In this optional embodiment, after the automated testing system obtains the test result and forms the test result report, the execution of the test task is completed. When the next test is carried out, if the test content is not changed, the test case information of the last test task is not required to be deleted, and only the test case is deleted or added on the basis of the original test case, so that the safe test case can be conveniently and dynamically expanded, the structure of an automatic test system is not required to be modified, and redevelopment is avoided.
According to the technical scheme, the test tasks can be acquired, and the test tasks are distributed to the test executors according to the user-defined distribution algorithm, so that the system resources of the automatic test system are guaranteed to be distributed to the test tasks as fairly as possible, and the use efficiency of the system resources in the software test process is effectively improved.
Fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present application. The electronic device 1 comprises a memory 12 and a processor 13. The memory 12 is used for storing computer readable instructions, and the processor 13 is used for executing the computer readable instructions stored in the memory to implement the artificial intelligence based test task distributing method according to any one of the above embodiments.
In an alternative embodiment, the electronic device 1 further comprises a bus, a computer program stored in said memory 12 and executable on said processor 13, such as an artificial intelligence based test task distribution program.
Fig. 4 only shows the electronic device 1 with the memory 12 and the processor 13, and it will be understood by a person skilled in the art that the structure shown in fig. 4 does not constitute a limitation of the electronic device 1, and may comprise fewer or more components than shown, or a combination of certain components, or a different arrangement of components.
In conjunction with fig. 1, the memory 12 in the electronic device 1 stores a plurality of computer-readable instructions to implement an artificial intelligence based test task distribution method, and the processor 13 can execute the plurality of instructions to implement:
constructing a test case database, and calling test case data from the test case database according to the test requirements of a user to generate a test task;
establishing heartbeat connection between the server and the test executor to acquire a communication connection state between the server and the test executor;
if the communication connection state is normal, distributing the test task to a test executor according to a custom distribution algorithm;
executing the test task based on the test executor to obtain a test result;
and generating a test report based on the test result for displaying.
Specifically, the processor 13 may refer to the description of the relevant steps in the embodiment corresponding to fig. 1 for a specific implementation method of the instruction, which is not described herein again.
It will be appreciated by those skilled in the art that the schematic diagram is merely an example of the electronic device 1, and does not constitute a limitation to the electronic device 1, the electronic device 1 may have a bus-type structure or a star-shaped structure, and the electronic device 1 may further include more or less hardware or software than that shown in the figure, or different component arrangements, for example, the electronic device 1 may further include an input and output device, a network access device, and the like.
It should be noted that the electronic device 1 is only an example, and other existing or future electronic products, such as may be adapted to the present application, should also be included in the scope of protection of the present application, and is included by reference.
Memory 12 includes at least one type of readable storage medium, which may be non-volatile or volatile. The readable storage medium includes flash memory, removable hard disks, multimedia cards, card type memory (e.g., SD or DX memory, etc.), magnetic memory, magnetic disks, optical disks, etc. The memory 12 may in some embodiments be an internal storage unit of the electronic device 1, for example a removable hard disk of the electronic device 1. The memory 12 may also be an external storage device of the electronic device 1 in other embodiments, such as a plug-in mobile hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided on the electronic device 1. The memory 12 may be used not only to store application software installed in the electronic device 1 and various types of data, such as codes of an artificial intelligence-based test task distribution program, etc., but also to temporarily store data that has been output or is to be output.
The processor 13 may be composed of an integrated circuit in some embodiments, for example, a single packaged integrated circuit, or may be composed of a plurality of integrated circuits packaged with the same function or different functions, including one or more Central Processing Units (CPUs), microprocessors, digital Processing chips, graphics processors, and combinations of various control chips. The processor 13 is a Control Unit (Control Unit) of the electronic device 1, connects various components of the electronic device 1 by various interfaces and lines, and executes various functions and processes data of the electronic device 1 by running or executing programs or modules stored in the memory 12 (for example, executing an artificial intelligence based test task distribution program and the like) and calling data stored in the memory 12.
The processor 13 executes the operating system of the electronic device 1 and various installed application programs. The processor 13 executes the application program to implement the steps in each of the artificial intelligence based test task distribution method embodiments described above, such as the steps shown in fig. 1-2.
Illustratively, the computer program may be partitioned into one or more modules/units, which are stored in the memory 12 and executed by the processor 13 to accomplish the present application. The one or more modules/units may be a series of computer-readable instruction segments capable of performing certain functions, which are used to describe the execution of the computer program in the electronic device 1. For example, the computer program may be divided into a generating unit 110, a connecting unit 111, a distributing unit 112, an executing unit 113, a presenting unit 114.
The integrated unit implemented in the form of a software functional module may be stored in a computer-readable storage medium. The software functional module is stored in a storage medium and includes several instructions to enable a computer device (which may be a personal computer, a computer device, or a network device) or a processor (processor) to execute the portions of the artificial intelligence based test task distribution method according to the embodiments of the present application.
The integrated modules/units of the electronic device 1 may be stored in a computer-readable storage medium if they are implemented in the form of software functional units and sold or used as separate products. Based on such understanding, all or part of the processes in the methods of the embodiments described above may be implemented by a computer program, which may be stored in a computer-readable storage medium and executed by a processor, to implement the steps of the embodiments of the methods described above.
Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, U-disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), random-access Memory and other Memory, etc.
Further, the computer-readable storage medium may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function, and the like; the storage data area may store data created according to the use of the blockchain node, and the like.
The block chain referred by the application is a novel application mode of computer technologies such as distributed data storage, point-to-point transmission, a consensus mechanism, an encryption algorithm and the like. A block chain (Blockchain), which is essentially a decentralized database, is a series of data blocks associated by using a cryptographic method, and each data block contains information of a batch of network transactions, so as to verify the validity (anti-counterfeiting) of the information and generate a next block. The blockchain may include a blockchain underlying platform, a platform product service layer, an application service layer, and the like.
The bus may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one arrow is shown in FIG. 4, but this does not indicate only one bus or one type of bus. The bus is arranged to enable connection communication between the memory 12 and at least one processor 13 or the like.
An embodiment of the present application further provides a computer-readable storage medium (not shown), where the computer-readable storage medium stores computer-readable instructions, and the computer-readable instructions are executed by a processor in an electronic device to implement the artificial intelligence based test task distribution method according to any of the above embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules is only one logical functional division, and other divisions may be realized in practice.
The modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In addition, functional modules in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional module.
Furthermore, it is obvious that the word "comprising" does not exclude other elements or steps, and the singular does not exclude the plural. A plurality of units or means stated in the description may also be implemented by one unit or means in software or hardware. The terms first, second, etc. are used to denote names, but not any particular order.
Finally, it should be noted that the above embodiments are only used for illustrating the technical solutions of the present application and not for limiting, and although the present application is described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions can be made on the technical solutions of the present application without departing from the spirit and scope of the technical solutions of the present application.

Claims (10)

1. A test task distribution method based on artificial intelligence is characterized by comprising the following steps:
constructing a test case database, and calling test case data from the test case database according to the test requirements of a user to generate a test task;
establishing heartbeat connection between a server and a test executor to acquire a communication connection state between the server and the test executor;
if the communication connection state is normal, distributing the test task to a test executor according to a custom distribution algorithm;
executing the test task based on the test executor to obtain a test result;
and generating a test report based on the test result for displaying.
2. The artificial intelligence based test task distribution method of claim 1, wherein the constructing a test case database and calling test case data from the test case database according to a test requirement of a user to generate a test task comprises:
generating a test case according to the automatic test case generation tool to construct a test case database;
and calling the corresponding test case in the test case database according to the test requirement of the user to generate the test task.
3. The artificial intelligence based test task distributing method according to claim 1, wherein the establishing a heartbeat connection between the server and the test executor to obtain a communication connection state between the server and the test executor comprises:
the test executor sends a heartbeat request to the server every other preset fixed period to acquire a heartbeat signal;
and judging the communication connection state between the test executor and the server based on the heartbeat signal.
4. The artificial intelligence based test task distribution method of claim 1, wherein if the communication connection state is normal, distributing the test task to a test executor according to a custom distribution algorithm comprises:
the test tasks are subjected to priority division to obtain a test task level data set;
sequencing the test tasks in the test task level data set according to the sequence of the priorities from high to low so as to generate a test task queue;
and distributing the test tasks in the test task queue to a test executor according to a custom distribution algorithm.
5. The artificial intelligence based test task distribution method of claim 4, wherein before distributing the test tasks in the test task queue to the test executors according to the custom distribution algorithm, the method further comprises:
counting system resources required by each test task in the test task queue to obtain a task resource set;
judging whether the priorities of the test tasks in the test task queue are consistent or not to obtain a judgment result;
if the judgment result is consistent, distributing the test tasks in the test task queue to a test executor according to a user-defined distribution algorithm and the task resource set;
if the judgment result is inconsistent, the priorities of the test tasks in the test task queue are subjected to standardized processing, and then the test tasks in the test task queue are distributed to a test executor according to a user-defined distribution algorithm and the task resource set.
6. The artificial intelligence based test task distribution method of claim 1, wherein the executing the test task based on the test executor to obtain a test result comprises:
executing the test case in the test task based on the test executor to obtain a test result, wherein the test result comprises test success, test failure and test skipping;
if the test result is successful, continuing to execute a next test case corresponding to the current test case in the test task;
if the test result is test failure, ending the test task corresponding to the current test case;
and if the test result is test skipping, skipping the test task corresponding to the current test case and not continuing to execute.
7. The artificial intelligence based test task distribution method of claim 1, wherein the generating for presentation a test report based on the test results comprises:
the test executor feeds back the test result to generate a test result report;
and storing the test result report to the test case database for displaying.
8. An artificial intelligence based test task distribution apparatus, the apparatus comprising:
the generating unit is used for constructing a test case database and calling test case data from the test case database according to the test requirements of a user to generate a test task;
the connection unit is used for establishing heartbeat connection between the server and the test executor so as to acquire a communication connection state between the server and the test executor;
the distribution unit is used for distributing the test task to the test executor according to a custom distribution algorithm if the communication connection state is normal;
the execution unit is used for executing the test task based on the test executor to obtain a test result;
and the display unit is used for generating a test report based on the test result so as to display the test report.
9. An electronic device, characterized in that the electronic device comprises:
a memory storing computer readable instructions; and
a processor executing computer readable instructions stored in the memory to implement the artificial intelligence based test task distribution method of any of claims 1-7.
10. A computer-readable storage medium having computer-readable instructions stored thereon which, when executed by a processor, implement the artificial intelligence based test task distribution method of any of claims 1 to 7.
CN202210598121.5A 2022-05-30 2022-05-30 Test task distribution method based on artificial intelligence and related equipment Pending CN114924981A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210598121.5A CN114924981A (en) 2022-05-30 2022-05-30 Test task distribution method based on artificial intelligence and related equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210598121.5A CN114924981A (en) 2022-05-30 2022-05-30 Test task distribution method based on artificial intelligence and related equipment

Publications (1)

Publication Number Publication Date
CN114924981A true CN114924981A (en) 2022-08-19

Family

ID=82812032

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210598121.5A Pending CN114924981A (en) 2022-05-30 2022-05-30 Test task distribution method based on artificial intelligence and related equipment

Country Status (1)

Country Link
CN (1) CN114924981A (en)

Similar Documents

Publication Publication Date Title
CN108304260B (en) Virtualization operation scheduling system based on high-performance cloud computing and implementation method thereof
CN111932257B (en) Block chain parallelization processing method and device
CN114881616A (en) Business process execution method and device, electronic equipment and storage medium
CN112506559A (en) Gray scale publishing method and device based on gateway, electronic equipment and storage medium
CN111163186B (en) ID generation method, device, equipment and storage medium
CN104615486A (en) Multi-task scheduling and executing method, device and system for search promotion platform
CN111835809B (en) Work order message distribution method, work order message distribution device, server and storage medium
CN111831408A (en) Asynchronous task processing method and device, electronic equipment and medium
CN114924981A (en) Test task distribution method based on artificial intelligence and related equipment
CN116401024A (en) Cluster capacity expansion and contraction method, device, equipment and medium based on cloud computing
CN112884382B (en) Resource quota management method, device and equipment of cloud platform and storage medium
CN114820132A (en) Order distribution method and device, electronic equipment and storage medium
CN115496470A (en) Full-link configuration data processing method and device and electronic equipment
CN111796934B (en) Task issuing method and device, storage medium and electronic equipment
CN112181485B (en) Script execution method and device, electronic equipment and storage medium
CN115221041A (en) Multi-device testing method and device, electronic device and storage medium
CN114626103A (en) Data consistency comparison method, device, equipment and medium
CN114237902A (en) Service deployment method and device, electronic equipment and computer readable medium
CN110677497A (en) Network medium distribution method and device
CN112527842B (en) System flow pushing method and device, electronic equipment and computer readable storage medium
CN110633142A (en) Block chain consensus method, management node, electronic device, and storage medium
CN117472516B (en) Virtual resource scheduling method, device, cluster system, electronic equipment and medium
CN113515495B (en) Data file distribution method and device, intelligent equipment and computer storage medium
CN113515403B (en) Micro-service state checking method, computer device and storage medium
US20080222231A1 (en) Integration Process and Product for Digital Systems

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination