CN113157569B - Automated testing method, apparatus, computer device and storage medium - Google Patents

Automated testing method, apparatus, computer device and storage medium Download PDF

Info

Publication number
CN113157569B
CN113157569B CN202110366426.9A CN202110366426A CN113157569B CN 113157569 B CN113157569 B CN 113157569B CN 202110366426 A CN202110366426 A CN 202110366426A CN 113157569 B CN113157569 B CN 113157569B
Authority
CN
China
Prior art keywords
test
task
state
execution machine
executed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110366426.9A
Other languages
Chinese (zh)
Other versions
CN113157569A (en
Inventor
苏淳开
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ifreecomm Technology Co ltd
Original Assignee
Ifreecomm Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ifreecomm Technology Co ltd filed Critical Ifreecomm Technology Co ltd
Priority to CN202110366426.9A priority Critical patent/CN113157569B/en
Publication of CN113157569A publication Critical patent/CN113157569A/en
Application granted granted Critical
Publication of CN113157569B publication Critical patent/CN113157569B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The application relates to an automatic test method, an automatic test device, computer equipment and a storage medium. The method comprises the following steps: in the process of executing a test task by a test execution machine, acquiring a test data table of the test execution machine from a database according to a preset time interval; detecting the current running state of the test execution machine according to the key field in the test data table; if the current running state is an abnormal state, determining an abnormal processing scheme matched with the abnormal state; and the exception handling scheme is executed, so that the test execution machine is restored to a normal state, and the scheduled test task is continuously executed. By adopting the method, the efficiency of automatic test can be improved.

Description

Automated testing method, apparatus, computer device and storage medium
Technical Field
The present application relates to the field of automated testing technologies, and in particular, to an automated testing method, an apparatus, a computer device, and a storage medium.
Background
With the development of automation technology, the automation test technology is widely applied to various industries. The existing automatic test method is to manually monitor the test state of an execution machine executing a test task so as to ensure that the execution machine can be kept in a normal test state.
However, by manually monitoring the test state of the execution machine executing the test task, a large amount of labor is required, and after the execution machine has executed the test task, the execution machine manually schedules the test task for each execution machine, which results in a long-time idle standby state of the execution machine to be scheduled with the test task. Therefore, the automatic test efficiency is reduced by manually keeping the execution machine in a normal test state.
Disclosure of Invention
In view of the foregoing, it is desirable to provide an automated testing method, apparatus, computer device, and storage medium that can improve efficiency.
An automated testing method, the method comprising:
in the process of executing a test task by a test execution machine, acquiring a test data table of the test execution machine from a database according to a preset time interval;
detecting the current running state of the test execution machine according to the key field in the test data table; if the current running state is an abnormal state, then
Determining an exception handling scheme that matches the exception state;
and the exception handling scheme is executed, so that the test execution machine is restored to a normal state, and the scheduled test task is continuously executed.
In one embodiment, the exception state includes a timeout not-ending task state;
the executing the exception handling scheme enables the test execution machine to be restored to a normal state, and the scheduling of the test task to be executed on the test execution machine in the normal state comprises the following steps:
Sending a task ending instruction; the task ending instruction is used for indicating ending the overtime test task on the test execution machine and releasing corresponding test resources so that the test execution machine can recover from the overtime unfinished task state to the normal state of the task to be executed;
And after the test executor is restored to the normal state of the task to be executed, scheduling the test task to the test executor for execution.
In one embodiment, the method further comprises:
if a task end request is received, then
Ending the test task appointed by the task ending request from the test tasks scheduled to the test execution machine, and generating a corresponding test result report;
and sending the test result report.
In one embodiment, the exception state includes a timeout inactive task state;
the executing the exception handling scheme enables the test execution machine to be restored to a normal state, and the scheduling of the test task to be executed on the test execution machine in the normal state comprises the following steps:
sending alarm information; the alarm information is used for indicating the test execution machine to recover from the overtime non-started task state to the normal state of the task to be executed;
And after the test executor is restored to the normal state of the task to be executed, scheduling the test task to the test executor for execution.
In one embodiment, test tasks scheduled to a test executor are arranged according to a task queue; the method further comprises the steps of:
If the current running state is a normal state and a task adding request is received, then
And updating the test tasks to be newly added, which are designated by the task adding request, into the task queue according to the priorities of the test tasks so as to update the execution sequence of each test task in the task queue.
In one embodiment, test tasks scheduled to a test executor are arranged according to a task queue; the method further comprises the steps of:
If the current running state is the state of ending the test task in advance, then
Modifying the starting time of the next test task in the task queue after the test task is finished in advance, so that the test execution machine executes the next test task according to the modified starting time; wherein the post-modification start-up time is earlier than the pre-modification start-up time.
In one embodiment, the method further comprises:
If the test tasks in the same test period scheduled to the test executor are executed completely and the test period is not finished, scheduling secondary test tasks with priority lower than that of the executed test tasks to the test executor;
And continuing to execute the secondary test task on the test execution machine until the test period is over.
An automated testing apparatus, the apparatus comprising:
the acquisition module is used for acquiring a test data table of the test execution machine from the database according to a preset time interval in the process of executing the test task by the test execution machine;
the checking module is used for detecting the current running state of the test execution machine according to the key fields in the test data table;
The determining module is used for determining an abnormal processing scheme matched with the abnormal state if the current running state is the abnormal state;
and the execution module is used for enabling the test execution machine to be restored to a normal state by executing the exception handling scheme so as to continue executing the scheduled test task.
A computer device comprising a memory storing a computer program and a processor which when executing the computer program performs the steps of:
in the process of executing a test task by a test execution machine, acquiring a test data table of the test execution machine from a database according to a preset time interval;
detecting the current running state of the test execution machine according to the key field in the test data table; if the current running state is an abnormal state, then
Determining an exception handling scheme that matches the exception state;
and the exception handling scheme is executed, so that the test execution machine is restored to a normal state, and the scheduled test task is continuously executed.
A computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of:
in the process of executing a test task by a test execution machine, acquiring a test data table of the test execution machine from a database according to a preset time interval;
detecting the current running state of the test execution machine according to the key field in the test data table; if the current running state is an abnormal state, then
Determining an exception handling scheme that matches the exception state;
and the exception handling scheme is executed, so that the test execution machine is restored to a normal state, and the scheduled test task is continuously executed.
According to the automatic test method, the automatic test device, the computer equipment and the storage medium, key fields of the test data table of the execution machine in the database are acquired according to the preset time interval in the process of executing the test task by the test execution machine so as to monitor the running state of the test execution machine in the test process. If the test executor is in an abnormal state, determining an abnormal processing scheme aiming at the abnormal state of the test executor, and executing the abnormal processing scheme to enable the test executor in the abnormal state to be restored to a normal state so as to continuously execute the scheduled test task. The test execution machine can always execute the scheduled test tasks without manual operation, so that the complicated steps of manual operation are reduced, and the efficiency of automatic test of the test execution machine is improved.
Drawings
FIG. 1 is a diagram of an application environment for an automated test method in one embodiment;
FIG. 2 is a flow diagram of an automated test method in one embodiment;
FIG. 3 is a flow chart of an automated testing method in another embodiment;
FIG. 4 is a flow chart of an automated test method in yet another embodiment;
FIG. 5 is a block diagram of an automated test equipment in one embodiment;
FIG. 6 is a block diagram of an automated test equipment in another embodiment;
FIG. 7 is an internal block diagram of a computer device in one embodiment;
fig. 8 is an internal structural view of a computer device in another embodiment.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
The automatic test method provided by the application can be applied to an application environment shown in figure 1. Wherein the dispatch platform 104, test executor 102, and database 106 communicate over a network. During execution of test tasks by test executor 102, a test data table may be stored in database 106. In the process of executing the test task by the test executor 102, the scheduling platform 104 may acquire the test data table of the test executor 102 from the database 106 according to a preset time interval, and detect the current running state of the test executor 102 according to the key field in the test data table. If the current operating state of the test executor 102 is an abnormal state, the dispatch platform 104 determines an exception handling scheme that matches the abnormal state. The test executor 102 is restored to a normal state by executing the exception handling scheme to continue executing the scheduled test tasks. The scheduling platform 104 may be implemented as a cloud platform. The test executor 102 may be, but is not limited to, various personal computers, notebook computers, and tablet computers.
In one embodiment, as shown in fig. 2, an automated testing method is provided, and the method is applied to the scheduling platform in fig. 1 for illustration, and includes the following steps:
Step 202, acquiring a test data table of the test execution machine from a database according to a preset time interval in the process of executing a test task by the test execution machine.
The test execution machine is an execution machine for executing test tasks. The preset time interval is a preset time interval. And the test task is a task which is scheduled to the test execution machine by the scheduling platform for testing. The test data table is a table for recording test data in the test process of the test execution machine.
Specifically, the scheduling platform schedules the test tasks to the test executor for execution. The test executor stores a test data table generated in the process of executing the test task into a database. And in the process of executing the test task by the test execution machine, the scheduling platform acquires a test data table of the test execution machine from the database according to a preset time interval.
In one embodiment, the test executor may upload the generated test data table to a server database for storage. The scheduling platform may read a test data table stored in the server database.
In one embodiment, the test executor may store the generated test data table in a local database and generate a corresponding memory address. The test executor may send the memory address corresponding to the test data table to the server database. The dispatching platform can read the test data table stored in the local database by the test execution machine through the storage address of the server database.
In one embodiment, the preset time interval may be 30 minutes per interval. In the process of executing the test task by the test execution machine, the dispatching platform can acquire the test data table of the test execution machine from the database every 30 minutes.
In one embodiment, a distributed automated test system is formed between the dispatch platform and the test executor and server, where processes associated with test tasks may be implemented in the Python language. Where Python is a computer language used to implement processes in a distributed system.
Step 204, detecting the current running state of the test execution machine according to the key fields in the test data table.
The key field is a field indicating an operation state of the test execution machine. The current running state is the running state of the test execution machine. The operating states include a normal state and an abnormal state. The normal state is a state in which no abnormality occurs in the process of executing the test task by the test execution machine. The abnormal state is a state in which an abnormality occurs in the process of executing a test task by the test execution machine.
Specifically, the scheduling platform acquires the test data table of the test executor from the database according to the preset time interval, and then can extract key fields in the test data table. The scheduling platform can detect the current running state of the test execution machine according to the key fields in the test data table.
In one embodiment, the scheduling platform acquires the test data table of the test execution machine from the database according to the preset time interval, and extracts the key field related to the running state of the test execution machine through the matched key field.
In one embodiment, the key field may be an english character string, a numerical value, or a punctuation mark, which is not limited herein.
Step 206, if the current running state is an abnormal state, determining an abnormal processing scheme matched with the abnormal state.
The exception handling scheme is a scheme for handling the exception state of the test execution machine.
Specifically, if the current running state of the test execution machine is detected to be an abnormal state according to the key field in the test data table, the scheduling platform can determine an abnormal processing scheme matched with the abnormal state of the test execution machine.
In one embodiment, if the current running state of the test executor is detected to be an abnormal state according to the key field in the test data table, the scheduling platform may determine that the alarm information is sent out as an abnormal processing scheme. The alarm information is alarm reminding information.
In one embodiment, if the current running state of the test executor is detected to be an abnormal state according to the key field in the test data table, the scheduling platform may determine that the task ending instruction is sent to the test executor as an abnormal processing scheme.
In one embodiment, the dispatch platform may determine the terminal sending the alert information to the worker as an exception handling scheme.
In one embodiment, the scheduling platform may send the alarm information to the terminal of the staff through at least one communication mode such as WeChat and mailbox.
And step 208, the test execution machine is restored to a normal state by executing the exception handling scheme so as to continue executing the scheduled test task.
Specifically, after the exception handling scheme which is determined by the scheduling platform and is matched with the exception state of the test execution machine is executed, the test execution machine is restored to the normal state from the exception state. And after the test execution machine is restored to a normal state, continuing to execute the test task scheduled by the scheduling platform.
In one embodiment, after the exception handling scheme, which is determined by the scheduling platform and matches with the exception state where the test executor is located, is executed, the test executor may continue to execute the test task that is not executed when the exception occurs, or may not continue to execute the test task that is not executed when the exception occurs.
In one embodiment, the test executor may store the test data table generated by executing the test task each time, or may store only the test data table of the latest date of the same test task.
In one embodiment, if the current running state of the test executor is detected to be an abnormal state according to the key field in the test data table, the scheduling platform may send a task ending instruction to the test executor. And the test execution machine receives and executes the task ending instruction to end the test task designated by the task ending instruction.
In one embodiment, the dispatch platform may send an alert message to the staff member's terminal. After the staff looks up the alarm information through the terminal, can solve the unusual of the test execution machine according to the alarm information.
In one embodiment, the scheduling platform may send the alarm information to the terminal of the staff through at least one communication mode such as WeChat and mailbox.
In the automatic test method, the scheduling platform is matched with the key fields of the test data table of the executor in the database according to the preset time interval in the process of executing the test task by the test executor so as to monitor whether the test executor is in an abnormal state in the test process. If the test execution machine is in an abnormal state, the scheduling platform determines an abnormal processing scheme aiming at the abnormal state of the test execution machine, and executes the abnormal processing scheme, so that the test execution machine in the abnormal state is restored to a normal state to continue to execute the scheduled test task. The dispatching platform can always execute the dispatched test tasks without manual operation, so that the complicated steps of manual operation are reduced, and the efficiency of automatic test of the test execution machine is improved.
In one embodiment, the exception state includes a timeout not-ending task state; the exception handling scheme is executed, so that the test execution machine is restored to a normal state, and the test task is scheduled to be executed on the test execution machine in the normal state, and the method comprises the following steps: sending a task ending instruction; the task ending instruction is used for indicating ending of the overtime test task on the test execution machine and releasing of the corresponding test resource, so that the test execution machine is restored to the normal state of the task to be executed from the overtime unfinished task state; and after the test execution machine is restored to the normal state of the task to be executed, the test task is scheduled to the test execution machine for execution.
The state that the task is not ended after the timeout is a state that the test time of the test task is exceeded and the test task is not ended. The end task instruction is an instruction to end a test task whose timeout has not ended. The test resource is the resource required by the test execution machine to execute the test task.
Specifically, the scheduling platform detects that the test executor is in a state of overtime and not ending the task through a test data table of the test executor in the database. The scheduling platform can send an end task instruction for a test task that is overtime on the test executor. And the test execution machine receives the instruction for ending the task, ends the overtime test task, and releases the corresponding test resources so as to recover from the state of the overtime unfinished task to the normal state of the task to be executed. And the scheduling platform schedules the test task to the test execution machine for execution after the test execution machine is restored to the normal state of the task to be executed.
In one embodiment, the scheduling platform may send an end task instruction to the staff's terminal for a timeout not-ended task state in which the test executor is located. The staff can receive the task ending instruction through the terminal, and adjust the test execution machine according to the task ending instruction so as to end the overtime test task and release the corresponding test resources, so that the test execution machine is restored to the normal state of the task to be executed from the state of the overtime unfinished task.
In one embodiment, the scheduling platform may also send an end task instruction directly to the test executor for a timeout not-ended task state in which the test executor is. And the test execution machine receives the instruction for ending the task, ends the overtime test task and releases the corresponding test resource so as to recover from the state of the overtime unfinished task to the normal state of the task to be executed.
In one embodiment, scripts associated with the automated test tasks may be implemented in the Python language, and the scripts may include at_schedule and at_runner test instructions. Wherein, the at_schedule instruction may implement scheduling of test tasks. The at_runner instruction is an instruction for the test execution machine to execute a test task. The test execution machine can apply for the test environment, release the test resources, start the test, upload the test data table and end the test task through the at_runner instruction.
In this embodiment, the operation state of the test execution machine is monitored by the scheduling platform, and the task ending instruction of the test task can be sent for the test execution machine in the overtime non-ending task state, so that the test execution machine is restored to the normal state of the task to be executed from the overtime non-ending task state, so that the task is scheduled to the test execution machine to be executed, and the automatic test efficiency of the test execution machine is improved.
In one embodiment, the method further comprises: if a task ending request is received, ending the test task appointed by the task ending request from the test tasks scheduled to the test execution machine, and generating a corresponding test result report; and sending a test result report.
The task ending request is a request for ending a test task scheduled to the test execution machine. The test result report is a report which is generated by the dispatching platform and used for reflecting the passing rate of the test task executed by the test execution machine and the information corresponding to the passing rate.
Specifically, if a task ending request is received, the scheduling platform ends the test task specified by the task ending request from among the test tasks scheduled to the test execution machine. The scheduling platform processes the test results corresponding to the test tasks executed by the test execution machine and generates a corresponding test result report. The scheduling platform may send the generated test result report to a report library.
In one embodiment, the scheduling platform may also send the generated test result report to the report library and the enterprise micro-community on the staff's terminal.
In one embodiment, when a task end request is received, the test executor may be in a normal state for executing the test task or in an abnormal state for executing the test task.
In one embodiment, the scheduling platform may receive a task end request for a normal state in which the test handler is in, during a period in which the scheduling platform has detected that the test handler is in a normal test state for a last preset time interval and has not reached a next preset time interval. And when the scheduling platform receives a task ending request aiming at the normal state of the test execution machine, ending the test task appointed by the task ending request from scheduling to the test task on the test execution machine, and generating a test result report.
In one embodiment, the scheduling platform may receive a task end request for an abnormal state in which the test executor is in, during a period in which the scheduling platform has detected that the test executor is in a normal test state for a last preset time interval and has not reached a next preset time interval. And when the scheduling platform receives a task ending request aiming at the abnormal state of the test execution machine, ending the test task appointed by the task ending request from scheduling to the test task on the test execution machine, and generating a test result report.
In one embodiment, the test tasks scheduled to the test executor may be all scheduled to the test executor, or only one test task may be scheduled to the executor, until the test task is completed, the next test task is scheduled to the test executor.
In one embodiment, if a plurality of test tasks are scheduled to the execution machine, the scheduling platform may end at least one test task specified by the task end request when the task end request is received.
In this embodiment, when receiving the task end request, the scheduling platform may end the task scheduled to the execution machine according to the task end request, so that the execution machine does not need to execute the test task that is requested to end, and does not need to manually perform the end operation of the test task, thereby improving the automatic test efficiency of the test execution machine.
In one embodiment, the exception state includes a timeout inactive task state; the exception handling scheme is executed, so that the test execution machine is restored to a normal state, and the test task is scheduled to be executed on the test execution machine in the normal state, and the method comprises the following steps: sending alarm information; the alarm information is used for indicating the test execution machine to recover from the state of the overtime non-started task to the normal state of the task to be executed; and after the test execution machine is restored to the normal state of the task to be executed, the test task is scheduled to the test execution machine for execution.
The state of the non-started task after the timeout is a state of the non-started test task after the test duration of the test task is exceeded.
Specifically, the scheduling platform detects that the test executor is in a state of overtime and not starting a task through a test data table of the test executor in the database. The scheduling platform may send alert information. And after the test execution machine is restored to the normal state of the task to be executed, the scheduling platform schedules the test task to the test execution machine for execution.
In one embodiment, for a timeout non-activated task state in which the test handler is in, the scheduling platform may send an alert message to the staff's terminal to inform the staff that the test handler is in the timeout non-activated task state. After the staff knows that the test execution machine is in a state of overtime and not starting the task, the test execution machine is restored to a normal state of the task to be executed. After the scheduling platform detects that the test execution machine is restored to the normal state of the task to be executed, the test task can be scheduled to the test execution machine for execution.
In one embodiment, the alert information may include a test product corresponding to the test task, a test case level in the test task, and execution result data generated by executing the test task.
In this embodiment, the operation state of the test execution machine is monitored by the scheduling platform, and the test execution machine is in a state of overtime non-started task, so that the test execution machine can be restored to a normal state of the task to be executed from the state of overtime non-started task by sending the alarm information, so as to schedule the task to the test execution machine for execution, thereby improving the automatic test efficiency of the test execution machine.
In one embodiment, test tasks scheduled onto a test executor are arranged in a task queue; the method further comprises the steps of: if the current running state is a normal state and a task adding request is received, updating the test task to be added designated by the task adding request into a task queue according to the priority of the test task so as to update the execution sequence of each test task in the task queue.
The task queue is a queue in which at least one test task is arranged. The task newly-added request is a request for newly adding a test task to a task queue.
Specifically, if the current running state of the test execution machine is monitored to be in a normal state and a task adding request is received, the scheduling platform updates the test task to be added designated by the task adding request into the task queue according to the priority of the test task so as to update the execution sequence of each test task in the task queue.
In one embodiment, the scheduling platform may rank the priorities of the test tasks according to the multiplexing degree of the test cases in the test tasks, may rank the priorities of the test tasks according to the test durations of the test cases in the test tasks, and may rank the priorities of the test tasks according to the matching degree of the test cases and the products in the test tasks.
In one embodiment, the scheduling platform may receive a plurality of task adding requests, and after the scheduling platform receives the plurality of task adding requests, the test task to be added designated by the task adding requests may be updated to the task queue according to the priority of the test task.
In this embodiment, when receiving the task adding request, the scheduling platform may update the test task to be added to the task queue according to the task adding request, so as to update the execution sequence of each test task in the task queue, without manually performing the new adding operation of the test task, thereby improving the automatic test efficiency of the test executor.
In one embodiment, test tasks scheduled onto a test executor are arranged in a task queue; the method further comprises the steps of: if the current running state is the state of ending the test task in advance, modifying the starting time of the next test task in the task queue of the test task ended in advance, so that the test execution machine executes the next test task according to the modified starting time; wherein the post-modification start-up time is earlier than the pre-modification start-up time.
The state of the test task is finished in advance, and the state is that the test execution machine does not reach the state that the test time length has finished executing the test task.
Specifically, if the current running state of the test execution machine is detected to be the state of ending the test task in advance, the scheduling center modifies the starting time of the next test task in the task queue for the test execution machine to execute the next test task according to the modified starting time. Wherein the post-modification start-up time is earlier than the pre-modification start-up time.
In one embodiment, the scheduling center may modify the starting time of the next test task in the task queue, which ends in advance, to a preset duration after the current time, so that the test executor executes the next test task according to the modified starting time.
For example, the scheduling center may modify the starting time of the next test task in the task queue, which ends in advance, to 5 minutes after the current time, so that the test executor executes the next test task according to the modified starting time.
In this embodiment, the operation state of the test execution machine is monitored by the scheduling platform, and the scheduling center modifies the starting time of the next test task in the task queue for the test execution machine in order to perform the next test task according to the modified starting time, so as to improve the automatic test efficiency of the test execution machine.
In one embodiment, the method further comprises: if the test tasks in the same test period scheduled to the test executor are executed completely and the test period is not finished, scheduling the secondary test tasks with the priority lower than that of the executed test tasks to the test executor; and continuing to execute the secondary test task on the test execution machine until the test period is over.
The test period is a preset duration of the cyclic test task. The secondary test task is a test task with a priority lower than that of the executed test task.
Specifically, the scheduling platform schedules at least one test task to the test executor in advance in each test period. If the test tasks in the same test period scheduled to the test executor are executed and the test period is not ended, the scheduling platform can schedule the secondary test tasks to the test executor so as to continue to execute the secondary test tasks on the test executor until the test period is ended.
In one embodiment, the scheduling platform may schedule a secondary test task having a lower priority than the executed test task to the test executor for a preset duration before the test task is executed in the same test cycle scheduled to the test executor.
For example, the scheduling platform may schedule a secondary test task having a lower priority than the executed test task to the test executor 5 minutes before the test task in the same test cycle scheduled to the test executor has been executed.
In one embodiment, if one test period ends and the test tasks scheduled by the test period are all executed by the test execution machine, the scheduling platform may create a new test period on the project management system, schedule the corresponding test task after creating the new test period, and set the predicted start time, the predicted end time, the corresponding test product and other information for the corresponding test task. The project management system may be a system developed based on JIRA. JIRA, a piece of software, is used to track software defects.
In one embodiment, the test period may be in units of hours, days, or months, etc. In hours, it may be a period of time within a day, such as 7 a.m. to 23 a.m..
In one embodiment, the test period may be from one time point to another in the day, such as 7 a.m. to 23 a.m. and if the test task scheduled to the test executor is completed by 19 a.m., the scheduling platform may schedule the secondary test task to the test executor to continue executing the secondary test task on the test executor until the test period ends.
In one embodiment, the scheduling platform may pre-schedule a plurality of advanced test tasks to the test executor during each test cycle. If the plurality of advanced test tasks in the same test period scheduled to the test executor are all executed and the test period is not finished, the scheduling platform can schedule the secondary test tasks to the test executor so as to continue to execute the secondary test tasks on the test executor until the test period is finished. The high-level test task is a high-priority test task.
For example, the test period is 7 a.m. to 23 a.m., if the test period is 19 a.m., the primary test task and the secondary test task scheduled to the test executor are already executed, and the scheduling platform may schedule the tertiary test task to the test executor. And if the test execution machine has completed the third-level test task at 23 pm, stopping scheduling the fourth-level test task until 7 am on the next day, and starting the test of the next test period.
In one embodiment, if one test period ends and the test tasks scheduled in the test period are all executed by the test execution machine, the scheduling platform modifies the start time of the test task corresponding to the next test period to be earlier than the start time before modification.
In this embodiment, the scheduling platform monitors the running state of the test executor, and during a test period, the secondary test tasks with a priority lower than that of the executed test tasks can be sent to the test executor when the test tasks scheduled to the test executor have been executed completely, so that the test executor can execute the test tasks continuously during a test period, thereby improving the automatic test efficiency of the test executor.
In one embodiment, as shown in fig. 3, another automated testing method is provided, which specifically includes the steps of:
Step 302, the scheduling platform checks the running state of the test executor through the key field of the test data table of the test executor in the database at intervals of 5 minutes.
Step 304, the scheduling platform checks whether the test executor has a daily construction task. If not, go to step 306. If yes, go to step 308. The daily construction task is a test task in one test period.
In step 306, the scheduling platform creates a day build task for the test executor.
In step 308, the dispatch platform may detect whether the test executor is in a timeout non-launch task state. If yes, go to step 310. If not, go to step 312.
In one embodiment, the scheduling platform may also detect whether a task end request is received, if so, then step 310 is performed, otherwise, step 312 is performed. Wherein the task end request is a request for ending a task. It will be appreciated that the task end request may include a request to actively end the task or may include a request to abort.
In step 310, the dispatch platform may issue an alert to notify the administrator of the processing of the test executor exception.
In one embodiment, the dispatch platform may send the alert information to an administrator's enterprise WeChat.
Step 312, determine whether there is a task add request. If yes, go to step 314. If not, go to step 316.
In step 314, the scheduling platform may update the test task to be newly added specified by the task addition request to the task queue according to the priority of the test task.
In step 316, the scheduling platform may detect whether the test executor is in a timeout unfinished task state. If yes, go to step 318. If not, go to step 320.
In one embodiment, after the test executor finishes the task with timeout and updates the running state, the scheduling platform may generate a test result report and send the test result report through an enterprise WeChat.
In step 318, the scheduling platform may send a task ending instruction to instruct the test executor to end the test task with a timeout and release the test resource.
In step 320, the scheduling platform may detect whether the test executor is in a state to end the test task in advance. If yes, go to step 322. If not, go to step 324.
In step 322, the scheduling center may modify the start time of the test task that ends in advance in the next test task in the task queue, so that the test executor executes the next test task according to the modified start time.
In step 324, the scheduling platform may detect whether all test tasks in the same test cycle on the test execution machine have been executed. If yes, go to step 326. If not, after the test period is finished, the next test period is entered.
In step 326, the scheduling platform may schedule the secondary test tasks to the test executor.
In one embodiment, reference is made to the flowchart of another automated test method shown in FIG. 4. The method specifically comprises the following steps:
in step 402, the scheduling platform may obtain, from the database, the situations of executing test tasks by each test executor.
In step 404, the scheduling platform may check whether there is a new day building task according to the storage address of the day building task. If there is a new day build task, then step 406 is performed, and if not, step 408 is performed.
In step 406, the scheduling platform may use the new day building task as a reserved test task for the next cycle of the test executor.
Step 408, determining whether a task add request is received. If yes, go to step 410, if no, go to step 412.
Step 410, the scheduling platform updates the test task to be newly added designated by the task adding request to the task queue according to the priority of the test task.
Step 412, determining whether the primary test task and the secondary test task in the same test period scheduled to the test execution machine have been executed, and the test period has not been completed. If yes, go to step 414, if not, go to the next test period after the test period is over.
In step 414, the scheduling platform may schedule the tertiary test tasks to the test executor.
In one embodiment, the scheduling platform may obtain the current time at intervals of 5 minutes, and compare the current time with the time point of the test condition of the test executor in the test data table of the test executor in the database, so as to detect the running state of the test executor. When the test executor executes the daily construction task, if a new daily construction task exists, the scheduling platform can read the test cases of the new daily construction task, and form a plurality of test tasks according to the priorities of the test cases, and take the formed plurality of test tasks as reserved test tasks of the next period of the test executor. The scheduling platform may synchronize the newly reserved test tasks onto the project management system.
It should be understood that, although the steps in the flowcharts of the above embodiments are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least a portion of the steps in the flowcharts of the various embodiments may include steps or stages that are not necessarily performed at the same time, but may be performed at different times, nor do the order in which the steps or stages are performed necessarily performed in sequence, but may be performed alternately or alternately with other steps or at least a portion of the steps or stages in other steps.
In one embodiment, as shown in FIG. 5, an automated test equipment 500 is provided, comprising: an acquisition module 502, a check module 504, a determination module 506, and an execution module 508, wherein:
the obtaining module 502 is configured to obtain, from the database, a test data table of the test executor according to a preset time interval during execution of the test task by the test executor.
And the checking module 504 is configured to detect a current running state of the test executor according to the key field in the test data table.
A determining module 506, configured to determine an exception handling scheme that matches the exception status if the current running status is the exception status.
And the execution module 508 is used for enabling the test execution machine to be restored to a normal state by executing the exception handling scheme so as to continue executing the scheduled test task.
In one embodiment, the exception state includes a timeout not-ending task state; the execution module 508 is further configured to send a task ending instruction; the task ending instruction is used for indicating ending of the overtime test task on the test execution machine and releasing of the corresponding test resource, so that the test execution machine is restored to the normal state of the task to be executed from the overtime unfinished task state; and after the test execution machine is restored to the normal state of the task to be executed, the test task is scheduled to the test execution machine for execution.
In one embodiment, the apparatus further comprises: the ending module 510 is configured to, if a task ending request is received, end a test task specified by the task ending request from among the test tasks scheduled to the test execution machine, and generate a corresponding test result report; and sending a test result report.
In one embodiment, the exception state includes a timeout inactive task state; the execution module 508 is further configured to send alarm information; the alarm information is used for indicating the test execution machine to recover from the state of the overtime non-started task to the normal state of the task to be executed; and after the test execution machine is restored to the normal state of the task to be executed, the test task is scheduled to the test execution machine for execution.
In one embodiment, test tasks scheduled onto a test executor are arranged in a task queue; the apparatus further comprises: and a new adding module 512, configured to update the test task to be added specified by the task adding request to the task queue according to the priority of the test task if the current running state is a normal state and the task adding request is received, so as to update the execution sequence of each test task in the task queue.
In one embodiment, test tasks scheduled onto a test executor are arranged in a task queue; the apparatus further comprises: the modifying module 514 is configured to modify a start time of a next test task in the task queue for the test execution machine to execute the next test task according to the modified start time if the current running state is the state of ending the test task in advance; wherein the post-modification start-up time is earlier than the pre-modification start-up time.
In one embodiment, as shown in fig. 6, the apparatus further comprises: an end module 510, a new add module 512, a modify module 514, and a schedule module 516, wherein:
the scheduling module 516 is configured to schedule a secondary test task with a priority lower than that of the executed test task to the test executor if the test task in the same test period scheduled to the test executor is executed completely and the test period is not ended; and continuing to execute the secondary test task on the test execution machine until the test period is over.
For specific limitations of the automated test equipment, reference is made to the limitations of the automated test methods described above, and no further description is given here. The various modules in the automated test equipment described above may be implemented in whole or in part in software, hardware, and combinations thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a computer device is provided, which may be a server, the internal structure of which may be as shown in fig. 7. The computer device includes a processor, a memory, and a network interface connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, computer programs, and a database. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The database of the computer device is for storing automated test data. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program, when executed by a processor, implements an automated test method.
In one embodiment, a computer device is provided, which may be a terminal, and the internal structure thereof may be as shown in fig. 8. The computer device includes a processor, a memory, a communication interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The communication interface of the computer device is used for carrying out wired or wireless communication with an external terminal, and the wireless mode can be realized through WIFI, an operator network, NFC (near field communication) or other technologies. The computer program, when executed by a processor, implements an automated test method. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, can also be keys, a track ball or a touch pad arranged on the shell of the computer equipment, and can also be an external keyboard, a touch pad or a mouse and the like.
It will be appreciated by persons skilled in the art that the structures shown in fig. 7 and 8 are block diagrams of only portions of structures associated with the present inventive arrangements and are not limiting of the computer device to which the present inventive arrangements are applied, and that a particular computer device may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
In an embodiment, there is also provided a computer device comprising a memory and a processor, the memory having stored therein a computer program, the processor implementing the steps of the method embodiments described above when the computer program is executed.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored which, when executed by a processor, carries out the steps of the method embodiments described above.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, or the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory. By way of illustration, and not limitation, RAM can be in various forms such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), etc.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples illustrate only a few embodiments of the application, which are described in detail and are not to be construed as limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of protection of the present application is to be determined by the appended claims.

Claims (10)

1. An automated testing method, the method comprising:
in the process of executing a test task by a test execution machine, acquiring a test data table of the test execution machine from a database according to a preset time interval;
detecting the current running state of the test execution machine according to the key field in the test data table; the current running state is used for representing the running state of the test execution machine;
if the current running state is an abnormal state, determining an abnormal processing scheme matched with the abnormal state; wherein the abnormal state includes at least one of a timeout un-ended task state and a timeout un-started task state;
The abnormal processing scheme is executed, so that the test execution machine is restored to a normal state, and a test task is scheduled to be executed on the test execution machine in the normal state;
In the case that the abnormal state is a task state which is not ended after timeout, the abnormal processing scheme comprises: sending a task ending instruction to the test execution machine; the task ending instruction is used for indicating ending the overtime test task on the test execution machine and releasing corresponding test resources so that the test execution machine can recover from the overtime unfinished task state to the normal state of the task to be executed;
In the case that the abnormal state is a timeout non-started task state, the abnormal processing scheme includes: sending alarm information to the test executor; and the alarm information is used for indicating the test execution machine to recover from the overtime non-started task state to the normal state of the task to be executed.
2. The method according to claim 1, wherein the method further comprises:
if a task end request is received, then
Ending the test task appointed by the task ending request from the test tasks scheduled to the test execution machine, and generating a corresponding test result report;
and sending the test result report.
3. The method of claim 1, wherein the test tasks scheduled onto the test executor are arranged in a task queue; the method further comprises the steps of:
If the current running state is a normal state and a task adding request is received, then
And updating the test tasks to be newly added, which are designated by the task adding request, into the task queue according to the priorities of the test tasks so as to update the execution sequence of each test task in the task queue.
4. The method of claim 1, wherein the test tasks scheduled onto the test executor are arranged in a task queue; the method further comprises the steps of:
If the current running state is the state of ending the test task in advance, then
Modifying the starting time of the next test task in the task queue after the test task is finished in advance, so that the test execution machine executes the next test task according to the modified starting time; wherein the post-modification start-up time is earlier than the pre-modification start-up time.
5. The method according to claim 1, wherein the method further comprises:
If the test tasks in the same test period scheduled to the test executor are executed completely and the test period is not finished, scheduling secondary test tasks with priority lower than that of the executed test tasks to the test executor;
And continuing to execute the secondary test task on the test execution machine until the test period is over.
6. An automated test equipment, the equipment comprising:
the acquisition module is used for acquiring a test data table of the test execution machine from the database according to a preset time interval in the process of executing the test task by the test execution machine;
The checking module is used for detecting the current running state of the test execution machine according to the key fields in the test data table; the current running state is used for representing the running state of the test execution machine;
The determining module is used for determining an abnormal processing scheme matched with the abnormal state if the current running state is the abnormal state; wherein the abnormal state includes at least one of a timeout un-ended task state and a timeout un-started task state;
The execution module is used for enabling the test execution machine to be restored to a normal state by executing the exception handling scheme, and scheduling a test task to be executed on the test execution machine in the normal state; in the case that the abnormal state is a task state which is not ended after timeout, the abnormal processing scheme comprises: sending a task ending instruction to the test execution machine; the task ending instruction is used for indicating ending the overtime test task on the test execution machine and releasing corresponding test resources so that the test execution machine can recover from the overtime unfinished task state to the normal state of the task to be executed; in the case that the abnormal state is a timeout non-started task state, the abnormal processing scheme includes: sending alarm information to the test executor; and the alarm information is used for indicating the test execution machine to recover from the overtime non-started task state to the normal state of the task to be executed.
7. The apparatus of claim 6, wherein the apparatus further comprises:
The ending module is used for ending the test task appointed by the task ending request from the test task scheduled to the test execution machine and generating a corresponding test result report if the task ending request is received; and sends a test result report.
8. The apparatus of claim 6, wherein the test tasks scheduled to the test executor are arranged in a task queue; the apparatus further comprises:
and the new adding module is used for updating the test tasks to be added designated by the task adding request into the task queue according to the priority of the test tasks if the current running state is a normal state and the task adding request is received, so as to update the execution sequence of each test task in the task queue.
9. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any one of claims 1 to 5 when the computer program is executed.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 5.
CN202110366426.9A 2021-04-06 2021-04-06 Automated testing method, apparatus, computer device and storage medium Active CN113157569B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110366426.9A CN113157569B (en) 2021-04-06 2021-04-06 Automated testing method, apparatus, computer device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110366426.9A CN113157569B (en) 2021-04-06 2021-04-06 Automated testing method, apparatus, computer device and storage medium

Publications (2)

Publication Number Publication Date
CN113157569A CN113157569A (en) 2021-07-23
CN113157569B true CN113157569B (en) 2024-05-24

Family

ID=76888807

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110366426.9A Active CN113157569B (en) 2021-04-06 2021-04-06 Automated testing method, apparatus, computer device and storage medium

Country Status (1)

Country Link
CN (1) CN113157569B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113918397A (en) * 2021-10-28 2022-01-11 浪潮(山东)计算机科技有限公司 Pressure testing method, device, equipment and storage medium
CN114420189B (en) * 2022-01-18 2024-09-20 浙江芯劢微电子股份有限公司 Method and system for detecting chip damage

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20010048294A (en) * 1999-11-26 2001-06-15 이계철 The Automatic Recovery Method for Circuit Test using DCS
US7165189B1 (en) * 2003-12-19 2007-01-16 Sun Microsystems, Inc. Distributed test framework for clustered systems
CN101727389A (en) * 2009-11-23 2010-06-09 中兴通讯股份有限公司 Automatic test system and method of distributed integrated service
US8464219B1 (en) * 2011-04-27 2013-06-11 Spirent Communications, Inc. Scalable control system for test execution and monitoring utilizing multiple processors
JP2016042338A (en) * 2014-08-19 2016-03-31 キヤノン株式会社 Information processing system, information processing apparatus, control method of information processing apparatus, and program
CN106844198A (en) * 2016-12-27 2017-06-13 浪潮软件集团有限公司 Distributed dispatching automation test platform and method
CN107894950A (en) * 2017-10-30 2018-04-10 北京奇虎科技有限公司 A kind of equipment detection method, device, server and storage medium
US9983988B1 (en) * 2016-06-23 2018-05-29 Amazon Technologies, Inc. Resuming testing after a destructive event
CN112269744A (en) * 2020-10-30 2021-01-26 深圳壹账通智能科技有限公司 System abnormity testing method and device, computer equipment and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10360095B2 (en) * 2016-03-31 2019-07-23 Change Healthcare Holdings, Llc Methods and apparatuses for improving failure recovery in a distributed system
US10261892B2 (en) * 2017-05-24 2019-04-16 Bank Of America Corporation Cloud-based automated test execution factory

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20010048294A (en) * 1999-11-26 2001-06-15 이계철 The Automatic Recovery Method for Circuit Test using DCS
US7165189B1 (en) * 2003-12-19 2007-01-16 Sun Microsystems, Inc. Distributed test framework for clustered systems
CN101727389A (en) * 2009-11-23 2010-06-09 中兴通讯股份有限公司 Automatic test system and method of distributed integrated service
US8464219B1 (en) * 2011-04-27 2013-06-11 Spirent Communications, Inc. Scalable control system for test execution and monitoring utilizing multiple processors
JP2016042338A (en) * 2014-08-19 2016-03-31 キヤノン株式会社 Information processing system, information processing apparatus, control method of information processing apparatus, and program
US9983988B1 (en) * 2016-06-23 2018-05-29 Amazon Technologies, Inc. Resuming testing after a destructive event
CN106844198A (en) * 2016-12-27 2017-06-13 浪潮软件集团有限公司 Distributed dispatching automation test platform and method
CN107894950A (en) * 2017-10-30 2018-04-10 北京奇虎科技有限公司 A kind of equipment detection method, device, server and storage medium
CN112269744A (en) * 2020-10-30 2021-01-26 深圳壹账通智能科技有限公司 System abnormity testing method and device, computer equipment and storage medium

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
MAESTRO: Automated test generation framework for high test coverage and reduced human effort in automotive industry;Yunho Kim等;Information and Software Technology;20200429;第123卷;第1-17页 *
On Subnormal Floating Point and Abnormal Timing;Marc Andrysco等;2015 IEEE Symposium on Security and Privacy;20150720;第623-639页 *
分布式软件测试自动化技术的研究与实现;刘亮;中国优秀博硕士学位论文全文数据库 (硕士) 信息科技辑;20040315(第1期);I138-109 *
基于Android移动应用的性能测试平台关键技术研究;王学文;中国优秀硕士学位论文全文数据库 信息科技辑;20160115(第1期);I138-255 *

Also Published As

Publication number Publication date
CN113157569A (en) 2021-07-23

Similar Documents

Publication Publication Date Title
CN107016480B (en) Task scheduling method, device and system
US20200167671A1 (en) Computer system and method for machine learning or inference
CN113157569B (en) Automated testing method, apparatus, computer device and storage medium
CN106201672B (en) Timed task setting system and timed task running method thereof
CN113569987A (en) Model training method and device
US8713578B2 (en) Managing job execution
US20140297355A1 (en) Workflow control apparatus and method therefor
CN111752822A (en) Containerization pressure measurement scheduling method, computer equipment and readable storage medium
CN115964153A (en) Asynchronous task processing method, device, equipment and storage medium
CN113157426B (en) Task scheduling method, system, equipment and storage medium
CN112948096A (en) Batch scheduling method, device and equipment
CN112328602A (en) Method, device and equipment for writing data into Kafka
US20150113420A1 (en) Overloaded schedule detection and notification
CN112052077B (en) Method, device, equipment and medium for managing software tasks
CN116627437A (en) Deployment method and device of Airflow service, storage medium and computer equipment
CN108521524B (en) Agent collaborative task management method and device, computer equipment and storage medium
CN112445549A (en) Operation and maintenance method, operation and maintenance device, electronic equipment and medium
CN115858378A (en) Test system and method
CN108108895B (en) Method, system, equipment and storage medium for dynamically controlling task state
CN105868957A (en) Continuous integration method and device
JP5387083B2 (en) Job management system and method
CN115658248A (en) Task scheduling method and device, electronic equipment and storage medium
CN113127158B (en) Method and device for executing data processing task
CN113419835A (en) Job scheduling method, device, equipment and medium
CN113448561B (en) CI-based differential analysis method for automation demand progress and management server

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant