CN117931620A - Automatic test method for reducing test technical threshold of intelligent terminal system - Google Patents

Automatic test method for reducing test technical threshold of intelligent terminal system Download PDF

Info

Publication number
CN117931620A
CN117931620A CN202311580629.3A CN202311580629A CN117931620A CN 117931620 A CN117931620 A CN 117931620A CN 202311580629 A CN202311580629 A CN 202311580629A CN 117931620 A CN117931620 A CN 117931620A
Authority
CN
China
Prior art keywords
test
unit
module
execution
log
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311580629.3A
Other languages
Chinese (zh)
Inventor
莊敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiaxing Jiasai Information Technology Co ltd
Original Assignee
Jiaxing Jiasai Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiaxing Jiasai Information Technology Co ltd filed Critical Jiaxing Jiasai Information Technology Co ltd
Priority to CN202311580629.3A priority Critical patent/CN117931620A/en
Publication of CN117931620A publication Critical patent/CN117931620A/en
Pending legal-status Critical Current

Links

Abstract

The invention discloses an automatic test method for reducing the test technical threshold of an intelligent terminal system, which relates to the technical field of automatic test and solves the problems that in the existing automatic test method, the technical threshold is high, the test scene is limited, the execution use case of a client is stopped due to the disconnection of communication, the support for testing the whole functions under a plurality of systems is lacking, and the method is not suitable for the scene of deploying a plurality of operating systems on mobile equipment; comprising the following steps: step one, system analysis and demand definition; step two, constructing an automatic adaptation model; step three, designing and developing test cases; step four, automatically generating a script; fifthly, supporting multi-equipment parallel test; step six, continuous integration and automatic deployment; step seven, exception handling and report generation; step eight, automatic test execution and monitoring; and step nine, result analysis and feedback optimization.

Description

Automatic test method for reducing test technical threshold of intelligent terminal system
Technical Field
The invention relates to the field of automatic testing, in particular to an automatic testing method for reducing the technical threshold of intelligent terminal system testing.
Background
With the rapid development of intelligent terminal systems, such as smart phones, tablet computers, and the like, the functions and the complexity of the intelligent terminal systems are continuously improved. In order to ensure the quality and stability of these smart terminal systems, testing becomes critical. However, conventional manual testing methods are inefficient and prone to error in the face of large testing effort. Therefore, in order to reduce the threshold of the test technology and improve the test efficiency, an automatic test method has been developed.
Under the background of the prior art, an automatic test method is widely applied. By using the test tool and script writing, user operation and various scenes can be simulated, and automatic test on the intelligent terminal system is realized. These test tools can interact with the system and collect and analyze test results to provide accurate feedback information. Meanwhile, the existing open source framework and library enable automatic testing to be more convenient and flexible, and rich functions and expansibility are provided.
At present, UI automation tests are popular in the industry, and UI automation open source tools at the Android mobile device end comprise Appnium, espresso, uiautomator and the like, and the tools mainly perform UI simulation tests aiming at target applications, but certain programming capability is needed, the technical threshold is high, and the tools can only be used by a small part of testers. Although other low-code automation tools such as Airtest, katalon reduce the technical threshold in use, in the test case scenarios of multi-system switching, restarting, resetting, shell command, application keep-alive and the like, the continuity of the automated test cannot be ensured due to the fact that the client ends executing the case because of the disconnection of the communication connection, and the test scenario is high in limit.
In some fields with higher information security requirements, higher customization requirements are set for mobile devices, and multiple operating systems (e.g., one for work and one for life) need to be deployed on one mobile device. In the field of such industry customized mobile devices, the method is not limited to testing of a single target application, but is used for testing overall functions under a plurality of systems, and testing scenes such as multi-system switching, restarting, shell command, network limitation, application keep-alive and the like are ubiquitous, so that the tools cannot meet testing use requirements.
Therefore, in order to solve the problems that in the existing automatic test method, the technical threshold is high, the test scene is limited, the communication connection is disconnected, so that the client ends stop executing use cases, support for testing the whole functions under a plurality of systems is lacking, and the method is not suitable for the scene of deploying a plurality of operating systems on the mobile equipment, the invention discloses an automatic test method for reducing the technical threshold of the intelligent terminal system test.
Disclosure of Invention
Aiming at the defects of the prior art, the invention discloses an automatic testing method for reducing the testing technical threshold of an intelligent terminal system, and the invention improves the flexibility and expansibility of a testing system by configuring the automatic adaptation model according to equipment of different types; the standard basic action and the compound action are generated by the test action generating system by utilizing a boundary value analysis method, so that the workload of manually writing scripts is reduced; the test of a plurality of devices in the virtual environment is allowed to be carried out simultaneously through the multi-device parallel test support, so that the test efficiency and the coverage range are improved. A plurality of operating systems are deployed on the mobile equipment through a virtualization model, so that testing can be performed under different systems, and the problem of limited testing scenes is solved; the automated testing process is integrated with the software development flow through continuous integration and automated deployment. Therefore, the test execution can carry out asynchronous interaction between the server and the application terminal, and even if the communication connection is interrupted, the test can still be continuously executed, thereby ensuring the continuity and stability of the test. Parallel execution and real-time monitoring of test cases are realized by adopting a distributed execution framework. The distributed execution framework can execute a plurality of test cases simultaneously, and monitors the test progress and the result through the test monitoring system to find out abnormal execution and errors. Thus, the whole functions under a plurality of systems can be tested, and the comprehensiveness and accuracy of the test are improved.
In order to achieve the technical effects, the invention adopts the following technical scheme:
an automated test method for reducing a test technology threshold of an intelligent terminal system, wherein the method comprises the following steps:
as a further technical scheme of the invention, the method comprises the following steps:
firstly, system analysis and demand definition are carried out on an intelligent terminal system in a questionnaire investigation, business scene analysis and system architecture analysis mode so as to determine a test target and demand;
step two, an automatic adaptation model is built, equipment of different types is configured through the automatic adaptation model, so that flexibility and expansibility of a test system are improved, and the automatic adaptation model decouples an automation use case from the equipment through modularization and an object-oriented programming method, so that batch execution of a script of the same use case on the equipment of different types is realized;
Step three, designing and developing test cases, namely generating a system design test case set through a test action, wherein the test action generating system provides standard basic actions and compound actions through a boundary value analysis method so as to reduce the technical requirements of developing test scripts;
Step four, automatically generating a script, analyzing and learning system interface elements and interactions through a self-adaptive script generation algorithm to automatically generate a test script, and reducing the manual script writing workload;
step five, multi-equipment parallel test support, wherein a plurality of equipment instances are created in a virtual environment through a virtualization model and are tested in parallel, so that the test efficiency and coverage are improved;
Step six, continuous integration and automatic deployment, wherein integration of an automatic test process and a software development flow is realized through a continuous integration model and an automatic deployment system; the continuous integration model realizes the cooperative execution of the automatic use case of the service and the application program in an asynchronous interaction mode of the service end and the application end so as to ensure the continuity and the stability of the test;
Step seven, exception handling and report generation, capturing exception conditions in the running of a system and generating a test report by a log analysis method and an exception detection algorithm so as to locate and repair problems;
Step eight, automatic test execution and monitoring are carried out, and parallel execution and real-time monitoring of test cases are realized through a distributed execution framework; the distributed execution framework monitors the test progress and the result through a test monitoring system so as to find out execution abnormality and error and improve the stability and reliability of the test;
and step nine, analyzing the test result through a data analysis and visualization method to realize system optimization and improvement.
As a further technical scheme of the invention, the automatic adaptation model comprises an equipment adaptation module, a test case adaptation module, a test module, a log management module and a configuration management module; the device adaptation module comprises a device information acquisition unit and a device configuration generation unit; the device information acquisition unit acquires device information through a system debugging bridge, wherein the device information at least comprises a device model and an operating system version; according to the equipment model and the operating system version; the device configuration generating unit automatically generates a corresponding configuration file through an object-oriented programming method; the test case adaptation module comprises a selection unit and a matching unit; according to the equipment model and the operating system version, the selection unit selects an adapted test case from the test case library through a reinforcement learning method; according to the characteristics and the limitation of the equipment, the matching unit adapts the selected test cases in a mode of input parameter modification and operation step adjustment; the test module comprises a scheduling unit, an executing unit and a report generating unit; the scheduling unit schedules and controls the execution sequence of the test cases through an object-oriented programming method according to the output of the equipment adaptation module and the test case adaptation module; the execution unit realizes executing the adapted test case through an application program interface; according to the test execution result, the report generating unit generates a test report through a visualization method and a chart library; the log management module comprises a log recording unit and a log output unit; the log recording unit records log information in the test process through a cloud database; the log output unit outputs log information to the cloud control console through a data formatting method so as to facilitate subsequent checking and analysis; the configuration management module comprises an equipment configuration unit, a test case configuration unit and a test environment configuration unit; the device configuration unit reads parameters in the configuration file into the memory through a parameter identification method so as to configure test device parameters; the test case configuration unit reads and configures a test case file through an object-oriented programming method; the test environment configuration unit reads the test environment configuration file through an object-oriented programming method to configure test environment parameters.
As a further technical scheme of the invention, the working method steps of the object-oriented programming method are as follows:
S1, defining classes, and defining class attributes and methods through class definition grammar;
S2, creating an object, and creating an object instance through a constructor of the class according to the definition of the class;
S3, the attribute and method operation is carried out, and the attribute and the calling method are accessed through the member access operator of the object instance so as to realize the processing of data and behaviors;
S4, packaging and information hiding, wherein the attribute and the access authority of the method are limited through the access modifier so as to realize information hiding;
S5, inheriting and polymorphism, creating a child class through an inheritance mechanism, inheriting the attribute and the method of a parent class, and realizing reuse and expansion of codes; different responses of different objects to the same message are realized in a method rewriting and method reloading mode;
S6, abstract classes and interfaces are defined through abstract classes and interface keywords, and specifications are provided for subclasses or implementation classes to realize concrete implementation;
And S7, information transfer and collaboration are realized through object method call and parameter transfer and event triggering and monitoring mechanisms.
As a further technical scheme of the invention, the test action generating system comprises a test case design module, a test environment configuration module, a test action design module, a test script writing module, a result analysis module, a defect management module and a test report generating module; the test case design module converts the test requirement into a specific test case through a boundary value analysis method so as to cover different scenes and functions of the system; the test environment configuration module configures a test environment through a state transition diagram so as to ensure the repeatability and the accuracy of the test; the test environment at least comprises hardware equipment, a software environment and network settings; the test action design module designs execution operation in the test process in a mode of graphical interface operation, interface calling and user input simulation, wherein the execution operation at least comprises input data, operation steps and expected results; the test script writing module converts the test action into an executable automatic script through a script editing library so as to realize automatic execution of the test action and verification of an expected result; the result analysis module analyzes and evaluates the test result through an assertion and log recording method; the defect management module records defects found in the test process through a vulnerability management system, and communicates and cooperates with a development team so as to repair and verify in time; according to the test execution result and the analysis data, the test report generating module generates a structured test report through a chart library and a data visualization method; the structured test report at least comprises test coverage, defect statistics and test passing rate information; the output end of the test case design module is connected with the input end of the test environment configuration module; the output end of the test environment configuration module is connected with the input end of the test action design module; the output end of the test action design module is connected with the input end of the test script writing module; the output end of the test script writing module is connected with the input end of the result analysis module; the output end of the result analysis module is connected with the input end of the defect management module; the output end of the defect management module is connected with the input end of the test report generating module.
As a further technical scheme of the invention, the self-adaptive script generation algorithm performs feature extraction on the system interface screenshot through a convolutional neural network, and marks and classifies interface elements by combining a natural language processing model; inputting the image features and the text description features of the interface elements into a deep learning model, and acquiring the recognition probability of the interface elements through an element recognition probability function; the formula expression is:
In the formula (1), M represents the interface element recognition probability, N represents the image feature vector, d represents the weight parameter and represents the model parameter in the adaptive script generation algorithm; t represents a text description feature vector; modeling and analyzing the input of a user through a long-short-time memory network in a natural language processing model, and predicting and recommending the interactive behavior of the system by combining the self-adaptive script generation algorithm; the self-adaptive script generation algorithm acquires recommendation probability of the interaction behavior through an interaction behavior recommendation probability calculation formula; the formula expression is:
In the formula (2), Y represents an interaction behavior recommendation probability; b is a user input sequence, representing a historical input sequence of a user in the system; z represents model parameters in the natural language processing model and the adaptive script generation algorithm; based on the interface element recognition probability and the interaction behavior recommendation probability, judging whether to add the step into the test script or not through the rule function automatically generated by the test script so as to realize automatic generation of the test script; the formula expression is:
In the formula (3), R represents a test script automatic generation rule; delta represents a threshold value for judging whether to execute a certain step; m represents classification data of interaction behavior, and S represents classification data of interface elements.
As a further technical scheme of the invention, the continuous integrated model comprises a version control module, an integrated server module, an automatic test execution module and an exception handling module; the version control module comprises a program management unit, a branch management unit and a record submitting unit; the program management unit manages codes of the application program and the test script through a version library; the branch management unit realizes parallel development and testing through a distributed version control system Git so as to ensure independent management among different version codes; the record submitting unit submits codes to a version library through a change log recorder and adds related comments so as to record the detailed information of the code change; the integrated server module comprises a test triggering unit, a generation test unit and a test deployment unit; the test triggering unit monitors code change through a polling method and triggers an automatic test flow; the generation test unit automatically pulls codes through a constructor and performs construction, compiling and packaging operations to generate a deployable application program; the test deployment unit deploys the constructed application program into a test environment through a publisher so as to perform automatic test; the automatic test execution module executes an automatic test script through a test framework selection, user operation simulation and application program behavior verification method; the abnormality processing module comprises an abnormality monitoring unit and an abnormality processing unit; the abnormality monitoring unit monitors the state and the log of the mobile equipment through the system debugging bridge so as to detect whether abnormal operation occurs to the mobile equipment; when abnormal operation occurs to the mobile equipment, the abnormal processing module sends a request to the server in an asynchronous interaction mode with the server, the test is requested to be suspended, and the test is continuously executed after the equipment is recovered to be normal; when the mobile equipment cannot be restored to a normal state, restarting operation of the equipment is realized through the system debugging bridge, so that the continuity and stability of the test are ensured.
As a further technical scheme of the invention, the virtualization model comprises a virtualization model module, a test execution module, a result analysis module and a data display module; the virtualization model module comprises an equipment management unit, a network simulation unit, a data simulation unit and an environment configuration unit; the device management unit manages device instances in the virtual environment through a virtual machine creation method; the network simulation unit simulates a network environment through a network simulation model so as to test the system performance under different network conditions; the data simulation unit generates simulation data through a data generation program to simulate the data interaction condition in a real environment so as to verify the performance and stability of the system under different data loads; the environment configuration unit configures virtual environment parameters through a system command line so as to simulate a real scene for testing; the virtual environment parameters at least comprise operating system version, application program version and configuration of hardware; the test execution module comprises a test case management unit, a parallel test unit and a test report generation unit; the test case management unit realizes the selection, execution and result recording of the test cases through an automatic test script; the automatic test script manages and executes the test cases through the test framework; the parallel test unit realizes parallel execution of a plurality of test tasks through a multi-process standard library so as to improve the test efficiency; according to the test result, the test report generating unit generates a test report through a logic test framework so as to evaluate the quality and stability of the system; the result analysis module comprises a result collection unit, a result comparison unit and an error positioning unit; the result collection unit collects data and logs generated in the test execution process through a log recorder, and at least comprises equipment states, performance indexes and error information; the result comparison unit compares the actual result with the expected result through an interface automatic test frame, and detects the behavior operation condition of the system so as to find potential problems and errors; the error positioning unit analyzes and positions error information generated in the test process through a log analysis method; the data display module generates a data visualization interface of the test progress and the result through a data visualization method so that a tester can know the test progress and the result in real time.
As a further technical scheme of the invention, the anomaly detection algorithm evaluates the deviation degree between the observed value and the average value through an anomaly score function based on a gaussian mixture model, and changes the severity of the anomaly condition through standard deviation; the formula expression is:
In the formula (4), D is an anomaly score indicating the degree of anomaly of the observed value; u represents the weight of the theta gaussian distribution, which is used for determining the contribution degree of each distribution in the whole model, and theta represents the gaussian distribution; n represents root mean square, and is used for measuring the overall fluctuation condition of the observed value; comparing the abnormal score with a preset threshold value through a threshold value judging formula, judging whether the system is in an abnormal state, and judging that the system has an abnormal condition if the abnormal score exceeds the threshold value; the formula expression is:
In the formula (5), θ represents a degree of skewness for describing a degree of skewness of the observed value distribution; ω represents a time window for controlling the time granularity of the anomaly detection algorithm, η represents an anomaly score threshold for determining a threshold for an anomaly condition, α represents a correlation, and a correlation between observed values is analyzed.
As a further technical scheme of the invention, the distributed execution framework comprises a task scheduling module, a test case management module, a resource management module, a data management module and a log management module; the task scheduling module comprises a task queue unit, a task ordering unit and a task distributing unit; the task queue unit realizes asynchronous scheduling and distribution of tasks through a message queue method so as to improve task execution efficiency; the task ordering unit schedules tasks in the task queue in a mode of shortest job priority, polling and priority ordering so as to ensure orderly execution of the tasks; the task allocation unit allocates tasks to the available execution nodes through polling, random and weighted polling methods so as to realize parallel execution of the tasks; the test case management module comprises a case management unit, a case execution unit and a case monitoring unit; the case management unit provides the functions of creating, editing and managing the test case through the application program interface so as to facilitate the user to define and maintain the test case; the case execution unit sends the test case to an execution node through a remote calling method, and an execution result is obtained; the case monitoring unit monitors the state and the execution progress of the execution node in real time through the test monitor and displays the state and the execution progress through a visualization method so as to acquire the execution condition of the test case; the resource management module comprises a resource maintenance unit, a resource scheduling unit and a real-time monitoring unit; the resource maintenance unit manages and maintains the execution node resources through a resource pool so as to ensure that the test task can be supported by the resources; the resource scheduling unit distributes tasks to the execution nodes through a minimum residual resource priority and weight scheduling method so as to realize dynamic distribution and load balancing of resources; the real-time monitoring unit monitors the resource use condition of the executing node through the resource monitor, wherein the resource use condition at least comprises CPU utilization rate and memory occupation so as to provide statistics and alarm functions of the resource use condition; the data management module comprises a data generation unit, a data storage unit and a data analysis unit; the data generation unit generates test data through a random number generation and template filling method so as to meet the requirements of different test scenes; the data storage unit stores and manages test data through a database and a distributed storage system so as to facilitate subsequent data analysis and backtracking; the data analysis unit processes and analyzes the test data through a big data analysis method to extract key indexes and modes and provide support and reference for test results; the log management module comprises a log collecting unit, a log storage unit and a log analysis unit; the log collecting unit collects log information generated by the executing node through a log recorder, wherein the log information at least comprises an error log and a debugging log; the log storage unit stores and manages log information of the execution node through a log database so as to facilitate subsequent query and analysis; the log analysis unit analyzes and visually displays the log of the execution node in real time through a search analysis technology stack so as to help a user locate problems and anomalies.
Has the positive beneficial effects that:
According to the invention, the automatic adaptation model is used for configuring equipment with different models, so that the flexibility and expansibility of the test system are improved; the standard basic action and the compound action are generated by the test action generating system by utilizing a boundary value analysis method, so that the workload of manually writing scripts is reduced; the test of a plurality of devices in the virtual environment is allowed to be carried out simultaneously through the multi-device parallel test support, so that the test efficiency and the coverage range are improved. A plurality of operating systems are deployed on the mobile equipment through a virtualization model, so that testing can be performed under different systems, and the problem of limited testing scenes is solved; the automated testing process is integrated with the software development flow through continuous integration and automated deployment. Therefore, the test execution can carry out asynchronous interaction between the server and the application terminal, and even if the communication connection is interrupted, the test can still be continuously executed, thereby ensuring the continuity and stability of the test. Parallel execution and real-time monitoring of test cases are realized by adopting a distributed execution framework. The distributed execution framework can execute a plurality of test cases simultaneously, and monitors the test progress and the result through the test monitoring system to find out abnormal execution and errors. Thus, the whole functions under a plurality of systems can be tested, and the comprehensiveness and accuracy of the test are improved.
Description of the drawings:
for a clearer description of embodiments of the invention or of solutions in the prior art, the drawings that are necessary for the description of the embodiments or of the prior art will be briefly described, it being apparent that the drawings in the description below are only some embodiments of the invention, from which, without inventive faculty, other drawings can be obtained for a person skilled in the art, in which:
FIG. 1 is a schematic diagram of the steps in the process of the present invention;
FIG. 2 is a schematic diagram of the operation of the automatic adaptation model of the present invention;
FIG. 3 is a schematic step diagram illustrating the operation of the object-oriented programming method of the present invention;
FIG. 4 is a schematic diagram of the operation of the test action generating system of the present invention;
Fig. 5 is a schematic diagram of a distributed execution framework of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
As shown in fig. 1-5, an automated testing method for reducing a testing technical threshold of an intelligent terminal system includes the following steps:
firstly, system analysis and demand definition are carried out on an intelligent terminal system in a questionnaire investigation, business scene analysis and system architecture analysis mode so as to determine a test target and demand;
step two, an automatic adaptation model is built, equipment of different types is configured through the automatic adaptation model, so that flexibility and expansibility of a test system are improved, and the automatic adaptation model decouples an automation use case from the equipment through modularization and an object-oriented programming method, so that batch execution of a script of the same use case on the equipment of different types is realized;
Step three, designing and developing test cases, namely generating a system design test case set through a test action, wherein the test action generating system provides standard basic actions and compound actions through a boundary value analysis method so as to reduce the technical requirements of developing test scripts;
Step four, automatically generating a script, analyzing and learning system interface elements and interactions through a self-adaptive script generation algorithm to automatically generate a test script, and reducing the manual script writing workload;
step five, multi-equipment parallel test support, wherein a plurality of equipment instances are created in a virtual environment through a virtualization model and are tested in parallel, so that the test efficiency and coverage are improved;
Step six, continuous integration and automatic deployment, wherein integration of an automatic test process and a software development flow is realized through a continuous integration model and an automatic deployment system; the continuous integration model realizes the cooperative execution of the automatic use case of the service and the application program in an asynchronous interaction mode of the service end and the application end so as to ensure the continuity and the stability of the test;
Step seven, exception handling and report generation, capturing exception conditions in the running of a system and generating a test report by a log analysis method and an exception detection algorithm so as to locate and repair problems;
Step eight, automatic test execution and monitoring are carried out, and parallel execution and real-time monitoring of test cases are realized through a distributed execution framework; the distributed execution framework monitors the test progress and the result through a test monitoring system so as to find out execution abnormality and error and improve the stability and reliability of the test;
and step nine, analyzing the test result through a data analysis and visualization method to realize system optimization and improvement.
In a specific embodiment, the working mode principle of an automatic test method for reducing the test technical threshold of an intelligent terminal system is as follows:
R1, setting up Web service in a server, providing a management function, a mobile device management function, a device parameter maintenance function, a report management function, an action library management function and a device parameter management function of an automation case by a foreground in a Web form, and maintaining communication connection between a background service and a plurality of devices through USB.
And R2, constructing an action library, and storing all action data information in a database. Each composite action is maintained by managing Web pages in an action library, and the mapping relationship between the composite action and the base action is maintained. When the tester writes the use cases, only the composite actions with stronger readability are selected for permutation and combination, and the basic actions to be executed do not need to be concerned. The compound action refers to a high-readability action of a conventional description language of a fitting tester, for example: unlock screen, turn on bluetooth, turn off WLAN, etc. The basic actions are program actions for enabling an auxiliary test application App (hereinafter referred to as App) on the device to read and execute, and each program action has device parameter information. Such as clicking on a certain coordinate, clicking on a certain control, etc. Basic actions are classified into normal actions and special actions. The normal actions are mainly normal actions of clicking, sliding, dragging and the like simulating the UI, and the special actions are mainly actions requiring background service assistance to be executed, such as shell command, restarting, resetting, system switching and the like.
R3, constructing a device parameter table, and maintaining UI control parameters and coordinates of each function on the device by personnel. Since there may be variability in each device model, a set of parameters needs to be maintained for each device model.
R4, selecting a compound action from the existing action library by the tester through the Web page as a use case step, and combining a plurality of compound actions to form a use case. For example: verifying whether the file secret cabinet can be normally opened, and selecting the following compound actions to be combined in sequence: unlocking a screen, entering a file secret cabinet page, enabling the file secret cabinet, and checking the state of the file secret cabinet. In addition, each composite action has a corresponding system mode requirement, for example, in some scenarios a "bluetooth on" requirement is performed under the working system and a "bluetooth off" requirement is performed in the living system.
R5, a tester distributes the automatic use cases to a plurality of mobile devices (without upper limit in theory) through a Web page, and the same or different use cases are executed simultaneously through a unified button.
R6, the background service installs an auxiliary test application App under two systems of the mobile equipment. The App is customized and developed based on Uiautomator, supports a direct starting mode, and is mainly used for reading each basic action parameter in the case configuration file and simulating UI operation. Wherein, the direct start mode appears after android7.0, the life cycle is: when the device is powered on normally but the user has not yet unlocked the device. By default, the application will not run in a "direct launch" mode. If an application needs to perform an operation in a "direct launch" mode, the application components that should run in this mode may be registered.
R7, the background service reads corresponding equipment parameter information from the equipment parameter table according to the equipment model, converts each composite action in the use case into a basic action which can be executed by the test application App, and generates a use case configuration file according to the step sequence. The device parameter table needs to be maintained in background service, and mainly comprises attributes, coordinates and the like of some controls on each device model; and each action in the use case configuration file comprises waiting execution time, repeated execution times and other information which are configured firstly.
And R8, the background service issues a use case configuration file to the mobile equipment and starts an App.
R9 and App sequentially read each use case configuration file and execute corresponding actions. The progress is recorded in the configuration file every time an action is performed.
In the process of executing basic actions by the R10 and the App, two situations exist, namely, a local action which can be directly executed by the App and an action which needs to be executed by the background service in a matching way. When the second situation occurs, the App writes the action information into a local message file, after the background service detects the message file, the action information in the file is read, whether the communication connection is disconnected or not is further judged, if the communication connection is not disconnected (such as shell command), a result file is generated and pushed to the equipment after the execution of the action is completed, and after the App reads the result file, the next action is continuously executed; if the action causes the communication connection to be disconnected (such as switching the system, restarting, etc.), the background service will temporarily store the use case configuration file (execution progress exists in the file) on the device in the background service, and then execute the action. After the connection with the USB communication of the mobile device is restored, the background service can issue the use case configuration file to the device again, and after the App is restarted, the use case is allowed to continue to be executed from the breakpoint.
And R11 and App generate corresponding result files after the execution of one use case is finished, and generate message files to inform background service after the execution of all use cases is finished.
And R12, in the execution process of the batch of use cases, the background service monitors information on the mobile equipment in real time and presents an execution result and an execution progress on the Web page.
In a specific implementation, the test data table of the automated test method of the present invention is shown in table 1:
Table 1 test example table
According to the test data example table, it can be seen that the automatic test method can effectively verify whether each functional module of the intelligent terminal system works normally. By automatically generating the test script, adapting to different devices, executing a plurality of test cases in parallel and other key technologies, the test efficiency is obviously improved, and the test result is more accurate and stable. The automatic test method for reducing the test technical threshold of the intelligent terminal system reduces the technical requirements of testers by means of modularization, automatic script generation and the like, so that more people can participate in the test of the intelligent terminal system. By means of an automatic adaptation model, script automatic generation, multi-equipment parallel test and other methods, test efficiency and coverage are greatly improved. Through continuous integration, exception handling, report generation and other functions, the exception condition in the system can be captured in time and the problem can be positioned and repaired, so that the test quality is improved. Through result analysis and feedback optimization, data analysis and visualization are carried out on test results, so that problems in a system can be found, and optimization and improvement can be carried out. The method can save the test cost, improve the test coverage and the test quality, and has important significance for guaranteeing the stability and the reliability of the intelligent terminal system.
In the above embodiment, the automatic adaptation model includes an equipment adaptation module, a test case adaptation module, a test module, a log management module, and a configuration management module; the device adaptation module comprises a device information acquisition unit and a device configuration generation unit; the device information acquisition unit acquires device information through a system debugging bridge, wherein the device information at least comprises a device model and an operating system version; according to the equipment model and the operating system version; the device configuration generating unit automatically generates a corresponding configuration file through an object-oriented programming method; the test case adaptation module comprises a selection unit and a matching unit; according to the equipment model and the operating system version, the selection unit selects an adapted test case from the test case library through a reinforcement learning method; according to the characteristics and the limitation of the equipment, the matching unit adapts the selected test cases in a mode of input parameter modification and operation step adjustment; the test module comprises a scheduling unit, an executing unit and a report generating unit; the scheduling unit schedules and controls the execution sequence of the test cases through an object-oriented programming method according to the output of the equipment adaptation module and the test case adaptation module; the execution unit realizes executing the adapted test case through an application program interface; according to the test execution result, the report generating unit generates a test report through a visualization method and a chart library; the log management module comprises a log recording unit and a log output unit; the log recording unit records log information in the test process through a cloud database; the log output unit outputs log information to the cloud control console through a data formatting method so as to facilitate subsequent checking and analysis; the configuration management module comprises an equipment configuration unit, a test case configuration unit and a test environment configuration unit; the device configuration unit reads parameters in the configuration file into the memory through a parameter identification method so as to configure test device parameters; the test case configuration unit reads and configures a test case file through an object-oriented programming method; the test environment configuration unit reads the test environment configuration file through an object-oriented programming method to configure test environment parameters.
In a specific embodiment, the automatic adaptation model detects the type and the characteristics of the currently used equipment through the equipment adaptation module, and performs corresponding adaptation operation according to different characteristics of the equipment. For example, if the device screen resolution is small, the module may automatically adjust the test interface layout to accommodate the small screen display. And then judging the functions and the limits supported by the current equipment through the test case adaptation module, and generating the test case suitable for the current equipment according to the requirements of the test case. For example, if a test case needs to be photographed using a camera, but the current device does not have a camera, the module will automatically adapt the case to skip this step. Then, executing the test case through the test module and obtaining a test result. It will select and perform the appropriate test operation based on the output of the auto-adapt model. For example, according to the output of the device adaptation module, the test module may automatically adjust the click position and scroll manner of the interface element. Then, a log management module records logs in the test execution process, including test steps, input and output information and the like. These logs can be used for problem investigation and analysis. Then, various configuration parameters in the test process, such as equipment setting, test case selection, test environment configuration and the like, are managed through the configuration management module. The configuration management module allows a user to configure as required and transmits configuration information to other modules.
In specific implementation, the automatic adaptation model can automatically carry out adaptation operation according to the characteristics of different equipment and test cases, so that the workload of manual adaptation of testers is reduced, and the test technical threshold is lowered. In addition, by automatically adapting and executing the test cases, large-scale testing can be rapidly performed, and testing time and labor cost are reduced. And the automatic adaptation model can avoid artificial interference and improve the stability and accuracy of the test result. And secondly, the automatic adaptation model can flexibly adjust the test scheme according to actual conditions, so that the automatic adaptation model is suitable for different equipment and environments, and the expandability and adaptability of the test are improved. Meanwhile, the automatic adaptation model can also avoid the problem of test failure caused by mismatching of equipment and test cases. Meanwhile, the automatic adaptation model can quickly find out and find out defects and holes existing in the software application program through automatic test adaptation and execution, and is beneficial to improving the test level and the software quality.
In the above embodiment, the working method steps of the object-oriented programming method are as follows:
S1, defining classes, and defining class attributes and methods through class definition grammar;
S2, creating an object, and creating an object instance through a constructor of the class according to the definition of the class;
S3, the attribute and method operation is carried out, and the attribute and the calling method are accessed through the member access operator of the object instance so as to realize the processing of data and behaviors;
S4, packaging and information hiding, wherein the attribute and the access authority of the method are limited through the access modifier so as to realize information hiding;
S5, inheriting and polymorphism, creating a child class through an inheritance mechanism, inheriting the attribute and the method of a parent class, and realizing reuse and expansion of codes; different responses of different objects to the same message are realized in a method rewriting and method reloading mode;
S6, abstract classes and interfaces are defined through abstract classes and interface keywords, and specifications are provided for subclasses or implementation classes to realize concrete implementation;
And S7, information transfer and collaboration are realized through object method call and parameter transfer and event triggering and monitoring mechanisms.
In a specific embodiment, the object-oriented programming method first designs an abstract device interface to define the operations and functions required by the device. This interface describes which basic actions the device should have, such as launching an application, clicking a button, entering text, etc. For devices of different models, independent device specific implementation classes can be designed by using an object-oriented programming method. These implementation classes inherit from the abstract device interface and implement corresponding operations and functions according to the characteristics of the specific device. For example, for a specific model of mobile phone, a mobile phone equipment class can be written, so that a method for abstracting an equipment interface is realized. In writing test cases, abstract device interfaces may be used to describe the operations and functions required by the case without having to write for a particular device. In this way, the test case is decoupled from the specific implementation of the device, so that the same case script can be executed on the multi-type device in batches. When the test case is run, a specific device object is instantiated according to actual requirements. Depending on the device model, a corresponding specific device implementation class may be selected to create the device object. In this way, the test cases may operate on device objects through the device interface without concern for implementation details of the particular device.
In the automatic test method for reducing the test technical threshold of the intelligent terminal system, the object-oriented programming method decouples the test case from the equipment, so that the test case can be flexibly executed on a plurality of pieces of equipment. The test cases only need to pay attention to the operation and the function provided by the abstract device interface, and do not need to be written for each specific device, so that reusability of the cases is improved. In addition, through the object-oriented programming method, the specific implementation of the device is encapsulated in the device implementation class, and when new device support needs to be modified or added, only the corresponding implementation class needs to be modified or added, without affecting the logic of the test case. Therefore, the maintainability of codes can be improved, and the complexity of modifying the test cases is reduced. Meanwhile, the object-oriented programming method allows new device implementation classes to be added according to actual needs so as to support new types of devices. Thus, the coverage range of the test can be expanded, and the device is suitable for the continuously updated device market.
In a word, the object-oriented programming method can reduce the testing technical threshold of the intelligent terminal system, improve the reusability, maintainability and expansibility of the test cases, enable the same case script to be executed on multi-model equipment in batches, and improve the testing efficiency and quality.
In the above embodiment, the test action generating system includes a test case design module, a test environment configuration module, a test action design module, a test script writing module, a result analysis module, a defect management module, and a test report generating module; the test case design module converts the test requirement into a specific test case through a boundary value analysis method so as to cover different scenes and functions of the system; the test environment configuration module configures a test environment through a state transition diagram so as to ensure the repeatability and the accuracy of the test; the test environment at least comprises hardware equipment, a software environment and network settings; the test action design module designs execution operation in the test process in a mode of graphical interface operation, interface calling and user input simulation, wherein the execution operation at least comprises input data, operation steps and expected results; the test script writing module converts the test action into an executable automatic script through a script editing library so as to realize automatic execution of the test action and verification of an expected result; the result analysis module analyzes and evaluates the test result through an assertion and log recording method; the defect management module records defects found in the test process through a vulnerability management system, and communicates and cooperates with a development team so as to repair and verify in time; according to the test execution result and the analysis data, the test report generating module generates a structured test report through a chart library and a data visualization method; the structured test report at least comprises test coverage, defect statistics and test passing rate information; the output end of the test case design module is connected with the input end of the test environment configuration module; the output end of the test environment configuration module is connected with the input end of the test action design module; the output end of the test action design module is connected with the input end of the test script writing module; the output end of the test script writing module is connected with the input end of the result analysis module; the output end of the result analysis module is connected with the input end of the defect management module; the output end of the defect management module is connected with the input end of the test report generating module.
In a specific embodiment, the test action generating system creates and edits test cases according to information such as a requirement document, a user scene and the like through a test case design module. The method can provide abundant test case templates and operation options, and help testers to quickly design comprehensive and accurate test cases. The test environment is built and configured by the test environment configuration module, including connecting test equipment, simulating a network environment, preparing test data, and the like. The system can automatically execute the complex configuration processes, and saves the time and effort of manually configuring the environment by a tester. And defining test actions in the test cases, namely operation steps required to be executed in the test process, through a test action design module. It may provide a visualization tool or scripting language that helps the tester accurately describe the operation and expected results of each test action. The test cases are converted into executable test scripts through a test script writing module. It can automatically generate test script code or provide a script editor to enable a tester to write and debug test scripts. The test script writing module can support multiple scripting languages and provide rich libraries and functions to simplify the writing of test scripts. And collecting and analyzing the test result through a result analysis module. The method can automatically execute the test script and record the execution result and the actual output of each test action. The results analysis module may also compare with expected results and generate detailed test reports to help test personnel locate and solve problems quickly. Defects found during the test are collected, tracked and repaired by the defect management module. It can be integrated with other defect management tools, automatically creating defect reports and assigning to relevant responsible persons. Therefore, the defects can be ensured to be solved in time, and the quality of software is improved. And generating a detailed test report by a test report generating module so as to display test results and defect conditions. The test report can contain indexes such as test coverage rate, pass rate, failure rate and the like, and provide charts and statistical data so that project managers and developers can better know test conditions and project progress.
In particular implementations, automated testing methods may reduce the workload and time consumption of manual testing. The test action generating system can automatically execute the test script instead of manual gradual operation, thereby greatly improving the test efficiency. In addition, the test action generating system can ensure consistency and accuracy of a test process. The automatically generated test script can reduce human errors and improve the execution reliability and stability of the test case, thereby improving the test quality. Meanwhile, the test action generating system can help testers to quickly create a large number of test cases and execute comprehensive test actions. Through automatic test, more functions, scenes and boundary conditions can be covered, and the test coverage rate is improved. Secondly, the automated testing method can reduce the workload of the testers and the cost of using the equipment. The test action generating system can automatically execute the test script without manual intervention, thereby saving human resources and equipment resources and reducing test cost. Finally, the test action generation system may automatically generate detailed test reports including test results, defect information, coverage, and the like. These reports can be used for project management, decision making and problem solving, providing reliable basis and reference.
In short, the test action generating system plays an important role in an automatic test method for reducing the test technical threshold of the intelligent terminal system. The test script is automatically executed, rich test case design and configuration functions are provided, a detailed test report is generated, the test efficiency, the test quality and the test coverage range can be improved, and the test cost is reduced.
In the above embodiment, the adaptive script generation algorithm performs feature extraction on the system interface screenshot through a convolutional neural network, and marks and classifies the interface elements by combining with a natural language processing model; inputting the image features and the text description features of the interface elements into a deep learning model, and acquiring the recognition probability of the interface elements through an element recognition probability function; the formula expression is:
In the formula (1), M represents the interface element recognition probability, N represents the image feature vector, d represents the weight parameter and represents the model parameter in the adaptive script generation algorithm; t represents a text description feature vector; modeling and analyzing the input of a user through a long-short-time memory network in a natural language processing model, and predicting and recommending the interactive behavior of the system by combining the self-adaptive script generation algorithm; the self-adaptive script generation algorithm acquires recommendation probability of the interaction behavior through an interaction behavior recommendation probability calculation formula; the formula expression is:
In the formula (2), Y represents an interaction behavior recommendation probability; b is a user input sequence, representing a historical input sequence of a user in the system; z represents model parameters in the natural language processing model and the adaptive script generation algorithm; based on the interface element recognition probability and the interaction behavior recommendation probability, judging whether to add the step into the test script or not through the rule function automatically generated by the test script so as to realize automatic generation of the test script; the formula expression is:
In the formula (3), R represents a test script automatic generation rule; delta represents a threshold value for judging whether to execute a certain step; m represents classification data of interaction behavior, and S represents classification data of interface elements.
In a particular embodiment, the adaptive script generation algorithm passes through a mainstream operating system such as Windows, linux or macOS. In addition, by Python, java, or other suitable programming languages and development tools. And by Selenium WebDriver as a framework for automated testing, the reliability of the script under different environments is ensured by using a more stable and accurate element positioning method, such as using a unique CSS selector, XPath expression and the like. Meanwhile, a processing mechanism for abnormal conditions, such as overtime processing, processing in which elements are not present and the like, is added, so that the robustness of the script is improved. And secondly, the test data is separated from the script, and an external data source (such as an Excel table, a database and the like) is used for managing and transmitting the test data, so that the test script is more flexible and easy to maintain.
In a specific embodiment, the working mode principle of the adaptive script generation algorithm includes the following steps:
m1, system analysis: firstly, an adaptive script generation algorithm analyzes an intelligent terminal system to identify various interface elements and interactive operations in the system.
M2, learning stage: the algorithm then simulates the system and records the user's operating steps and corresponding interface elements. By analyzing this data, the algorithm can learn the correlations between the different operating steps and constraint conditions.
M3, script generation: based on the learned patterns and rules, the adaptive script generation algorithm may automatically generate test scripts. It combines and generates the corresponding test steps and expected results based on the interface elements and interactions of the system.
M4, executing and verifying: the generated test script may be executed by an automated test framework and verify that the system is operating as intended. In the execution process, the algorithm may be adjusted and optimized according to the actual result, so as to improve the accuracy and efficiency of the test.
In an automated test method for reducing the test technical threshold of an intelligent terminal system, a self-adaptive script generation algorithm automatically generates a test script by analyzing and learning interface elements and interactive operation of the system. Therefore, the workload of manually writing scripts by a tester can be greatly reduced, and the efficiency and accuracy of the test are improved. In addition, the adaptive script generation algorithm can generate comprehensive and accurate test scripts according to interface elements and interactive operations of the system. Compared with the traditional manual script writing method, the method can better cover different functions, scenes and boundary conditions, and improves the coverage of the test. And secondly, the self-adaptive script generation algorithm can dynamically adjust the generation mode and the content of the test script according to the actual situation. The method can automatically adapt to new interface elements and interactive operation according to the change and evolution of the system, and the continuous effectiveness of the test is maintained. Finally, the self-adaptive script generation algorithm can generate a test script with clear structure and easy understanding and maintenance according to the characteristics and the requirements of the system. Thus, the maintenance cost of the test script can be reduced, and the reliability and the sustainability of the test can be improved.
In specific implementation, a data test comparison example table of the adaptive script generation algorithm and the conventional algorithm is shown in table 2:
Table 2 adaptive script generation algorithm test data table
Test index Adaptive script generation algorithm Traditional algorithm
Script generation time For 10 minutes 60 Minutes
Script error rate 5% 15%
Test coverage 95% 80%
Script maintenance cost Low and low High height
Through the comparison, compared with the traditional algorithm, the self-adaptive script generation algorithm has obvious advantages in the aspects of script generation time, script error rate, test coverage rate, script maintenance cost and the like. The self-adaptive script generation algorithm can generate the test script more quickly and accurately, improves the efficiency and quality of the test, and reduces the maintenance cost of the test.
In the above embodiment, the persistent integrated model includes a version control module, an integrated server module, an automatic test execution module, and an exception handling module; the version control module comprises a program management unit, a branch management unit and a record submitting unit; the program management unit manages codes of the application program and the test script through a version library; the branch management unit realizes parallel development and testing through a distributed version control system Git so as to ensure independent management among different version codes; the record submitting unit submits codes to a version library through a change log recorder and adds related comments so as to record the detailed information of the code change; the integrated server module comprises a test triggering unit, a generation test unit and a test deployment unit; the test triggering unit monitors code change through a polling method and triggers an automatic test flow; the generation test unit automatically pulls codes through a constructor and performs construction, compiling and packaging operations to generate a deployable application program; the test deployment unit deploys the constructed application program into a test environment through a publisher so as to perform automatic test; the automatic test execution module executes an automatic test script through a test framework selection, user operation simulation and application program behavior verification method; the abnormality processing module comprises an abnormality monitoring unit and an abnormality processing unit; the abnormality monitoring unit monitors the state and the log of the mobile equipment through the system debugging bridge so as to detect whether abnormal operation occurs to the mobile equipment; when abnormal operation occurs to the mobile equipment, the abnormal processing module sends a request to the server in an asynchronous interaction mode with the server, the test is requested to be suspended, and the test is continuously executed after the equipment is recovered to be normal; when the mobile equipment cannot be restored to a normal state, restarting operation of the equipment is realized through the system debugging bridge, so that the continuity and stability of the test are ensured.
In a specific embodiment, the continuous integration model carries out version management on test codes, configuration files and the like through a version control module, and consistency of the test environment and the codes is ensured. And (3) automatically constructing, deploying and testing through the integrated server module. The server monitors code changes in the version control tool and triggers the corresponding build and deployment process. And automatically pulling the latest codes through an automatic test execution module and executing the predefined test cases. Test cases may include different levels and types of testing, such as unit testing, integration testing, end-to-end testing, and the like. If any stage (construction, deployment and test) is abnormal or wrong, the continuous integration model can timely send notification to related personnel through an abnormality processing module and record abnormality information for subsequent investigation and repair.
The continuous integration model realizes continuous integration through the cooperation of a version control tool and an integration server, namely, codes of all developers are integrated into a shared code library, and the shared code library is constructed, deployed and tested regularly. The integration server then automatically pulls the latest code and builds and deploys to ensure that each integration is based on the latest code version. The integrated server then executes predefined test cases, including unit testing, integration testing, end-to-end testing, and the like. These test cases may run automatically and generate test reports and results. Finally, if an anomaly or error occurs during the automated build, deployment or test process, the continuous integration model may send timely notifications to relevant personnel to discover and resolve the problem in time.
In the automatic test method for reducing the test technical threshold of the intelligent terminal system, the continuous integrated model can be automatically constructed, deployed and tested, so that the workload of manual operation is reduced, and the efficiency of developers is improved. In addition, the continuous integration model can quickly find and repair problems introduced by code integration, including compiling errors, error configuration and the like, so that the quality of codes is improved. Meanwhile, the continuous integration model can greatly improve the coverage range and depth of the test by automatically executing the test case, and the test comprises different levels of test such as unit test, integration test and the like. And secondly, the continuous integration model can timely discover abnormality or error in construction, deployment or test, and is beneficial to quickly discovering and solving problems and reducing the influence range of faults by sending notification to related personnel. Finally, the continuous integration model can quickly integrate codes of developers into a shared code library and construct and test the codes, so that quick iteration and frequent delivery are supported.
In summary, the continuous integration model plays an important role in the automatic test for reducing the testing technical threshold of the intelligent terminal system. The method improves development efficiency and code quality through automatic construction, deployment and testing, increases test coverage rate, can discover and solve problems in time, and supports rapid iteration and delivery.
In the above embodiment, the virtualization model includes a virtualization model module, a test execution module, a result analysis module, and a data display module; the virtualization model module comprises an equipment management unit, a network simulation unit, a data simulation unit and an environment configuration unit; the device management unit manages device instances in the virtual environment through a virtual machine creation method; the network simulation unit simulates a network environment through a network simulation model so as to test the system performance under different network conditions; the data simulation unit generates simulation data through a data generation program to simulate the data interaction condition in a real environment so as to verify the performance and stability of the system under different data loads; the environment configuration unit configures virtual environment parameters through a system command line so as to simulate a real scene for testing; the virtual environment parameters at least comprise operating system version, application program version and configuration of hardware; the test execution module comprises a test case management unit, a parallel test unit and a test report generation unit; the test case management unit realizes the selection, execution and result recording of the test cases through an automatic test script; the automatic test script manages and executes the test cases through the test framework; the parallel test unit realizes parallel execution of a plurality of test tasks through a multi-process standard library so as to improve the test efficiency; according to the test result, the test report generating unit generates a test report through a logic test framework so as to evaluate the quality and stability of the system; the result analysis module comprises a result collection unit, a result comparison unit and an error positioning unit; the result collection unit collects data and logs generated in the test execution process through a log recorder, and at least comprises equipment states, performance indexes and error information; the result comparison unit compares the actual result with the expected result through an interface automatic test frame, and detects the behavior operation condition of the system so as to find potential problems and errors; the error positioning unit analyzes and positions error information generated in the test process through a log analysis method; the data display module generates a data visualization interface of the test progress and the result through a data visualization method so that a tester can know the test progress and the result in real time.
In particular embodiments, the virtualization model creates a virtualized environment using virtualization techniques (e.g., virtual machines or containers) that simulate the operating environment of the intelligent terminal system. The module is responsible for managing and configuring the virtualized environment. Meanwhile, the virtualization model executes the predefined test cases in the virtualization environment through the test execution module. The test cases can comprise a function test, a performance test, a compatibility test and the like, and are used for verifying various functions and performance indexes of the intelligent terminal system. And analyzing and counting the test result generated by the test execution module through the result analysis module. The module can automatically detect and record the execution results of the test cases, including success, failure or abnormality. Finally, the data generated by the result analysis module is visually displayed through the data display module, so that a user can intuitively know the test result. This may be accomplished by generating reports, charts or graphical interfaces, etc. In specific implementation, the virtualization model creates a virtualization environment through a virtualization technology, and simulates the running environment of the intelligent terminal system. Therefore, a plurality of virtualized instances can be operated on the same hardware platform at the same time, and the resource utilization rate is improved. Each virtualized instance is isolated and independent from each other, having its own operating system and application programs. Thus, mutual interference among test cases can be avoided, and stability of a test environment is ensured. In addition, the virtualization model can realize the repeatable execution of the test case by saving and restoring the state of the virtualization instance. Meanwhile, resources and parameters of the virtualized environment can be flexibly configured to meet different testing requirements.
In an automated testing method for reducing the testing technical threshold of an intelligent terminal system, a virtualization model can simulate the real running environment of the intelligent terminal system, and the real running environment comprises an operating system, hardware configuration and the like. This allows for more accurate testing of the functionality and performance of the system. In addition, the virtualization model can simultaneously run a plurality of virtualization instances on the same physical equipment, so that the utilization rate of the test resources is improved, and the test speed and efficiency are improved. Secondly, through the virtualization model, different testing environments can be conveniently configured, including different operating system versions, hardware configurations and the like. This helps cover a wider test requirement. Meanwhile, the virtualization model can save and restore the state of the virtualization instance, so that the repeatable execution of the test case is realized. This is very helpful for debugging problems, reproducing defects, verifying repair effects, etc. Finally, the virtualization model can package the test environment into a virtual machine image or a container image, so that the deployment and configuration processes of the test environment are simplified, and the test convenience is improved.
In the above embodiment, the abnormality detection algorithm evaluates the degree of deviation between the observed value and the average value by an abnormality score function based on a gaussian mixture model, and normalizes the severity of the abnormal condition by a standard; the formula expression is:
In the formula (4), D is an anomaly score indicating the degree of anomaly of the observed value; u represents the first Weights of individual Gaussian distributions for determining the contribution of each distribution in the whole model,/>Representing a gaussian distribution; n represents root mean square, and is used for measuring the overall fluctuation condition of the observed value; comparing the abnormal score with a preset threshold value through a threshold value judging formula, judging whether the system is in an abnormal state, and judging that the system has an abnormal condition if the abnormal score exceeds the threshold value; the formula expression is:
In the formula (5), θ represents a degree of skewness for describing a degree of skewness of the observed value distribution; ω represents a time window for controlling the time granularity of the anomaly detection algorithm, η represents an anomaly score threshold for determining a threshold for an anomaly condition, α represents a correlation, and a correlation between observed values is analyzed.
In a specific embodiment, the operating environment of the anomaly detection algorithm is a machine learning or deep learning based algorithm library. In the specific embodiment, a machine learning framework such as a Python programming language and Scikit-learn, tensorFlow is adopted, and an abnormality detection model is constructed to realize an abnormality detection function.
In an automatic test method for reducing the test technical threshold of an intelligent terminal system, an abnormality detection algorithm realizes the collection of various log information in the running process of the system by embedding a log recording function in the intelligent terminal system or using a special log collecting tool. The goal is to obtain critical information and error conditions for the system operation. The collected log information is then analyzed by an anomaly detection algorithm, typically using text analysis techniques, regular expressions, keyword extraction, etc., to extract useful information. The goal of this step is to screen out the information related to anomalies from the vast amount of log data. Then, the abnormality detection algorithm constructs an abnormality detection model according to the abnormality related information obtained by analysis. Common models include statistical methods (e.g., probability distribution based models), machine learning methods (e.g., support vector machines, random forests), and deep learning methods (e.g., neural networks). The aim is to judge whether unknown data belongs to abnormal conditions or not by modeling an abnormal mode. The anomaly detection algorithm then trains the model using a known training dataset. The training data set includes marked normal data and abnormal data. By learning the known data patterns, the model can automatically adjust parameters to better identify abnormal situations. The goal is to obtain an accurately classified model. Then, the abnormality detection algorithm performs abnormality detection on the new log data using the constructed abnormality detection model. By inputting new data into the model, the model can judge the data according to the previous training and learning, and give abnormal or normal classification results. The aim is to determine whether the new log data belongs to abnormal conditions according to the judgment of the model. And then, the abnormality detection algorithm generates a corresponding test report according to the output result of the abnormality detection algorithm. The report typically includes information about the description of the anomaly, the time of occurrence, the location of the anomaly, etc. The goal is to provide detailed exceptions to the developer in order to quickly locate and fix the problem. Finally, the abnormality detection algorithm can locate and repair problems of the system according to the provided information according to the abnormality in the test report. By analyzing the abnormal situation, a developer can determine the reason of the abnormal occurrence and take corresponding measures to repair the system. The aim is to improve the stability and reliability of the system
In specific implementation, the anomaly detection algorithm can automatically analyze system log information, identify the anomaly condition in the system operation, and does not need manual intervention. This greatly improves the efficiency and accuracy of the test. Meanwhile, the abnormality detection algorithm can generate a detailed test report to indicate the position and reason of abnormality occurrence. This helps the developer to quickly locate the problem and repair it in time. In addition, the anomaly detection algorithm can analyze the system log in real time and discover the anomaly condition in time. The method can enable the problem to be found and repaired more timely, and improves the stability and reliability of the system. And secondly, the anomaly detection algorithm can be customized and adjusted according to the log characteristics of different systems, and has certain expandability and adaptability. This allows the algorithm to function in different scenarios. In a specific implementation, the data test table of the anomaly detection algorithm is shown in table 3:
Table 3 data test table for abnormality detection algorithm
Sequence number Input data Expected output Actual output Abnormality detection result
1 10 Normal state Normal state No abnormality
2 15 Normal state Normal state No abnormality
3 20 Normal state Timeout of time Abnormality of
4 12 Normal state Normal state No abnormality
5 30 Normal state Normal state No abnormality
6 50 Normal state Normal state No abnormality
7 8 Normal state Abnormality of Abnormality of
In this example, the input data is certain parameters or test cases of the intelligent terminal system, and the expected output is a predefined normal output. The actual output is the actual result obtained by running the system. The abnormality detection result is an abnormality in the system operation identified according to the log analysis method and the abnormality detection algorithm.
In the example table, the input data of lines 1,2, 4, 5 and 6 all obtain expected normal output, no abnormal situation occurs, and the abnormal detection result is "no abnormal". The input data of line 3 causes the system to timeout, the actual output is inconsistent with the expected output, and the abnormality detection result is "abnormal". The input data of line 7 also causes abnormality in the system, the actual output is inconsistent with the expected output, and the abnormality detection result is also "abnormality".
Through the example table, it can be seen that the abnormality detection algorithm can accurately capture the abnormal condition in the system operation and generate a corresponding abnormality detection result. The method is favorable for problem positioning and repairing, and improves the stability and reliability of the intelligent terminal system. When testing is performed on a larger-scale test data set, the abnormality detection algorithm can automatically monitor and report the abnormal behavior of the system, reduce manual intervention and improve test efficiency.
In the above embodiment, the distributed execution framework includes a task scheduling module, a test case management module, a resource management module, a data management module, and a log management module; the task scheduling module comprises a task queue unit, a task ordering unit and a task distributing unit; the task queue unit realizes asynchronous scheduling and distribution of tasks through a message queue method so as to improve task execution efficiency; the task ordering unit schedules tasks in the task queue in a mode of shortest job priority, polling and priority ordering so as to ensure orderly execution of the tasks; the task allocation unit allocates tasks to the available execution nodes through polling, random and weighted polling methods so as to realize parallel execution of the tasks; the test case management module comprises a case management unit, a case execution unit and a case monitoring unit; the case management unit provides the functions of creating, editing and managing the test case through the application program interface so as to facilitate the user to define and maintain the test case; the case execution unit sends the test case to an execution node through a remote calling method, and an execution result is obtained; the case monitoring unit monitors the state and the execution progress of the execution node in real time through the test monitor and displays the state and the execution progress through a visualization method so as to acquire the execution condition of the test case; the resource management module comprises a resource maintenance unit, a resource scheduling unit and a real-time monitoring unit; the resource maintenance unit manages and maintains the execution node resources through a resource pool so as to ensure that the test task can be supported by the resources; the resource scheduling unit distributes tasks to the execution nodes through a minimum residual resource priority and weight scheduling method so as to realize dynamic distribution and load balancing of resources; the real-time monitoring unit monitors the resource use condition of the executing node through the resource monitor, wherein the resource use condition at least comprises CPU utilization rate and memory occupation so as to provide statistics and alarm functions of the resource use condition; the data management module comprises a data generation unit, a data storage unit and a data analysis unit; the data generation unit generates test data through a random number generation and template filling method so as to meet the requirements of different test scenes; the data storage unit stores and manages test data through a database and a distributed storage system so as to facilitate subsequent data analysis and backtracking; the data analysis unit processes and analyzes the test data through a big data analysis method to extract key indexes and modes and provide support and reference for test results; the log management module comprises a log collecting unit, a log storage unit and a log analysis unit; the log collecting unit collects log information generated by the executing node through a log recorder, wherein the log information at least comprises an error log and a debugging log; the log storage unit stores and manages log information of the execution node through a log database so as to facilitate subsequent query and analysis; the log analysis unit analyzes and visually displays the log of the execution node in real time through a search analysis technology stack so as to help a user locate problems and anomalies.
In a specific embodiment, the distributed execution framework distributes test tasks to each node for execution through a task scheduling module according to the characteristics and requirements of the test cases. This may be achieved by using task scheduling algorithms, such as load balancing or priority based algorithms, to achieve intelligent allocation of tasks. All test cases are managed through the test case management module, including storage, classification, version control and the like of the test cases. By setting the priority and importance of test cases, it can be ensured that critical test cases are executed first. The hardware and software resources of each node are managed by a resource management module. The method can monitor the condition of available resources of the nodes and dynamically allocate the nodes according to the requirements of test tasks. Through effective management of resources, resource contention and waste can be avoided, and efficiency and stability of testing are improved. The data management module is responsible for collecting and managing the data generated in the test process. This includes test results, performance metrics, log files, screenshots, and the like. For large-scale testing tasks, a distributed storage system may be used to manage and store such data for subsequent data analysis and processing. And collecting and managing log information of each node through a log management module. The method can identify potential errors and anomalies by monitoring the log output of the nodes in real time and provide detailed log records for subsequent fault detection and problem positioning.
In a specific implementation, the distributed execution framework can greatly shorten the test time by parallelizing the test tasks. And a plurality of nodes execute test cases at the same time, so that hardware resources are effectively utilized, and the throughput of the whole test is improved. In addition, the distributed execution framework can fully utilize the existing hardware resources, so that extra hardware investment is avoided. This reduces test costs, especially for large-scale and long-term testing, saving expensive hardware procurement and maintenance costs. And secondly, different nodes can test different functional modules or test scenes simultaneously by executing test cases in parallel, so that the test coverage rate is improved. This helps to find more potential problems and defects. Meanwhile, the distributed execution framework can automatically collect and analyze test results, performance indexes and log information. This may help the test team better understand system behavior and performance conditions, identify potential problems and bottlenecks, and take timely action to improve system quality and performance.
Summarizing, the distributed execution framework reduces the technical threshold of intelligent terminal system test by parallelizing test tasks, fully utilizing resources and automatically managing and analyzing data. The intelligent terminal system testing method improves testing efficiency, reduces testing cost, enhances testing coverage rate and testing quality, and provides stability and reliability guarantee for development and deployment of the intelligent terminal system.
While specific embodiments of the present invention have been described above, it will be understood by those skilled in the art that these specific embodiments are by way of example only, and that various omissions, substitutions, and changes in the form and details of the methods and systems described above may be made by those skilled in the art without departing from the spirit and scope of the invention. For example, it is within the scope of the present invention to combine the above-described method steps to perform substantially the same function in substantially the same way to achieve substantially the same result. Accordingly, the scope of the invention is limited only by the following claims.

Claims (9)

1. An automatic test method for reducing the test technical threshold of an intelligent terminal system is characterized by comprising the following steps of: the method comprises the following steps:
firstly, system analysis and demand definition are carried out on an intelligent terminal system in a questionnaire investigation, business scene analysis and system architecture analysis mode so as to determine a test target and demand;
step two, an automatic adaptation model is built, equipment of different types is configured through the automatic adaptation model, so that flexibility and expansibility of a test system are improved, and the automatic adaptation model decouples an automation use case from the equipment through modularization and an object-oriented programming method, so that batch execution of a script of the same use case on the equipment of different types is realized;
Step three, designing and developing test cases, namely generating a system design test case set through a test action, wherein the test action generating system provides standard basic actions and compound actions through a boundary value analysis method so as to reduce the technical requirements of developing test scripts;
Step four, automatically generating a script, analyzing and learning system interface elements and interactions through a self-adaptive script generation algorithm to automatically generate a test script, and reducing the manual script writing workload;
step five, multi-equipment parallel test support, wherein a plurality of equipment instances are created in a virtual environment through a virtualization model and are tested in parallel, so that the test efficiency and coverage are improved;
Step six, continuous integration and automatic deployment, wherein integration of an automatic test process and a software development flow is realized through a continuous integration model and an automatic deployment system; the continuous integration model realizes the cooperative execution of the automatic use case of the service and the application program in an asynchronous interaction mode of the service end and the application end so as to ensure the continuity and the stability of the test;
Step seven, exception handling and report generation, capturing exception conditions in the running of a system and generating a test report by a log analysis method and an exception detection algorithm so as to locate and repair problems;
Step eight, automatic test execution and monitoring are carried out, and parallel execution and real-time monitoring of test cases are realized through a distributed execution framework; the distributed execution framework monitors the test progress and the result through a test monitoring system so as to find out execution abnormality and error and improve the stability and reliability of the test;
and step nine, analyzing the test result through a data analysis and visualization method to realize system optimization and improvement.
2. The automated testing method for reducing the testing technical threshold of the intelligent terminal system according to claim 1, wherein the method comprises the following steps: the automatic adaptation model comprises an equipment adaptation module, a test case adaptation module, a test module, a log management module and a configuration management module; the device adaptation module comprises a device information acquisition unit and a device configuration generation unit; the device information acquisition unit acquires device information through a system debugging bridge, wherein the device information at least comprises a device model and an operating system version; according to the equipment model and the operating system version; the device configuration generating unit automatically generates a corresponding configuration file through an object-oriented programming method; the test case adaptation module comprises a selection unit and a matching unit; according to the equipment model and the operating system version, the selection unit selects an adapted test case from the test case library through a reinforcement learning method; according to the characteristics and the limitation of the equipment, the matching unit adapts the selected test cases in a mode of input parameter modification and operation step adjustment; the test module comprises a scheduling unit, an executing unit and a report generating unit; the scheduling unit schedules and controls the execution sequence of the test cases through an object-oriented programming method according to the output of the equipment adaptation module and the test case adaptation module; the execution unit realizes executing the adapted test case through an application program interface; according to the test execution result, the report generating unit generates a test report through a visualization method and a chart library; the log management module comprises a log recording unit and a log output unit; the log recording unit records log information in the test process through a cloud database; the log output unit outputs log information to the cloud control console through a data formatting method so as to facilitate subsequent checking and analysis; the configuration management module comprises an equipment configuration unit, a test case configuration unit and a test environment configuration unit; the device configuration unit reads parameters in the configuration file into the memory through a parameter identification method so as to configure test device parameters; the test case configuration unit reads and configures a test case file through an object-oriented programming method; the test environment configuration unit reads the test environment configuration file through an object-oriented programming method to configure test environment parameters.
3. The automated testing method for reducing the testing technical threshold of the intelligent terminal system according to claim 1, wherein the method comprises the following steps: the working method of the object-oriented programming method comprises the following steps:
S1, defining classes, and defining class attributes and methods through class definition grammar;
S2, creating an object, and creating an object instance through a constructor of the class according to the definition of the class;
S3, the attribute and method operation is carried out, and the attribute and the calling method are accessed through the member access operator of the object instance so as to realize the processing of data and behaviors;
S4, packaging and information hiding, wherein the attribute and the access authority of the method are limited through the access modifier so as to realize information hiding;
S5, inheriting and polymorphism, creating a child class through an inheritance mechanism, inheriting the attribute and the method of a parent class, and realizing reuse and expansion of codes; different responses of different objects to the same message are realized in a method rewriting and method reloading mode;
S6, abstract classes and interfaces are defined through abstract classes and interface keywords, and specifications are provided for subclasses or implementation classes to realize concrete implementation;
And S7, information transfer and collaboration are realized through object method call and parameter transfer and event triggering and monitoring mechanisms.
4. The automated testing method for reducing the testing technical threshold of the intelligent terminal system according to claim 1, wherein the method comprises the following steps: the test action generating system comprises a test case design module, a test environment configuration module, a test action design module, a test script writing module, a result analysis module, a defect management module and a test report generating module; the test case design module converts the test requirement into a specific test case through a boundary value analysis method so as to cover different scenes and functions of the system; the test environment configuration module configures a test environment through a state transition diagram so as to ensure the repeatability and the accuracy of the test; the test environment at least comprises hardware equipment, a software environment and network settings; the test action design module designs execution operation in the test process in a mode of graphical interface operation, interface calling and user input simulation, wherein the execution operation at least comprises input data, operation steps and expected results; the test script writing module converts the test action into an executable automatic script through a script editing library so as to realize automatic execution of the test action and verification of an expected result; the result analysis module analyzes and evaluates the test result through an assertion and log recording method; the defect management module records defects found in the test process through a vulnerability management system, and communicates and cooperates with a development team so as to repair and verify in time; according to the test execution result and the analysis data, the test report generating module generates a structured test report through a chart library and a data visualization method; the structured test report at least comprises test coverage, defect statistics and test passing rate information; the output end of the test case design module is connected with the input end of the test environment configuration module; the output end of the test environment configuration module is connected with the input end of the test action design module; the output end of the test action design module is connected with the input end of the test script writing module; the output end of the test script writing module is connected with the input end of the result analysis module; the output end of the result analysis module is connected with the input end of the defect management module; the output end of the defect management module is connected with the input end of the test report generating module.
5. The automated testing method for reducing the testing technical threshold of the intelligent terminal system according to claim 1, wherein the method comprises the following steps: the self-adaptive script generation algorithm performs feature extraction on the system interface screenshot through a convolutional neural network, and marks and classifies interface elements by combining a natural language processing model; inputting the image features and the text description features of the interface elements into a deep learning model, and acquiring the recognition probability of the interface elements through an element recognition probability function; the formula expression is:
In the formula (1), M represents the interface element recognition probability, N represents the image feature vector, d represents the weight parameter and represents the model parameter in the adaptive script generation algorithm; t represents a text description feature vector; modeling and analyzing the input of a user through a long-short-time memory network in a natural language processing model, and predicting and recommending the interactive behavior of the system by combining the self-adaptive script generation algorithm; the self-adaptive script generation algorithm acquires recommendation probability of the interaction behavior through an interaction behavior recommendation probability calculation formula; the formula expression is:
In the formula (2), Y represents an interaction behavior recommendation probability; b is a user input sequence, representing a historical input sequence of a user in the system; z represents model parameters in the natural language processing model and the adaptive script generation algorithm; based on the interface element recognition probability and the interaction behavior recommendation probability, judging whether to add the step into the test script or not through the rule function automatically generated by the test script so as to realize automatic generation of the test script; the formula expression is:
In the formula (3), R represents a test script automatic generation rule; delta represents a threshold value for judging whether to execute a certain step; m represents classification data of interaction behavior, and S represents classification data of interface elements.
6. The automated testing method for reducing the testing technical threshold of the intelligent terminal system according to claim 1, wherein the method comprises the following steps: the continuous integrated model comprises a version control module, an integrated server module, an automatic test execution module and an exception handling module; the version control module comprises a program management unit, a branch management unit and a record submitting unit; the program management unit manages codes of the application program and the test script through a version library; the branch management unit realizes parallel development and testing through a distributed version control system Git so as to ensure independent management among different version codes; the record submitting unit submits codes to a version library through a change log recorder and adds related comments so as to record the detailed information of the code change; the integrated server module comprises a test triggering unit, a generation test unit and a test deployment unit; the test triggering unit monitors code change through a polling method and triggers an automatic test flow; the generation test unit automatically pulls codes through a constructor and performs construction, compiling and packaging operations to generate a deployable application program; the test deployment unit deploys the constructed application program into a test environment through a publisher so as to perform automatic test; the automatic test execution module executes an automatic test script through a test framework selection, user operation simulation and application program behavior verification method; the abnormality processing module comprises an abnormality monitoring unit and an abnormality processing unit; the abnormality monitoring unit monitors the state and the log of the mobile equipment through the system debugging bridge so as to detect whether abnormal operation occurs to the mobile equipment; when abnormal operation occurs to the mobile equipment, the abnormal processing module sends a request to the server in an asynchronous interaction mode with the server, the test is requested to be suspended, and the test is continuously executed after the equipment is recovered to be normal; when the mobile equipment cannot be restored to a normal state, restarting operation of the equipment is realized through the system debugging bridge, so that the continuity and stability of the test are ensured.
7. The automated testing method for reducing the testing technical threshold of the intelligent terminal system according to claim 1, wherein the method comprises the following steps: the virtualization model comprises a virtualization model module, a test execution module, a result analysis module and a data display module; the virtualization model module comprises an equipment management unit, a network simulation unit, a data simulation unit and an environment configuration unit; the device management unit manages device instances in the virtual environment through a virtual machine creation method; the network simulation unit simulates a network environment through a network simulation model so as to test the system performance under different network conditions; the data simulation unit generates simulation data through a data generation program to simulate the data interaction condition in a real environment so as to verify the performance and stability of the system under different data loads; the environment configuration unit configures virtual environment parameters through a system command line so as to simulate a real scene for testing; the virtual environment parameters at least comprise operating system version, application program version and configuration of hardware; the test execution module comprises a test case management unit, a parallel test unit and a test report generation unit; the test case management unit realizes the selection, execution and result recording of the test cases through an automatic test script; the automatic test script manages and executes the test cases through the test framework; the parallel test unit realizes parallel execution of a plurality of test tasks through a multi-process standard library so as to improve the test efficiency; according to the test result, the test report generating unit generates a test report through a logic test framework so as to evaluate the quality and stability of the system; the result analysis module comprises a result collection unit, a result comparison unit and an error positioning unit; the result collection unit collects data and logs generated in the test execution process through a log recorder, and at least comprises equipment states, performance indexes and error information; the result comparison unit compares the actual result with the expected result through an interface automatic test frame, and detects the behavior operation condition of the system so as to find potential problems and errors; the error positioning unit analyzes and positions error information generated in the test process through a log analysis method; the data display module generates a data visualization interface of the test progress and the result through a data visualization method so that a tester can know the test progress and the result in real time.
8. The automated testing method for reducing the testing technical threshold of the intelligent terminal system according to claim 1, wherein the method comprises the following steps: the anomaly detection algorithm evaluates the deviation degree between the observed value and the average value through an anomaly score function based on a Gaussian mixture model, and normalizes the severity of the anomaly condition through standard deviation; the formula expression is:
In the formula (4), D is an anomaly score indicating the degree of anomaly of the observed value; u represents the first Weights of individual Gaussian distributions for determining the contribution of each distribution in the whole model,/>Representing a gaussian distribution; n represents root mean square, and is used for measuring the overall fluctuation condition of the observed value; comparing the abnormal score with a preset threshold value through a threshold value judging formula, judging whether the system is in an abnormal state, and judging that the system has an abnormal condition if the abnormal score exceeds the threshold value; the formula expression is:
In the formula (5), θ represents a degree of skewness for describing a degree of skewness of the observed value distribution; ω represents a time window for controlling the time granularity of the anomaly detection algorithm, η represents an anomaly score threshold for determining a threshold for an anomaly condition, α represents a correlation, and a correlation between observed values is analyzed.
9. The automated testing method for reducing the testing technical threshold of the intelligent terminal system according to claim 1, wherein the method comprises the following steps: the distributed execution framework comprises a task scheduling module, a test case management module, a resource management module, a data management module and a log management module; the task scheduling module comprises a task queue unit, a task ordering unit and a task distributing unit; the task queue unit realizes asynchronous scheduling and distribution of tasks through a message queue method so as to improve task execution efficiency; the task ordering unit schedules tasks in the task queue in a mode of shortest job priority, polling and priority ordering so as to ensure orderly execution of the tasks; the task allocation unit allocates tasks to the available execution nodes through polling, random and weighted polling methods so as to realize parallel execution of the tasks; the test case management module comprises a case management unit, a case execution unit and a case monitoring unit; the case management unit provides the functions of creating, editing and managing the test case through the application program interface so as to facilitate the user to define and maintain the test case; the case execution unit sends the test case to an execution node through a remote calling method, and an execution result is obtained; the case monitoring unit monitors the state and the execution progress of the execution node in real time through the test monitor and displays the state and the execution progress through a visualization method so as to acquire the execution condition of the test case; the resource management module comprises a resource maintenance unit, a resource scheduling unit and a real-time monitoring unit; the resource maintenance unit manages and maintains the execution node resources through a resource pool so as to ensure that the test task can be supported by the resources; the resource scheduling unit distributes tasks to the execution nodes through a minimum residual resource priority and weight scheduling method so as to realize dynamic distribution and load balancing of resources; the real-time monitoring unit monitors the resource use condition of the executing node through the resource monitor, wherein the resource use condition at least comprises CPU utilization rate and memory occupation so as to provide statistics and alarm functions of the resource use condition; the data management module comprises a data generation unit, a data storage unit and a data analysis unit; the data generation unit generates test data through a random number generation and template filling method so as to meet the requirements of different test scenes; the data storage unit stores and manages test data through a database and a distributed storage system so as to facilitate subsequent data analysis and backtracking; the data analysis unit processes and analyzes the test data through a big data analysis method to extract key indexes and modes and provide support and reference for test results; the log management module comprises a log collecting unit, a log storage unit and a log analysis unit; the log collecting unit collects log information generated by the executing node through a log recorder, wherein the log information at least comprises an error log and a debugging log; the log storage unit stores and manages log information of the execution node through a log database so as to facilitate subsequent query and analysis; the log analysis unit analyzes and visually displays the log of the execution node in real time through a search analysis technology stack so as to help a user locate problems and anomalies.
CN202311580629.3A 2023-11-23 2023-11-23 Automatic test method for reducing test technical threshold of intelligent terminal system Pending CN117931620A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311580629.3A CN117931620A (en) 2023-11-23 2023-11-23 Automatic test method for reducing test technical threshold of intelligent terminal system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311580629.3A CN117931620A (en) 2023-11-23 2023-11-23 Automatic test method for reducing test technical threshold of intelligent terminal system

Publications (1)

Publication Number Publication Date
CN117931620A true CN117931620A (en) 2024-04-26

Family

ID=90752594

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311580629.3A Pending CN117931620A (en) 2023-11-23 2023-11-23 Automatic test method for reducing test technical threshold of intelligent terminal system

Country Status (1)

Country Link
CN (1) CN117931620A (en)

Similar Documents

Publication Publication Date Title
US11281570B2 (en) Software testing method, system, apparatus, device medium, and computer program product
CN103150249B (en) A kind of method and system of automatic test
US20210279577A1 (en) Testing of Computing Processes Using Artificial Intelligence
US10572360B2 (en) Functional behaviour test system and method
US7895565B1 (en) Integrated system and method for validating the functionality and performance of software applications
CN110309071B (en) Test code generation method and module, and test method and system
Memon Automatically repairing event sequence-based GUI test suites for regression testing
US10310968B2 (en) Developing software project plans based on developer sensitivity ratings detected from monitoring developer error patterns
CN109656820B (en) Intelligent automatic test system based on CBTC
CN106227654B (en) A kind of test platform
CN109933521A (en) Automated testing method, device, computer equipment and storage medium based on BDD
CN113590454A (en) Test method, test device, computer equipment and storage medium
CN115658529A (en) Automatic testing method for user page and related equipment
CN114968272A (en) Algorithm operation method, device, equipment and storage medium
Sumalatha et al. Uml based automated test case generation technique using activity-sequence diagram
CN116643950B (en) FaaS-based cloud native application automatic operation and maintenance method
CN113626326A (en) Dragging type zero code front end automatic test system based on image recognition
CN117493188A (en) Interface testing method and device, electronic equipment and storage medium
CN112416336A (en) Software architecture design method for aerospace embedded system
Mirza et al. ContextDrive: Towards a functional scenario-based testing framework for context-aware applications
US20220206773A1 (en) Systems and methods for building and deploying machine learning applications
CN117931620A (en) Automatic test method for reducing test technical threshold of intelligent terminal system
Canny et al. Engineering model-based software testing of WIMP interactive applications: a process based on formal models and the SQUAMATA tool
Püschel et al. Testing self-adaptive software: requirement analysis and solution scheme
Xie et al. Design and implementation of bank financial business automation testing framework based on QTP

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination