CN114398283A - Automatic testing method and device for user interface, electronic equipment and storage medium - Google Patents

Automatic testing method and device for user interface, electronic equipment and storage medium Download PDF

Info

Publication number
CN114398283A
CN114398283A CN202210057030.0A CN202210057030A CN114398283A CN 114398283 A CN114398283 A CN 114398283A CN 202210057030 A CN202210057030 A CN 202210057030A CN 114398283 A CN114398283 A CN 114398283A
Authority
CN
China
Prior art keywords
test
test case
execution process
interface
process information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210057030.0A
Other languages
Chinese (zh)
Inventor
向旗
蔡飞
祝文兵
王国强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Health Insurance Company of China Ltd
Original Assignee
Ping An Health Insurance Company of China Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Health Insurance Company of China Ltd filed Critical Ping An Health Insurance Company of China Ltd
Priority to CN202210057030.0A priority Critical patent/CN114398283A/en
Publication of CN114398283A publication Critical patent/CN114398283A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Human Computer Interaction (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The application discloses a method and a device for automatically testing a user interface, electronic equipment and a storage medium, wherein the method comprises the following steps: recording and storing the execution process information of the source test case of the interface to be tested which is manually operated; running a test plug-in, and reading a source test case and execution process information thereof; analyzing the read source test case and the execution process information thereof, and performing intelligent expansion of the test case according to the analysis result; generating a test script based on the derived test case obtained by the expansion; and executing the test script to obtain an automatic test result of the user interface. According to the method and the device, when the use case is executed, the related data of the automatic test can be prepared through an intelligent means, and the workload and the cost of compiling the test use case are greatly reduced; when the function of the interface to be tested is changed, the test case can be automatically associated and expanded, and the test case does not need to be maintained at high cost; the application has the advantages of high testing efficiency, low testing cost and strong practicability.

Description

Automatic testing method and device for user interface, electronic equipment and storage medium
Technical Field
The application relates to the technical field of software testing, in particular to an automatic testing method and device for a user interface, electronic equipment and a storage medium.
Background
The automatic test is a process of converting a manually driven page test behavior into machine execution, the automatic test can greatly reduce the cost of regression test, when the adjustment of a function with larger local influence or the overall reconstruction and optimization occur, the whole system needs to be subjected to the full-scale regression test, the automatic test can greatly save the cost in the scenes, and the software quality is well guaranteed in the iteration online process.
In the existing automatic test flow of User Interface (UI) in the industry, test case evaluation is firstly carried out, the whole test case flow is manually run out, after the whole test is passed, the function of online is carried out, and then the manually run-out test cases are compiled into an automatic test script. If the software function is changed, the operation needs to be repeated, namely, the test case is manually run out, and then the script is maintained. In the process, an engineer writing a script and an engineer manually running a test case are not usually the same person, so that the labor cost is greatly increased, and workers are required to be familiar with a large number of elements involved in the whole case, which causes a large amount of repeated work; in some tests, a large number of use cases are needed to be compiled, even hundreds of dimensional data are randomly combined, the manual script compiling is high, and when the function is changed, the script maintenance cost is high.
Disclosure of Invention
In view of the foregoing problems, embodiments of the present application provide a method and an apparatus for automatically testing a user interface, an electronic device, and a storage medium, so as to overcome or at least partially overcome the disadvantages of the prior art.
In a first aspect, an embodiment of the present application provides an automated testing method for a user interface, including:
recording and storing the execution process information of the source test case of the interface to be tested which is manually operated;
running a test plug-in, and reading the source test case and the execution process information thereof;
analyzing the read source test case and the execution process information thereof, and performing intelligent expansion of the test case according to the analysis result;
generating a test script based on the derived test case obtained by the expansion;
and executing the test script to obtain an automatic test result of the user interface.
In a second aspect, an embodiment of the present application further provides an apparatus for automatically testing a user interface, where the apparatus includes:
the recording unit is used for recording and storing the execution process information of the source test case of the interface to be tested which is manually operated;
the reading unit is used for operating the test plug-in and reading the source test case and the execution process information thereof;
the extension unit is used for analyzing the read target test case and the execution process information thereof; carrying out intelligent expansion of the test case according to the analysis result;
the scripted unit is used for generating a test script based on the derived test case obtained by expansion;
and the execution unit is used for executing the test script to obtain an automatic test result of the user interface.
In a third aspect, an embodiment of the present application further provides an electronic device, including: a processor; and a memory storing computer-executable instructions that, when executed, cause the processor to perform any of the methods described above.
In a fourth aspect, this application embodiment also provides a computer-readable storage medium storing one or more computer programs that, when executed by an electronic device including a plurality of application programs, cause the electronic device to perform any of the methods described above.
The embodiment of the application adopts at least one technical scheme which can achieve the following beneficial effects:
according to the method and the device, the execution process information of the manually executed test cases is recorded, the recorded test cases and the execution process information of the test cases are read and analyzed during automatic testing, more test cases are intelligently expanded according to the analysis result, the test script is generated according to the expanded derived test cases, and finally, the test script is executed, so that the automatic test result of the interface to be tested can be obtained. The method and the system can automatically record the test engineer, execute the whole process of the test case, and facilitate the test engineer to track and record the test process; when the case is executed, the related data of the automatic test is prepared by an intelligent means, and the workload and the cost of compiling the test case are greatly reduced; when the function of the interface to be tested is changed, the test case can be automatically associated and expanded, and the test case does not need to be maintained at high cost; aiming at a test project, the condition of a manual execution case and the condition of automatic test operation can be obtained for comparison, and a comparison result between manual and automatic is formed so as to optimize the test process; and this application efficiency of software testing is high, the test cost is low, and the practicality is strong.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
FIG. 1 shows a flow diagram of a method for automated testing of a user interface according to an embodiment of the present application;
FIG. 2 shows a flow diagram of a method for automated testing of a user interface according to another embodiment of the present application;
FIG. 3 illustrates a schematic structural diagram of an automated testing apparatus for a user interface according to one embodiment of the present application;
fig. 4 is a schematic structural diagram of an electronic device in an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
In the existing automatic test flow of User Interface (UI) in the industry, test case evaluation is firstly carried out, the whole test case flow is manually run out, after the whole test is passed, the function of online is carried out, and then the manually run-out test cases are compiled into an automatic test script. If the software function is changed, the operation needs to be repeated, namely, the test case is manually run out, and then the script is maintained. In the process, an engineer writing a script and an engineer manually running a test case are not usually the same person, so that the labor cost is greatly increased, and workers are required to be familiar with a large number of elements involved in the whole case, which causes a large amount of repeated work; in some tests, a large number of use cases are needed to be compiled, even hundreds of dimensional data are randomly combined, the manual script compiling is high, and when the function is changed, the script maintenance cost is high.
In order to solve the problems, the application provides an automatic test method for a user interface, which records execution process information of a manually executed test case, reads and analyzes the recorded test case and the execution process information thereof during automatic test, intelligently expands more test cases according to the analysis result, generates a test script according to the expanded derived test cases, and finally executes the test script to obtain an automatic test result of the interface to be tested.
Fig. 1 is a flowchart illustrating an automated testing method for a user interface according to an embodiment of the present application, and as can be seen from fig. 1, the present application at least includes steps S110 to S150:
step S110: and recording and storing the execution process information of the source test case of the interface to be tested which is manually operated.
The software passes an acceptance test before the version is on-line to ensure the quality. In the conventional technology, the test of the user interface is performed manually, in order to save labor cost and time cost, the automated test of the user interface is called as a research hotspot, and the automated test is a process of converting a manually-driven page test behavior into machine execution.
The existing automatic test flow is mainly divided into two steps, firstly, a test case is programmed manually and is used for test case evaluation, and the test case flow is completed manually; and then compiling the manually run test cases into an automatic test script, and then carrying out correlation comparison on the test cases and the script to realize the aim of executing the automatic test based on the test cases. When the functions of the software are changed, the operations need to be repeated, namely, the test case is manually run out firstly, then the script is maintained, and a script engineer and an engineer who manually runs the test case are not usually a group of workers, the process can cause that the labor cost is greatly increased, the workers who maintain the script need to be familiar with elements involved in the whole case, so that the repeated working cost is caused, if a larger number of cases are needed to be compiled, data of hundreds of dimensions are randomly combined, and the economic cost and the time cost for manually compiling the script are high. Especially when a new version is released, the test script maintenance costs may increase significantly.
Therefore, the method can automatically record the execution process information of the test case of the interface to be tested in manual operation, intelligent expansion of the test case is carried out in the process of manually operating the test case, automatic generation of the test script is achieved, automatic testing of the user interface is achieved, and the method is high in testing efficiency, low in testing cost and high in practicability.
Firstly, manually running a small number of test cases for an interface to be tested, recording the small number of test use amounts as source test cases in the application, and recording the execution process information of the test cases in the manual operation process when the source test cases are manually run, wherein the execution process information comprises the operation process and the result of the test cases; and storing the test case and the execution process information thereof correspondingly. In some embodiments of the present application, these data may be recorded in a purposely configured database for later step read calls.
Step S120: and running a test plug-in, and reading the source test case and the execution process information thereof.
And when the automatic test is carried out on the interface to be tested, responding to the automatic test instruction, and reading the source test case and the execution process information of the interface to be tested.
The test case can be understood as a specific test operation aiming at the software or interface to be tested through natural language description, and can be compiled from the perspective of a user by a tester with abundant business background knowledge.
The execution process information of the test case mainly refers to operations of manual operation when the test case is executed and run, such as clicking an element triggering event or acquiring a focus, acquiring path information of the element, inputting content in an input box, and returning data from a rear-end interface triggered by the event, which are recorded.
During reading, the test case and the execution process information of the test case are read. When reading, the source test case of the existing interface to be tested and the execution process information thereof can be read from the designated area, such as a database.
The test plug-in is a prior art, software test plug-in, such as google plug-in, which is available by accessing the following web site: http:// code. google.com/chrome/extensions/getsorted. html.
Step S130: and analyzing the read source test case and the execution process information thereof, and performing intelligent expansion of the test case according to the analysis result.
And analyzing the read source test case and the execution process information thereof to obtain the specific content of the source test case and the execution process information during manual operation, wherein the step is to form a template for use in the subsequent intelligent expansion of the source test case.
The test case is usually written according to a certain format, and for the analysis process, reverse analysis can be performed, for example, an analysis template is generated according to the writing format of the test case, and the test case is identified through the analysis template, so that the target information can be obtained.
Because a lot of test cases are needed for testing one software or interface, only a small number of test cases are operated in the manual operation stage, and therefore, the method and the device can be intelligently expanded on the basis of analyzing the source test cases and the execution process information thereof to obtain more test cases. The method can be understood as that the source test case is used as a template, and more test cases are expanded according to the information of the template.
For example, for a project, the number of required test cases is usually more than 6, in the application, only one manual operation stage can be operated, and the rest is extended through intellectualization.
For the intelligent extension process, the required information can be supplemented automatically and randomly according to the result of the analyzed source test case.
Step S140: and generating a test script based on the derived test case obtained by the extension.
After obtaining a plurality of derived test cases, the test script can be automatically generated. In a software or interface to be tested, there are many items to be tested, and here, the derived test cases can be grouped according to the items, and an execution process of each test case is generated to generate a test script.
And generating a reference selenium IDE for recording the test script. The Selenium IDE (integrated development environment) is an open source Web automation testing tool under the Selenium Suite. Unlike Selenium Webdriver and RC, it does not require any programming logic to write its test scripts, but only records interactions with the browser to create test cases. The test case may then be rerun using the play option.
Step S150: and executing the test script to obtain an automatic test result of the user interface.
After the test script is obtained, the test script can be executed, specifically, the derivative test cases obtained through expansion are stored in a one-to-one correspondence manner with corresponding execution results, after all the test cases are executed, an automatic test result of the user interface can be obtained, and if the test results of the test cases of one project all pass, the automatic test result of the user interface can be determined to be pass; if the test result of at least one test case of a project is not passed, the project needs to be rechecked.
It can be seen from the method shown in fig. 1 that, in the present application, the execution process information of the manually executed test case is recorded, and during the automatic test, the recorded test case and the execution process information thereof are read and analyzed, more test cases are intelligently expanded according to the analysis result, a test script is generated according to the expanded derived test cases, and finally, the test script is executed, so that the automatic test result of the interface to be tested can be obtained. The method and the system can automatically record the test engineer, execute the whole process of the test case, and facilitate the test engineer to track and record the test process; when the case is executed, the related data of the automatic test is prepared by an intelligent means, and the workload and the cost of compiling the test case are greatly reduced; when the function of the interface to be tested is changed, the test case can be automatically associated and expanded, and the test case does not need to be maintained at high cost; aiming at a test project, the condition of a manual execution case and the condition of automatic test operation can be obtained for comparison, and a comparison result between manual and automatic is formed so as to optimize the test process; and this application efficiency of software testing is high, the test cost is low, and the practicality is strong.
In some embodiments of the present application, in the above method, the recording and storing the execution process information of the manually run test case includes: determining manual operation information of each interface component in the interface to be tested; and recording and storing the manual operation information, wherein the manual operation information comprises but is not limited to the operation form of each component, the operation sequence, the information filled in each component and the like.
The interface to be tested is composed of a plurality of interface components, such as texts and images, and a plurality of filling items, selectable items and the like are borne in the texts and the images. During manual testing, a test engineer adds information in a test case to an interface to be tested, and the execution process information of the manually operated test case is recorded and stored in the application, wherein the execution process information includes the content specifically added to the interface to be tested and also includes operations performed by a user, for example, in the process of recording manual testing, an element triggering event is clicked or a focus is obtained, xpath path information of the element is obtained, input content in an input box is input, and data is returned by a rear-end interface triggered by the event, and all the information can be recorded.
Taking the item to be tested as an example of verification login, a test engineer will fill in elements such as a user name and a password in an interface to be tested, and at this time, the information can be recorded, and the specific recorded information includes but is not limited to: 1. a user name element xpath # username obtains a focus; 2. input content admin 1; 3. password element xpath # password obtains focus; 4. input content admin 2; 5. clicking a login button element xpath, button, sign-button; 6. the back-end interface returns the content { data: { }, status: success }, so that when a test case is executed, the whole process is recorded.
In some embodiments of the present application, for convenience of subsequent parsing, the data may be stored according to a certain preset format or data structure for subsequent parsing.
In some embodiments of the present application, in the method, the running a test plug-in and reading the source test case and the execution process information thereof includes: determining a plurality of test items of an interface to be tested; searching in a database according to the test items to obtain at least one source test case corresponding to the test items, and reading the execution process information of the source test case; or, determining the identity information of the interface to be tested; and searching in a database according to the identity information to obtain at least one source test case consistent with the test item, and reading the execution process information of the source test case.
That is, when the source test case and the execution process information thereof are stored in the data, the source test case and the execution process information thereof can be obtained by searching in the database.
A test case can be written aiming at a project in an interface to be tested, and also can be written aiming at an interface to be tested, if the test case is the former, the test case can be searched in a database according to the project information of the interface to be tested, such as name, identity code and the like, so as to determine a source test case and an execution process result thereof; if the interface to be tested is the latter, the source test case and the execution process result thereof can be directly searched and determined in the database according to the identity information of the interface to be tested, such as the software name, the identity code and the like.
In some embodiments of the present application, the parsing the read source test case and the execution process information thereof includes: and analyzing the execution process information of the source test case according to a preset format to obtain a plurality of elements and corresponding element attribute values.
Still taking the above login authentication as an example, the plurality of elements include a user name element and a password element, where an element attribute value corresponding to the user name element is the input content admin 1; the element attribute value corresponding to the password element is the input content admin 2.
When the test case and the manual operation condition thereof are stored, the test case and the manual operation condition thereof are usually stored in a certain preset format or a data structure, and in analyzing the read source test case and the execution process information thereof, the execution process information of the test case can be analyzed by adopting the preset format during storage according to a reverse thinking, so that a plurality of elements of the test case and an element attribute value corresponding to each element are obtained.
Still taking the aforementioned item to be tested as an example of a verification log, the format of the storage may be, but is not limited to, the form of table 1.
Table 1:
Figure BDA0003476784060000071
Figure BDA0003476784060000081
note: table 1 is merely an exemplary illustration, and in practice, it is generally stored in a data structure similar to a programming syntax.
Table 1 records both the test cases and the manual execution process of the test cases, and when one test case is executed, the whole process is recorded.
Of course, the description and storage manner of the test case is not limited to the description manner of table 1, and any description manner can be adopted. However, once a description is selected, all descriptions in the test case are described in the description. From table 1, it can be seen that the writing and storing of the test cases are performed according to a preset format, and the non-computer professional can also understand the description language. Storing according to the preset format, and during analysis, filtering information in the table according to a reverse thinking, for example, adopting a table header as a table with a table 1 format to obtain elements of the test case and an element attribute value corresponding to each element; further, the operations performed by the user on each element may also be filtered out, if desired. In some embodiments of the present application, in the method, the intelligently extending the test case according to the parsing result includes:
randomly generating multi-element attribute values according to the obtained multiple elements and the formats of the attribute values corresponding to the elements; and returning the attribute values of the elements in each group to the elements of the source test case to obtain a derivative test case.
In the process of intelligently extending the test case, firstly, according to the analysis method, it is determined that one test case contains several elements, such as the verification login process, which contains two elements, one is a user name and the other is a password.
And then randomly generating multi-element attribute values, wherein each group of element attribute values respectively comprises all elements of the test case, and when the multi-element attribute values are generated, whether the multi-element attribute values have special format requirements or not is determined according to the element attribute values corresponding to each element, if not, the multi-element attribute values can be subsequently randomly generated, and if so, the multi-element attribute values are generated according to the special format.
And finally, regression is carried out on each element attribute value in each group of element attribute values to obtain element attribute values corresponding to elements of the test case, and the derivative test case can be obtained.
The input content can be randomly generated and freely combined, for example, the user name of a group of test cases is admin, and the password is 123456; the user name of another group of test cases is 123456, the password is admin, and the like, and then the data is returned to the test cases, so that hundreds of similar test cases can be expanded. In some embodiments of the present application, the method further comprises: and displaying the running condition of the test script on a visual interface for a user to view.
In the process of testing, a test engineer sometimes needs to check the running condition of each test case, and at this time, the running condition of the automated test can be checked through a visual interface. For example, a dashboard is a short term for a business intelligence (business intelligence) dashboard (BI dashboard), is a module owned by general business intelligence and used for realizing data visualization, is a data virtualization tool for displaying measurement information and key service indicators (KPIs) status to an enterprise, and is very suitable for automated testing of software.
In some embodiments of the present application, in the above method, before the step of generating a test script based on the derived test case obtained by the extension, the method further includes: acquiring the number of generated derived test cases; responding to the quantity intervention instruction, and adjusting the quantity of the derived test cases; wherein the adjustment instruction is determined manually according to the existing test result.
That is, during the entire automated testing process, manual optimization may also be performed. The test engineer may manually intervene on the number of test cases according to the extended number of pre-test cases that already exist, that is, the number of derived test cases, and the test results of the test cases that have run out. When receiving the number intervention instruction set manually, the reasonable number of test cases can be further calculated according to the instruction, and the number of the test cases is increased or decreased when the subsequent test is carried out.
Fig. 2 is a schematic flow chart illustrating an automated testing apparatus for a user interface according to another embodiment of the present application, and as can be seen from fig. 2, the present embodiment includes:
running google browser and running google test plug-in.
And recording and storing the execution process information of the source test case of the interface to be tested which is manually operated, and storing the execution process information in a database.
And starting the automatic test, and reading the source test case of the interface to be tested and the execution process information thereof from the database.
And analyzing the execution process information of the source test case to generate more derived test cases.
And responding to the quantity intervention instruction, and adjusting the quantity of the derived test cases.
And performing scripting on the derivative test case, and executing the newly generated script to obtain an automatic test result of the interface to be tested.
And storing the derived test case and the test result into a database.
Calling data from a database, and visually displaying a test result; in addition, manual intervention can be performed on the test cases stored in the database.
Fig. 3 is a schematic structural diagram of an automated testing apparatus for a user interface according to an embodiment of the present application, and as can be seen from fig. 3, the automated testing apparatus 300 for a user interface includes:
the recording unit 310 is configured to record and store execution process information of a source test case in which an interface to be tested is manually run;
a reading unit 320, configured to run a test plug-in, and read the source test case and the execution process information thereof;
the extension unit 330 is configured to parse the read target test case and the execution process information thereof; carrying out intelligent expansion of the test case according to the analysis result;
the scripting unit 340 is used for generating a test script based on the derived test case obtained by extension;
and the execution unit 350 is configured to execute the test script to obtain an automated test result of the user interface.
In some embodiments of the present application, in the above apparatus, the recording unit 310 is configured to determine manual operation information for each interface component in the interface to be tested; and recording and storing the manual operation information, wherein the manual operation information comprises the operation form and the operation sequence of each component manually and information filled in each component.
In some embodiments of the present application, in the above apparatus, the reading unit 320 is configured to determine a number of test items of the interface to be tested; searching in a database according to the test items to obtain at least one source test case corresponding to the test items, and reading the execution process information of the source test case; or, determining the identity information of the interface to be tested; and searching in a database according to the identity information to obtain at least one source test case consistent with the test item, and reading the execution process information of the source test case.
In some embodiments of the present application, in the above apparatus, the extending unit 330 is configured to parse the execution process information of the source test case according to a preset format to obtain a plurality of elements and corresponding element attribute values.
In some embodiments of the present application, in the above apparatus, the expanding unit 330 is configured to randomly generate multi-element attribute values; and returning the attribute values of the elements in each group to the source test case to obtain a derivative test case.
In some embodiments of the present application, the apparatus further comprises: and the display unit is used for displaying the running condition of the test script on a visual interface.
In some embodiments of the present application, the apparatus further comprises: the case quantity adjusting unit is used for acquiring the quantity of the generated derived test cases before the step of generating the test script based on the derived test cases obtained by the expansion; and responding to a quantity intervention instruction, and adjusting the quantity of the derived test cases, wherein the adjustment instruction is artificially determined according to the existing test result.
It can be understood that the above-mentioned automatic testing apparatus for a user interface can implement each step of the automatic testing method for a user interface provided in the foregoing embodiments, and the related explanations about the automatic testing method for a user interface are applicable to the automatic testing apparatus for a user interface, and are not described herein again.
Fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present application. Referring to fig. 4, at a hardware level, the electronic device includes a processor, and optionally further includes an internal bus, a network interface, and a memory. The Memory may include a Memory, such as a Random-Access Memory (RAM), and may further include a non-volatile Memory, such as at least 1 disk Memory. Of course, the electronic device may also include hardware required for other services.
The processor, the network interface, and the memory may be connected to each other via an internal bus, which may be an ISA (Industry Standard Architecture) bus, a PCI (Peripheral Component Interconnect) bus, an EISA (Extended Industry Standard Architecture) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one double-headed arrow is shown in FIG. 4, but that does not indicate only one bus or one type of bus.
And the memory is used for storing programs. In particular, the program may include program code comprising computer operating instructions. The memory may include both memory and non-volatile storage and provides instructions and data to the processor.
The processor reads the corresponding computer program from the nonvolatile memory into the memory and then runs the computer program to form the automatic test device of the user interface on the logic level. The processor is used for executing the program stored in the memory and is specifically used for executing the following operations:
recording and storing the execution process information of the source test case of the interface to be tested which is manually operated;
running a test plug-in, and reading the source test case and the execution process information thereof;
analyzing the read source test case and the execution process information thereof, and performing intelligent expansion of the test case according to the analysis result;
generating a test script based on the derived test case obtained by the expansion;
and executing the test script to obtain an automatic test result of the user interface.
The method performed by the automated testing device for a user interface disclosed in the embodiment of fig. 3 of the present application may be implemented in or by a processor. The processor may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and completes the steps of the method in combination with hardware of the processor.
The electronic device may further execute the method executed by the automatic testing apparatus for a user interface in fig. 3, and implement the functions of the automatic testing apparatus for a user interface in the embodiment shown in fig. 3, which are not described herein again in this embodiment of the present application.
An embodiment of the present application further provides a computer-readable storage medium storing one or more programs, where the one or more programs include instructions, which, when executed by an electronic device including a plurality of application programs, enable the electronic device to perform a method performed by an automated testing apparatus of a user interface in the embodiment shown in fig. 3, and are specifically configured to perform:
recording and storing the execution process information of the source test case of the interface to be tested which is manually operated;
running a test plug-in, and reading the source test case and the execution process information thereof;
analyzing the read source test case and the execution process information thereof, and performing intelligent expansion of the test case according to the analysis result;
generating a test script based on the derived test case obtained by the expansion;
and executing the test script to obtain an automatic test result of the user interface.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (10)

1. A method for automated testing of a user interface, comprising:
recording and storing the execution process information of the source test case of the interface to be tested which is manually operated;
running a test plug-in, and reading the source test case and the execution process information thereof;
analyzing the read source test case and the execution process information thereof, and performing intelligent expansion of the test case according to the analysis result;
generating a test script based on the derived test case obtained by the expansion;
and executing the test script to obtain an automatic test result of the user interface.
2. The method of claim 1, wherein the recording and storing the execution procedure information of the manually run test case comprises:
determining manual operation information of each interface component in the interface to be tested;
and recording and storing the manual operation information, wherein the manual operation information comprises the operation form and the operation sequence of each component manually and information filled in each component.
3. The method of claim 1, wherein the running of the test plug-in, reading the source test case and the execution procedure information thereof, comprises:
determining a plurality of test items of an interface to be tested; searching in a database according to the test items to obtain at least one source test case corresponding to the test items, and reading the execution process information of the source test case;
or the like, or, alternatively,
determining identity information of an interface to be tested; and searching in a database according to the identity information to obtain at least one source test case consistent with the test item, and reading the execution process information of the source test case.
4. The method of claim 1, wherein analyzing the read source test case and the execution procedure information thereof comprises:
and analyzing the execution process information of the source test case according to a preset format to obtain a plurality of elements and corresponding element attribute values.
5. The method of claim 4, wherein the intelligently extending the test cases according to the parsing result comprises:
randomly generating multi-element attribute values according to the obtained multiple elements and the formats of the attribute values corresponding to the elements;
and returning the attribute values of the elements in each group to the elements of the source test case to obtain a derivative test case.
6. The method of claim 1, further comprising: and displaying the running condition of the test script on a visual interface.
7. The method according to claim 1, wherein before the step of generating a test script based on the derived test case obtained by the extension, the method further comprises:
acquiring the number of generated derived test cases;
and responding to a quantity intervention instruction, and adjusting the quantity of the derived test cases, wherein the adjustment instruction is artificially determined according to the existing test result.
8. An apparatus for automated testing of a user interface, the apparatus comprising:
the recording unit is used for recording and storing the execution process information of the source test case of the interface to be tested which is manually operated;
the reading unit is used for operating the test plug-in and reading the source test case and the execution process information thereof;
the extension unit is used for analyzing the read target test case and the execution process information thereof; carrying out intelligent expansion of the test case according to the analysis result;
the scripted unit is used for generating a test script based on the derived test case obtained by expansion;
and the execution unit is used for executing the test script to obtain an automatic test result of the user interface.
9. An electronic device, comprising:
a processor; and
a memory storing computer-executable instructions that, when executed, cause the processor to perform the method of any of claims 1-7.
10. A computer readable storage medium storing one or more computer programs which, when executed by an electronic device comprising a plurality of application programs, cause the electronic device to perform the method of any of claims 1-7.
CN202210057030.0A 2022-01-18 2022-01-18 Automatic testing method and device for user interface, electronic equipment and storage medium Pending CN114398283A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210057030.0A CN114398283A (en) 2022-01-18 2022-01-18 Automatic testing method and device for user interface, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210057030.0A CN114398283A (en) 2022-01-18 2022-01-18 Automatic testing method and device for user interface, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114398283A true CN114398283A (en) 2022-04-26

Family

ID=81230610

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210057030.0A Pending CN114398283A (en) 2022-01-18 2022-01-18 Automatic testing method and device for user interface, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114398283A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113468050A (en) * 2021-06-30 2021-10-01 杭州群核信息技术有限公司 Canvas-based testing method and device, computer equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113468050A (en) * 2021-06-30 2021-10-01 杭州群核信息技术有限公司 Canvas-based testing method and device, computer equipment and storage medium

Similar Documents

Publication Publication Date Title
CN108319547B (en) Test case generation method, device and system
CN110309071B (en) Test code generation method and module, and test method and system
CN109614324B (en) Test case generation method and device
CN109542789B (en) Code coverage rate statistical method and device
CN104035863B (en) A kind of browser testing method and device
CN103049371A (en) Testing method and testing device of Android application programs
CN113051155A (en) Control system and control method of automatic test platform
CN108804305A (en) A kind of method and device of automatic test
CN112433948A (en) Simulation test system and method based on network data analysis
CN107480056B (en) Software testing method and device
CN114398283A (en) Automatic testing method and device for user interface, electronic equipment and storage medium
US8850407B2 (en) Test script generation
US9501390B1 (en) Enhancing automated mobile application testing
CN114490413A (en) Test data preparation method and device, storage medium and electronic equipment
EP4261692A1 (en) Self-learning application test automation
CN111198798B (en) Service stability measuring method and device
CN109857665B (en) Test execution method and device for test case
CN113672509A (en) Automatic testing method, device, testing platform and storage medium
CN109446091B (en) Business entity object testing method and device
CN107844484B (en) Method and device for identifying exposure code
CN118503139B (en) Automatic test method, equipment and medium for three-dimensional CAD system
CN118503117A (en) Selenium-based automatic test method, terminal equipment and storage medium
Chen et al. Research on Page Object Generation Approach for Web Application Testing.
CN118445192A (en) Application program testing method, device and medium
CN117827644A (en) Automatic test method and automatic test device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination