CN117762802A - Automatic test method, device, equipment and storage medium - Google Patents

Automatic test method, device, equipment and storage medium Download PDF

Info

Publication number
CN117762802A
CN117762802A CN202311799563.7A CN202311799563A CN117762802A CN 117762802 A CN117762802 A CN 117762802A CN 202311799563 A CN202311799563 A CN 202311799563A CN 117762802 A CN117762802 A CN 117762802A
Authority
CN
China
Prior art keywords
target
case
test
test case
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311799563.7A
Other languages
Chinese (zh)
Inventor
袁雪
李广聚
谢继刚
韩建国
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China United Network Communications Group Co Ltd
Unicom Digital Technology Co Ltd
China Unicom Western Innovation Research Institute Co Ltd
Original Assignee
China United Network Communications Group Co Ltd
Unicom Digital Technology Co Ltd
China Unicom Western Innovation Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China United Network Communications Group Co Ltd, Unicom Digital Technology Co Ltd, China Unicom Western Innovation Research Institute Co Ltd filed Critical China United Network Communications Group Co Ltd
Priority to CN202311799563.7A priority Critical patent/CN117762802A/en
Publication of CN117762802A publication Critical patent/CN117762802A/en
Pending legal-status Critical Current

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Debugging And Monitoring (AREA)

Abstract

The application provides an automated testing method, an automated testing device, automated testing equipment and a storage medium. The method comprises the following steps: responding to the operation of the automatic test platform, and acquiring an identifier of a target test case; according to a preset corresponding relation and an identifier of a target test case, target data in a target case database table is called, wherein the corresponding relation represents the corresponding relation between the identifier of the test case and data in the target case database table, and the target case database table comprises a keyword database sub-table, an element database sub-table, a case step database sub-table and a case information database sub-table; constructing a target test case script according to the target data; and obtaining an automatic test report of the user interface according to the target test case script. According to the method, the time and the workload of manually and repeatedly executing the same testing steps are reduced, the maintenance cost of the automatic testing of the user interface is reduced, and the efficiency of the automatic testing of the user interface is improved.

Description

Automatic test method, device, equipment and storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to an automated testing method, apparatus, device, and storage medium.
Background
UI (User Interface) automation test is a test method for simulating the real operation of a User based on page graphics, aiming at checking whether the User Interface program can interact normally and whether there is any defect that hampers the User's behavior. By using the user interface to automate testing, a user can check the working effects of software on different platforms, browsers, and devices and ensure that they are working properly in a variety of environments.
Currently, UI automation tests are commonly performed by writing test scripts to perform the required tests when the tests are required, simulating user operations in an application or website, such as clicking buttons, inputting text, verifying page elements, and the like.
However, by writing test scripts to perform automated testing of a user interface when testing is required, there are many repetitive tasks, developing different test frameworks and test scripts in different systems requires a lot of time for the tester to maintain the written test scripts, and UI automated testing is limited to various different test environments, and performing automated testing of a user interface through the written test scripts requires a lot of time and effort to configure the test environments, simulating the scenarios in various different browsers and different versions, reducing the efficiency of UI automated testing.
Disclosure of Invention
The application provides an automatic test method, an automatic test device, automatic test equipment and a storage medium, which are used for solving the problems of high development cost, high maintenance cost, complex test environment and high deployment cost of a user interface automatic test script.
In a first aspect, the present application provides an automated test method comprising:
responding to the operation of the automatic test platform, and acquiring an identifier of a target test case;
according to a preset corresponding relation and an identifier of a target test case, target data in a target case database table is called, wherein the corresponding relation represents the corresponding relation between the identifier of the test case and data in the target case database table, and the target case database table comprises a keyword database sub-table, an element database sub-table, a case step database sub-table and a case information database sub-table;
constructing a target test case script according to the target data;
and obtaining an automatic test report of the user interface according to the target test case script.
In this embodiment of the present application, before the target data in the target database table is called according to the preset correspondence and the identifier of the target test case, the method further includes:
Responding to the new case operation of the automatic test platform by a user to obtain new test case data, wherein the new test case data comprises a keyword database table, an element database table, a case step database table and newly added data in a case information database table;
and saving the newly added test case data to an application database table.
In the embodiment of the present application, an automated test platform displays a case management interface, a keyword management interface, and an element management interface, where the case management interface includes a case step name input area, an element selection area, and a keyword selection area, and the obtaining new test case data in response to a user's new case operation on the automated test platform includes:
responding to the input operation of a user to a case step name input area in a case management interface, and obtaining the name and parameter value data of each step of the newly added test case;
responding to the selection operation of the user on the element selection area in the case management interface to obtain the element data of the new test case;
responding to the selection operation of a user on a keyword selection area in a case management interface to obtain keyword data of a new test case;
Displaying keywords and managing elements of the test page according to the keyword management interface and the element management interface;
and obtaining the new test case data and the test case data identifier according to the name and parameter value data, the element data, the keyword data and the browser version data of each step of the new test case.
In an embodiment of the present application, the method further comprises:
when more than two pieces of new test case data exist, the new test case data are subjected to collection processing, and the new test case data after collection processing are stored in an application case database table.
In the embodiment of the present application, according to a preset correspondence and an identifier of a target test case, the method for calling the target data in the target test case database table includes:
determining application case step information and browser information corresponding to the identifier of the target test case according to the corresponding relation and the identifier of the target test case, wherein the application case step information comprises an element identifier and a keyword identifier of the target test case;
determining element information and key information corresponding to the element identifier and key identifier of the target test case from the target case database table according to the element identifier and key identifier of the target test case;
And determining the browser version corresponding to the target test case according to the browser information.
In the embodiment of the application, according to the target test case script, obtaining a user interface automation test report includes:
determining a target automation test tool and a control instruction sent by a remote browser driver;
starting a target browser according to a target automation testing tool and a control instruction sent by a remote browser driver, wherein the version of the target browser corresponds to browser version information in a target test case script;
and executing the target test case according to the target browser to obtain an automatic test report of the user interface.
In the embodiment of the present application, according to a control instruction sent by a target automation test tool and a remote browser driver, starting a target browser includes:
browser version information in the target test case script is obtained;
obtaining a Docker image of the target browser according to browser version information in the target test case script, wherein the Docker image of the target browser corresponds to the browser version information in the target test case script;
starting a Docker mirror image container according to the Docker mirror image of the target browser;
And starting the target browser according to the Docker mirror container.
In the embodiment of the present application, according to a target browser, a target test case is executed to obtain a user interface automation test report, including:
determining a browser driver of a target browser and an output catalog of a test report;
according to a browser driver of the target browser, running an automatic test tool;
executing the target test case according to the running automatic test tool to obtain a user interface automatic test report;
the user interface automated test report is copied to an output catalog of test reports.
In an embodiment of the present application, the method further includes:
acquiring a screen recording requirement, a log requirement and a target catalog of a target test case;
obtaining a video file and a log file of the target test case according to the screen recording and log requirements of the target test case;
the video file and the log file are copied to the target directory.
In a second aspect, the present application provides an automated testing apparatus comprising:
the acquisition module is used for responding to the operation of the automatic test platform and acquiring the identifier of the target test case;
the calling module is used for calling target data in a target use case database table according to a preset corresponding relation and an identifier of the target test case, wherein the corresponding relation represents the corresponding relation between the identifier of the test case and data in the use case database table, and the target use case database table comprises a keyword database sub-table, an element database sub-table, a use case step database sub-table and a use case information database sub-table;
The construction module is used for constructing a target test case script according to the target data;
and the determining module is used for obtaining an automatic test report of the user interface according to the target test case script.
In a third aspect, the present application provides an electronic device, including: a processor, a memory communicatively coupled to the processor;
the memory stores computer-executable instructions;
the processor executes the computer-executable instructions stored in the memory to implement the automated test methods of embodiments of the present application.
In a fourth aspect, a computer readable storage medium has stored therein computer executable instructions that when executed by a processor are configured to implement an automated test method according to embodiments of the present application.
According to the automatic test method, the device, the equipment and the storage medium, through the obtained identifier of the target test case and the corresponding relation between the preset identifier of the test case and the data in the case database table, the target data in the target case database table is called, wherein the target case database table comprises a keyword database sub-table, an element database sub-table, a case step database sub-table and a case information database sub-table, and a target test case script is constructed according to the target data, so that a user interface automatic test report is obtained according to the target test case script. According to the UI automatic test method, time and workload for manually and repeatedly executing the same test steps are reduced, maintenance cost of automatic testing of the user interface is reduced, and efficiency of automatic testing of the user interface is improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
Fig. 1 is a schematic flow chart of an automated testing method according to an embodiment of the present application;
FIG. 2 is a schematic diagram illustrating the effect of an automated testing method according to an embodiment of the present disclosure;
FIG. 3 is a flow chart of another automated testing method provided in an embodiment of the present application;
fig. 4 is a schematic structural diagram of an automated testing apparatus according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Specific embodiments thereof have been shown by way of example in the drawings and will herein be described in more detail. These drawings and the written description are not intended to limit the scope of the inventive concepts in any way, but to illustrate the concepts of the present application to those skilled in the art by reference to specific embodiments.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present application as detailed in the accompanying claims.
In the prior art, the UI automation test is generally to write a test script to execute a test to be executed when the test is needed, develop different test frames and test scripts in different systems, repeatedly write the test scripts, and maintain the written test scripts in a large amount of time by a tester.
According to the method and the device, the target data in the target case database table can be called through the obtained corresponding relation between the identifiers of the target test cases and the preset identifiers of the test cases and the data in the case database table, wherein the target case database table comprises a keyword database sub-table, an element database sub-table, a case step database sub-table and a case information database sub-table, and a target test case script is constructed according to the target data, so that a user interface automation test report is obtained according to the target test case script. According to the automatic test method, the keyword data, the element data, the case step data and the case information data of the test cases are stored in the data table through the automatic test platform, and the repeatedly-executable test cases are constructed through the data table, so that the time and the workload for manually repeatedly executing the same test steps are reduced, and the maintenance cost of automatic test of the user interface is reduced. In addition, the automatic test method provided by the application improves the efficiency of automatic test of the user interface by selecting the corresponding use cases on the test platform and executing the use cases without repeatedly writing and executing the same test steps.
The application provides an automated testing method. The subject of execution of the automated test method may be a server, wherein the server may provide an automated test environment and resources allowing test personnel to run and manage test scripts on the server. The implementation manner of the execution body is not particularly limited in this embodiment, as long as the execution body can obtain the identifier of the target test case in response to the operation on the automated test platform; according to a preset corresponding relation and an identifier of a target test case, target data in a target case database table is called, wherein the corresponding relation represents the corresponding relation between the identifier of the test case and data in the case database table, and the target case database table comprises a keyword database sub-table, an element database sub-table, a case step database sub-table and a case information database sub-table; constructing a target test case script according to the target data; and obtaining an automatic test report of the user interface according to the target test case script.
Fig. 1 is a schematic flow chart of an automated testing method according to an embodiment of the present application. As shown in fig. 1, the method may include:
s101, responding to operation of an automatic test platform, and acquiring an identifier of a target test case.
The automated test platform may be a comprehensive tool or system for managing and performing automated tests that provides a series of functions and tools that assist a test team in the planning, execution, management, and monitoring of automated tests. The automated test platform can comprise a use case management module, a keyword management module, an element management module and a task management module, can provide the capability of compiling and editing test scripts, configures and manages different test environments including an operating system, a browser, equipment and the like so as to conveniently run the test scripts, and can also generate detailed test results and reports including test coverage rate, error logs, screenshot and the like so as to help testers analyze and evaluate the test results.
The target test case may refer to a set of test steps that need to be automatically performed to verify whether the tested software can operate according to the expected behavior. The target test case may be an interface test case, a login verification test case, a form verification test case, or a database test case, for example.
In the embodiment of the application, when a user clicks an execution case operation on an automatic test platform, a test case list and an identifier contained in the corresponding test kit can be obtained by querying the test kit in the test platform, and it is required to be noted that the test kit can represent a combination rule of the test cases.
S102, according to a preset corresponding relation and an identifier of a target test case, target data in a target case database table is called, wherein the corresponding relation represents the corresponding relation between the identifier of the test case and data in the target case database table, and the target case database table comprises a keyword database sub-table, an element database sub-table, a case step database sub-table and a case information database sub-table.
The target case database table may be a data table for storing target test case information in a software development project. In this embodiment, the target case database table includes a keyword database table, an element database table, a case step database table and a case information database table, where the keyword database table may be a data table for defining keywords used in test cases, storing keywords and operations corresponding to the keywords, it should be noted that the keywords may be predefined operations or steps for completing specific tasks in UI test cases, such as clicking a button, inputting text, waiting for page loading, etc., the keywords are generally defined by related test frames or libraries, and may be used alone or in combination to construct more complex test cases, and reduce the test code amount. The key database sub-table may include a unique identifier for each key in the sub-table; the name or description of the keyword, such as click, enter text, wait for page load, etc.; the actual test operation corresponding to the keyword, such as simulated clicking of a button, inputting of text, etc. The tester can construct more complex test cases by packaging and combining these keywords to verify the correctness and reliability of the software or application.
The element database sub-table may refer to a data table for storing page element information, a positioning manner and a positioning value of an element used by an automated test script, and it should be noted that the element may refer to a page element requiring interaction or verification of a state in a UI test process, and may generally be an HTML (Hypertext Markup Language ) element in a web page or a UI element in a mobile application program, and the element may be a button, an input box, a drop-down list, a text, a tag, or the like. The element database sub-table may include a unique identifier for each element in the sub-table, the name or identifier of the page in which the element is located, the name of the element, the manner in which the element is located, and the location value. The tester can locate and operate the elements in the test case according to the identifiers through the unique identifiers of the elements, so that when the page changes, the maintenance cost of the test case is reduced by updating the corresponding element identifiers without modifying the whole test case.
The case step database sub-table may refer to a data table for storing execution step information of test cases, which may include a name of each step, an element identifier of an operation, a keyword identifier of an operation, input data such as input text, waiting time, etc.
The case information database sub-table is used for storing information of the test case, and comprises an identifier of the test case, a name of the test case, a browser corresponding to the test case, a browser version, a project module, a creator, creation time and the like of the test case.
In this embodiment of the present application, after a user clicks on an automatic test platform to execute an instance operation, target data, such as keyword database sub-table, element database sub-table, instance step database sub-table, and instance information sub-table, stored keyword data, element data, instance step data, and instance information data, in the instance database table may be called according to a preset correspondence and the identifier of the target test instance obtained in step S101.
In this embodiment of the present application, according to a preset correspondence and an identifier of a target test case, the retrieving target data in the target case database table may include:
determining application case step information and browser information corresponding to the identifier of the target test case according to the corresponding relation and the identifier of the target test case, wherein the application case step information comprises an element identifier and a keyword identifier of the target test case;
Determining element information and key information corresponding to the element identifier and key identifier of the target test case from the target case database table according to the element identifier and key identifier of the target test case;
and determining the browser version corresponding to the target test case according to the browser information.
In the application, step information corresponding to the identifier of the target test case can be determined through a preset corresponding relation and the identifier of the target test case, searching is performed in a target case database table, element information and keyword information corresponding to the element identifier and the keyword identifier are found, and further processing, such as performing operation corresponding to the keywords or acquiring attributes of the elements, is performed according to the searched element information and keyword information. Meanwhile, the embodiment can search in the case information database table through the preset corresponding relation and the identifier of the target test case, and determine the browser information corresponding to the identifier of the target test case, so as to determine the browser to be used, and the browser information includes browser version, browser type and the like.
In this embodiment of the present application, obtaining a user interface automation test report according to a target test case script may include:
determining a target automation test tool and a control instruction sent by a remote browser driver;
starting a target browser according to a target automation testing tool and a control instruction sent by a remote browser driver, wherein the version of the target browser corresponds to browser version information in a target test case script;
and executing the target test case according to the target browser to obtain an automatic test report of the user interface.
The target automatic testing tool is used for automatically testing the functions of the software in the software development process so as to realize quick and accurate discovery and repair of code errors and problems. Common automated test tools include Selenium (Selenium WebDriver), appium (Appium Mobile Automation Framework), JMeter (Apache JMeter) and Robot frame work. In this embodiment, the target automation test tool may be Selenoid (Selenoid Selenium Hub in Docker), and it should be noted that, the selenid is an open-source lightweight Selenium test container manager, which provides a Selenium Grid based on Docker containerization, and may launch multiple instances of different browsers and run automation tests on those instances. Selenoid also provides a number of utility functions, such as browser logs, video recording, and WebRTC support, which can help developers develop and debug automated test cases more conveniently.
Remote browser driver (Remote WebDriver) may refer to the manner in which Web application test code and WebDriver commands are sent to a browser on a Remote computer for execution. The Remote browser can automatically test the Remote browser by using the Remote WebDriver to send the test code to the Remote computer for execution.
In this embodiment, the server may issue an instruction to the selenoid through the Remote WebDriver, so that the selenoid connects to the target browser, and sends a test instruction to the target browser through a command, executes the target test case, and generates a user interface automation test report.
In this embodiment of the present application, according to a control instruction sent by a target automation test tool and a remote browser driver, starting a target browser may include:
browser version information in the target test case script is obtained;
obtaining a Docker image of the target browser according to browser version information in the target test case script, wherein the Docker image of the target browser corresponds to the browser version information in the target test case script;
starting a Docker mirror image container according to the Docker mirror image of the target browser;
And starting the target browser according to the Docker mirror container.
The Docker image may be a lightweight, executable software package that contains all of the content needed to run the Docker container, including application code, system tools, library files, environment variables, configuration files, etc., that may be used to create a runtime instance of the Docker container.
The Docker mirror container may be a lightweight portable container that can package applications or dependent software, libraries, and configuration files as a whole into the container and run anywhere that the Docker environment is supported. The Docker mirror image container can isolate the resources such as an operating system and an application program, so that the application program can run in the container of the Docker mirror image container without interference, and a developer can pack the application program and an infrastructure into a fixed container format by using the Docker mirror image container, thereby realizing cross-platform application deployment and running.
In this embodiment, referring to fig. 2, fig. 2 is an effect schematic diagram of an automatic test method provided in this embodiment, after a mu target test case script is built through an automatic test platform and a target case database, browser version information in the target test case script may be obtained by analyzing parameters in a statement of an open browser in the target test case script, and according to the browser version information, a Docker mirror image required for running the target test case script is searched in a mirror image warehouse, so that the target browser in the Docker mirror image container is started by connecting to a remote WebDriver provided by the Docker mirror image container through a WebDriver client library. And if a plurality of test cases are executed simultaneously and different versions of browsers are required by each case, the docker mirror image container can be automatically called by the selenoid, so that independent test environments of the plurality of different versions of browsers are realized.
In this embodiment of the present application, according to a target browser, executing a target test case to obtain a user interface automation test report may include:
determining a browser driver of a target browser and an output catalog of a test report;
according to a browser driver of the target browser, running an automatic test tool;
executing the target test case according to the running automatic test tool to obtain a user interface automatic test report;
the user interface automated test report is copied to an output catalog of test reports.
The output catalog of the browser driver and test report is configurable on a case-by-case basis.
The browser driver may be a middleware for connecting the test framework and the browser, which serves as a bridge between the test framework and the browser, helps the test framework communicate and interact with a specific browser, and is capable of controlling the behavior of the browser, performing user-defined test operations, and obtaining test results.
In this embodiment, the corresponding browser driver may be determined according to the type of the target browser. For example, for a Chrome browser, a ChromeDriver may be used as a driver; for Firefox browser, geckoDriver may be used; for the Safari browser, safariDriver may be used. After the driver is obtained, the driver can be placed in a system path or a driver path can be designated in a test script, the selected browser driver is used for executing an automatic test in combination with a corresponding automatic test tool (such as Selenium WebDriver), a target test case is executed through the automatic test tool, an automatic test process is executed, user interface operation is performed, and a verification result is obtained, so that a generated user interface automatic test report is copied into an output catalog of the configured test report for subsequent analysis, viewing and sharing.
In this embodiment of the present application, according to the automated test tool after operation, the executing the target test case further includes:
acquiring a screen recording requirement, a log requirement and a target catalog of a target test case;
obtaining a video file and a log file of the target test case according to the screen recording and log requirements of the target test case;
the video file and the log file are copied to the target directory.
The target directory may be a directory for storing video files and log files.
In this embodiment, the automatic test tool, such as Selenium WebDriver, executes an automatic test, so as to meet the requirements of the recording and the log of the target test case, and specifically, according to the determined requirements of the recording or the log, set configuration parameters of the automatic test tool or other related tools of the recording and the log, where the configuration parameters may include the resolution of the recording, the frame rate of the recording, the level and the format of the log, and so on. After the test is executed, acquiring a screen recording and a log file according to the configuration of the used tool, and copying the obtained video file and log file into a predetermined target directory.
S103, constructing a target test case script according to the target data.
The test case script can be a programming script for executing automatic test, is used for describing test steps, expected results, verification logic and the like, is generally written based on a test framework and a programming language, and can realize large-scale test automation and improve test efficiency and accuracy by writing the test case script.
In this embodiment, the test case script structure may be designed by a test Framework, a programming language (e.g., selenium WebDriver, robot Framework, JUnit, python, etc.), and target data, including importing necessary libraries, initializing a test environment, defining test case functions, etc. According to the testing steps and parameter data in the target data, corresponding test case functions are written, and an application programming interface of the testing framework is used for calling a browser, positioning elements, executing operations and verifying expected results.
S104, obtaining an automatic test report of the user interface according to the target test case script.
In the embodiment of the application, after the target test case script is constructed, the target test case script can be executed by running an execution command of the test framework or calling a corresponding application program interface, and a test report is generated or a test result is recorded.
And calling target data in a target case database table through the obtained identifier of the target test case and the corresponding relation between the preset identifier of the test case and the data in the case database table, wherein the target case database table comprises a keyword database sub-table, an element database sub-table, a case step database sub-table and a case information database sub-table, and constructing a target test case script according to the target data, so that a user interface automation test report is obtained according to the target test case script. According to the automatic test method, the keyword data, the element data, the case step data and the case information data of the test cases are stored in the data table through the automatic test platform, and the repeatedly-executable test cases are constructed through the data table, so that the time and the workload for manually repeatedly executing the same test steps are reduced, and the maintenance cost of automatic test of the user interface is reduced. In addition, the automatic test method provided by the application improves the efficiency of automatic test of the user interface by selecting the corresponding use cases on the test platform and executing the use cases without repeatedly writing and executing the same test steps.
Fig. 3 is a flow chart of another automated testing method according to an embodiment of the present application. As shown in fig. 3, the method may include:
s301, responding to a new-added case operation of a user on an automatic test platform to obtain new-added test case data, wherein the new-added test case data comprises a keyword database table, an element database table, a case step database table and new-added data in a case information database table.
In this embodiment, after a user enters the UI automation test platform, the interface element of the system to be tested, the name of each step of the test case, the interface element to be operated, the keyword to be operated, the parameter value to be input in each step, and the browser and version to be tested may be input through the UI automation test platform.
In this embodiment of the present application, the automatic test platform displays a case management interface, a keyword management interface, and an element management interface, where the case management interface includes a case step name input area, an element selection area, and a keyword selection area, and the obtaining new test case data in response to a user's new case operation on the automatic test platform may include:
Responding to the input operation of a user to a case step name input area in a case management interface, and obtaining the name and parameter value data of each step of the newly added test case;
responding to the selection operation of the user on the element selection area in the case management interface to obtain the element data of the new test case;
responding to the selection operation of a user on a keyword selection area in a case management interface to obtain keyword data of a new test case;
displaying keywords and managing elements of the test page according to the keyword management interface and the element management interface;
and obtaining the new test case data and the test case data identifier according to the name and parameter value data, the element data, the keyword data and the browser version data of each step of the new test case.
In this embodiment, the use case management interface may be an interface of an automated test platform, and the use case management interface may be used for adding, modifying, and deleting use cases, where the new use case page displays a use case step name input area, an element selection area, a keyword selection area, and an input value area. The user can input the name and parameter value data of each step through the case step name input area, select the element data required by the test case through the element selection area, select the key word data required by the test case through the key word selection area, input corresponding values through the input value area when an input box or text area is added below each step of the case management interface, and display an input value area beside the selection area when the user selects the key word and the element, input corresponding values and associate the key word and the element through inputting the required values.
The user can input the name and parameter value data of each step of the test case in the case step input area, can add, edit and delete steps according to the needs, and set the input parameters required by each step. The user can select the information of the page elements related to the test case in the selection area, wherein the information comprises the positioning mode and the selector of the elements, and the positioning mode and the selector of the elements are used for positioning and operating the elements on the page when the test case is executed. The user can select keywords used in the test case in the keyword selection area, and a unique test case data identifier is allocated to the new test case data according to the acquired name and parameter value data, element data, keyword data and browser version data of each step of the new test case, so as to be used for subsequent test execution and management.
S302, saving the new test case data to an application case database table.
In this embodiment, the data newly added in step S301 may be correspondingly stored in a keyword database table, an element database table, a use case step database table, or a use case information database table in the database table.
In this embodiment of the present application, storing the new test case data into the application database table may include:
when more than two pieces of new test case data exist, the new test case data are subjected to collection processing, and the new test case data after collection processing are stored in an application case database table.
After the test case data is newly added through the UI automation test platform and stored in the corresponding database table, each component part of the test case can be stored in the corresponding database table in a structuring mode, wherein the component part comprises keywords, elements, case steps, case information and the like. The test cases and the components thereof can be conveniently inquired, edited and deleted through the case data table, so that the reusability of the test cases can be improved, and the repeated labor during testing can be reduced.
Fig. 4 is a schematic structural diagram of an automated testing apparatus according to an embodiment of the present application. As shown in fig. 4, the automated test equipment 40 includes: an acquisition module 401, a retrieval module 402, a construction module 403, and a determination module 404. Wherein:
an acquisition module 401, responsive to operation of the automated test platform, for acquiring identifiers of the target test cases;
The retrieving module 402 is configured to retrieve target data in a target use case database table according to a preset correspondence and an identifier of a target test case, where the correspondence represents a correspondence between the identifier of the test case and data in the target use case database table, and the target use case database table includes a keyword database sub-table, an element database sub-table, a use case step database sub-table, and a use case information database sub-table;
a building module 403, configured to build a target test case script according to the target data;
and the determining module 404 is configured to obtain a user interface automation test report according to the target test case script.
As can be seen from the foregoing, the automatic test apparatus of this embodiment may call the target data in the target case database table through the obtained correspondence between the identifier of the target test case and the preset identifier of the test case and the data in the case database table, where the target case database table includes a keyword database sub-table, an element database sub-table, a case step database sub-table, and a case information database sub-table, and construct a target test case script according to the target data, so as to obtain the user interface automatic test report according to the target test case script. According to the UI automatic test method, time and workload for manually and repeatedly executing the same test steps are reduced, maintenance cost of automatic testing of the user interface is reduced, and efficiency of automatic testing of the user interface is improved.
In the embodiment of the present application, the obtaining module 401 may also be configured to:
responding to the new case operation of the automatic test platform by a user to obtain new test case data, wherein the new test case data comprises a keyword database table, an element database table, a case step database table and newly added data in a case information database table;
and saving the newly added test case data to an application database table.
In this embodiment of the present application, the automated test platform displays a use case management interface, a keyword management interface, and an element management interface, where the use case management interface includes a use case step name input area, an element selection area, and a keyword selection area, and the obtaining module 401 may be further configured to:
responding to the input operation of a user to a case step name input area in a case management interface, and obtaining the name and parameter value data of each step of the newly added test case;
responding to the selection operation of the user on the element selection area in the case management interface to obtain the element data of the new test case;
responding to the selection operation of a user on a keyword selection area in a case management interface to obtain keyword data of a new test case;
Displaying keywords and managing elements of the test page according to the keyword management interface and the element management interface;
and obtaining the new test case data and the test case data identifier according to the name and parameter value data, the element data, the keyword data and the browser version data of each step of the new test case.
In embodiments of the present application, the building module 403 may also be configured to:
when more than two pieces of new test case data exist, the new test case data are subjected to collection processing, and the new test case data after collection processing are stored in an application case database table.
In embodiments of the present application, the determination module 404 may also be configured to:
determining application case step information and browser information corresponding to the identifier of the target test case according to the corresponding relation and the identifier of the target test case, wherein the application case step information comprises an element identifier and a keyword identifier of the target test case;
determining element information and key information corresponding to the element identifier and key identifier of the target test case from the target case database table according to the element identifier and key identifier of the target test case;
And determining the browser version corresponding to the target test case according to the browser information.
In the present embodiment, the determining module 404 may also be configured to:
determining a target automation test tool and a control instruction sent by a remote browser driver;
starting a target browser according to a target automation testing tool and a control instruction sent by a remote browser driver, wherein the version of the target browser corresponds to browser version information in a target test case script;
and executing the target test case according to the target browser to obtain an automatic test report of the user interface.
In the embodiment of the present application, the obtaining module 401 may also be configured to:
browser version information in the target test case script is obtained;
obtaining a Docker image of the target browser according to browser version information in the target test case script, wherein the Docker image of the target browser corresponds to the browser version information in the target test case script;
starting a Docker mirror image container according to the Docker mirror image of the target browser;
and starting the target browser according to the Docker mirror container.
In embodiments of the present application, the determination module 304 may also be configured to:
Determining a browser driver of a target browser and an output catalog of a test report;
according to a browser driver of the target browser, running an automatic test tool;
executing the target test case according to the running automatic test tool to obtain a user interface automatic test report;
the user interface automated test report is copied to an output catalog of test reports.
In an embodiment of the present application, the acquisition module 401 may also be configured to:
acquiring a screen recording requirement, a log requirement and a target catalog of a target test case;
obtaining a video file and a log file of the target test case according to the screen recording and log requirements of the target test case;
the video file and the log file are copied to the target directory.
As can be seen from the foregoing, in the embodiment of the present application, the obtained correspondence between the identifier of the target test case and the preset identifier of the test case and the data in the case database table may be used to call the target data in the target case database table, where the target case database table includes a keyword database sub-table, an element database sub-table, a case step database sub-table, and a case information database sub-table, and a target test case script is constructed according to the target data, so as to obtain the user interface automation test report according to the target test case script. According to the automatic test method, the keyword data, the element data, the case step data and the case information data of the test cases are stored in the data table through the automatic test platform, and the repeatedly-executable test cases are constructed through the data table, so that the time and the workload for manually repeatedly executing the same test steps are reduced, and the maintenance cost of automatic test of the user interface is reduced. In addition, the automatic test method provided by the application improves the efficiency of automatic test of the user interface by selecting the corresponding use cases on the test platform and executing the use cases without repeatedly writing and executing the same test steps.
Fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 5, the electronic device 50 includes:
the electronic device 50 may include one or more processing cores 'processors 501, one or more computer-readable storage media's memory 502, communication components 503, and the like. The processor 501, the memory 502, and the communication unit 503 are connected via a bus 504.
In a particular implementation, at least one processor 501 executes computer-executable instructions stored in memory 502, causing at least one processor 501 to perform an automated test method as described above.
The specific implementation process of the processor 501 may refer to the above-mentioned method embodiment, and its implementation principle and technical effects are similar, and this embodiment will not be described herein again.
In the embodiment shown in fig. 5, it should be understood that the processor may be a central processing unit (english: central Processing Unit, abbreviated as CPU), or may be other general purpose processors, digital signal processors (english: digital Signal Processor, abbreviated as DSP), application specific integrated circuits (english: application Specific Integrated Circuit, abbreviated as ASIC), or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the present invention may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in a processor for execution.
The Memory may comprise high-speed Memory (Random Access Memory, RAM) or may further comprise Non-volatile Memory (NVM), such as at least one disk Memory.
The bus may be an industry standard architecture (Industry Standard Architecture, ISA) bus, an external device interconnect (Peripheral Component, PCI) bus, or an extended industry standard architecture (Extended Industry Standard Architecture, EISA) bus, among others. The buses may be divided into address buses, data buses, control buses, etc. For ease of illustration, the buses in the drawings of the present application are not limited to only one bus or one type of bus.
In some embodiments, a computer program product is also presented, comprising a computer program or instructions which, when executed by a processor, implement the steps of any of the automated test methods described above.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Those of ordinary skill in the art will appreciate that all or a portion of the steps of the various methods of the above embodiments may be performed by instructions, or by instructions controlling associated hardware, which may be stored in a computer-readable storage medium and loaded and executed by a processor.
To this end, embodiments of the present application provide a computer readable storage medium having stored therein a plurality of instructions capable of being loaded by a processor to perform steps in any of the automated test methods provided by the embodiments of the present application.
Wherein the storage medium may include: read-only memory (ROM, readOnlyMemory), random access memory (RAM, randomAccessMemory), magnetic or optical disk, and the like.
According to one aspect of the present application, there is provided a computer program product or computer program comprising computer instructions stored in a computer readable storage medium.
The instructions stored in the storage medium may perform steps in any of the automated testing methods provided in the embodiments of the present application, so that the beneficial effects that any of the automated testing methods provided in the embodiments of the present application may be achieved are detailed in the previous embodiments and are not described herein.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It is to be understood that the present application is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (12)

1. An automated testing method, the method comprising:
responding to the operation of the automatic test platform, and acquiring an identifier of a target test case;
according to a preset corresponding relation and an identifier of the target test case, target data in a target case database table is called, wherein the corresponding relation represents the corresponding relation between the identifier of the test case and data in the target case database table, and the target case database table comprises a keyword database sub-table, an element database sub-table, a case step database sub-table and a case information database sub-table;
constructing a target test case script according to the target data;
and obtaining an automatic test report of the user interface according to the target test case script.
2. The method of claim 1, wherein before the retrieving the target data in the target database table according to the preset correspondence and the identifier of the target test case, the method further comprises:
Responding to the new case operation of the automatic test platform by a user, and obtaining new test case data, wherein the new test case data comprises newly added data in a keyword database table, an element database table, a case step database table and a case information database table;
and storing the new test case data into an application case database table.
3. The method of claim 2, wherein the automated test platform displays a case management interface, a keyword management interface, and an element management interface, the case management interface includes a case step name input area, an element selection area, and a keyword selection area, and the obtaining the new test case data in response to the user operating the new case of the automated test platform includes:
responding to the input operation of a user to a case step name input area in a case management interface, and obtaining the name and parameter value data of each step of the newly added test case;
responding to the selection operation of the user on the element selection area in the case management interface to obtain the element data of the new test case;
responding to the selection operation of a user on a keyword selection area in a case management interface to obtain keyword data of a new test case;
Displaying keywords and managing elements of the test page according to the keyword management interface and the element management interface;
and obtaining new test case data and a test case data identifier according to the name and parameter value data, element data, keyword data and browser version data of each step of the new test case.
4. A method according to claim 3, characterized in that the method further comprises:
when more than two pieces of new test case data exist, the new test case data are subjected to collection processing, and the new test case data after collection processing are stored into an application database table.
5. The method of claim 1, wherein the retrieving the target data in the target case database table according to the preset correspondence and the identifier of the target test case includes:
determining application case step information and browser information corresponding to the identifier of the target test case according to the corresponding relation and the identifier of the target test case, wherein the application case step information comprises an element identifier and a keyword identifier of the target test case;
determining element information and key information corresponding to the element identifier and key identifier of the target test case from the target test case database table according to the element identifier and key identifier of the target test case;
And determining the browser version corresponding to the target test case according to the browser information.
6. The method of claim 1, wherein obtaining a user interface automation test report from the target test case script comprises:
determining a target automation test tool and a control instruction sent by a remote browser driver;
starting the target browser according to the target automation testing tool and a control instruction sent by a remote browser driver, wherein the version of the target browser corresponds to browser version information in the target test case script;
and executing the target test case according to the target browser to obtain a user interface automation test report.
7. The method of claim 6, wherein the launching the target browser in accordance with the control instructions sent by the target automated test tool and remote browser driver comprises:
browser version information in the target test case script is obtained;
obtaining a Docker image of the target browser according to the browser version information in the target test case script, wherein the Docker image of the target browser corresponds to the browser version information in the target test case script;
Starting a Docker mirror image container according to the Docker mirror image of the target browser;
and starting the target browser according to the Docker mirror image container.
8. The method of claim 6, wherein the executing the target test case according to the target browser to obtain a user interface automation test report comprises:
determining a browser driver of a target browser and an output catalog of a test report;
operating an automatic test tool according to the browser driver of the target browser;
executing the target test case according to the running automatic test tool to obtain a user interface automatic test report;
copying the user interface automation test report to an output catalog of the test report.
9. The method of claim 8, wherein the method further comprises:
acquiring a screen recording requirement, a log requirement and a target catalog of the target test case;
obtaining a video file and a log file of the target test case according to the screen recording and log requirements of the target test case;
and copying the video file and the log file to a target directory.
10. An automated test equipment, comprising:
the acquisition module is used for responding to the operation of the automatic test platform and acquiring the identifier of the target test case;
the calling module is used for calling target data in a target case database table according to a preset corresponding relation and an identifier of the target test case, wherein the corresponding relation represents the corresponding relation between the identifier of the test case and data in the case database table, and the target case database table comprises a keyword database sub-table, an element database sub-table, a case step database sub-table and a case information database sub-table;
the construction module is used for constructing a target test case script according to the target data;
and the determining module is used for obtaining an automatic test report of the user interface according to the target test case script.
11. An electronic device, comprising: a processor, and a memory communicatively coupled to the processor;
the memory stores computer-executable instructions;
the processor executes the computer-executable instructions stored in the memory to implement the automated test method of any of claims 1-9.
12. A computer readable storage medium having stored therein computer executable instructions which when executed by a processor are for implementing the automated test method of any of claims 1 to 9.
CN202311799563.7A 2023-12-25 2023-12-25 Automatic test method, device, equipment and storage medium Pending CN117762802A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311799563.7A CN117762802A (en) 2023-12-25 2023-12-25 Automatic test method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311799563.7A CN117762802A (en) 2023-12-25 2023-12-25 Automatic test method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117762802A true CN117762802A (en) 2024-03-26

Family

ID=90312340

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311799563.7A Pending CN117762802A (en) 2023-12-25 2023-12-25 Automatic test method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117762802A (en)

Similar Documents

Publication Publication Date Title
US9846638B2 (en) Exposing method related data calls during testing in an event driven, multichannel architecture
US8392886B2 (en) System, program product, and methods to enable visual recording and editing of test automation scenarios for web application
US20160188450A1 (en) Automated application test system
US20080270841A1 (en) Test case manager
US11138097B2 (en) Automated web testing framework for generating and maintaining test scripts
CN103180834A (en) An automated operating system test framework
CN115658529A (en) Automatic testing method for user page and related equipment
CN112231213A (en) Web automatic testing method, system, storage medium and terminal equipment
CN112231206A (en) Script editing method for application program test, computer readable storage medium and test platform
CN117112060A (en) Component library construction method and device, electronic equipment and storage medium
CN113742215B (en) Method and system for automatically configuring and calling test tool to perform test analysis
CN116841543A (en) Development method for dynamically generating cross-platform multi-terminal application based on Flutter
CN116719736A (en) Test case generation method and device for testing software interface
CN116860608A (en) Interface testing method and device, computing equipment and storage medium
CN116893960A (en) Code quality detection method, apparatus, computer device and storage medium
CN117762802A (en) Automatic test method, device, equipment and storage medium
CN113986263A (en) Code automation test method, device, electronic equipment and storage medium
CN113220586A (en) Automatic interface pressure test execution method, device and system
CN112035300A (en) Server BMC automatic test system, method, storage medium and electronic device
US11995146B1 (en) System and method for displaying real-time code of embedded code in a browser-window of a software application
CN115934500A (en) Pipeline construction method and device, computer equipment and storage medium
CN117609040A (en) Test page generation method, device, terminal equipment and storage medium
Horalek et al. Automated Tests Using Selenium Framework
CN115543807A (en) Automatic regression testing method and device, computer equipment and storage medium
CN117909246A (en) Automatic testing method and platform for front-end webpage interface

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination