CN117435510B - Automatic test method, terminal equipment and computer readable storage medium - Google Patents

Automatic test method, terminal equipment and computer readable storage medium Download PDF

Info

Publication number
CN117435510B
CN117435510B CN202311757111.2A CN202311757111A CN117435510B CN 117435510 B CN117435510 B CN 117435510B CN 202311757111 A CN202311757111 A CN 202311757111A CN 117435510 B CN117435510 B CN 117435510B
Authority
CN
China
Prior art keywords
test
engine
container
mirror image
name
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311757111.2A
Other languages
Chinese (zh)
Other versions
CN117435510A (en
Inventor
冯绍文
伍艳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Zhicheng Software Technology Service Co ltd
Shenzhen Smart City Technology Development Group Co ltd
Original Assignee
Shenzhen Zhicheng Software Technology Service Co ltd
Shenzhen Smart City Technology Development Group Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Zhicheng Software Technology Service Co ltd, Shenzhen Smart City Technology Development Group Co ltd filed Critical Shenzhen Zhicheng Software Technology Service Co ltd
Priority to CN202311757111.2A priority Critical patent/CN117435510B/en
Publication of CN117435510A publication Critical patent/CN117435510A/en
Application granted granted Critical
Publication of CN117435510B publication Critical patent/CN117435510B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3696Methods or tools to render software testable

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Test And Diagnosis Of Digital Computers (AREA)

Abstract

The application relates to the field of automated testing, and discloses an automated testing method, terminal equipment and a computer readable storage medium, wherein the method comprises the following steps: receiving a test schedule, and acquiring an engine name of a test engine in the test schedule; calling a mirror image address corresponding to the test engine in a preset engine mirror image file according to the engine name; creating a test container corresponding to the test engine according to the mirror image address, and adapting the test container to an interface of the test engine; and calling the interface in the test container to execute the test case corresponding to the test schedule. The problem that cross-platform operation is needed and efficiency is low when testing is needed on various testing engines is solved. The effect of improving the test efficiency is achieved.

Description

Automatic test method, terminal equipment and computer readable storage medium
Technical Field
The present disclosure relates to the field of automated testing, and in particular, to an automated testing method, a terminal device, and a computer readable storage medium.
Background
In software development, test engines are widely used for automated testing. The variety of test engines is wide, each having its specific syntax, rules, and execution.
Current test platforms typically only support specific types of test engines, which vary in usage scenarios and functions, resulting in a number of limitations faced by users in selecting the appropriate test engine. If the user needs to select multiple different test engines for verification, test flows need to be executed on different test platforms respectively, so that the test efficiency is low.
Disclosure of Invention
According to the embodiment of the application, by providing the automatic testing method, the terminal equipment and the computer readable storage medium, the problem that cross-platform operation is needed when different testing engines are needed to be tested is solved, and the effect of testing a plurality of different testing engines on one testing platform and improving the testing efficiency is achieved.
The embodiment of the application provides an automatic test method, terminal equipment and a computer readable storage medium, wherein the automatic test method comprises the following steps:
receiving a test schedule, and acquiring an engine name of a test engine in the test schedule;
calling a mirror image address corresponding to the test engine in a preset engine mirror image file according to the engine name;
creating a test container corresponding to the test engine according to the mirror image address, and adapting the test container to an interface of the test engine;
and calling the interface in the test container to execute the test case corresponding to the test schedule.
Optionally, the step of receiving a test schedule and obtaining an engine name of a test engine in the test schedule includes:
analyzing the test schedule to obtain a test type;
judging whether the test engine is an open source test engine or a commercial test engine according to the test type;
if the engine is the open source test engine, acquiring an engine name corresponding to the open source test engine;
and if the engine is the commercial test engine, acquiring the engine name corresponding to the commercial test engine based on the key after acquiring the key corresponding to the commercial test engine.
Optionally, after the step of receiving the test schedule and obtaining the engine name of the test engine in the test schedule, the method further includes:
matching the engine name with the engine name in the engine image file;
judging whether the image address corresponding to the engine name exists in the engine image file or not;
if the mirror address does not exist, the current test is terminated.
Optionally, after the step of terminating the current test if the mirror address does not exist, the method includes:
after receiving the update information of the engine mirror image file, detecting whether a mirror image address corresponding to the engine name exists in the engine mirror image file;
if yes, the current test is restored, and the step of calling the mirror image address corresponding to the test engine in the preset engine mirror image file according to the engine name is executed.
Optionally, the step of creating a test container corresponding to the test engine according to the mirror address and adapting the test container to an interface of the test engine includes:
downloading codes and data corresponding to the test container according to the mirror image address, and creating the test container according to the codes and the data;
acquiring an interface of the test engine according to the action of the test container;
and carrying out corresponding configuration of the interface in the test container so as to complete the adaptation of the interface.
Optionally, the step of executing the test case corresponding to the test schedule includes:
acquiring test parameters in the test case and expected results corresponding to the test parameters;
operating the test parameters in the test container to obtain a test result;
and judging whether the current test passes or fails according to the test result and the expected result, and generating a test report.
Optionally, after the step of calling the interface in the test container to execute the test case corresponding to the test schedule, the method includes:
stopping running the test container after the test of the test case is completed;
setting a label for the test container according to the attribute of the test container;
and according to the label, defining a storage area of the test container, and storing the test container.
Optionally, the step of creating a test container corresponding to the test engine according to the mirror address and adapting the test container to an interface of the test engine includes:
if the number of the test containers to be created is two or more, and the mirror image addresses of the test containers to be created are the same, creating a universal test container according to the mirror image addresses;
and completing the construction of the test container corresponding to the test engine according to the personalized requirements of each test container to be created and the universal test container, and adapting the test container to the interface of the test engine.
In addition, in order to achieve the above objective, an embodiment of the present invention further provides a terminal device, including a memory, a processor, and an automated test program stored in the memory and capable of running on the processor, where the processor implements the method as described above when executing the automated test program.
In addition, to achieve the above object, an embodiment of the present invention further provides a computer-readable storage medium having stored thereon an automated test program which, when executed by a processor, implements the method as described above.
One or more technical solutions provided in the embodiments of the present application at least have the following technical effects or advantages:
and after receiving the test schedule, acquiring the engine name of the test engine in the test schedule, and calling the mirror image address corresponding to the test engine in a preset engine mirror image file according to the engine name. And creating a test container corresponding to the test engine according to the mirror image address, and adapting the test container to an interface of the test engine. And calling the interface in the test container to execute the test case corresponding to the test schedule. Because the test container corresponding to the engine to be tested can be established on the test platform, the test case is directly executed in the test container, the purpose of testing a plurality of test engines on one test platform is realized, the test is not needed to be carried out across platforms, and the test efficiency is improved.
Drawings
FIG. 1 is a schematic flow chart of an embodiment of an automated testing method of the present application;
FIG. 2 is a flow chart of an automated test procedure according to the present application;
FIG. 3 is a schematic flow chart of a second embodiment of an automated test method of the present application;
FIG. 4 is a schematic flow chart of a third embodiment of an automated test method of the present application;
fig. 5 is a schematic diagram of a terminal structure of a hardware running environment according to an embodiment of the present application.
Detailed Description
Because the test platform usually only supports a specific type of test engine, when a plurality of different test engines need to be tested simultaneously, the test needs to be performed on different test platforms, and the problem of low test efficiency exists. To solve this problem, the present application provides an automated test method. And after receiving the test schedule, acquiring the engine name of the test engine in the test schedule, and calling the mirror image address corresponding to the test engine in a preset engine mirror image file according to the engine name. And creating a test container corresponding to the test engine according to the mirror image address, and adapting the test container to an interface of the test engine. And after the interface is adapted, calling the interface in a test container, and executing the test case corresponding to the test schedule. By establishing the test containers corresponding to the different test engines on one test platform, the test engines can be tested on one test platform, and the test efficiency is improved.
In order to better understand the above technical solution, exemplary embodiments of the present application will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present application are shown in the drawings, it should be understood that the present application may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
In order to better understand the above technical solutions, the following detailed description will refer to the accompanying drawings and specific embodiments.
Example 1
In this embodiment, an automated testing method is provided.
Referring to fig. 1, the automated testing method of the present embodiment includes the steps of:
step S100: receiving a test schedule, and acquiring an engine name of a test engine in the test schedule;
in this embodiment, the test platform receives the test schedule and tests the test engine according to the test schedule. Referring to fig. 2, the test platform supports registration and invocation of various test engines, which may include an interface test engine, a UI test engine, an APP test engine, a performance test engine, a security test engine, and the like. The test platform can be accessed in a registration mode only by supporting test engines executed by ssh and OpenAPI calls. The variety of the test engines in the test platform is increased, so that a user can select a proper test engine to execute a test task according to the requirement.
As an alternative implementation manner, after receiving the test schedule, the test schedule is parsed to obtain the test type, and the test engine is judged to be an open source test engine or a business test engine according to the test type, if the test engine is an open source test engine, the engine name can be directly obtained. If the engine is a business test engine, the corresponding engine name is acquired after the key corresponding to the business test engine is acquired. Since the business test engine is usually an engine developed inside an enterprise, in order to prevent data leakage inside the enterprise, when the business test engine is called, a correct engine name is usually obtained after a corresponding key is obtained.
Illustratively, the test schedule table contains information such as test type, test name, and test content. A test schedule may contain execution of test tasks for one or more different test engines. And analyzing the corresponding fields of the test types in the test schedule table, and determining whether the test engine corresponding to each test task is an open source test engine or a business test engine. If the engine is an open source test engine, the corresponding engine name can be directly obtained. If the key is a commercial test engine, the key collection interface is skipped, and after the key input by a user is received, whether the key is correct or not is judged, and the engine name can be obtained only when the key is correct. So as to establish a corresponding test container according to the engine name and complete the test tasks in the test schedule.
As another alternative implementation manner, after the engine name is obtained, the engine name is matched with the engine name in the engine mirror image file, whether the mirror image address corresponding to the engine name exists in the engine mirror image file is judged, and if the mirror image address does not exist, the current test is directly terminated. By matching the obtained engine names with the names in the engine image file, it is possible to verify whether the engine names provided in the test schedule are correct, helping to ensure that the engine names used in the subsequent steps are accurate and valid. In fig. 2, the engine registration management module includes an engine image file, and the engine registration management module completes the acquisition of the image address.
For example, the test schedule may include two or more test engines, that is, include two or more engine names, and when at least one of the test engines is detected not to be in the engine image file, that is, the test is terminated, so as to prevent abrupt interruption during execution of the test tasks in the test schedule, increase useless test procedures, and reduce test efficiency.
As a further alternative implementation manner, if the test is terminated because the corresponding engine name in the test schedule is not detected in the engine image file, whether the engine image file is updated or not can be detected within a period of time after the test is terminated, if the update information of the engine image file is received, whether the image addresses corresponding to all the engine names in the test schedule exist in the engine image file is detected, if the image addresses corresponding to all the engine names in the test schedule exist in the engine image file, the current test is restored, and the step of calling the image addresses corresponding to the test engine in the preset engine image file according to the engine names is executed.
For example, the time from the completion of the registration of the test engine in the test platform to the generation of the image address corresponding to the registered engine in the engine image file is predictable, so that whether the engine image file is updated or not can be detected within a preset time after the termination of the test, and if the update exists, whether the corresponding image address exists or not is detected again according to the engine name in the test schedule.
Step S200: calling a mirror image address corresponding to the test engine in a preset engine mirror image file according to the engine name;
as an alternative implementation, the engine image file includes an engine name and at least one image address corresponding to the engine name.
Illustratively, when one engine name corresponds to multiple mirror addresses, an explicitly selected criteria and priority is typically required. If one of the plurality of mirror addresses is marked as the default address, the default address is preferentially selected, reducing the repeatability of the selection and ensuring that a stable, verified mirror is used in most cases.
Step S300: creating a test container corresponding to the test engine according to the mirror image address, and adapting the test container to an interface of the test engine;
in this embodiment, after a test container corresponding to the test engine is created according to the mirror address, the test container is adapted to an interface of the test engine. So that test cases executing on the test receptacles can be adapted to the actual test engine.
As an alternative implementation, the code and data corresponding to the test container are downloaded according to the mirror address, and the container is created according to the code and data. And obtaining an interface of the test engine according to the action of the test container, and carrying out configuration corresponding to the interface in the test container to complete the adaptation of the interface.
For example, one test engine may have multiple interfaces corresponding to each interface, where the functions implemented by each interface are different, so that the corresponding test engine interface needs to be obtained according to the action of the test container, and after the test container corresponds to the interface of the test engine, the test container may call some functions in the test engine, such as executing test cases, obtaining test data, and so on.
Step S400: and calling the interface in the test container to execute the test case corresponding to the test schedule.
In this embodiment, after the test container completes the adaptation with the interface of the test engine, the interface of the test engine may be called when the test task is executed, so as to call the test case execution function of the test engine. And acquiring and executing the test cases to be executed in the test schedule.
As an optional implementation manner, when executing the test case, firstly obtaining the test parameters in the test case and the expected results corresponding to the test parameters, and running the test parameters in the test container to obtain the test results. And comparing the test result with the expected result, judging whether the current test passes or fails, and generating a test report. Because the test cases are executed on the test container instead of directly on the test engine, the execution of the test cases is not interfered by other environmental factors, and the accuracy of the test results is ensured.
For example, to ensure the accuracy of the test results, two identical test containers may be created, identical test cases are executed on the test containers at the same time, the execution results are compared to ensure the accuracy of the results, and then the execution results are compared with the expected results to determine whether the current test passed or failed.
As another alternative implementation manner, the test case is executed through the test container, rather than directly through the test engine, so that the test efficiency can be effectively improved. If multiple test engines are to be executed on one test platform at the same time, the test platform is required to have the performance of a load balancing mechanism, a task scheduler and the like, and the requirements on the test platform are high. And a plurality of test containers are executed on the test platform at the same time, the test platform is not required to have the performance.
As another alternative implementation manner, one test case may need to be completed through a plurality of test containers, and after the content of the test case corresponding to the first test container is executed, the second test container is automatically called to execute the content of the corresponding test case until the execution of the test case is completed. The test case is executed through the test container, so that the test container can be switched at any time, namely the test engine can be switched at any time, different test requirements are met, and the flexibility of the test is improved.
Illustratively, enterprise a needs to perform interface testing and performance testing, and needs to access a test container corresponding to the interface testing and a test container corresponding to the performance testing in the test schedule. After the test cases are executed by the interface test container, the test cases are switched to the performance test container to execute the corresponding test cases.
In this embodiment, the engine name of the test engine is determined through the test schedule, and the mirror address corresponding to the test engine is called in the preset engine mirror file according to the engine name. And creating a test container according to the mirror image address, adapting the successfully created test container to an interface of a test engine, and calling the interface in the test container to execute the test case corresponding to the test schedule. The test platform is provided with the test containers, and the test containers are used for providing the isolated execution environment, so that the test cases can be executed in different containers in parallel, and the test efficiency is improved.
Example two
Based on the first embodiment, another embodiment of the present application is provided, referring to fig. 3, after the step of calling the interface in the test container to execute the test case corresponding to the test schedule, the method includes the following steps:
step S500: stopping running the test container after the test of the test case is completed;
step S600: setting a label for the test container according to the attribute of the test container;
step S700: and according to the label, defining a storage area of the test container, and storing the test container.
In this embodiment, after the test container completes execution of the test case, the test container may be stored, and the execution environment and related data of the test case may be reserved for subsequent use and review. The test container may be considered a reusable test environment containing the execution code, dependencies, configuration information, etc. of the test cases. By storing the test receptacles, they may be stored in a suitable storage medium or receptacle warehouse for later use. When the same test case needs to be run again, the test case can be directly loaded and executed from the stored test container without recreating and configuring the test environment, so that time and resources can be saved, and the test efficiency can be improved.
As an alternative embodiment, the label of the test container is set according to the attribute of the test container, and the storage area of the test container is defined according to the label, and the test container is stored. The property of the test container may be any property or metadata associated with the test container, such as the name of the test case, the configuration of the test environment, the type of test data, etc.
Illustratively, various attributes of the test receptacles are first obtained, and the test receptacles are labeled according to the attributes, which may be any identifier or key for classifying and identifying the test receptacles. After the label is set, the test container can be classified into the corresponding storage area according to the label so as to facilitate subsequent searching and management. For long-term storage of test receptacles, maintenance and updates also need to be performed periodically to ensure that they are consistent with the latest test code and dependencies. In addition, there is a need to manage and monitor the stored test receptacles to discover problems and make adjustments in time.
In this embodiment, after the test container completes the test task, the test cases are stored and managed, so that when the same test container needs to be used next time, the test cases can be directly called, the test container does not need to be re-created, and the test efficiency can be improved.
Example III
Based on the above embodiments, another embodiment of the present application is provided, referring to fig. 4, the step of creating a test container corresponding to the test engine according to the mirror address, and adapting the interface between the test container and the test engine includes the following steps:
step S310: if the number of the test containers to be created is two or more, and the mirror image addresses of the test containers to be created are the same, creating a universal test container according to the mirror image addresses;
step S320: and completing the construction of the test container corresponding to the test engine according to the personalized requirements of each test container to be created and the universal test container, and adapting the test container to the interface of the test engine.
In this embodiment, when there are more test containers to be created, a general test container may be created according to the commonality of each test container, and then customized according to the specific requirements of each test container, so as to build and complete each test container, save the cost and time of creating multiple identical basic environments, and further improve the efficiency of testing.
As an alternative embodiment, when more than or equal to two test containers need to be created, and the mirror addresses of these test containers are the same, a generic test container may be created from this same mirror address, and then customized according to the specific requirements of each test container.
Illustratively, a generic test container created from the mirrored address may contain the common environment, dependencies, configuration information, and the like for multiple test cases. After the creation of the universal test containers is completed, each test container has the characteristics and functions according to the customized steps of configuring special environment variables, adding additional dependent items and the like. After the test container is created, loading and running the corresponding test cases, executing test scripts, running the tested system or application program in the container, and collecting test results. And verifying and debugging the test result of each test container in the test execution process or after the test execution process is finished, so that the test case is ensured to be executed according to expectations.
Alternatively, when a common test container is created from the commonality of a plurality of test containers, the mirror addresses of the individual test containers are not necessarily required to be identical. The generic test container may be created from the common test environment and dependent items and stored so that the generic test container may be directly invoked the next time the test container is created.
As another alternative implementation manner, when the test container is created, the estimated use time of the universal test container can be predicted according to the software to be tested and the test environment, and whether the universal test container needs to be created or not can be judged according to the estimated use time. If the estimated use time length is greater than or equal to the preset time length, a universal test container is established; if the estimated use time length is smaller than the preset time length, a universal test container does not need to be established so as to save the storage space. Wherein the estimated duration of use refers to how long the universal test vessel is expected to be used. The preset time length can be adjusted according to actual conditions. In addition, if it is predicted that the universal test receptacle needs to be used a plurality of times, the universal test receptacle is also created. If the universal test container is not used after the test is finished or is not used after the test is finished in the month, the universal test container is not required to be established.
Illustratively, assume that for an account opening business process, it is necessary to estimate how many accounts need to be opened during the test, and how long each account needs to be tested. If it is expected that each account will take 10 minutes to test, 100 accounts will be tested, then the estimated time of use will be 1000 minutes. Whether a universal test container needs to be established or not can be judged according to the estimated use time length, if the preset time length is 500 minutes, the universal test container needs to be established for testing the account opening business flow because the estimated use time length is longer than the preset time length.
In the embodiment, the universal test container is created according to the universal item of the test container to be created, so that the cost and time for creating a plurality of identical base environments can be reduced. After the test container is used up, only the general test container can be saved, and the specific test container is not saved, so that the occupation of the storage space is reduced.
Example IV
In an embodiment of the present application, an automated testing apparatus is provided.
Referring to fig. 5, fig. 5 is a schematic diagram of a terminal structure of a hardware running environment according to an embodiment of the present application.
As shown in fig. 5, the control terminal may include: a processor 1001, such as a CPU, a network interface 1003, memory 1004, and a communication bus 1002. Wherein the communication bus 1002 is used to enable connected communication between these components. The network interface 1003 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1004 may be a high-speed RAM memory or a stable memory (non-volatile memory), such as a disk memory. The memory 1004 may also optionally be a storage device separate from the processor 1001 described above.
It will be appreciated by those skilled in the art that the terminal structure shown in fig. 5 is not limiting of the terminal and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
As shown in fig. 5, an operating system, a network communication module, and an automated test program may be included in the memory 1004, which is a type of computer storage medium.
In the automated test equipment hardware architecture shown in fig. 5, the processor 1001 may call an automated test program stored in the memory 1004 and perform the following operations:
receiving a test schedule, and acquiring an engine name of a test engine in the test schedule;
calling a mirror image address corresponding to the test engine in a preset engine mirror image file according to the engine name;
creating a test container corresponding to the test engine according to the mirror image address, and adapting the test container to an interface of the test engine;
and calling the interface in the test container to execute the test case corresponding to the test schedule.
Optionally, the processor 1001 may call an automated test program stored in the memory 1004, and also perform the following operations:
analyzing the test schedule to obtain a test type;
judging whether the test engine is an open source test engine or a commercial test engine according to the test type;
if the engine is the open source test engine, acquiring an engine name corresponding to the open source test engine;
and if the engine is the commercial test engine, acquiring the engine name corresponding to the commercial test engine based on the key after acquiring the key corresponding to the commercial test engine.
Optionally, the processor 1001 may call an automated test program stored in the memory 1004, and also perform the following operations:
matching the engine name with the engine name in the engine image file;
judging whether the image address corresponding to the engine name exists in the engine image file or not;
if the mirror address does not exist, the current test is terminated.
Optionally, the processor 1001 may call an automated test program stored in the memory 1004, and also perform the following operations:
after receiving the update information of the engine mirror image file, detecting whether a mirror image address corresponding to the engine name exists in the engine mirror image file;
if yes, the current test is restored, and the step of calling the mirror image address corresponding to the test engine in the preset engine mirror image file according to the engine name is executed.
Optionally, the processor 1001 may call an automated test program stored in the memory 1004, and also perform the following operations:
downloading codes and data corresponding to the test container according to the mirror image address, and creating the test container according to the codes and the data;
acquiring an interface of the test engine according to the action of the test container;
and carrying out configuration corresponding to the interface in the test container to complete the adaptation of the interface.
Optionally, the processor 1001 may call an automated test program stored in the memory 1004, and also perform the following operations:
acquiring test parameters in the test case and expected results corresponding to the test parameters;
operating the test parameters in the test container to obtain a test result;
and judging whether the current test passes or fails according to the test result and the expected result, and generating a test report.
Optionally, the processor 1001 may call an automated test program stored in the memory 1004, and also perform the following operations:
stopping running the test container after the test of the test case is completed;
setting a label for the test container according to the attribute of the test container;
and according to the label, defining a storage area of the test container, and storing the test container.
Optionally, the processor 1001 may call an automated test program stored in the memory 1004, and also perform the following operations:
if the number of the test containers to be created is two or more, and the mirror image addresses of the test containers to be created are the same, creating a universal test container according to the mirror image addresses;
and completing the construction of the test container corresponding to the test engine according to the personalized requirements of each test container to be created and the universal test container.
In addition, in order to achieve the above objective, an embodiment of the present invention further provides a terminal device, including a memory, a processor, and an automated test program stored in the memory and capable of running on the processor, where the processor implements the automated test method as described above when executing the automated test program.
In addition, in order to achieve the above object, an embodiment of the present invention further provides a computer-readable storage medium having stored thereon an automated test program which, when executed by a processor, implements the automated test method as described above.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It should be noted that in the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The application may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second and third, et cetera do not indicate any ordering. These words may be interpreted as names.
While preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various modifications and variations can be made in the present application without departing from the spirit or scope of the invention. Thus, if such modifications and variations of the present application fall within the scope of the claims and the equivalents thereof, the present application is intended to cover such modifications and variations.

Claims (7)

1. An automated test method, comprising the steps of:
receiving a test schedule, and acquiring an engine name of a test engine in the test schedule;
calling a mirror image address corresponding to the test engine in a preset engine mirror image file according to the engine name, wherein the engine mirror image file comprises the engine name and at least one mirror image address corresponding to the engine name, and when one engine name corresponds to a plurality of mirror image addresses, taking a default address as the mirror image address corresponding to the test engine;
downloading codes and data corresponding to the test container according to the mirror image address, and creating the test container according to the codes and the data; acquiring an interface of the test engine according to the action of the test container; performing corresponding configuration of the interface in the test container to complete the adaptation of the interface;
if two or more test containers to be created are provided and the mirror image addresses of the test containers to be created are the same, creating a universal test container according to the mirror image addresses, completing the construction of the test containers corresponding to the test engine according to the personalized requirements of the test containers to be created and the universal test container, and adapting the test containers to interfaces of the test engine; when a universal test container is created according to the commonality of a plurality of test containers, the universal test container is created according to a common test environment and a dependence item, when the test container is created, the estimated use time of the universal test container is predicted according to software to be tested and the test environment, whether the universal test container needs to be created is judged according to the estimated use time, and if the estimated use time is greater than or equal to the preset time, the universal test container is created; if the estimated use time length is smaller than the preset time length, a universal test container does not need to be established;
calling the interface in the test container to acquire test parameters in the test case and expected results corresponding to the test parameters;
operating the test parameters in the test container to obtain a test result;
and judging whether the current test passes or fails according to the test result and the expected result, and generating a test report, wherein two identical test containers are created, identical test cases are executed on the test containers at the same time, the test results on the two identical test containers are compared with the expected result, and whether the current test passes or fails is judged.
2. The automated test method of claim 1, wherein the step of receiving a test schedule and obtaining an engine name of a test engine in the test schedule comprises:
analyzing the test schedule to obtain a test type;
judging whether the test engine is an open source test engine or a commercial test engine according to the test type;
if the engine is the open source test engine, acquiring an engine name corresponding to the open source test engine;
and if the engine is the commercial test engine, acquiring the engine name corresponding to the commercial test engine based on the key after acquiring the key corresponding to the commercial test engine.
3. The automated test method of claim 1, wherein after the step of receiving a test schedule and obtaining an engine name for a test engine in the test schedule, further comprising:
matching the engine name with the engine name in the engine image file;
judging whether the image address corresponding to the engine name exists in the engine image file or not;
if the mirror address does not exist, the current test is terminated.
4. The automated test method of claim 3, wherein after the step of terminating the current test if the mirrored address is not present, comprising:
after receiving the update information of the engine mirror image file, detecting whether a mirror image address corresponding to the engine name exists in the engine mirror image file;
if yes, the current test is restored, and the step of calling the mirror image address corresponding to the test engine in the preset engine mirror image file according to the engine name is executed.
5. The automated test method of claim 1, wherein the automated test method further comprises:
stopping running the test container after the test of the test case is completed;
setting a label for the test container according to the attribute of the test container;
and according to the label, defining a storage area of the test container, and storing the test container.
6. A terminal device comprising a memory, a processor and an automated test program stored on the memory and executable on the processor, the processor implementing the method of any of claims 1-5 when executing the automated test program.
7. A computer readable storage medium, characterized in that the computer readable storage medium has stored thereon an automated test program, which when executed by a processor, implements the method of any of claims 1-5.
CN202311757111.2A 2023-12-20 2023-12-20 Automatic test method, terminal equipment and computer readable storage medium Active CN117435510B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311757111.2A CN117435510B (en) 2023-12-20 2023-12-20 Automatic test method, terminal equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311757111.2A CN117435510B (en) 2023-12-20 2023-12-20 Automatic test method, terminal equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN117435510A CN117435510A (en) 2024-01-23
CN117435510B true CN117435510B (en) 2024-04-02

Family

ID=89553889

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311757111.2A Active CN117435510B (en) 2023-12-20 2023-12-20 Automatic test method, terminal equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN117435510B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6662312B1 (en) * 2000-06-30 2003-12-09 Qwest Communications International Inc. Software-testing automation system
CN109101269A (en) * 2018-08-30 2018-12-28 长沙软工信息科技有限公司 A kind of Software Delivery method and device, electronic equipment
CN109359045A (en) * 2018-10-16 2019-02-19 武汉斗鱼网络科技有限公司 A kind of test method, device, equipment and storage medium
CN110399307A (en) * 2019-07-31 2019-11-01 网宿科技股份有限公司 A kind of test method, test platform and destination server
CN111290936A (en) * 2018-12-07 2020-06-16 北京奇虎科技有限公司 Interface testing method and device
CN111930361A (en) * 2020-07-16 2020-11-13 东云睿连(武汉)计算技术有限公司 Artificial intelligence operation mirror image templating construction method and device
CN113360825A (en) * 2021-07-26 2021-09-07 深圳平安智汇企业信息管理有限公司 WebUI automatic test method and device, electronic equipment and storage medium
CN114443114A (en) * 2020-11-02 2022-05-06 浙江省公众信息产业有限公司 Application issuing method, system and storage medium
CN115994079A (en) * 2021-10-19 2023-04-21 腾讯科技(深圳)有限公司 Test method, test device, electronic apparatus, storage medium, and program product
WO2023066245A1 (en) * 2021-10-18 2023-04-27 中兴通讯股份有限公司 Container engine, container engine implementation methods, electronic device and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220092136A1 (en) * 2020-03-12 2022-03-24 Sony Interactive Entertainment LLC Search engine optimization test tool

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6662312B1 (en) * 2000-06-30 2003-12-09 Qwest Communications International Inc. Software-testing automation system
CN109101269A (en) * 2018-08-30 2018-12-28 长沙软工信息科技有限公司 A kind of Software Delivery method and device, electronic equipment
CN109359045A (en) * 2018-10-16 2019-02-19 武汉斗鱼网络科技有限公司 A kind of test method, device, equipment and storage medium
CN111290936A (en) * 2018-12-07 2020-06-16 北京奇虎科技有限公司 Interface testing method and device
CN110399307A (en) * 2019-07-31 2019-11-01 网宿科技股份有限公司 A kind of test method, test platform and destination server
CN111930361A (en) * 2020-07-16 2020-11-13 东云睿连(武汉)计算技术有限公司 Artificial intelligence operation mirror image templating construction method and device
CN114443114A (en) * 2020-11-02 2022-05-06 浙江省公众信息产业有限公司 Application issuing method, system and storage medium
CN113360825A (en) * 2021-07-26 2021-09-07 深圳平安智汇企业信息管理有限公司 WebUI automatic test method and device, electronic equipment and storage medium
WO2023066245A1 (en) * 2021-10-18 2023-04-27 中兴通讯股份有限公司 Container engine, container engine implementation methods, electronic device and storage medium
CN115994079A (en) * 2021-10-19 2023-04-21 腾讯科技(深圳)有限公司 Test method, test device, electronic apparatus, storage medium, and program product

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
一种跨平台的移动自动化测试系统的设计与实现;王俊 等;自动化技术与应用;20171025(第10期);第72-76页 *
基于docker的时差法流量监测装置web测试系统设计;符伟杰 等;电子设计工程;20180205(第03期);第21-24+29页 *

Also Published As

Publication number Publication date
CN117435510A (en) 2024-01-23

Similar Documents

Publication Publication Date Title
AU2018310287A1 (en) Smart contract processing method and apparatus
CN109710428B (en) Dynamic calling method and device for implementation class
CN100468356C (en) Test case inheritance controlled via attributes
CN109558320B (en) System testing method, device, system, equipment and computer readable storage medium
CN110096424B (en) Test processing method and device, electronic equipment and storage medium
CN112306855B (en) Interface automation test method, device, terminal and storage medium
US11422917B2 (en) Deriving software application dependency trees for white-box testing
CN105426310A (en) Method and apparatus for detecting performance of target process
CN111124871A (en) Interface test method and device
CN112147987B (en) Vehicle diagnosis method, vehicle diagnosis device and terminal equipment
CN110046100B (en) Packet testing method, electronic device and medium
CN109857432A (en) A kind of hot update method and device of game application
CN115617780A (en) Data import method, device, equipment and storage medium
CN112052037A (en) Application software development method, device, equipment and medium
CN110958138A (en) Container expansion method and device
CN114416545A (en) Method and device for determining test code coverage rate and electronic equipment
CN112463596B (en) Test case data processing method, device and equipment and processing equipment
CN117435510B (en) Automatic test method, terminal equipment and computer readable storage medium
CN112134922A (en) Service calling method and device based on micro-service and storage medium
CN109542775B (en) Test script generation and execution method and device
CN111259619A (en) Control method and device for configuration object, storage medium and verification platform
CN106951236B (en) Plug-in development method and device
CN106855804B (en) Code change management and control method and device
CN115617668A (en) Compatibility testing method, device and equipment
CN114816984A (en) JAVA program regression test method and test device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant