CN115587028A - Interface automation test method, system, medium and terminal - Google Patents

Interface automation test method, system, medium and terminal Download PDF

Info

Publication number
CN115587028A
CN115587028A CN202211191860.9A CN202211191860A CN115587028A CN 115587028 A CN115587028 A CN 115587028A CN 202211191860 A CN202211191860 A CN 202211191860A CN 115587028 A CN115587028 A CN 115587028A
Authority
CN
China
Prior art keywords
test
interface
case
script
performance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211191860.9A
Other languages
Chinese (zh)
Inventor
张伟明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Jiufangyun Intelligent Technology Co ltd
Original Assignee
Shanghai Jiufangyun Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Jiufangyun Intelligent Technology Co ltd filed Critical Shanghai Jiufangyun Intelligent Technology Co ltd
Priority to CN202211191860.9A priority Critical patent/CN115587028A/en
Publication of CN115587028A publication Critical patent/CN115587028A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Abstract

The invention provides an interface automatic test method, a system, a medium and a terminal, comprising the following steps: interface function testing: before executing a test case corresponding to the test case identification each time, acquiring parameter information of the to-be-tested case according to the associated case information, generating a to-be-tested interface test case according to the acquired parameter information of the to-be-tested case and a to-be-tested case template, and performing functional test on the interface which changes in real time by using the to-be-tested interface test case; interface performance testing: receiving a performance test instruction sent by a tester, communicating with a performance test template through a test case identifier to generate a performance test script, transmitting the script into a container, and performing performance test through a container running script; and whether the interface performance test step needs to be carried out or not is provided by an interface function test result. The method has the characteristic of self-driving in the testing process, and greatly saves the time consumed by repeatedly compiling the content of the test case and compiling the performance testing script.

Description

Interface automation test method, system, medium and terminal
Technical Field
The invention relates to the field of software interface testing, in particular to an automatic interface testing method, system, medium and terminal.
Background
At present, with the increasing development of computer technology and the increasing abundance of service types of internet financial companies, the system complexity is higher and higher, the number of interfaces is higher and higher, and the requirement for automatic testing is increased.
The general test method is mainly realized by developing a test tool, and the existing test tool is mainly divided into a commercial test tool, an open-source test framework and a lightweight test script compiled by a tester from the technical architecture; from the functional point of view, the test tool is mainly divided into a function test tool, a performance test tool and a safety test tool; from the aspect of an operation platform, the operation platform is mainly divided into an operation system based on Windows, an operation system based on Unix/Linux and the like; from the perspective of the system under test, for example, web test tools, message test tools, etc. are classified.
However, in the testing process, sometimes when a certain interface is tested, a functional test and a performance test need to be performed in sequence, while in the functional test, the complexity of the software system is continuously increased, the one-sidedness and the hysteresis of the traditional functional test result in a rapid increase of the testing cost, the testing efficiency is greatly reduced, and the quality and the progress of the project are difficult to be ensured only by the functional test. Then, when the performance test is carried out, the functional test cases can not be reused, so that more time is consumed for compiling the test cases and the performance test scripts, and the interface test efficiency is reduced.
Chinese patent publication No. CN107797917B discloses a performance test script generation method and apparatus. The method comprises the following steps: acquiring code data corresponding to a functional test flow and functional test data generated in a functional test; generating head information of a performance test script template according to the code data corresponding to the function test flow and preset request system information; generating request information of the performance test script template according to the code data corresponding to the functional test flow and the functional test data generated in the functional test; and generating a performance test script according to the head information of the performance test script template and the request information of the performance test script template.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to provide an interface automatic testing method, a system, a medium and a terminal.
The interface automatic testing method provided by the invention comprises the following steps:
interface function testing: before executing a test case corresponding to the test case identification each time, acquiring parameter information of the to-be-tested case according to the associated case information, generating a to-be-tested interface test case according to the acquired parameter information of the to-be-tested case and a to-be-tested case template, and performing functional test on the interface which changes in real time by using the to-be-tested interface test case;
interface performance testing: receiving a performance test instruction, generating a performance test script through communication between a test case identifier and a performance test template, transmitting the script into a container, and performing performance test through the container running script;
and whether the interface performance test step needs to be carried out or not is provided by an interface function test result.
Preferably, the interface function testing step comprises the sub-steps of:
step S1.1: creating an interface function test driving instruction;
receiving test case configuration information input by a user through a client, acquiring a unique identifier of a current test case according to the function test case configuration information, extracting and adjusting associated parameters in a front test case, and generating an interface function test driving instruction;
step S1.2: monitoring an interface function test driving instruction;
the generated interface function test driving instruction is sent to a server side through a client side, the interface function test driving instruction is monitored in real time through the server side, and the monitored interface function test driving instruction is stored in a first storage medium;
step S1.3: analyzing and filtering the interface function test driving instruction;
extracting data and analyzing the data through the interface function test driving instruction of the server side, automatically classifying and filtering the interface function test driving instruction, generating a test case and a unique identifier through a case template, and storing the test case and the unique identifier into a second storage medium;
step S1.4: driving a test case to perform interface function test through an interface function test driving instruction;
extracting test information in the test case of the interface to be tested according to the interface function test driving instruction through the server side for packaging, and testing the interface function of the interface to be tested according to the packaged test information;
step S1.5: automatically generating a performance test report, analyzing and counting;
the test information is transmitted to the client side through the server side in real time, analysis and statistics are carried out on the test information, the analysis and statistics results are sent to the client side after the test is finished, and the test results are fed back to the user through the client side.
Preferably, the interface performance testing step comprises the sub-steps of:
step S2.1: creating an interface performance test driving instruction;
acquiring a user instruction through a client to enter an interface performance test configuration page, acquiring equipment configuration performance parameters input by a user, and creating an interface performance test driving instruction;
step S2.2: monitoring an interface performance test driving instruction;
receiving an interface performance test driving instruction through the server, monitoring the interface performance test driving instruction in real time, and automatically binding the monitored interface performance test driving instruction with a test case of the interface to be tested through the server;
step S2.3: automatically generating an interface performance test script;
automatically retrieving an interface performance test driving instruction and performance configuration and interface test configuration in a test case of an interface to be tested through a server, and calling a built-in method in a performance test script template to process to obtain a corresponding interface performance test script, wherein the performance test script template is a performance test service processing scene;
step S2.4: performing performance test on the interface to be tested through the containerized interface performance test script;
the generated interface performance test script is uploaded to the bridging end through the server end, the interface performance test script is embedded into the container end through the bridging end, and then the container end automatically runs the interface performance test script to perform performance test on the interface to be tested;
step S2.5: automatically generating a performance test report;
and transmitting the interface performance test information to the server side through the container side to automatically generate a performance test report, and receiving the performance test report through the client side and displaying the performance test report to a user.
Preferably, said step S1.1 comprises the following sub-steps:
step S1.1.1: recording the creation time of the test driving instruction, and recording the creation time in the case configuration information;
step S1.1.2: using the case ID and the creation time as a unique identifier of the preposed test case;
step S1.1.3: and judging whether the preposed test case exists, if so, extracting the associated parameters in the preposed test case and adjusting.
Preferably, said step S1.3 comprises the following sub-steps:
step S1.3.1: analyzing the obtained interface function test driving instruction through the server side, and analyzing whether the instruction is repeated, effective or executable;
step S1.3.2: filtering the interface function test driving instruction according to the analysis result by the server side, and filtering out repeated, invalid and unexecutable instructions;
step S1.3.3: and generating a dynamic test case with a unique identifier by using the case template through the server, and storing the dynamic test case into a second storage medium.
Preferably, said step S1.4 comprises the following sub-steps:
step S1.4.1: when the server side monitors that the dynamic test case is stored in a second storage medium, a task ID is generated through the server side, and test information in the test case of the interface to be tested is extracted and packaged after the task ID is generated;
step S1.4.2: extracting case parameters during packaging, converting the parameters into dynamic name spaces, acquiring corresponding values according to the dynamic name spaces, assigning values to the dynamic name spaces, and then executing cases;
step S1.4.3: before executing the use case, storing the original use case into a history record table, and updating the current use case to obtain an executable brand new use case and executing the executable brand new use case;
step S1.4.4: after the use case is executed, the current use case state is updated, and the consumed instruction and the taskID are cleared.
Preferably, said step S2.4 comprises the following sub-steps:
s2.4.1, generating a script master and a script worker through a server, uploading the script master to a bridge terminal, detecting the updating information of the script master through the bridge terminal, generating a master container terminal by the script master and a basic container terminal, and automatically starting the script master in the master container terminal;
s2.4.2, uploading the script worker to a bridge terminal through a server, detecting update information of the script worker through the bridge terminal, generating a worker container terminal by the script worker and a basic container terminal, and automatically starting the script worker in the worker container terminal;
and S2.4.3, automatically popping up a performance test monitoring page through the client to display the interface performance details after all the container ends are ready.
The invention provides an interface automation test system, which comprises:
interface function test module: before executing a test case corresponding to the test case identification each time, acquiring parameter information of the to-be-tested case according to the associated case information, generating a to-be-tested interface test case according to the acquired parameter information of the to-be-tested case and a to-be-tested case template, and performing functional test on the interface which changes in real time by using the to-be-tested interface test case;
interface performance test module: receiving a performance test instruction, generating a performance test script through communication between a test case identifier and a performance test template, transmitting the script into a container, and performing performance test through the container running script;
whether the interface performance test module executes the interface function test result is provided;
the interface function testing module comprises the following sub-modules:
module M1.1: creating an interface function test driving instruction;
receiving test case configuration information input by a user through a client, acquiring a unique identifier of a current test case according to the function test case configuration information, extracting and adjusting associated parameters in a front test case, and generating an interface function test driving instruction;
module M1.2: monitoring an interface function test driving instruction;
the generated interface function test driving instruction is sent to a server side through a client side, the interface function test driving instruction is monitored in real time through the server side, and the monitored interface function test driving instruction is stored in a first storage medium;
module M1.3: analyzing and filtering the interface function test driving instruction;
extracting data and analyzing the data through the interface function test driving instruction of the server side, automatically classifying and filtering the interface function test driving instruction, generating a test case and a unique identifier through a case template, and storing the test case and the unique identifier into a second storage medium;
module M1.4: driving a test case to perform interface function test through an interface function test driving instruction;
extracting test information in a test case of the interface to be tested according to the interface function test driving instruction through the server side for packaging, and performing interface function test on the interface to be tested according to the packaged test information;
module M1.5: automatically generating a performance test report, analyzing and counting;
the test information is transmitted to the client side in real time through the server side, the test information is analyzed and counted, after the test is finished, the analysis and counting result is sent to the client side, and the test result is fed back to the user through the client side;
the interface performance testing module comprises the following sub-modules:
module M2.1: creating an interface performance test driving instruction;
acquiring a user instruction through a client to enter an interface performance test configuration page, acquiring equipment configuration performance parameters input by a user, and creating an interface performance test driving instruction;
module M2.2: monitoring a driving instruction for interface performance test;
receiving an interface performance test driving instruction through the server, monitoring the interface performance test driving instruction in real time, and automatically binding the monitored interface performance test driving instruction with a test case of the interface to be tested through the server;
module M2.3: automatically generating an interface performance test script;
automatically retrieving an interface performance test driving instruction and performance configuration and interface test configuration in an interface test case to be tested through a server, and calling a built-in method in a performance test script template to process to obtain a corresponding interface performance test script, wherein the performance test script template is a performance test service processing scene;
module M2.4: performing performance test on the interface to be tested through the containerized interface performance test script;
the generated interface performance test script is uploaded to a bridging end through the server end, the interface performance test script is embedded into the container end through the bridging end, and then the container end automatically runs the interface performance test script to perform performance test on the interface to be tested;
module M2.5: automatically generating a performance test report;
the interface performance test information is transmitted to the server side through the container side to automatically generate a performance test report, and the performance test report is received through the client side and displayed to a user;
the module M1.1 comprises the following sub-modules:
s1.1.1: recording the creation time of the test drive command, and recording the creation time in the case configuration information;
s1.1.2: using the case ID and the creation time as a unique identifier of the preposed test case;
s1.1.3: judging whether a preposed test case exists, if so, extracting the associated parameters in the preposed test case and adjusting;
the module M1.3 comprises the following sub-modules:
module m1.3.1: analyzing the obtained interface function test driving instruction through the server side, and analyzing whether the instruction is repeated, effective or executable;
module m1.3.2: filtering the interface function test driving instruction according to the analysis result by the server side, and filtering out repeated, invalid and unexecutable instructions;
module m1.3.3: generating a dynamic test case with a unique identifier by using a case template through a server, and storing the dynamic test case into a second storage medium;
the module M1.4 comprises the following sub-modules:
module m1.4.1: when the server side monitors that the dynamic test case is stored in a second storage medium, a task ID is generated through the server side, and test information in the test case of the interface to be tested is extracted and packaged after the task ID is generated;
module m1.4.2: extracting case parameters during packaging, converting the parameters into a dynamic name space, acquiring corresponding values according to the dynamic name space, assigning values to the dynamic name space, and then executing a case;
module m1.4.3: before executing the use case, storing the original use case into a history record table, and updating the current use case to obtain an executable brand new use case and executing the executable brand new use case;
module m1.4.4: after the use case is executed, updating the current use case state, and then clearing the consumed instruction and the taskID;
the module M2.4 comprises the following sub-modules:
the module M2.4.1 generates a script master and a script worker through the server, uploads the script master to the bridge terminal, detects the update information of the script master through the bridge terminal, generates a master container terminal from the script master and a basic container terminal, and automatically starts the script master in the master container terminal;
a module M2.4.2 uploads the script worker to a bridge terminal through a server terminal, detects update information of the script worker through the bridge terminal, generates a worker container terminal by the script worker and a basic container terminal, and automatically starts the script worker in the worker container terminal;
and the module M2.4.3 automatically pops up a performance test monitoring page through the client to display the interface performance details after all the container ends are ready.
According to the present invention, a computer-readable storage medium is provided, in which a computer program is stored, and the computer program, when executed by a processor, implements the above-mentioned interface automated testing method.
According to the interface automatic test terminal provided by the invention, the interface automatic test method is adopted when the interface to be tested is tested by the interface automatic test terminal.
Compared with the prior art, the invention has the following beneficial effects:
1. the automatic interface test method provided by the invention has the characteristic of self-driving in the test process, and greatly saves the time consumed by repeatedly compiling the content of the test case and compiling the performance test script, thereby improving the test efficiency of the automatic test integration of the interface to be tested according to the test case.
2. The invention automatically updates the original case before and after the test case is executed each time to obtain a brand new test case, and has the advantage of high multiplexing of the test case.
3. The invention adopts the automatic deployment containerization operation script, thereby greatly improving the testing efficiency.
Drawings
Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments with reference to the following drawings:
FIG. 1 is a flow chart of an automated interface testing method according to an embodiment of the present invention;
FIG. 2 is a flow chart of an interface automation test platform according to an embodiment of the present invention;
FIG. 3 is a block diagram of an automated interface test system according to an embodiment of the present invention.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will aid those skilled in the art in further understanding the present invention, but are not intended to limit the invention in any manner. It should be noted that variations and modifications can be made by persons skilled in the art without departing from the concept of the invention. All falling within the scope of the invention.
The invention provides an interface automatic testing method, which comprises the following steps with reference to fig. 1 and 2: interface function testing: before executing the test case corresponding to the test case identification each time, acquiring the parameter information of the to-be-tested case according to the associated case information, wherein the test case is used for covering the associated case, generating the test case of the to-be-tested interface according to the acquired parameter information of the to-be-tested case and the template of the to-be-tested case, and performing functional test on the interface which changes in real time by using the test case of the to-be-tested interface.
And after the function test of the interface to be tested is completed, analyzing the result, generating a visual test report according to the icon template and feeding the visual test report back to a tester, wherein the result contains whether the interface performance test is required or not.
Interface performance testing: the performance testing method comprises the steps of receiving a performance testing instruction sent by a tester, generating a performance testing script through communication between a testing case identifier and a performance testing template after receiving the performance testing instruction, transmitting the script into a container through a conventional method, and carrying out performance testing through running the script by the container. The server monitors data in real time and in multiple dimensions for testers to check reports in real time, omnibearing power-assisted problem bottleneck positioning is achieved, the process is completely self-driven, time consumed by repeatedly compiling test case content and compiling performance test scripts is greatly saved, and therefore the test efficiency of automatic test integration of the interface to be tested according to the test cases is improved.
The method for automatically testing the interface introduced by the present invention is further described below.
The interface function testing step comprises the following substeps:
step S1.1: and creating an interface function test driving command.
An interface test configuration page is displayed in the client, and a user can input test case configuration information in the interface test configuration page for generating test cases, batch testing and test case warehousing, or generating necessary parameters of a case template. And clicking a configuration completion button in the interface test configuration page by the user through the input equipment of the client. When detecting that a configuration completion button in an interface test configuration page is clicked, a client acquires test case configuration information input in the interface test configuration page, acquires a unique identifier of a current test case according to the function test case configuration information, extracts and adjusts associated parameters in a front test case, and generates an interface function test driving instruction.
Step S1.1 comprises the following substeps:
s1.1.1: recording the creation time of the test driving instruction, and recording the creation time in the case configuration information;
s1.1.2: using the case ID and the creation time as a unique identifier of the preposed test case;
s1.1.3: and judging whether the preposed test case exists, if so, extracting the associated parameters in the preposed test case and adjusting.
Step S1.2: and monitoring the interface function test driving instruction.
The generated interface function test driving instruction is sent to the server side through the client side, the interface function test driving instruction is monitored in real time through the server side, the test driving instruction is the only identification, and the monitored interface function test driving instruction is stored in the first storage medium through the server side.
Step S1.3: the interface function test driver instructions are analyzed and filtered.
The interface function test driving instruction comprises test case data (test case parameters) such as test environment, interface information and the like, key data are extracted from the interface function test driving instruction through the server side and are analyzed intelligently, the interface function test driving instruction is classified automatically and filtered, a test case and a unique identifier (test case identifier) are generated through a case template, and the test case and the unique identifier are stored into a second storage medium.
Step S1.3 comprises the following substeps:
s1.3.1: analyzing the obtained interface function test driving instruction through the server side, and analyzing whether the instruction is repeated, effective or executable;
s1.3.2: filtering the interface function test driving instruction according to the analysis result by the server side, and filtering out repeated, invalid and unexecutable instructions;
s1.3.3: and generating a dynamic test case with a unique identifier by using the case template through the server, and storing the dynamic test case into a second storage medium.
Step S1.4: and driving the test case to perform interface function test through the interface function test driving instruction.
And extracting test information in the test case of the interface to be tested according to the interface function test driving instruction through the server side for packaging, and testing the interface function of the interface to be tested according to the packaged test information.
Step S1.4 comprises the following substeps:
s1.4.1: when the server side monitors that the dynamic test case is stored in a second storage medium, a task ID is generated through the server side, and test information in the test case of the interface to be tested is extracted and packaged after the task ID is generated;
s1.4.2: extracting case parameters during packaging, converting the parameters into a dynamic name space, acquiring corresponding values according to the dynamic name space, assigning values to the dynamic name space, and then executing a case;
s1.4.3: before executing the use case, storing the original use case into a history record table, and updating the current use case to obtain an executable brand new use case and executing the executable brand new use case;
s1.4.4: after the use case is executed, the current use case state is updated, and the consumed instruction and the taskID are cleared.
Step S1.5: and (4) automatically generating a performance test report, analyzing and counting.
The server creates a report object through a built-in inherent static class, interface test information is continuously transmitted to the report object, the report object automatically generates a test report through a preset report model, the test report is transmitted to the client through the server in real time, the client receives the test report in real time, a visual chart is rendered in real time, a user automatically enters a visual chart page, and details of an interface test process can be observed in real time.
The server side continuously transmits test information to the client side in the test process, and simultaneously continuously analyzes and counts results, sends the analysis and statistics results to the client side after the test is finished, feeds back the test results to the user through the client side, and feeds back the test results to the user through the report statistics page.
In the interface function testing step, the testing efficiency is improved by adopting a technical means of multiplexing the function test cases, and although the existing technical scheme also has a scheme of multiplexing the test cases, the schemes all have the same defect that the test cases cannot be multiplexed in the interface with high variable access parameters, so that the existing scheme cannot be applied to the interface with real-time change. Then, the method disclosed by the invention can automatically update the original case before and after the test case is executed each time to obtain a brand-new test case, and solves the defects of the existing scheme, so that the scheme disclosed by the invention has the advantage of high multiplexing of the test case.
Referring to fig. 3, the interface performance testing step includes the sub-steps of:
step S2.1: and creating an interface performance test driving instruction.
And a report statistical page with interface function test results is displayed in the client, and a user can click an interface performance test configuration button through input equipment of the client in the report statistical page. When the client detects that the interface performance test configuration button is clicked, the client enters an interface performance test configuration page to acquire equipment configuration performance parameters input by a user, wherein the performance parameters refer to concurrency, resources, speed and the like and are used for generating performance instructions and controlling the performance test to run and stop, and the parameters are indexes used for calculating the interface performance in the report. After the configuration is completed, a user can click a performance execution button on the current page, and when the client detects that the performance execution button is clicked, an interface performance test driving instruction is created.
Step S2.2: and monitoring the interface performance test driving instruction.
And receiving the interface performance test driving instruction through the server, monitoring the interface performance test driving instruction in real time, wherein the performance test driving instruction is unique in identification, and automatically binding the monitored interface performance test driving instruction with the test case of the interface to be tested through the server.
Step S2.3: and automatically generating an interface performance test script.
The interface performance test driving instruction, the interface test case to be tested and the performance test script template are integrated, wherein the interface test case to be tested is a test case used for testing the interface to be tested, the performance test script template is a performance test service processing scene, the interface performance test driving instruction and the performance configuration and the interface test configuration in the interface test case to be tested are automatically retrieved through the server, and then a built-in method in the performance test script template is called to process to obtain the corresponding interface performance test script.
Step S2.4: and performing performance test on the interface to be tested through the containerization interface performance test script.
The generated interface performance test script is uploaded to the bridge end through the server end, the interface performance test script is embedded into the container end through the bridge end, and then the container end automatically runs the interface performance test script to perform performance test on the interface to be tested.
Step S2.4 comprises the following substeps:
s2.4.1, generating a script master and a script worker through a server side, uploading the script master to a bridge terminal, detecting the updating information of the script master through the bridge terminal, generating a master container side by the script master and a basic container side, and automatically starting the script master in the master container side;
s2.4.2, uploading the script worker to a bridge terminal through a server, detecting update information of the script worker through the bridge terminal, generating a worker container terminal by the script worker and a basic container terminal, and automatically starting the script worker in the worker container terminal;
and S2.4.3, automatically popping up a performance test monitoring page through the client to display the interface performance details after all the container ends are ready.
Step S2.5: a performance test report is automatically generated.
The interface performance test information is continuously transmitted to the server side through the container side, the server side creates a performance report object through the built-in inherent static class, the interface performance test information is continuously transmitted to the performance report object, the performance report object automatically generates a performance test report through a preset performance report model, the server side pushes the performance test report to the client side, the client side receives the performance test report in real time, a visual chart is rendered in real time, a user automatically enters a visual chart page, and details of the interface performance test process can be observed in real time.
In the scheme provided by the invention, the deployment and the operation of the performance test script do not need the intervention of a tester, the principle is to automatically deploy the containerized operation script, and the part still needs the intervention of the tester in the existing scheme, so that certain influence is generated on the test efficiency.
In the scheme provided by the invention, a tester is required to intervene in a place where case configuration and performance parameters are adjusted, but the workload of the tester is only adjustment parameters and is small. But the effect brought by the step is that the whole test gets a qualitative return at the place where the result of the functional test and the performance test is fed back, and the specific return is at the place where the test case is generated and the performance report is generated.
The invention discloses an interface automatic test system, comprising:
interface function test module: before executing the test case corresponding to the test case identification each time, acquiring the parameter information of the to-be-tested case according to the associated case information, generating the to-be-tested interface test case according to the acquired parameter information of the to-be-tested case and the to-be-tested case template, and performing functional test on the interface which changes in real time by using the to-be-tested interface test case.
And after the function test of the interface to be tested is completed, analyzing the result, generating a visual test report according to the icon template and feeding the visual test report back to a tester, wherein the result contains whether the interface performance test is required or not.
Interface performance test module: and receiving a performance test instruction sent by a tester, communicating with the performance test template through the test case identifier to generate a performance test script, transmitting the script into the container, and performing performance test through the container running script.
The invention discloses a computer readable storage medium, wherein a computer program is stored in the computer readable storage medium, and the computer program is executed by a processor to realize the automatic interface testing method. The computer readable storage medium may be a U disk, a removable hard disk, etc.
The invention also discloses an interface automatic test terminal, which can be portable, and is connected with a computer during testing, the function and performance of the tested interface can be tested by running the internal program of the test terminal, and the interface automatic test method can be executed when the internal program of the test terminal runs.
It is well within the knowledge of a person skilled in the art to implement the system and its various devices, modules, units provided by the present invention in a purely computer readable program code means that the same functionality can be implemented by logically programming method steps in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like. Therefore, the system and various devices, modules and units thereof provided by the present invention can be regarded as a hardware component, and the devices, modules and units included therein for implementing various functions can also be regarded as structures within the hardware component; means, modules, units for performing the various functions may also be regarded as structures within both software modules and hardware components for performing the method.
The foregoing description has described specific embodiments of the present invention. It is to be understood that the present invention is not limited to the specific embodiments described above, and that various changes or modifications may be made by one skilled in the art within the scope of the appended claims without departing from the spirit of the invention. The embodiments and features of the embodiments of the present application may be combined with each other arbitrarily without conflict.

Claims (10)

1. An automated interface testing method, comprising:
interface function testing: before executing a test case corresponding to the test case identification each time, acquiring parameter information of the to-be-tested case according to the associated case information, generating a to-be-tested interface test case according to the acquired parameter information of the to-be-tested case and a to-be-tested case template, and performing functional test on the interface which changes in real time by using the to-be-tested interface test case;
interface performance testing: receiving a performance test instruction, generating a performance test script through communication between a test case identifier and a performance test template, automatically deploying the script into a container, and performing performance test through the container running script;
and whether the interface performance test step needs to be carried out or not is provided by an interface function test result.
2. The automated interface testing method of claim 1, wherein: the interface function testing step comprises the following substeps:
step S1.1: creating an interface function test driving instruction;
receiving test case configuration information input by a user through a client, acquiring a unique identifier of a current test case according to the function test case configuration information, extracting and adjusting associated parameters in a front test case, and generating an interface function test driving instruction;
step S1.2: monitoring an interface function test driving instruction;
the generated interface function test driving instruction is sent to a server side through a client side, the interface function test driving instruction is monitored in real time through the server side, and the monitored interface function test driving instruction is stored in a first storage medium;
step S1.3: analyzing and filtering the interface function test driving instruction;
extracting data and analyzing the data through the interface function test driving instruction of the server side, automatically classifying and filtering the interface function test driving instruction, generating a test case and a unique identifier through a case template, and storing the test case and the unique identifier into a second storage medium;
step S1.4: driving the test case to perform interface function test through the interface function test driving instruction;
extracting test information in the test case of the interface to be tested according to the interface function test driving instruction through the server side for packaging, and testing the interface function of the interface to be tested according to the packaged test information;
step S1.5: automatically generating a success performance test report, and analyzing and counting;
the test information is transmitted to the client side through the server side in real time, analysis and statistics are carried out on the test information, the analysis and statistics results are sent to the client side after the test is finished, and the test results are fed back to the user through the client side.
3. The automated interface testing method of claim 1, wherein: the interface performance testing step comprises the following substeps:
step S2.1: creating an interface performance test driving instruction;
acquiring a user instruction through a client to enter an interface performance test configuration page, acquiring equipment configuration performance parameters input by a user, and creating an interface performance test driving instruction;
step S2.2: monitoring a driving instruction for interface performance test;
receiving an interface performance test driving instruction through the server, monitoring the interface performance test driving instruction in real time, and automatically binding the monitored interface performance test driving instruction with a test case of the interface to be tested through the server;
step S2.3: automatically generating an interface performance test script;
automatically retrieving an interface performance test driving instruction and performance configuration and interface test configuration in an interface test case to be tested through a server, and calling a built-in method in a performance test script template to process to obtain a corresponding interface performance test script, wherein the performance test script template is a performance test service processing scene;
step S2.4: performing performance test on the interface to be tested through the containerization interface performance test script;
automatically uploading the generated interface performance test script to a bridging end through a server end, embedding the interface performance test script into a container end through the bridging end, and automatically operating the interface performance test script through the container end to perform performance test on an interface to be tested;
step S2.5: automatically generating a performance test report;
and transmitting the interface performance test information to the server side through the container side to automatically generate a performance test report, and receiving and displaying the performance test report to a user through the client side.
4. The automated interface testing method of claim 2, wherein: said step S1.1 comprises the following sub-steps:
step S1.1.1: recording the creation time of the test drive command, and recording the creation time in the case configuration information;
step S1.1.2: using the case ID and the creation time as a unique identifier of the preposed test case;
step S1.1.3: and judging whether the preposed test case exists, if so, extracting the associated parameters in the preposed test case and adjusting.
5. The automated interface testing method of claim 2, wherein: said step S1.3 comprises the following sub-steps:
step S1.3.1: analyzing the obtained interface function test driving instruction through the server, and analyzing whether the instruction is repeated, effective or not and executable or not;
step S1.3.2: filtering the interface function test driving instruction according to the analysis result by the server side, and filtering out repeated, invalid and unexecutable instructions;
step S1.3.3: and generating a dynamic test case with a unique identifier by using the case template through the server, and storing the dynamic test case into a second storage medium.
6. The automated interface testing method of claim 2, wherein: said step S1.4 comprises the following substeps:
step S1.4.1: when the server side monitors that the dynamic test case is stored in a second storage medium, a task ID is generated through the server side, and test information in the test case of the interface to be tested is extracted and packaged after the task ID is generated;
step S1.4.2: extracting case parameters during packaging, converting the parameters into a dynamic name space, acquiring corresponding values according to the dynamic name space, assigning values to the dynamic name space, and then executing a case;
step S1.4.3: before the use case is executed, storing the original use case into a history record table, updating the current use case to obtain an executable brand-new use case, and executing the executable brand-new use case;
step S1.4.4: after the use case is executed, the current use case state is updated, and the consumed instruction and the taskID are cleared.
7. The automated interface testing method of claim 3, wherein: said step S2.4 comprises the following sub-steps:
s2.4.1, generating a script master and a script worker through a server, uploading the script master to a bridge terminal, detecting the updating information of the script master through the bridge terminal, generating a master container terminal by the script master and a basic container terminal, and automatically starting the script master in the master container terminal;
s2.4.2, uploading the script worker to a bridge terminal through a server, detecting update information of the script worker through the bridge terminal, generating a worker container terminal by the script worker and a basic container terminal, and automatically starting the script worker in the worker container terminal;
and S2.4.3, automatically popping up a performance test monitoring page through the client to display the interface performance details after all the container ends are ready.
8. An interface automated test system, comprising:
interface function test module: before executing a test case corresponding to the test case identification each time, acquiring parameter information of the to-be-tested case according to the associated case information, generating a to-be-tested interface test case according to the acquired parameter information of the to-be-tested case and a to-be-tested case template, and performing functional test on the interface which changes in real time by using the to-be-tested interface test case;
interface performance test module: receiving a performance test instruction, generating a performance test script through communication between a test case identifier and a performance test template, transmitting the script into a container, and performing performance test through the container running script;
and whether the interface performance test module executes the interface function test result is provided.
9. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, implements the interface automation test method of any one of claims 1 to 7.
10. An interface automation test terminal, characterized in that the interface automation test terminal adopts the interface automation test method of any claim 1-7 when testing the interface to be tested.
CN202211191860.9A 2022-09-28 2022-09-28 Interface automation test method, system, medium and terminal Pending CN115587028A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211191860.9A CN115587028A (en) 2022-09-28 2022-09-28 Interface automation test method, system, medium and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211191860.9A CN115587028A (en) 2022-09-28 2022-09-28 Interface automation test method, system, medium and terminal

Publications (1)

Publication Number Publication Date
CN115587028A true CN115587028A (en) 2023-01-10

Family

ID=84778953

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211191860.9A Pending CN115587028A (en) 2022-09-28 2022-09-28 Interface automation test method, system, medium and terminal

Country Status (1)

Country Link
CN (1) CN115587028A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116561014A (en) * 2023-07-07 2023-08-08 国电南瑞科技股份有限公司 Device and method for generating test case of electric power secondary equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116561014A (en) * 2023-07-07 2023-08-08 国电南瑞科技股份有限公司 Device and method for generating test case of electric power secondary equipment
CN116561014B (en) * 2023-07-07 2023-09-29 国电南瑞科技股份有限公司 Device and method for generating test case of electric power secondary equipment

Similar Documents

Publication Publication Date Title
US7895565B1 (en) Integrated system and method for validating the functionality and performance of software applications
CN107688530B (en) Software testing method and device
US20040153837A1 (en) Automated testing
CN101620564B (en) Method for automatically testing recording playback mode of terminal system
US9703687B2 (en) Monitor usable with continuous deployment
US7299451B2 (en) Remotely driven system for multi-product and multi-platform testing
EP2245551B1 (en) Identification of elements of currently-executing component script
CN102693183A (en) Method and system for realizing automatic software testing
CA2966820A1 (en) Task-based health data monitoring of aircraft components
CN104899132B (en) Application software testing method, apparatus and system
CN106227654A (en) A kind of test platform
CN115686540A (en) RPA control method and system based on Hongmong system
CN112433948A (en) Simulation test system and method based on network data analysis
CN111462811A (en) Automatic testing method and device, storage medium and electronic equipment
CN112650676A (en) Software testing method, device, equipment and storage medium
CN115587028A (en) Interface automation test method, system, medium and terminal
CN111309609B (en) software processing system
CN104123397A (en) Automatic test device and method for Web page
CN110232013B (en) Test method, test device, controller and medium
CN116467188A (en) Universal local reproduction system and method under multi-environment scene
CN112506772B (en) Web automatic test method, device, electronic equipment and storage medium
CN111488264A (en) Deployment scheduling method for interface performance test cluster
CN111163309A (en) Testing method based on behavior simulation, television equipment and storage medium
CN116932413B (en) Defect processing method, defect processing device and storage medium for test task
CN114527962A (en) Flow automation processing device and method and computing equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination