CN112988565B - Interface automatic test method, device, computer equipment and storage medium - Google Patents

Interface automatic test method, device, computer equipment and storage medium Download PDF

Info

Publication number
CN112988565B
CN112988565B CN202110099987.7A CN202110099987A CN112988565B CN 112988565 B CN112988565 B CN 112988565B CN 202110099987 A CN202110099987 A CN 202110099987A CN 112988565 B CN112988565 B CN 112988565B
Authority
CN
China
Prior art keywords
test
model
path
service
business
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110099987.7A
Other languages
Chinese (zh)
Other versions
CN112988565A (en
Inventor
陆星欣
徐克强
欧平均
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Yike Information Technology Co ltd
Original Assignee
Hangzhou Yike Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Yike Information Technology Co ltd filed Critical Hangzhou Yike Information Technology Co ltd
Priority to CN202410020039.3A priority Critical patent/CN117827669A/en
Priority to CN202110099987.7A priority patent/CN112988565B/en
Publication of CN112988565A publication Critical patent/CN112988565A/en
Application granted granted Critical
Publication of CN112988565B publication Critical patent/CN112988565B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases

Abstract

The invention relates to an automatic interface testing method, a device, computer equipment and a storage medium, wherein the method comprises the steps of obtaining testing requirements; carrying out service modeling according to the test requirements to obtain a test model; generating a test path for the test model; converting the test path into an automatic test case; and executing the test script of the automatic test case to generate a test report. According to the invention, after the business object is analyzed for the test requirement, the test model is constructed, the test is carried out based on the model, the GraphWalker is adopted to generate the test path for the test model, the test path is converted into the automatic test case, a tester does not need to write the test case, and the test path can relate to different details in the test process, so that the purposes of improving the test coverage rate and the test accuracy are achieved, the test model can easily cope with the change of business logic, software defects can be found earlier in the process of constructing the test model, and the barriers in business communication are reduced.

Description

Interface automatic test method, device, computer equipment and storage medium
Technical Field
The present invention relates to software testing methods, and more particularly, to an interface automation testing method, apparatus, computer device, and storage medium.
Background
In the conventional test work, a tester only designs brain charts and test cases according to requirements in a design stage to perform software test, but the brain charts and the test cases need to be redesigned or supplemented with the increment of functions of a software system and the change of partial requirements. In the iterative process of the software system, the priority of the test cases can be dynamically changed along with the continuous deepening of business understanding. At this time, the old brain patterns and test cases gradually decrease in effect, and it is difficult for the testers to analyze the potential problems of the system.
From the software project management flow, the test should participate in the whole course, and from the project starting, the testers participate in the demand analysis and review, prepare the test plan and the test environment, and design the test scheme. Therefore, the testers and the developers can communicate timely and fully, the difficulty and the testing risk of the project can be known timely, defects can be found as early as possible, the project risk is obviously reduced, the testers also need to write test cases and update in real time in the software testing process, but the test cases are difficult to better cover all details of the whole test, and the accuracy of the test results is not high.
Therefore, a new method is necessary to be designed, so that a tester does not need to write test cases, and the test coverage rate and the test accuracy rate are improved.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides an interface automatic test method, an interface automatic test device, computer equipment and a storage medium.
In order to achieve the above purpose, the present invention adopts the following technical scheme: the automatic interface testing method comprises the following steps:
acquiring a test requirement;
performing service modeling according to the test requirements to obtain a test model;
generating a test path for the test model;
converting the test path into an automatic test case;
and executing the test script of the automatic test case to generate a test report.
The further technical scheme is as follows: and performing service modeling according to the test requirement to obtain a test model, wherein the service modeling comprises the following steps:
analyzing the test requirement to obtain a business object;
generating a business flow chart according to the business object;
constructing a test application layer according to the business object and the business flow chart;
packaging the test application layer to obtain a model adaptation layer;
and modifying the business flow chart according to the method related to the model adaptation layer to obtain a test model.
The further technical scheme is as follows: the encapsulating the test application layer to obtain a model adaptation layer comprises the following steps:
setting a service object initialization function;
integrating operation type methods in the test application layer;
integrating a query assertion method in the test application layer;
setting an execution rule to obtain a model adaptation layer.
The further technical scheme is as follows: the generating a test path for the test model includes:
and generating a test path for the test model by using the GraphWalker.
The further technical scheme is as follows: the generating a test path for the test model using GraphWalker includes:
a test path is generated for the test model using a generator and stop condition provided by GraphWalker.
The further technical scheme is as follows: the converting the test path into an automated test case includes:
and generating an automatic test case by combining the test path and the method provided by the model adaptation layer.
The invention also provides an interface automatic testing device, which comprises:
the demand acquisition unit is used for acquiring the test demand;
the service modeling unit is used for carrying out service modeling according to the test requirements so as to obtain a test model;
a path generating unit, configured to generate a test path for the test model;
the case conversion unit is used for converting the test path into an automatic test case;
and the execution unit is used for executing the test script of the automatic test case so as to generate a test report.
The further technical scheme is as follows: the service modeling unit includes:
the analysis subunit is used for analyzing the test requirement to obtain a business object;
a flow chart generating subunit, configured to generate a service flow chart according to the service object;
an application layer construction subunit, configured to construct a test application layer according to the service object and the service flow chart;
the adaptation layer construction subunit is used for packaging the test application layer to obtain a model adaptation layer;
and the modification subunit is used for modifying the business flow chart according to the method related to the model adaptation layer so as to obtain a test model.
The invention also provides a computer device which comprises a memory and a processor, wherein the memory stores a computer program, and the processor realizes the method when executing the computer program.
The present invention also provides a storage medium storing a computer program which, when executed by a processor, performs the above-described method.
Compared with the prior art, the invention has the beneficial effects that: according to the invention, after the business object is analyzed for the test requirement, the test model is constructed, the test is carried out based on the model, the GraphWalker is adopted to generate the test path for the test model, the test path is converted into the automatic test case, a tester does not need to write the test case, and the test path can relate to different details in the test process, so that the purposes of improving the test coverage rate and the test accuracy are achieved, the test model can easily cope with the change of business logic, software defects can be found earlier in the process of constructing the test model, and the barriers in business communication are reduced.
The invention is further described below with reference to the drawings and specific embodiments.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic diagram of an application scenario of an automatic interface testing method according to an embodiment of the present invention;
FIG. 2 is a flow chart of an automated interface testing method according to an embodiment of the present invention;
FIG. 3 is a schematic flow chart of an embodiment of an automated interface testing method according to the present invention;
FIG. 4 is a schematic flow chart of an embodiment of an automated interface testing method according to the present invention;
FIG. 5 is a schematic block diagram of an automated interface testing apparatus provided by an embodiment of the present invention;
FIG. 6 is a schematic block diagram of a business modeling unit of an interface automation test device provided by an embodiment of the present invention;
FIG. 7 is a schematic block diagram of an adaptation layer building subunit of an interface automation test device provided in an embodiment of the present invention;
fig. 8 is a schematic block diagram of a computer device according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be understood that the terms "comprises" and "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in the present specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
Referring to fig. 1 and fig. 2, fig. 1 is a schematic view of an application scenario of an automatic interface testing method according to an embodiment of the present invention. Fig. 2 is a schematic flow chart of an interface automation test method provided by an embodiment of the present invention. The interface automatic test method is applied to the server. The server performs data interaction with the terminal.
Fig. 2 is a flow chart of an automated interface testing method according to an embodiment of the present invention. As shown in fig. 2, the method includes the following steps S110 to S150.
S110, acquiring test requirements.
In this embodiment, the test requirement refers to a requirement of the terminal for software test construction, and the requirement is sent to the server by the terminal.
S120, carrying out service modeling according to the test requirements to obtain a test model.
In the present embodiment, the test model refers to a model used for performing software testing. The software test is carried out by using a test Model, namely MBT (Model-based test), which is a test mode for modeling according to the description of the main functionality of certain aspects of the tested system, further automatically generating a test case according to the Model and automatically executing the verification process.
In one embodiment, referring to fig. 3, the step S120 may include steps S121 to S125.
S121, analyzing the test requirement to obtain a business object.
In this embodiment, the service object refers to the content of the software to be tested, including the performance and attribute of the software, and the like.
S122, generating a business flow chart according to the business object.
In this embodiment, the business flow chart refers to a flow chart formed by a flow which needs to be executed when testing software.
The method comprises the steps of analyzing test requirements, abstracting out business objects related to tested functions in the test requirements, describing the business objects by using a business flow chart mode through corresponding rules, completing abstraction of a first step, deducing according to a logic coverage or basic path coverage mode on the basis of the flow chart, deepening understanding of the tested functions, gradually evolving into a deep model, designing the test scientifically and systematically, and analyzing the hidden problem of the system. Software defects can be found earlier, and barriers in business communication are reduced.
In this embodiment, the corresponding rule refers to that the naming method of the edge or vertex in the model diagram needs to be contracted. When the analysis path generates the test case, the test case is mapped to an interface provided in the model service through a mapping rule according to the naming mode.
When the test model is constructed, codes of the service objects are required to be realized, and the codes are arranged according to the sequence of the service flow chart. The test model construction is implemented by testing a service layer, specifically, a root service flow chart and a service object or a rule corresponding to the service flow chart, referring to the thought of field design, implementing the service layer construction by using codes to implement a persistent object, a service object and a rule object in a tested system, wherein only the service logic is implemented, and the operation of a database, an API (application program interface, application Programming Interface) and the like on the tested system is not involved.
S123, constructing a test application layer according to the business object and the business flow chart.
In this embodiment, the test application layer refers to an application entity responsible for interaction with the system under test.
Specifically, the operation/query interface provided by the server is individually packaged by expanding on the basis of testing the business object of the business layer. And in each operation class method, firstly, calling an operation interface corresponding to the server, if the interface is wrong, directly throwing out an exception, then, calling a method corresponding to the service object, and updating the attribute in the test object. Corresponding to the query class interface, a query class method is provided independently, and then an assertion method is provided on the basis, so that the current attribute of the test service object is compared with the return value of the service interface, and the purpose of full field verification is achieved.
In execution, the test case stores the IS of the object under test in the test suite context. When the information is needed, the needed information is acquired from the context, and then a request is sent to a test service layer for processing.
S124, packaging the test application layer to obtain a model adaptation layer.
In this embodiment, the model adaptation layer is a service for converting the test path generated by GraphWalker into an executable automation use case.
GraphWalker is an open source testing tool based on a model, supports the creation and editing of the model, and verifies the correctness and expected behavior of the model. The GraphWalker model contains 2 basic elements: vertex and edge; one edge represents an action, which may be an API call, a button click, a behavior of the tested system from one state to the next, some condition, etc., and the vertex represents verification, which may verify whether the API call returns the correct value, whether clicking the button closes the dialog, or whether the tested system triggers an expected event when the next state is reached.
In one embodiment, referring to fig. 4, the step S124 may include steps S1241 to S1244.
S1241, a service object initializing function is set.
In this embodiment, the model adaptation layer needs to provide a function of service object initialization, and if a new service object appears, it is convenient to maintain the test model and the test case, and the tester does not need to manually rewrite the test case.
S1242, integrating the operation type method in the test application layer.
In this embodiment, the model adaptation layer may integrate a good operation class method, return to the boot method in mocha, and in a test suite of one operation, trigger the front-end, and may also be regarded as a test case, when the boot fails, the query case in the corresponding test suite is no longer executed, and optionally terminate the current test procedure. And the corresponding edge in the GraphWalker is used for driving the execution of the business process.
S1243, integrating the query assertion method in the test application layer.
In this embodiment, the model adaptation layer may integrate the polling class assertion method and return to the it method in the mocha. And the vertex in the GraphWalker is used for checking data. In a test suite of one operation class, one operation may affect different business objects, and one query class assertion method is only aimed at one business object. It is therefore necessary to refer to multiple query class assertion methods as the case may be.
S1244, an execution rule is set to obtain a model adaptation layer.
In addition, the model adaptation layer needs to provide constraint conditions, i.e. execution rules, when executing from one edge to one vertex, i.e. a test suite, to check whether the current business object meets the executable requirements. And if the current test line is not satisfied, stopping the current test line. For example, in an order system, the order system can be shipped only after successful payment, and before the test suite for shipment is executed, whether the current payment state of the order is paid or not is judged, if not, the order system is stopped.
The test application layer and the model adaptation layer are designed to easily cope with the change of business logic, so that the system can realize full field verification under the conditions of different test data and different test behavior arrangement combinations, and the test accuracy is greatly improved.
S125, modifying the business flow chart according to the method related to the model adaptation layer to obtain a test model.
And modifying the business flow chart by using the method provided by the model adaptation layer, replacing the label in the business flow chart with a label which can be identified by the model adaptation layer, and properly debugging to obtain a corresponding test model.
S130, generating a test path for the test model.
In this embodiment, the test path refers to a location of a test to be performed, and functions as a route for test guidance, similar to the navigation path.
Specifically, a GraphWalker is used to generate a test path for the test model.
A test path is generated for the test model using a generator and stop condition provided by GraphWalker.
In this embodiment, the test path is generated using the generator and stop condition provided by GraphWalker. The generator is an algorithm that decides how to traverse the model, different generators will generate different test sequences that will traverse the test model in different ways. The stop condition will determine the condition for the test path to complete. The generator continues to generate new steps until a stop condition is met.
S140, converting the test path into an automatic test case.
In this embodiment, the automated test case refers to data for a test model.
Specifically, an automated test case is generated in conjunction with the test path and the method provided by the model adaptation layer.
After the circuit required by the test is determined, a plurality of test cases can be generated according to an operation class method, a polling class assertion method and constraint conditions integrated in the model adaptation layer, the test cases are not required to be written manually by a tester, and each detail of the test can be better considered by the generated test cases, so that the test coverage rate is improved.
Specifically, according to a traversing algorithm or strategy of the model, a corresponding test path is generated. The processing of the edges adds special edges such as init/reset/end, which are determined by the respective items and are used for distinguishing or controlling the boundaries of the use cases. In general, an init to a reset or end process is understood as a complete automatic test process case, in which an operation and an assertion corresponding to the operation often represent a test node, and may also be understood as a test case with a minimum unit, where the assertion is a set of multiple test points, independent of each other, and not affecting each other, so as to verify whether the corresponding operation is normal from the respective angles.
S150, executing the test script of the automatic test case to generate a test report.
In this embodiment, the test report refers to a test result of the software obtained after the test model is tested.
After the automatic test cases are acquired, the corresponding test scripts can be executed, so that the report after execution is acquired.
According to the interface automatic test method, after the service object is analyzed on the test requirement, the test model is constructed, the test is carried out based on the model, the GraphWalker is adopted to generate the test path for the test model, the test path is converted into the automatic test case, a tester does not need to write the test case, and the test path can relate to different details in the test process, so that the purposes of improving the test coverage rate and the test accuracy are achieved, the test model can easily cope with the change of service logic, software defects can be found earlier in the process of constructing the test model, and the barriers in service communication are reduced.
Fig. 5 is a schematic block diagram of an interface automation test device 300 according to an embodiment of the present invention. As shown in fig. 5, the present invention further provides an interface automation testing device 300 corresponding to the above interface automation testing method. The interface automation test device 300 includes a unit for performing the interface automation test method described above, and may be configured in a server. Specifically, referring to fig. 5, the interface automation testing device 300 includes a requirement obtaining unit 301, a service modeling unit 302, a path generating unit 303, a use case converting unit 304, and an executing unit 305.
A requirement acquisition unit 301, configured to acquire a test requirement; the service modeling unit 302 is configured to perform service modeling according to the test requirement to obtain a test model; a path generating unit 303, configured to generate a test path for the test model; a case conversion unit 304, configured to convert the test path into an automated test case; and the execution unit 305 is used for executing the test script of the automatic test case to generate a test report.
In an embodiment, as shown in fig. 6, the service modeling unit 302 includes an analysis subunit 3021, a flowchart generation subunit 3022, an application layer construction subunit 3023, an adaptation layer construction subunit 3024, and a modification subunit 3025.
An analysis subunit 3021, configured to analyze the test requirement to obtain a service object; a flow chart generation subunit 3022, configured to generate a service flow chart according to the service object; an application layer construction subunit 3023, configured to construct a test application layer according to the service object and the service flowchart; an adaptation layer construction subunit 3024, configured to encapsulate the test application layer to obtain a model adaptation layer; and the modification subunit 3025 is configured to modify the service flowchart according to the method involved in the model adaptation layer, so as to obtain a test model.
In an embodiment, as shown in fig. 7, the adaptation layer building sub-unit 3024 includes a setting module 30241, a first integration module 30242, a second integration module 30243, and a rule setting module 30244.
A setting module 30241, configured to set a service object initializing function; a first integration module 30242, configured to integrate the operation class method in the test application layer; a second integration module 30243, configured to integrate a query class assertion method in the test application layer; the rule setting module 30244 is configured to set execution rules to obtain a model adaptation layer.
In an embodiment, the path generating unit 303 is configured to generate a test path for the test model using GraphWalker.
Specifically, the path generating unit 303 is configured to generate a test path for the test model using a generator and a stop condition provided by GraphWalker.
In an embodiment, the use case conversion unit 304 is configured to generate an automated test use case in combination with the test path and the method provided by the model adaptation layer.
It should be noted that, as will be clearly understood by those skilled in the art, the specific implementation process of the interface automation testing device 300 and each unit may refer to the corresponding description in the foregoing method embodiment, and for convenience and brevity of description, the description is omitted here.
The interface automation test device 300 described above may be implemented in the form of a computer program that may be run on a computer apparatus as shown in fig. 8.
Referring to fig. 8, fig. 8 is a schematic block diagram of a computer device according to an embodiment of the present application. The computer device 500 may be a server, where the server may be a stand-alone server or may be a server cluster formed by a plurality of servers.
With reference to FIG. 8, the computer device 500 includes a processor 502, memory, and a network interface 505 connected by a system bus 501, where the memory may include a non-volatile storage medium 503 and an internal memory 504.
The non-volatile storage medium 503 may store an operating system 5031 and a computer program 5032. The computer program 5032 includes program instructions that, when executed, cause the processor 502 to perform an interface automation test method.
The processor 502 is used to provide computing and control capabilities to support the operation of the overall computer device 500.
The internal memory 504 provides an environment for the execution of a computer program 5032 in the non-volatile storage medium 503, which computer program 5032, when executed by the processor 502, causes the processor 502 to perform an interface automation test method.
The network interface 505 is used for network communication with other devices. Those skilled in the art will appreciate that the architecture shown in fig. 8 is merely a block diagram of a portion of the architecture in connection with the present application and is not intended to limit the computer device 500 to which the present application is applied, and that a particular computer device 500 may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
Wherein the processor 502 is configured to execute a computer program 5032 stored in a memory to implement the steps of:
acquiring a test requirement; performing service modeling according to the test requirements to obtain a test model; generating a test path for the test model; converting the test path into an automatic test case; and executing the test script of the automatic test case to generate a test report.
In an embodiment, when the processor 502 performs the service modeling according to the test requirement to obtain a test model step, the following steps are specifically implemented:
analyzing the test requirement to obtain a business object; generating a business flow chart according to the business object; constructing a test application layer according to the business object and the business flow chart; packaging the test application layer to obtain a model adaptation layer; and modifying the business flow chart according to the method related to the model adaptation layer to obtain a test model.
In one embodiment, when implementing the step of encapsulating the test application layer to obtain the model adaptation layer, the processor 502 specifically implements the following steps:
setting a service object initialization function; integrating operation type methods in the test application layer; integrating a query assertion method in the test application layer; setting an execution rule to obtain a model adaptation layer.
In one embodiment, when the step of generating the test path for the test model is implemented by the processor 502, the following steps are specifically implemented:
and generating a test path for the test model by using the GraphWalker.
In one embodiment, when implementing the step of generating the test path for the test model using GraphWalker, the processor 502 specifically implements the following steps:
a test path is generated for the test model using a generator and stop condition provided by GraphWalker.
In one embodiment, when implementing the step of converting the test path into an automated test case, the processor 502 specifically implements the following steps:
and generating an automatic test case by combining the test path and the method provided by the model adaptation layer.
It should be appreciated that in embodiments of the present application, the processor 502 may be a central processing unit (Central Processing Unit, CPU), the processor 502 may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSPs), application specific integrated circuits (Application Specific Integrated Circuit, ASICs), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. Wherein the general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
Those skilled in the art will appreciate that all or part of the flow in a method embodying the above described embodiments may be accomplished by computer programs instructing the relevant hardware. The computer program comprises program instructions, and the computer program can be stored in a storage medium, which is a computer readable storage medium. The program instructions are executed by at least one processor in the computer system to implement the flow steps of the embodiments of the method described above.
Accordingly, the present invention also provides a storage medium. The storage medium may be a computer readable storage medium. The storage medium stores a computer program which, when executed by a processor, causes the processor to perform the steps of:
acquiring a test requirement; performing service modeling according to the test requirements to obtain a test model; generating a test path for the test model; converting the test path into an automatic test case; and executing the test script of the automatic test case to generate a test report.
In one embodiment, when the processor executes the computer program to implement the service modeling according to the test requirement to obtain a test model step, the following steps are specifically implemented:
analyzing the test requirement to obtain a business object; generating a business flow chart according to the business object; constructing a test application layer according to the business object and the business flow chart; packaging the test application layer to obtain a model adaptation layer; and modifying the business flow chart according to the method related to the model adaptation layer to obtain a test model.
In one embodiment, when the processor executes the computer program to implement the step of encapsulating the test application layer to obtain a model adaptation layer, the processor specifically implements the following steps:
setting a service object initialization function; integrating operation type methods in the test application layer; integrating a query assertion method in the test application layer; setting an execution rule to obtain a model adaptation layer.
In one embodiment, when the processor executes the computer program to implement the step of generating a test path for the test model, the processor specifically implements the following steps:
and generating a test path for the test model by using the GraphWalker.
In one embodiment, when the processor executes the computer program to implement the step of generating a test path for the test model using GraphWalker, the method specifically includes the following steps:
a test path is generated for the test model using a generator and stop condition provided by GraphWalker.
In one embodiment, when the processor executes the computer program to implement the step of converting the test path into an automated test case, the processor specifically implements the following steps:
and generating an automatic test case by combining the test path and the method provided by the model adaptation layer.
The storage medium may be a U-disk, a removable hard disk, a Read-Only Memory (ROM), a magnetic disk, or an optical disk, or other various computer-readable storage media that can store program codes.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps described in connection with the embodiments disclosed herein may be embodied in electronic hardware, in computer software, or in a combination of the two, and that the elements and steps of the examples have been generally described in terms of function in the foregoing description to clearly illustrate the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the several embodiments provided by the present invention, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the device embodiments described above are merely illustrative. For example, the division of each unit is only one logic function division, and there may be another division manner in actual implementation. For example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed.
The steps in the method of the embodiment of the invention can be sequentially adjusted, combined and deleted according to actual needs. The units in the device of the embodiment of the invention can be combined, divided and deleted according to actual needs. In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The integrated unit may be stored in a storage medium if implemented in the form of a software functional unit and sold or used as a stand-alone product. Based on such understanding, the technical solution of the present invention is essentially or a part contributing to the prior art, or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a terminal, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention.
While the invention has been described with reference to certain preferred embodiments, it will be understood by those skilled in the art that various changes and substitutions of equivalents may be made and equivalents will be apparent to those skilled in the art without departing from the scope of the invention. Therefore, the protection scope of the invention is subject to the protection scope of the claims.

Claims (6)

1. The automatic interface testing method is characterized by comprising the following steps of:
acquiring a test requirement;
performing service modeling according to the test requirements to obtain a test model;
generating a test path for the test model;
converting the test path into an automatic test case;
executing the test script of the automatic test case to generate a test report;
wherein, the performing service modeling according to the test requirement to obtain a test model includes:
analyzing the test requirement to obtain a business object;
generating a business flow chart according to the business object, specifically: analyzing the test requirement, abstracting out business objects related to the tested function in the test requirement, describing by using a business flow chart mode through corresponding rules, and finishing the abstraction of the first step; according to the flow chart, deducing according to a logic coverage or basic path coverage mode, and gradually developing into a deep model; the corresponding rule refers to that the naming mode of the edge or the vertex in the model diagram needs to be agreed, and when the analysis path generates a test case, the test case corresponds to an interface provided in the model service through a mapping rule according to the naming mode; when a test model is constructed, arranging codes according to the sequence of a service flow chart; the test model construction is realized by testing a service layer, namely the service layer construction is carried out by taking the thought of field design as reference and using codes to realize the persistent objects, the service objects and the rule objects in the tested system according to the root service flow chart and the service objects or the corresponding rules;
and constructing a test application layer according to the service object and the service flow chart, wherein the test application layer comprises the following concrete steps: expanding on the basis of testing the business object of the business layer, and independently packaging an operation/query interface provided by a server; the test case stores the related IS of the tested object into the context of the test suite, and when the information IS needed, the needed information IS acquired from the context and a request IS sent to a test service layer for processing;
encapsulating the test application layer to obtain a model adaptation layer, wherein the model adaptation layer comprises the following specific steps: setting a service object initialization function; integrating operation type methods in the test application layer; integrating a query assertion method in the test application layer; setting an execution rule to obtain a model adaptation layer;
and modifying the business flow chart according to the method related to the model adaptation layer to obtain a test model.
2. The method of automated interface testing of claim 1, wherein the generating a test path for the test model comprises:
and generating a test path for the test model by using the GraphWalker.
3. The method of automated interface testing of claim 2, wherein the generating a test path for the test model using GraphWalker comprises:
a test path is generated for the test model using a generator and stop condition provided by GraphWalker.
4. The method of automated interface testing according to claim 1, wherein converting the test path into an automated test case comprises:
and generating an automatic test case by combining the test path and the method provided by the model adaptation layer.
5. A computer device, characterized in that it comprises a memory on which a computer program is stored and a processor which, when executing the computer program, implements the method according to any of claims 1-4.
6. A storage medium storing a computer program which, when executed by a processor, performs the method of any one of claims 1 to 4.
CN202110099987.7A 2021-01-25 2021-01-25 Interface automatic test method, device, computer equipment and storage medium Active CN112988565B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202410020039.3A CN117827669A (en) 2021-01-25 2021-01-25 Automatic interface testing device
CN202110099987.7A CN112988565B (en) 2021-01-25 2021-01-25 Interface automatic test method, device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110099987.7A CN112988565B (en) 2021-01-25 2021-01-25 Interface automatic test method, device, computer equipment and storage medium

Related Child Applications (2)

Application Number Title Priority Date Filing Date
CN202410020040.6A Division CN117951005A (en) 2021-01-25 Interface automation testing device, computer equipment and storage medium
CN202410020039.3A Division CN117827669A (en) 2021-01-25 2021-01-25 Automatic interface testing device

Publications (2)

Publication Number Publication Date
CN112988565A CN112988565A (en) 2021-06-18
CN112988565B true CN112988565B (en) 2024-01-02

Family

ID=76345395

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202110099987.7A Active CN112988565B (en) 2021-01-25 2021-01-25 Interface automatic test method, device, computer equipment and storage medium
CN202410020039.3A Pending CN117827669A (en) 2021-01-25 2021-01-25 Automatic interface testing device

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202410020039.3A Pending CN117827669A (en) 2021-01-25 2021-01-25 Automatic interface testing device

Country Status (1)

Country Link
CN (2) CN112988565B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107783893A (en) * 2016-08-26 2018-03-09 上海计算机软件技术开发中心 A kind of Auto-Test System and its implementation based on scene description
CN107844424A (en) * 2017-11-15 2018-03-27 杭州杉石科技有限公司 Model-based testing system and method
CN110018963A (en) * 2019-04-11 2019-07-16 苏州浪潮智能科技有限公司 A kind of test method, system and electronic equipment and storage medium
CN110196804A (en) * 2018-04-24 2019-09-03 腾讯科技(深圳)有限公司 The test method and device of business, storage medium, electronic device
CN110351161A (en) * 2019-05-22 2019-10-18 口碑(上海)信息技术有限公司 Business end-to-end test method, apparatus, storage medium and computer equipment
CN111459821A (en) * 2020-04-01 2020-07-28 汇通达网络股份有限公司 Software automation unit testing method based on TestNG
CN111581074A (en) * 2020-03-26 2020-08-25 平安普惠企业管理有限公司 Call scene coverage testing method and device, computer equipment and storage medium
CN111737148A (en) * 2020-07-24 2020-10-02 深圳市富之富信息技术有限公司 Automatic regression testing method and device, computer equipment and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7421683B2 (en) * 2003-01-28 2008-09-02 Newmerix Corp£ Method for the use of information in an auxiliary data system in relation to automated testing of graphical user interface based applications
US20060048123A1 (en) * 2004-08-30 2006-03-02 International Business Machines Corporation Modification of swing modulo scheduling to reduce register usage

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107783893A (en) * 2016-08-26 2018-03-09 上海计算机软件技术开发中心 A kind of Auto-Test System and its implementation based on scene description
CN107844424A (en) * 2017-11-15 2018-03-27 杭州杉石科技有限公司 Model-based testing system and method
CN110196804A (en) * 2018-04-24 2019-09-03 腾讯科技(深圳)有限公司 The test method and device of business, storage medium, electronic device
CN110018963A (en) * 2019-04-11 2019-07-16 苏州浪潮智能科技有限公司 A kind of test method, system and electronic equipment and storage medium
CN110351161A (en) * 2019-05-22 2019-10-18 口碑(上海)信息技术有限公司 Business end-to-end test method, apparatus, storage medium and computer equipment
CN111581074A (en) * 2020-03-26 2020-08-25 平安普惠企业管理有限公司 Call scene coverage testing method and device, computer equipment and storage medium
CN111459821A (en) * 2020-04-01 2020-07-28 汇通达网络股份有限公司 Software automation unit testing method based on TestNG
CN111737148A (en) * 2020-07-24 2020-10-02 深圳市富之富信息技术有限公司 Automatic regression testing method and device, computer equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Automated Model-Based Test Case Generation for Web User Interfaces (WUI) From Interaction Flow Modeling Language (IFML) Models;N. Yousaf 等;《IEEE Access》;第07卷;67331-67354 *
一种软件测试需求建模及测试用例生成方法;杨波 等;《计算机学报》;第37卷(第03期);522-538 *

Also Published As

Publication number Publication date
CN112988565A (en) 2021-06-18
CN117827669A (en) 2024-04-05

Similar Documents

Publication Publication Date Title
US6385765B1 (en) Specification and verification for concurrent systems with graphical and textual editors
US9465718B2 (en) Filter generation for load testing managed environments
US8676723B2 (en) Automated test system based on three-dimensional application software framework and a method thereof
Tsai et al. Scenario-based functional regression testing
Griebe et al. A model-based approach to test automation for context-aware mobile applications
US20100146340A1 (en) Analyzing Coverage of Code Changes
KR20210149045A (en) artificial intelligence chip verification
WO2003054666A2 (en) System and method for automated test-case generation for software
US7315973B1 (en) Method and apparatus for choosing tests for simulation and associated algorithms and hierarchical bipartite graph data structure
US9058427B2 (en) Iterative generation of symbolic test drivers for object-oriented languages
US7606695B1 (en) Self-checking simulations using dynamic data loading
WO2022128469A1 (en) System testing infrastructure with hidden variable, hidden attribute, and hidden value detection
CN111782207A (en) Method, device and equipment for generating task stream code and storage medium
US9529963B1 (en) Method and system for partitioning a verification testbench
US11132286B1 (en) Dynamic reordering of test case execution
CN112988565B (en) Interface automatic test method, device, computer equipment and storage medium
CN111382065B (en) Verification flow management system and method based on test template
Belli et al. Test generation and minimization with" Basic" statecharts
CN115248783B (en) Software testing method, system, readable storage medium and computer equipment
CN117951005A (en) Interface automation testing device, computer equipment and storage medium
CN112346994A (en) Test information correlation method and device, computer equipment and storage medium
CN113919257A (en) Method, device, equipment and storage medium for compiling simulation file
CN113626342A (en) Model online testing method and device
CN114265776B (en) Unit test code generation method, device, equipment and medium
US9710581B1 (en) VIP assisted method and apparatus to enable SOC integration and software development

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20231009

Address after: Room 901, Floor 9, Building 1, No. 239, Sansheng Street, Qiaosi Street, Linping District, Hangzhou City, Zhejiang Province, 310000

Applicant after: Hangzhou Yike Information Technology Co.,Ltd.

Address before: Room 1601, building 1, Xingqi building, 1916 Jiangling Road, Xixing street, Binjiang District, Hangzhou City, Zhejiang Province 310051

Applicant before: Hangzhou yikeyun Technology Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant