CN111221743B - Automatic test method and system - Google Patents

Automatic test method and system Download PDF

Info

Publication number
CN111221743B
CN111221743B CN202010192525.5A CN202010192525A CN111221743B CN 111221743 B CN111221743 B CN 111221743B CN 202010192525 A CN202010192525 A CN 202010192525A CN 111221743 B CN111221743 B CN 111221743B
Authority
CN
China
Prior art keywords
request
tested
interface
service
service interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010192525.5A
Other languages
Chinese (zh)
Other versions
CN111221743A (en
Inventor
黄华松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shishi Tongyun Technology Chengdu Co ltd
Original Assignee
Shishi Tongyun Technology Chengdu Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shishi Tongyun Technology Chengdu Co ltd filed Critical Shishi Tongyun Technology Chengdu Co ltd
Priority to CN202010192525.5A priority Critical patent/CN111221743B/en
Publication of CN111221743A publication Critical patent/CN111221743A/en
Application granted granted Critical
Publication of CN111221743B publication Critical patent/CN111221743B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Abstract

The invention discloses an automatic test method and an automatic test system, wherein the automatic test method comprises the following steps: reading annotation information of a service interface to be tested in a target system, registering the service interface to be tested according to the annotation information, and generating a test case corresponding to the service interface to be tested; intercepting a service call request sent by a client to a target system, and acquiring request log data corresponding to the service call request; matching a service interface to be tested corresponding to the test case with a service interface in the request log data to determine case data of the test case; and injecting the case data into the test case, and executing the test case to obtain a test result. According to the technical scheme provided by the invention, the automatic maintenance of the test cases and the case data is realized, the maintenance cost is greatly reduced, the coverage rate of the test cases is improved, the timeliness of the case data is ensured, the test accuracy is effectively improved, and the scene coverage is more comprehensive.

Description

Automatic test method and system
Technical Field
The invention relates to the technical field of computers, in particular to an automatic test method and system.
Background
In the existing test technology, the white box test is usually performed through unit test, and a standard integrated test environment is rarely provided, and in the integrated test, test cases and case data (or test data) generally need to be manually maintained, so that a great deal of labor cost and time cost are required to be consumed. The test quality mainly depends on the coverage rate of the test cases and the timeliness of the case data, wherein the coverage rate of the test cases depends on the collection condition of the interface of the test personnel to the system to be tested. However, in the daily iterative development process, the condition of newly adding or updating the service interface to be tested often occurs, and the problems that the newly added service interface to be tested is missed or the test case is out of date easily occur by adopting a mode of manually maintaining the test case. In addition, in the prior art, the task of recording, collecting and the like of case data required by test cases is generally required to be completed manually, and is not usually true data generated in an actual production environment, but is sample data which is constructed by test staff according to experience, and it is difficult to ensure that all scenes can be covered comprehensively. Even if the real data generated in the production environment is used as the case data of the test case, the data can be changed at any time in the production environment, so that the timely update of the case data can not be conveniently completed.
Disclosure of Invention
In view of the foregoing, embodiments of the present invention have been developed to provide an automated testing method and system that overcome, or at least partially solve, the foregoing problems.
According to one aspect of an embodiment of the present invention, there is provided an automated test method comprising:
reading annotation information of a service interface to be tested in a target system, registering the service interface to be tested according to the annotation information, and generating a test case corresponding to the service interface to be tested;
intercepting a service call request sent by a client to a target system, and acquiring request log data corresponding to the service call request;
matching a service interface to be tested corresponding to the test case with a service interface in the request log data to determine case data of the test case;
and injecting the case data into the test case, and executing the test case to obtain a test result.
Further, registering the service interface to be tested according to the annotation information, and generating the test case corresponding to the service interface to be tested further comprises:
constructing a first interface signature of the service interface to be tested according to the annotation information;
registering the service interface to be tested into a test case list, generating a test case corresponding to the service interface to be tested, and storing the test case corresponding to the first interface signature of the service interface to be tested into a test case library.
Further, after generating the test case corresponding to the service interface to be tested, the method further includes:
reading the latest annotation information of the service interface to be tested corresponding to the test case, and constructing a second interface signature of the service interface to be tested according to the latest annotation information;
judging whether the second interface signature is consistent with the first interface signature of the service interface to be tested stored in the test case library; if not, generating expiration prompt information of the test case.
Further, intercepting the service call request sent by the client to the target system, and obtaining the request log data corresponding to the service call request further includes:
intercepting a service call request sent by a client to a target system, recording request log data corresponding to the service call request, and storing the request log data into a database;
the collected request log data is obtained from the database.
Further, intercepting a service call request sent by a client to a target system, recording request log data corresponding to the service call request, and storing the request log data in a database, wherein the method further comprises the steps of:
intercepting a service call request sent by a client to a target system, recording request call log data of the service call request, and sending the service call request to the target system so that the target system calls a corresponding service interface according to the service call request to obtain request response data;
Intercepting request response data returned by a target system to a client, recording request response log data of the request response data, sending the request response data to the client, and asynchronously storing the request call log data and the request response log data as request log data corresponding to a service call request into a database.
Further, matching the service interface to be tested corresponding to the test case with the service interface in the request log data, and determining the case data of the test case further includes:
matching annotation information of the service interface to be tested corresponding to the test case with the service interface in the request log data to obtain matched request log data;
and taking the request parameters in the matched request log data as case data of the test case.
Further, after obtaining the test result, the method further comprises:
judging whether the test result is consistent with the response result in the matched request log data; if not, generating abnormal alarm information.
Further, before generating the test case corresponding to the service interface to be tested, the method further includes: checking a service interface to be tested;
the test cases corresponding to the service interfaces to be tested are specifically generated by: and if the verification is passed, generating a test case corresponding to the service interface to be tested.
Further, the request log data includes: request parameters, service interfaces, methods, request headers, response results, session data, and cookie data.
According to another aspect of an embodiment of the present invention, there is provided an automated test system comprising:
the reading module is suitable for reading annotation information of the service interface to be tested in the target system;
the use case generation module is suitable for registering the service interface to be tested according to the annotation information and generating a test use case corresponding to the service interface to be tested;
the acquisition module is suitable for intercepting a service call request sent to the target system by the client and acquiring request log data corresponding to the service call request;
the matching module is suitable for matching the service interface to be tested corresponding to the test case with the service interface in the request log data to determine the case data of the test case;
and the execution module is suitable for injecting the case data into the test case, executing the test case and obtaining a test result.
Further, the use case generation module is further adapted to:
constructing a first interface signature of the service interface to be tested according to the annotation information;
registering the service interface to be tested into a test case list, generating a test case corresponding to the service interface to be tested, and storing the test case corresponding to the first interface signature of the service interface to be tested into a test case library.
Further, the reading module is further adapted to: reading the latest annotation information of the service interface to be tested corresponding to the test case;
the use case generation module is further adapted to: constructing a second interface signature of the service interface to be tested according to the latest annotation information;
the system further comprises: the signature judging module and the information generating module; the signature judgment module is adapted to: judging whether the second interface signature is consistent with the first interface signature of the service interface to be tested stored in the test case library; the information generation module is adapted to: if the signature judging module judges that the second interface signature is inconsistent with the first interface signature, generating expiration prompt information of the test case.
Further, the acquisition module is further adapted to:
intercepting a service call request sent by a client to a target system, recording request log data corresponding to the service call request, and storing the request log data into a database;
the collected request log data is obtained from the database.
Further, the acquisition module is further adapted to:
intercepting a service call request sent by a client to a target system, recording request call log data of the service call request, and sending the service call request to the target system so that the target system calls a corresponding service interface according to the service call request to obtain request response data;
Intercepting request response data returned by a target system to a client, recording request response log data of the request response data, sending the request response data to the client, and asynchronously storing the request call log data and the request response log data as request log data corresponding to a service call request into a database.
Further, the matching module is further adapted to:
matching annotation information of the service interface to be tested corresponding to the test case with the service interface in the request log data to obtain matched request log data;
and taking the request parameters in the matched request log data as case data of the test case.
Further, the system further comprises:
the result judging module is suitable for judging whether the test result is consistent with the response result in the matched request log data;
the information generation module is suitable for generating abnormal alarm information if the result judgment module judges that the test result is inconsistent with the response result.
Further, the use case generation module is further adapted to: checking a service interface to be tested; and if the verification is passed, generating a test case corresponding to the service interface to be tested.
Further, the request log data includes: request parameters, service interfaces, methods, request headers, response results, session data, and cookie data.
According to yet another aspect of an embodiment of the present invention, there is provided a computing device including: the device comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface are communicated with each other through the communication bus;
the memory is used for storing at least one executable instruction, and the executable instruction enables the processor to execute the operation corresponding to the automatic test method.
According to still another aspect of the embodiments of the present invention, there is provided a computer storage medium having at least one executable instruction stored therein, the executable instruction causing a processor to perform operations corresponding to the above-described automated test method.
According to the technical scheme provided by the embodiment of the invention, the test case can be automatically generated according to the annotation information of the service interface to be tested, the request log data generated in the production environment is acquired by intercepting the service call request sent by the client to the target system, and the case data is automatically determined based on the acquired request log data, so that the automatic maintenance of the test case and the case data is realized, the maintenance cost is greatly reduced, the coverage rate of the test case is improved, the timeliness of the case data is ensured, the test accuracy is effectively improved, the scene coverage is more comprehensive, and the test mode is optimized.
The foregoing description is only an overview of the technical solutions of the embodiments of the present invention, and may be implemented according to the content of the specification, so that the technical means of the embodiments of the present invention can be more clearly understood, and the following specific implementation of the embodiments of the present invention will be more apparent.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to designate like parts throughout the figures. In the drawings:
FIG. 1 shows a flow chart of an automated test method provided by an embodiment of the present invention;
FIG. 2a shows a flow chart of an automated test method provided by another embodiment of the present invention;
FIG. 2b is a timing diagram of a request log data acquisition method according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of an automated test system according to an embodiment of the present invention;
FIG. 4 illustrates a schematic diagram of a computing device, according to an embodiment of the invention.
Detailed Description
Exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present invention are shown in the drawings, it should be understood that the present invention may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
Fig. 1 shows a flowchart of an automated testing method according to an embodiment of the present invention, as shown in fig. 1, the method includes the following steps:
step S101, annotation information of a service interface to be tested in a target system is read, the service interface to be tested is registered according to the annotation information, and a test case corresponding to the service interface to be tested is generated.
The target system refers to a system which needs to be tested. The service interface to be tested can be annotated in the target system in advance through a JAVA annotation mechanism and the like so as to mark the service interface to be tested. The annotation information of the service interface to be tested can specifically comprise information such as responsible person information, interface names, interface descriptions, whether the service interface is out of date, parameters, returned results and the like.
In step S101, annotation information of a service interface to be tested in the target system is read, the class and the method marked with the annotation are automatically registered in the automated test system, and then the automated test system can generate a test case corresponding to the service interface to be tested according to the registered service interface to be tested. The test case is an important document in the test process, and information such as a test target, a test environment, input parameters, test steps, expected results, test scripts and the like can be recorded in the test case.
Step S102, intercepting a service call request sent to a target system by a client to acquire request log data corresponding to the service call request.
The method comprises the steps that a Java Filter filtering mechanism and the like can be utilized to intercept a service call request sent to a target system by a client, and request log data corresponding to the service call request is obtained. When the use case data needs to be determined, the collected request log data is acquired. The request log data may include: request parameters, service interfaces, methods, request header, response results, session data (session), cookie data, and the like.
In an actual application scene, the generation of the test case and the acquisition of the request log data are two independent processes, no sequential dependency relationship exists between the two processes, the request log data can be acquired while the test case is generated, the request log data can be acquired after the test case is generated, or the request log data can be acquired before the test case is generated, and the specific limitation is not made here.
Step S103, the service interface to be tested corresponding to the test case is matched with the service interface in the request log data, and the case data of the test case is determined.
After the generation of the test cases and the acquisition of the request log data are completed, the case data required for each test case can be determined from the request log data. The test case and the service interface to be tested have a corresponding relation, the request log data is data generated by requesting each service interface of the target system, so that the corresponding relation exists between the request log data and the service interface of the target system, and then the test case data can be conveniently determined by matching the service interface to be tested corresponding to the test case with the service interface in the request log data and then according to the matched request log data.
Step S104, the case data is injected into the test case, the test case is executed, and a test result is obtained.
For a certain test case, after the case data of the test case is determined, tools such as a Jmeter can be utilized to automatically inject the case data into the test case, and the test is completed by executing the test case, so that a test result is obtained.
According to the automatic test method provided by the embodiment, the test case can be automatically generated according to the annotation information of the service interface to be tested, the request log data generated in the production environment is acquired by intercepting the service call request sent by the client to the target system, the case data is automatically determined based on the acquired request log data, the automatic maintenance of the test case and the case data is realized, the maintenance cost is greatly reduced, the coverage rate of the test case is improved, the timeliness of the case data is ensured, the test accuracy is effectively improved, the scene coverage is more comprehensive, and the test mode is optimized.
FIG. 2a shows a flowchart of an automated testing method according to another embodiment of the present invention, as shown in FIG. 2a, the method includes the steps of:
step S201, annotation information of a service interface to be tested in a target system is read, and a first interface signature of the service interface to be tested is constructed according to the annotation information.
In order to automatically register the service interface to be tested of the target system in the automated test system, the service interface to be tested needs to be annotated in the target system in advance. The annotation information of the service interface to be tested can comprise information such as responsible person information, interface names, interface descriptions, expiration, parameters, returned results and the like. Taking the JAVA annotation mechanism as an example to annotate the service interface to be tested, the annotation can be added for a Controller, the Controller marks the annotation on a class, and the class marked by the Controller is a SpringMVC Controller object. In step S201, an add-in such as an annotation processing add-in may be used to read annotation information of a service interface to be tested in the target system, and a first interface signature of the service interface to be tested is constructed according to the annotation information. For example, the annotation information may be processed using a preset signature algorithm to obtain the first interface signature.
Step S202, registering the service interface to be tested into a test case list, generating a test case corresponding to the service interface to be tested, and storing the test case corresponding to the first interface signature of the service interface to be tested into a test case library.
The method comprises the steps of automatically registering classes and methods marked with comments in a test case list in an automatic test system according to comment information by using plug-ins such as comment processing plug-ins, so that the service interfaces to be tested are registered in the test case list. In order to ensure that the test case can be correctly generated, after the service interface to be tested is registered in the test case list, the service interface to be tested in the test case list can be checked, for example, whether information such as responsible person information, interface name, interface description, expiration, parameters and the like of the service interface to be tested has problems or has non-compliance. If the test passes, a test case corresponding to the service interface to be tested is generated, and the test case and the first interface signature of the service interface to be tested are correspondingly stored in a test case library so as to be used in the subsequent test.
By the generation mode of the test case, when a new added service interface to be tested exists in the target system, the generation of the test case can be conveniently completed according to the read annotation information of the new added service interface to be tested, omission of the service interface to be tested is avoided, and the coverage rate of the test case is effectively improved.
Step S203, the latest annotation information of the service interface to be tested corresponding to the test case is read, and a second interface signature of the service interface to be tested is constructed according to the latest annotation information.
In the daily iterative development process, the condition of updating the service interface to be tested often occurs, and if the service interface to be tested is updated, the test case corresponding to the service interface to be tested, which is generated in the automatic test system before, will fail along with the update of the service interface to be tested, thereby causing test failure. In order to ensure the validity of the test case in the test process, when the test case is required to be used for testing, the plug-ins such as the annotation processing plug-in are utilized to read the latest annotation information of the service interface to be tested corresponding to the generated test case from the target system, and a second interface signature of the service interface to be tested is constructed according to the latest annotation information. The latest annotation information refers to the annotation information of the service interface to be tested in the target system at the current moment.
Step S204, judging whether the second interface signature is consistent with the first interface signature of the service interface to be tested stored in the test case library; if yes, go to step S207; if not, step S211 is performed.
For a certain service interface to be tested, after a second interface signature is obtained according to the latest annotation information construction, the second interface signature can be compared with the first interface signature of the service interface to be tested stored in the test case library, and whether the second interface signature is consistent with the first interface signature of the service interface to be tested stored in the test case library is judged. If the second interface signature is consistent with the first interface signature, it indicates that the annotation information of the service interface to be tested has not changed, that is, the service interface to be tested is not updated, and step S207 is executed; if the second interface signature is inconsistent with the first interface signature, it indicates that the annotation information of the service interface to be tested has changed, that is, there is an update of the service interface to be tested, step S211 is executed.
Step S205, intercept the service call request sent by the client to the target system, record the request log data corresponding to the service call request, and store the request log data in the database.
In the embodiment of the invention, the request log data generated by the client in the process of requesting the target system can be collected in a log interception mode, so that the collection of the request log data generated in the production environment is realized. For example, the log interception component may intercept all service call requests sent by the client to the target system, record relevant data such as request parameters, service interfaces, methods, request headers, response results, session data, cookie data and the like corresponding to the service call requests, and asynchronously store the request log data into the database as request log data corresponding to the service call requests. The databases may be selected by those skilled in the art according to actual needs, for example, the databases may be NoSQL databases, etc.
In order to reduce the performance loss of the interception log to the request, the storage of the request log data is preferably realized in an asynchronous mode, and meanwhile, the request log data can be decoupled through a high-performance message channel. The collection of the request log data generated in the production environment of the target system may adopt policies such as real-time collection, random collection, and timing collection to collect the request log data, and may also collect the request log data generated earliest or latest, and those skilled in the art may select a data collection policy according to actual needs, which is not specifically limited herein.
FIG. 2b is a timing diagram illustrating a method for obtaining request log data according to an embodiment of the present invention, as shown in FIG. 2b, when a client needs to request a service provided by a target system, the client sends a service call request; then, a log interception component intercepts a service call request sent by a client to a target system, records request call log data of the service call request, and then sends the service call request to the target system so that the target system calls a corresponding service interface according to the service call request to obtain request response data; in the process that the target system returns request response data, the request response data returned by the target system to the client can be intercepted through the log interception component, the request response log data of the request response data is recorded, then the request response data is sent to the client for use by the client, and the request call log data and the request response log data are asynchronously stored in the database as the request log data corresponding to the service call request, so that the collection of the request log data generated in the production environment is completed.
Step S206, acquiring the collected request log data from the database.
In the embodiment of the invention, various modes such as a tool such as a Meter and the like or an intermediate program and the like can be adopted to open the automatic test system and the request log data, and when the use case data needs to be determined, the collected request log data can be conveniently obtained from a database. The request log data may include: request parameters, service interfaces, methods, request headers, response results, session data, cookie data, and the like.
Step S207, the service interface to be tested corresponding to the test case is matched with the service interface in the request log data, and the case data of the test case is determined.
The test case and the service interfaces to be tested have a corresponding relation, the request log data is data generated by requesting each service interface of the target system, so that the corresponding relation exists between the request log data and the service interfaces of the target system, annotation information of the service interfaces to be tested corresponding to the test case can be matched with the service interfaces in the request log data to obtain matched request log data, and then request parameters in the matched request log data are used as the case data of the test case.
Step S208, the case data is injected into the test case, the test case is executed, and a test result is obtained.
When the test is needed, various modes such as a tool such as a Meter and the like or an intermediate program and the like can be adopted to automatically inject the case data into the test case, and the test is completed by executing the test case, so that a test result is obtained.
Step S209, judging whether the test result is consistent with the response result in the matched request log data; if yes, the method ends; if not, step S210 is performed.
After the test result is obtained, whether the test result is consistent with the response result in the matched request log data or not can be judged; if the two types of information are consistent, the problem is not found, and the processing is not needed, so that the method is finished; if not, step S210 is performed.
Step S210, generating abnormality alert information.
Under the condition that the test result is inconsistent with the response result in the matched request log data, whether the test result is the expected result or not can be checked first; if the result is desired, no processing is required; if the result is not the expected result, generating abnormal alarm information for the testers to analyze the reasons of the problem according to the abnormal alarm information and solve the problem.
Step S211, generating expiration prompt information of the test case.
If the second interface signature is inconsistent with the first interface signature of the service interface to be tested stored in the test case library, which is determined in step S204, it is indicated that annotation information of the service interface to be tested is changed, that is, the service interface to be tested is updated, and the previously generated test case corresponding to the service interface to be tested is expired, then the expiration invalidation processing needs to be performed on the test case, and test case expiration prompt information is generated, so that the service interface to be tested is checked again according to the test case expiration prompt information, and after the verification is passed, the latest test case corresponding to the service interface to be tested is generated. By the processing mode, whether the test case is overdue or not can be timely identified, and maintenance cost of the test case is greatly reduced.
According to the automatic test method provided by the embodiment, the test case can be automatically generated according to the annotation information of the service interface to be tested, when a new service interface to be tested exists in the target system, the generation of the test case can be conveniently completed according to the read annotation information of the new service interface to be tested, the omission of the service interface to be tested is avoided, the coverage rate of the test case is effectively improved, whether the service interface to be tested is updated or not and whether the test case is out of date can be conveniently identified by comparing with the signature mode of the interface, and the maintenance cost of the test case is greatly reduced; in addition, based on the request log data generated in the collected production environment, the use case data can be automatically determined, the automatic maintenance and the automatic update of the use case data are realized, the maintenance cost of the use case data is greatly reduced, the timeliness of the use case data is ensured, the testing accuracy is effectively improved, the scene coverage is more comprehensive, and the product release quality is guaranteed.
Fig. 3 shows a schematic structural diagram of an automated test system according to an embodiment of the present invention, as shown in fig. 3, where the system includes: a reading module 301, a use case generating module 302, an obtaining module 303, a matching module 304 and an executing module 305.
The reading module 301 is adapted to: and reading annotation information of the service interface to be tested in the target system.
The use case generation module 302 is adapted to: registering the service interface to be tested according to the annotation information, and generating a test case corresponding to the service interface to be tested.
The acquisition module 303 is adapted to: intercepting a service call request sent by a client to a target system, and acquiring request log data corresponding to the service call request.
Wherein the request log data includes: request parameters, service interfaces, methods, request headers, response results, session data, and cookie data.
The matching module 304 is adapted to: and matching the service interface to be tested corresponding to the test case with the service interface in the request log data to determine the case data of the test case.
The execution module 305 is adapted to: and injecting the case data into the test case, and executing the test case to obtain a test result.
Optionally, the use case generation module 302 is further adapted to: constructing a first interface signature of the service interface to be tested according to the annotation information; registering the service interface to be tested into a test case list, generating a test case corresponding to the service interface to be tested, and storing the test case corresponding to the first interface signature of the service interface to be tested into a test case library.
Optionally, the reading module 301 is further adapted to: reading the latest annotation information of the service interface to be tested corresponding to the test case; the use case generation module 302 is further adapted to: and constructing a second interface signature of the service interface to be tested according to the latest annotation information.
The system may further include: a signature judgment module 306 and an information generation module 307. The signature determination module 306 is adapted to: judging whether the second interface signature is consistent with the first interface signature of the service interface to be tested stored in the test case library; the information generation module 307 is adapted to: if the signature judging module judges that the second interface signature is inconsistent with the first interface signature, generating expiration prompt information of the test case.
Optionally, the acquisition module 303 is further adapted to: intercepting a service call request sent by a client to a target system, recording request log data corresponding to the service call request, and storing the request log data into a database; the collected request log data is obtained from the database.
Optionally, the acquisition module 303 is further adapted to: intercepting a service call request sent by a client to a target system, recording request call log data of the service call request, and sending the service call request to the target system so that the target system calls a corresponding service interface according to the service call request to obtain request response data; intercepting request response data returned by a target system to a client, recording request response log data of the request response data, sending the request response data to the client, and asynchronously storing the request call log data and the request response log data as request log data corresponding to a service call request into a database.
Optionally, the matching module 304 is further adapted to: matching annotation information of the service interface to be tested corresponding to the test case with the service interface in the request log data to obtain matched request log data; and taking the request parameters in the matched request log data as case data of the test case.
Optionally, the system may further comprise: a result judgment module 308 and an information generation module 307. The result determination module 308 is adapted to: judging whether the test result is consistent with the response result in the matched request log data; the information generation module 307 is further adapted to: if the result judging module judges that the test result is inconsistent with the response result, abnormal alarm information is generated.
Optionally, the use case generation module 302 is further adapted to: checking a service interface to be tested; and if the verification is passed, generating a test case corresponding to the service interface to be tested.
According to the automatic test system provided by the embodiment, the test cases can be automatically generated according to the annotation information of the service interfaces to be tested, when a new service interface to be tested exists in the target system, the generation of the test cases can be conveniently completed according to the read annotation information of the new service interface to be tested, the omission of the service interface to be tested is avoided, the coverage rate of the test cases is effectively improved, whether the service interface to be tested is updated or not and whether the test cases are out of date can be conveniently identified in a mode of comparing the signature of the interface to be tested, and the maintenance cost of the test cases is greatly reduced; in addition, based on the request log data generated in the collected production environment, the use case data can be automatically determined, the automatic maintenance and the automatic update of the use case data are realized, the maintenance cost of the use case data is greatly reduced, the timeliness of the use case data is ensured, the testing accuracy is effectively improved, the scene coverage is more comprehensive, and the product release quality is guaranteed.
Embodiments of the present invention provide a non-volatile computer storage medium having stored thereon at least one executable instruction that is capable of performing the automated test method of any of the method embodiments described above.
FIG. 4 illustrates a schematic diagram of a computing device, according to an embodiment of the invention, the particular embodiment of the invention not being limited to a particular implementation of the computing device.
As shown in fig. 4, the computing device may include: a processor 402, a communication interface (Communications Interface) 404, a memory 406, and a communication bus 408.
Wherein:
processor 402, communication interface 404, and memory 406 communicate with each other via communication bus 408.
A communication interface 404 for communicating with network elements of other devices, such as clients or other servers.
The processor 402 is configured to execute the program 410, and may specifically perform relevant steps in the above-described embodiments of the automated test method.
In particular, program 410 may include program code including computer-operating instructions.
The processor 402 may be a central processing unit CPU, or a specific integrated circuit ASIC (Application Specific Integrated Circuit), or one or more integrated circuits configured to implement embodiments of the present invention. The one or more processors included by the computing device may be the same type of processor, such as one or more CPUs; but may also be different types of processors such as one or more CPUs and one or more ASICs.
Memory 406 for storing programs 410. Memory 406 may comprise high-speed RAM memory or may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
Program 410 may be specifically configured to cause processor 402 to perform the automated test method of any of the method embodiments described above. The specific implementation of each step in the procedure 410 may refer to the corresponding step and corresponding description in the unit in the above automated test embodiment, which is not repeated herein. It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the apparatus and modules described above may refer to corresponding procedure descriptions in the foregoing method embodiments, which are not repeated herein.
The algorithms and displays presented herein are not inherently related to any particular computer, virtual system, or other apparatus. Various general-purpose systems may also be used with the teachings herein. The required structure for a construction of such a system is apparent from the description above. In addition, embodiments of the present invention are not directed to any particular programming language. It will be appreciated that the teachings of embodiments of the present invention described herein may be implemented in a variety of programming languages, and the above description of specific languages is provided for disclosure of enablement and best mode of the embodiments of the present invention.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be construed as reflecting the intention that: i.e., an embodiment of the invention that is claimed, requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules in the apparatus of the embodiments may be adaptively changed and disposed in one or more apparatuses different from the embodiments. The modules or units or components of the embodiments may be combined into one module or unit or component and, furthermore, they may be divided into a plurality of sub-modules or sub-units or sub-components. Any combination of all features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or units of any method or apparatus so disclosed, may be used in combination, except insofar as at least some of such features and/or processes or units are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise. Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features but not others included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments can be used in any combination.
Various component embodiments of the invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that some or all of the functions of some or all of the components in accordance with embodiments of the present invention may be implemented in practice using a microprocessor or Digital Signal Processor (DSP). Embodiments of the present invention may also be implemented as a device or apparatus program (e.g., a computer program and a computer program product) for performing a portion or all of the methods described herein. Such a program embodying the embodiments of the present invention may be stored on a computer readable medium, or may have the form of one or more signals. Such signals may be downloaded from an internet website, provided on a carrier signal, or provided in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. Embodiments of the invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second, third, etc. do not denote any order. These words may be interpreted as names.

Claims (20)

1. An automated testing method, the method comprising:
reading annotation information of a service interface to be tested in a target system, constructing a first interface signature of the service interface to be tested according to the annotation information, registering the service interface to be tested, and generating a test case corresponding to the service interface to be tested; the annotation information comprises responsible person information, interface names, interface descriptions, whether the annotation information is out of date, parameters and returned results;
intercepting a service call request sent by a client to the target system, and acquiring request log data corresponding to the service call request;
if a second interface signature constructed according to the latest annotation information of the service interface to be tested is consistent with the first interface signature, matching the service interface to be tested corresponding to the test case with the service interface in the request log data, and determining the case data of the test case;
and injecting the case data into the test case, and executing the test case to obtain a test result.
2. The method of claim 1, wherein the constructing a first interface signature of the service interface under test according to the annotation information, registering the service interface under test, and generating a test case corresponding to the service interface under test further comprises:
Registering the service interface to be tested into a test case list, generating a test case corresponding to the service interface to be tested, and storing the test case corresponding to the first interface signature of the service interface to be tested into a test case library.
3. The method of claim 2, wherein after the generating the test case corresponding to the service interface under test, the method further comprises:
reading the latest annotation information of the service interface to be tested corresponding to the test case, and constructing a second interface signature of the service interface to be tested according to the latest annotation information;
judging whether the second interface signature is consistent with the first interface signature of the service interface to be tested stored in the test case library; if not, generating expiration prompt information of the test case.
4. The method of claim 1, wherein intercepting the service call request sent by the client to the target system, and obtaining the request log data corresponding to the service call request further comprises:
intercepting a service call request sent by a client to the target system, recording request log data corresponding to the service call request, and storing the request log data into a database;
And acquiring the collected request log data from the database.
5. The method of claim 4, wherein intercepting the service call request sent by the client to the target system, recording request log data corresponding to the service call request, and storing the request log data in a database further comprises:
intercepting a service call request sent by the client to the target system, recording request call log data of the service call request, and sending the service call request to the target system so that the target system calls a corresponding service interface according to the service call request to obtain request response data;
intercepting the request response data returned by the target system to the client, recording request response log data of the request response data, sending the request response data to the client, and asynchronously storing the request call log data and the request response log data as request log data corresponding to the service call request into a database.
6. The method according to any one of claims 1-5, wherein the matching the service interface to be tested corresponding to the test case with the service interface in the request log data, determining the case data of the test case further includes:
Matching annotation information of the service interface to be tested corresponding to the test case with the service interface in the request log data to obtain matched request log data;
and taking the request parameters in the matched request log data as the case data of the test case.
7. The method of any of claims 1-5, wherein after obtaining the test result, the method further comprises:
judging whether the test result is consistent with the response result in the matched request log data; if not, generating abnormal alarm information.
8. The method according to any one of claims 1-5, wherein, before generating the test case corresponding to the service interface under test, the method further comprises: checking the service interface to be tested;
the generating the test case corresponding to the service interface to be tested specifically includes: and if the verification is passed, generating a test case corresponding to the service interface to be tested.
9. The method of any of claims 1-5, wherein the request log data comprises: request parameters, service interfaces, methods, request headers, response results, session data, and cookie data.
10. An automated testing system, the system comprising:
the reading module is suitable for reading annotation information of the service interface to be tested in the target system;
the use case generation module is suitable for constructing a first interface signature of the service interface to be tested according to the annotation information, registering the service interface to be tested and generating a test use case corresponding to the service interface to be tested; the annotation information comprises responsible person information, interface names, interface descriptions, whether the annotation information is out of date, parameters and returned results;
the acquisition module is suitable for intercepting a service call request sent to the target system by a client to acquire request log data corresponding to the service call request;
the matching module is suitable for matching the service interface to be tested corresponding to the test case with the service interface in the request log data if the second interface signature constructed according to the latest annotation information of the service interface to be tested is consistent with the first interface signature, and determining the case data of the test case;
and the execution module is suitable for injecting the case data into the test case, executing the test case and obtaining a test result.
11. The system of claim 10, wherein the use case generation module is further adapted to:
registering the service interface to be tested into a test case list, generating a test case corresponding to the service interface to be tested, and storing the test case corresponding to the first interface signature of the service interface to be tested into a test case library.
12. The system of claim 11, wherein the reading module is further adapted to: reading the latest annotation information of the service interface to be tested corresponding to the test case;
the use case generation module is further adapted to: constructing a second interface signature of the service interface to be tested according to the latest annotation information;
the system further comprises: the signature judging module and the information generating module; the signature judgment module is adapted to: judging whether the second interface signature is consistent with the first interface signature of the service interface to be tested stored in the test case library; the information generation module is adapted to: and if the signature judging module judges that the second interface signature is inconsistent with the first interface signature, generating expiration prompt information of the test case.
13. The system of claim 10, wherein the acquisition module is further adapted to:
Intercepting a service call request sent by a client to the target system, recording request log data corresponding to the service call request, and storing the request log data into a database;
and acquiring the collected request log data from the database.
14. The system of claim 13, wherein the acquisition module is further adapted to:
intercepting a service call request sent by the client to the target system, recording request call log data of the service call request, and sending the service call request to the target system so that the target system calls a corresponding service interface according to the service call request to obtain request response data;
intercepting the request response data returned by the target system to the client, recording request response log data of the request response data, sending the request response data to the client, and asynchronously storing the request call log data and the request response log data as request log data corresponding to the service call request into a database.
15. The system of any of claims 10-14, wherein the matching module is further adapted to:
Matching annotation information of the service interface to be tested corresponding to the test case with the service interface in the request log data to obtain matched request log data;
and taking the request parameters in the matched request log data as the case data of the test case.
16. The system of any of claims 10-14, wherein the system further comprises:
the result judging module is suitable for judging whether the test result is consistent with the response result in the matched request log data;
and the information generation module is suitable for generating abnormal alarm information if the result judgment module judges that the test result is inconsistent with the response result.
17. The system of any of claims 10-14, wherein the use case generation module is further adapted to: checking the service interface to be tested; and if the verification is passed, generating a test case corresponding to the service interface to be tested.
18. The system of any of claims 10-14, wherein the request log data comprises: request parameters, service interfaces, methods, request headers, response results, session data, and cookie data.
19. A computing device, comprising: the device comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete communication with each other through the communication bus;
the memory is configured to store at least one executable instruction that causes the processor to perform operations corresponding to the automated test method according to any one of claims 1-9.
20. A computer storage medium having stored therein at least one executable instruction for causing a processor to perform operations corresponding to the automated test method of any of claims 1-9.
CN202010192525.5A 2020-03-18 2020-03-18 Automatic test method and system Active CN111221743B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010192525.5A CN111221743B (en) 2020-03-18 2020-03-18 Automatic test method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010192525.5A CN111221743B (en) 2020-03-18 2020-03-18 Automatic test method and system

Publications (2)

Publication Number Publication Date
CN111221743A CN111221743A (en) 2020-06-02
CN111221743B true CN111221743B (en) 2023-07-14

Family

ID=70828472

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010192525.5A Active CN111221743B (en) 2020-03-18 2020-03-18 Automatic test method and system

Country Status (1)

Country Link
CN (1) CN111221743B (en)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111813662A (en) * 2020-06-16 2020-10-23 上海中通吉网络技术有限公司 User behavior driven sustainable integration test method, device and equipment
CN111885051B (en) * 2020-07-22 2022-10-25 微医云(杭州)控股有限公司 Data verification method and device and electronic equipment
CN113760699A (en) * 2020-07-31 2021-12-07 北京沃东天骏信息技术有限公司 Server performance test system and method
CN114205273B (en) * 2020-08-26 2023-09-15 腾讯科技(深圳)有限公司 System test method, device and equipment and computer storage medium
CN113448836A (en) * 2020-10-13 2021-09-28 北京新氧科技有限公司 Software interface testing method and device, electronic equipment and storage medium
CN112463588A (en) * 2020-11-02 2021-03-09 北京健康之家科技有限公司 Automatic test system and method, storage medium and computing equipment
CN112532490A (en) * 2020-11-30 2021-03-19 武汉悦学帮网络技术有限公司 Regression testing system and method and electronic equipment
CN113760722A (en) * 2021-01-13 2021-12-07 北京京东振世信息技术有限公司 Test system and test method
CN112994976A (en) * 2021-02-23 2021-06-18 北京百度网讯科技有限公司 Gateway testing method and device, electronic equipment and storage medium
CN113138934B (en) * 2021-05-14 2024-01-19 杭州网易云音乐科技有限公司 Automatic test method, medium, device and computing equipment
CN113407444B (en) * 2021-06-03 2022-11-25 中科曙光国际信息产业有限公司 Interface test case generation method, device, equipment and storage medium
CN113342679A (en) * 2021-06-29 2021-09-03 汇付天下有限公司 Interface test method and test device
CN114003451B (en) * 2021-10-27 2023-08-25 苏州浪潮智能科技有限公司 Interface testing method, device, system and medium
CN117493218B (en) * 2023-12-27 2024-03-22 南京翼辉信息技术有限公司 VSOA-based test system and test method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107832217A (en) * 2017-11-09 2018-03-23 政采云有限公司 A kind of automated testing method and device
CN108600048A (en) * 2018-04-12 2018-09-28 平安科技(深圳)有限公司 Interface test method, device, equipment and computer readable storage medium
CN109446065A (en) * 2018-09-18 2019-03-08 深圳壹账通智能科技有限公司 User tag test method, device, computer equipment and storage medium
CN110825619A (en) * 2019-10-12 2020-02-21 深圳壹账通智能科技有限公司 Automatic generation method and device of interface test case and storage medium

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105005532B (en) * 2015-08-05 2017-11-24 广东欧珀移动通信有限公司 A kind of system and method for automatic test application programming interfaces stability
CN107908540B (en) * 2017-07-26 2021-04-06 平安壹钱包电子商务有限公司 Test case creating method and device, computer equipment and medium
US20190073292A1 (en) * 2017-09-05 2019-03-07 Panasonic Automotive Systems Company Of America, Division Of Panasonic Corporation Of North America State machine software tester
CN107908549A (en) * 2017-10-24 2018-04-13 北京小米移动软件有限公司 Method for generating test case, device and computer-readable recording medium
CN107894952A (en) * 2017-11-08 2018-04-10 中国平安人寿保险股份有限公司 Generation method, device, equipment and the readable storage medium storing program for executing of interface testing use-case
US10678683B2 (en) * 2018-03-07 2020-06-09 Jpmorgan Chase Bank, N.A. System and method for automated service layer testing and regression
CN108509337A (en) * 2018-03-15 2018-09-07 链家网(北京)科技有限公司 The dynamic testing method and device called based on interface
CN108829587A (en) * 2018-05-29 2018-11-16 平安普惠企业管理有限公司 Test case parameter replacing method, device, computer equipment and storage medium
CN109344056B (en) * 2018-09-07 2021-02-26 武汉达梦数据库股份有限公司 Test method and test device
CN109446071A (en) * 2018-09-26 2019-03-08 深圳壹账通智能科技有限公司 Interface test method, interface test device, electronic equipment and storage medium
CN109308266A (en) * 2018-11-30 2019-02-05 北京微播视界科技有限公司 Construction method, test method, device, equipment and the medium of test case
CN109684209A (en) * 2018-12-17 2019-04-26 北京奇虎科技有限公司 A kind of method for generating test case, device and electronic equipment
CN109582588B (en) * 2018-12-25 2022-02-22 迈普通信技术股份有限公司 Test case generation method and device and electronic equipment
CN109947646A (en) * 2019-03-13 2019-06-28 平安信托有限责任公司 Interface test method, device, computer equipment and storage medium
CN110503297B (en) * 2019-07-16 2023-08-25 创新先进技术有限公司 Service scene acquisition method and device, electronic equipment and medium
CN110765018B (en) * 2019-10-25 2023-06-13 上海中通吉网络技术有限公司 Automatic interface testing method and equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107832217A (en) * 2017-11-09 2018-03-23 政采云有限公司 A kind of automated testing method and device
CN108600048A (en) * 2018-04-12 2018-09-28 平安科技(深圳)有限公司 Interface test method, device, equipment and computer readable storage medium
CN109446065A (en) * 2018-09-18 2019-03-08 深圳壹账通智能科技有限公司 User tag test method, device, computer equipment and storage medium
CN110825619A (en) * 2019-10-12 2020-02-21 深圳壹账通智能科技有限公司 Automatic generation method and device of interface test case and storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Carlos Pacheco 等.Randoop: feedback-directed random testing for Java.《OOPSLA '07: Companion to the 22nd ACM SIGPLAN conference on Object-oriented programming systems and applications companion》.2007,815-816. *
一种分布式可视化Dubbo接口测试平台;李艳丽 等;《华东师范大学学报(自然科学版)》(第4期);120-132,143 *
基于通信协议的接口测试用例自动生成框架;刘逻 等;《测控技术》;第39卷(第1期);46-54 *

Also Published As

Publication number Publication date
CN111221743A (en) 2020-06-02

Similar Documents

Publication Publication Date Title
CN111221743B (en) Automatic test method and system
CN109688202B (en) Interface data processing method and device, computing equipment and storage medium
CN108683562B (en) Anomaly detection positioning method and device, computer equipment and storage medium
CN109271359B (en) Log information processing method and device, electronic equipment and readable storage medium
US9910726B2 (en) System dump analysis
CN114117311B (en) Data access risk detection method and device, computer equipment and storage medium
CN111522728A (en) Method for generating automatic test case, electronic device and readable storage medium
CN112532490A (en) Regression testing system and method and electronic equipment
CN114356785B (en) Data processing method and device, electronic equipment and storage medium
CN114422564A (en) Audit tracing method and device for access data, computer equipment and storage medium
CN110750443A (en) Webpage testing method and device, computer equipment and storage medium
US20120054724A1 (en) Incremental static analysis
US20070245313A1 (en) Failure tagging
CN111324510A (en) Log processing method and device and electronic equipment
CN115048257A (en) System service function verification method and device, computer equipment and storage medium
CN108650123B (en) Fault information recording method, device, equipment and storage medium
CN113742250A (en) Automatic interface testing method and device
CN112416762A (en) API test method and device, equipment and computer readable storage medium
CN112632419A (en) Domain name pre-resolution configuration method and device, computer equipment and storage medium
CN116303320A (en) Real-time task management method, device, equipment and medium based on log file
CN113282496B (en) Automatic interface testing method, device, equipment and storage medium
CN111198798B (en) Service stability measuring method and device
CN114416420A (en) Equipment problem feedback method and system
CN112363944A (en) Method and equipment for comparing return values of multiple environment interfaces
CN113435830A (en) Mail information summarizing method, system, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant