CN111930611A - Statistical method and device for test data - Google Patents

Statistical method and device for test data Download PDF

Info

Publication number
CN111930611A
CN111930611A CN202010654024.4A CN202010654024A CN111930611A CN 111930611 A CN111930611 A CN 111930611A CN 202010654024 A CN202010654024 A CN 202010654024A CN 111930611 A CN111930611 A CN 111930611A
Authority
CN
China
Prior art keywords
interfaces
module
test
tested
modules
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010654024.4A
Other languages
Chinese (zh)
Other versions
CN111930611B (en
Inventor
李彩新
张金鑫
王发明
杨广奇
陈元兵
宋蓓蓓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Leading Technology Co Ltd
Original Assignee
Nanjing Leading Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Leading Technology Co Ltd filed Critical Nanjing Leading Technology Co Ltd
Priority to CN202010654024.4A priority Critical patent/CN111930611B/en
Publication of CN111930611A publication Critical patent/CN111930611A/en
Application granted granted Critical
Publication of CN111930611B publication Critical patent/CN111930611B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management

Abstract

The embodiment of the invention provides a statistical method and a statistical device for test data, wherein the method comprises the following steps: scanning a service management application of a tested system to obtain the number of modules to be tested; screening a newly added module in the current period and a module for compiling the automatic test case in the current period from the module to be tested according to the automatic test case; inquiring newly-increased effective test interfaces in the previous period aiming at all the interfaces in the module to be tested to obtain the coverage quantity of the newly-increased interfaces in the previous period; scanning a web service framework of a test environment to obtain the number of single module interfaces and the number of all interfaces; and sending the identification and the number of the modules to be tested, the identification of the modules in which the automatic test cases are compiled in the current period, the identification of the newly added modules in the current period, the coverage number of the newly added interfaces in the previous period, the number of the interfaces of the single module and the number of all the interfaces to corresponding test responsible users. The embodiment of the invention provides a test responsible user to comprehensively master the integral automatic test condition of the tested system.

Description

Statistical method and device for test data
Technical Field
The invention relates to the technical field of computers, in particular to a statistical method and a statistical device for test data.
Background
As computers have evolved, more and more products have become networked products or intangible products, such as various software. And the software is also developed from common computer software to terminal systems. Before each software product or other similar products and items come into the market, a series of indexes such as product performance and the like of the software product are often automatically tested to obtain a test result. The product being automatically tested may be referred to as the system under test.
However, the test result merely indicates whether the module or the interface in the system under test passes the automated test. When the number of modules or interfaces included in the system to be tested is large, the overall automatic test condition of the system to be tested cannot be mastered in all directions only according to the test result.
Disclosure of Invention
In view of the above, embodiments of the present invention are proposed to provide a statistical method and apparatus of test data that overcome the above problems or at least partially solve the above problems.
In order to solve the above problem, according to a first aspect of an embodiment of the present invention, a statistical method of test data is disclosed, including: scanning service management application of a tested system to obtain the number of modules to be tested contained in the tested system; screening a current period newly added module and a module of the current period compiled automatic test case from all the modules to be tested according to the automatic test case of the test environment of the system to be tested, wherein the current period newly added module represents a module of the current period un-compiled automatic test case; inquiring newly-increased effective test interfaces in the previous period aiming at all the interfaces in all the modules to be tested to obtain the coverage quantity of the newly-increased interfaces in the previous period, wherein the effective test interfaces represent the interfaces which have compiled automatic test cases and pass the test; scanning a web service framework of the test environment to obtain the number of single module interfaces and the number of all interfaces, wherein the single module interfaces represent the interfaces contained in each module to be tested; and sending the identification and the number of the modules to be tested, the identification of the modules in which the automatic test cases are compiled in the current period, the identification of the newly added modules in the current period, the coverage number of the newly added interfaces in the previous period, and the number of the single module interfaces and the number of all the interfaces to corresponding test responsible users.
Optionally, after the scanning the web service framework of the test environment, the method further includes: traversing all the interfaces of all the modules to be tested, inquiring whether each interface writes an effective test interface case or not, and obtaining the number and the detailed information of the uncovered interfaces of the single module and the number of the test debugging interfaces, wherein the uncovered interfaces of the single module represent the interfaces which do not realize the automatic test in each module to be tested; and summarizing according to the number of the single module uncovered interfaces to obtain the number of all uncovered interfaces.
Optionally, the method further comprises: inquiring the transfer date of all the uncovered interfaces, and taking the uncovered interfaces with the transfer date being greater than or equal to a preset date threshold value as an overdue interface; acquiring all interface data in the module to be tested in the previous period and the number of newly added interfaces in the test environment in the current period, and counting the number of the transfer test interfaces in the current period according to the interface data and the number of the newly added interfaces.
Optionally, the method further comprises: acquiring the coverage rate of the single module according to the number of the uncovered interfaces of the single module and the number of the interfaces of the single module; obtaining the total coverage rate according to the number of all uncovered interfaces and the number of all modules to be tested; acquiring first input parameters of interfaces of all the modules to be tested in the web service framework and second input parameters of interfaces of all the modules to be tested in the test environment; and counting according to the first input parameter and the second input parameter to obtain the change condition of the input parameters.
Optionally, the obtaining a variation of the input parameter according to statistics of the first input parameter and the second input parameter includes: for the same interface, if the parameter types of the first input parameter and the second input parameter of the same interface are different, when the parameter type of the first input parameter is an added type, the first input parameter of the same interface is an added parameter, and when the parameter type of the first input parameter is a missing type, the first input parameter of the same interface is a deleted parameter.
Optionally, the method further comprises: scanning the test report of the test environment to obtain the test passing rate of all the modules to be tested; scanning all interfaces of the module to be tested in a development environment and all interfaces of the module to be tested in a test environment, and taking the interfaces which are positioned in the development environment but not positioned in the test environment as non-test-transfer interfaces; and taking the untransformed test interfaces with the written effective test interface cases as debugging interfaces in the development environment, and counting to obtain the number of the debugging interfaces.
According to a second aspect of the embodiments of the present invention, there is also disclosed a statistical apparatus of test data, including: the scanning module is used for scanning the service management application of the tested system and obtaining the number of the modules to be tested contained in the tested system; the screening module is used for screening a current period newly-added module and a module for compiling the automatic test case in the current period from all the modules to be tested according to the automatic test case in the test environment of the system to be tested, wherein the current period newly-added module represents a module for not compiling the automatic test case in the current period; the query module is used for querying all interfaces in all the modules to be tested for the newly added effective test interfaces in the previous period to obtain the coverage quantity of the newly added interfaces in the previous period, wherein the effective test interfaces represent interfaces which have compiled automatic test cases and pass tests; the scanning module is further configured to scan a web service framework of the test environment to obtain the number of single module interfaces and the number of all interfaces, where the single module interface represents an interface included in each module to be tested; and the sending module is used for sending the identification and the number of the modules to be tested, the identification of the modules in which the automatic test cases are compiled in the current period, the identification of the newly added modules in the current period, the number of the newly added interfaces in the previous period, and the number of the single-module interfaces and the number of all the interfaces to corresponding testing responsible users.
Optionally, the query module is further configured to traverse all interfaces of all modules to be tested after the scanning module scans the web service framework of the test environment, query whether each interface has compiled an effective test interface case, obtain the number and detailed information of the uncovered interfaces of the single module, and test the number of the debug interfaces, where the uncovered interfaces of the single module represent interfaces of each module to be tested that do not implement the automated test; the device further comprises: and the summarizing module is used for summarizing the number of all uncovered interfaces according to the number of the uncovered interfaces of the single module.
Optionally, the query module is further configured to query a transfer date of all uncovered interfaces, and use an uncovered interface with the transfer date being greater than or equal to a preset date threshold as an overdue interface; the device further comprises: and the acquisition module is used for acquiring all interface data in the module to be tested in the previous period and the number of newly added interfaces in the test environment in the current period, and counting the number of the transfer test interfaces in the current period according to the interface data and the number of the newly added interfaces.
Optionally, the apparatus further comprises: the calculation module is used for obtaining the coverage rate of the single module according to the number of the uncovered interfaces of the single module and the number of the interfaces of the single module; the computing module is further configured to obtain a total coverage rate according to the number of all uncovered interfaces and the number of all modules to be tested; the acquisition module is further used for acquiring first input parameters of interfaces of all the modules to be tested in the web service framework and second input parameters of interfaces of all the modules to be tested in the test environment; and the counting module is used for counting according to the first input parameter and the second input parameter to obtain the change condition of the input parameters.
Optionally, the statistical module is configured to, for a same interface, if the first input parameter of the same interface is different from the second input parameter in parameter type, if the first input parameter of the same interface is of a newly added type, the first input parameter of the same interface is a newly added parameter, and if the first input parameter of the same interface is of a missing type, the first input parameter of the same interface is a deleted parameter.
Optionally, the scanning module is further configured to scan the test report of the test environment, and obtain the test passing rates of all the modules to be tested; the scanning module is further configured to scan all interfaces of the module to be tested in a development environment and all interfaces of the module to be tested in a test environment, and take an interface located in the development environment but not located in the test environment as an un-test interface; the statistical module is further configured to take the untransformed test interfaces with the compiled valid test interface cases as the debugging interfaces in the development environment, and count to obtain the number of the debugging interfaces.
The embodiment of the invention has the following advantages:
according to the statistical method for the test data provided by the embodiment of the invention, the service management application of the tested system is scanned to obtain the identification and the number of all the modules to be tested contained in the tested system, and the automatic test cases of the test environment of the tested system are scanned to obtain the identification of the module in which the automatic test cases are compiled in the current period in all the modules to be tested. And screening out the identification of the newly added module in the current period according to the identification of all the modules to be tested and the identification of the module in which the automatic test case is compiled in the current period. And the current period newly added module represents a module in which the automatic test case is not written in the current period. And then inquiring a newly added effective test interface in the previous period, wherein the effective test interface represents an interface which is compiled with an automatic test case and passes the test, and the coverage quantity of the newly added interface in the previous period is obtained. And scanning a web service framework of the test environment to obtain the number of single module interfaces and the number of all interfaces, wherein the single module interfaces represent the interfaces contained in each module to be tested. And finally, sending the obtained identification and number of the modules to be tested, the identification of the modules in which the automatic test cases are compiled in the current period, the identification of the newly added modules in the current period, the number of the newly added interfaces in the previous period, the number of the single module interfaces and the number of all the interfaces to corresponding test responsible users, so that the test responsible users can comprehensively master the overall automatic test condition of the tested system.
Drawings
FIG. 1 is a flow chart illustrating the steps of one embodiment of a statistical method of test data according to the present invention;
FIG. 2 is a schematic flow diagram of an aspect of the present invention for automatically checking and tracking statistical interface conditions;
FIG. 3 is a block diagram of an embodiment of a statistical apparatus for test data according to the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
Referring to FIG. 1, a flow chart of steps of an embodiment of a statistical method of test data of the present invention is shown. The statistical method of the test data specifically comprises the following steps:
step 101, scanning the service management application of the system to be tested to obtain the number of modules to be tested contained in the system to be tested.
In the embodiment of the invention, the module in the tested system can be registered in the service management application, and the module in the tested system can be managed through the service management application. In practical application, the service management application may be cusul (cusul is a service management software, supports multiple data centers, is highly available in a distributed manner, and is shared with configuration), and therefore, cusul may be scanned to obtain the identification and number of the to-be-tested modules registered in cusul by the to-be-tested system. That is, scanning Consul may obtain the identity and number of all modules under test.
102, screening a newly added module in the current period and a module in which the automatic test case is compiled in the current period from all the modules to be tested according to the automatic test case in the test environment of the system to be tested.
In an embodiment of the present invention, a test environment represents an operating environment for automated testing of a system under test. By scanning the automatic test cases of the test environment, the condition that the automatic test cases are compiled in all the modules to be tested in the current period can be obtained. That is, by scanning the automatic test case, it can be determined whether each module to be tested in all the modules to be tested compiles the automatic test case. According to actual conditions, the condition of writing the automatic test cases in the current period can be divided into writing the automatic test cases in the current period and not writing the automatic test cases in the current period. The module in which the automatic test case is not written in the current period can be called as a current period newly added module.
The period in the embodiment of the present invention may be a time unit such as hour, day, week, month, etc., and the embodiment of the present invention does not specifically limit the value, unit, etc. of the period.
Step 103, inquiring the newly added effective test interfaces in the previous period aiming at all the interfaces in all the modules to be tested, and obtaining the coverage quantity of the newly added interfaces in the previous period.
In embodiments of the invention, the valid test interface may represent an interface where an automated test case has been written and passed the test. The number of the newly added interfaces in the previous period can represent the number of the interfaces which have compiled the automatic test cases and pass the test in the previous period, and can also represent the workload of the tester in the previous period.
And step 104, scanning the web service framework of the test environment to obtain the number of the single module interfaces and the number of all the interfaces.
In embodiments of the invention, a web service framework may be used to generate, describe, invoke, and visualize restful styles. Where restful is a software architecture style, design style, rather than a standard, and merely provides a set of design principles and constraints. The method is mainly used for the interactive software of the client and the server. Software designed based on the style can be simpler, more hierarchical and easier to realize mechanisms such as cache and the like. In practical applications, the web service framework may be swagger (swagger is a kind of web service framework). And a detailed interface list and parameter information of a corresponding interface can be obtained by scanning swagger. Therefore, the swagger is scanned to obtain a scanning result, and the number of the interfaces of the single module can be obtained through analysis from the scanning result, namely the number of the interfaces contained in each module to be tested. And then, summarizing the number of the interfaces contained in each module to be tested to obtain the number of all the interfaces contained in all the modules to be tested.
It should be noted that, the execution sequence of the steps 101 to 104 may be sequentially executed or may be executed in parallel, and the embodiment of the present invention does not specifically limit the sequential execution relationship between the steps 101 to 104.
And 105, sending the identification and the number of the modules to be tested, the identification of the modules in which the automatic test cases are compiled in the current period, the identification of the newly added modules in the current period, the coverage number of the newly added interfaces in the previous period, the number of the interfaces of the single module and the number of all the interfaces to corresponding test responsible users.
In the embodiment of the present invention, the test data obtained in the above steps 101 to 104 are sent to the corresponding test responsible user, and the test responsible user can perform sorting, comparison, analysis, and the like on the test data, thereby determining the automatic test condition of the system under test. The test data may include, but is not limited to: the number of the modules to be tested, the identification of the modules in which the automatic test cases have been compiled in the current period, the identification of the newly added modules in the current period, the coverage number of the newly added interfaces in the previous period, the number of the interfaces of a single module, the number of all the interfaces and the like.
In an exemplary embodiment of the present invention, after the web service framework of the test environment is scanned, all interfaces of all modules to be tested may also be traversed, and whether each interface has written a valid test interface case is queried. In each module to be tested, the interface of the effective test interface case which is not compiled is a single module uncovered interface. The interface for writing the effective test interface case is a single module coverage interface. Thus, the number of single module uncovered interfaces and detailed information may be obtained, which may include, but is not limited to: uniform Resource Locator (URL), input parameters, description information, notes, etc. For an interface for which an effective test interface case is written, if the interface fails to be tested, the interface is called a test debugging interface, and therefore the number of the debugging test interfaces can be acquired.
After the number of uncovered interfaces of a single module in all the modules to be tested is obtained, the number of the uncovered interfaces of all the modules to be tested can be obtained in a summary mode.
In an exemplary embodiment of the present invention, the transit test dates of all uncovered interfaces may be queried, and if the transit test date of a certain interface is greater than or equal to the preset date threshold, the certain interface is taken as an overdue interface. In practical application, the preset date threshold may be set to be 3 days, and the numerical value, unit and the like of the preset date threshold are not specifically limited in the embodiment of the present invention.
In an exemplary embodiment of the present invention, interface data in all modules to be tested in the last cycle may be obtained, where the interface data may include, but is not limited to: quantity, URL, and description information, etc. The number of newly added interfaces in the test environment of the current period can be obtained, and then the number of the interfaces for transferring the test to the current period is obtained according to the interface data and the number of the newly added interfaces. That is, if an interface does not exist in the previous cycle and exists in the current cycle test environment, the interface is used as the current cycle transfer interface.
In an exemplary embodiment of the present invention, after obtaining the number of single module uncovered interfaces and the number of single module interfaces, a single module coverage rate may be obtained according to the number of single module uncovered interfaces and the number of single module interfaces. That is, the number of uncovered interfaces of the single module may be divided by the number of interfaces of the single module to obtain the coverage of the single module. After the number of all uncovered interfaces and the number of all modules to be tested are obtained, the total coverage rate can be obtained according to the number of all uncovered interfaces and the number of all modules to be tested. That is, the total coverage rate can be obtained by dividing the number of all uncovered interfaces by the number of all modules to be tested.
In an exemplary embodiment of the present invention, first input parameters of interfaces of all test modules in a web service framework and second input parameters of interfaces of all modules to be tested in a test environment may be obtained. And counting according to the first input parameter and the second input parameter to obtain the change condition of the input parameters. In practical application, for a same interface, if the first input parameter and the second input parameter of the same interface are different in parameter type, when the parameter type of the first input parameter is an added type, the first input parameter of the same interface is an added parameter, and when the parameter type of the first input parameter is a missing type, the first input parameter of the same interface is a deleted parameter.
Moreover, for an interface, if the interface does not exist in the web service framework and exists in the test environment, the interface can be regarded as a deleted interface.
In an exemplary embodiment of the present invention, the test report of the test environment may be scanned to obtain the test passing rate of all the modules to be tested. The interfaces of all the modules to be tested in the development environment and the interfaces of all the modules to be tested in the test environment can be scanned, and the interfaces which are positioned in the development environment but not positioned in the test environment can be used as the non-test interfaces. If an interface is located in both the development environment and the test environment, then the interface is the old interface that was previously tested, with no statistics necessary. The untransformed interface of the written effective test interface case can be used as a debugging interface in a development environment, and the number of the debugging interfaces is obtained through statistics. And taking the untransformed interfaces of the uncoded effective test interface cases as the untdebugged interfaces in the development environment, and counting to obtain the number of the untdebugged interfaces.
In an exemplary embodiment of the present invention, in addition to the identifier and the number of the above-mentioned modules to be tested, the identifier of the module in which the automatic test case has been compiled in the current cycle, the identifier of the newly added module in the current cycle, the number of newly added interfaces in the previous cycle, and the number of interfaces of a single module and the number of all interfaces may be sent to the corresponding test responsible users, the number and the detailed information of the interfaces uncovered by a single module, the number of test and debug interfaces, the number of all uncovered interfaces, the identifier of the overdue interface, the number of transfer test interfaces in the current cycle, the coverage of a single module, all coverage, the change condition of the input parameter, the test pass rate, the identifier of the transfer test interfaces and the number of debug interfaces may also be sent to the test responsible users through mails, instant communication tools, and the like.
Based on the above description of a statistical method for testing data, a scheme for automatically checking and tracking the condition of a statistical interface is described as shown in fig. 2.
[1] Scanning the tested system for Consul:
and scanning Consul of the tested system to obtain the tested modules registered on the Consul by the tested system, and obtaining the total number of the tested modules.
[2] Scanning an automatic test case of a test environment:
by comparing the module in which the automatic test case is compiled in the current test environment with all the modules to be tested of the tested system, if the module to be tested of the tested system has not compiled the automatic test case, the module to be tested is counted as a newly added module.
And acquiring the coverage quantity of the yesterday newly added interface by inquiring the yesterday newly added effective test interface.
[3] Scanning a test environment swagger of the module to be tested:
and analyzing the scanning result to obtain the total number of the single module interfaces.
And summarizing the total number of the interfaces of all the modules to be tested to obtain the total number of all the interfaces.
And querying whether each interface writes an effective test interface case or not by traversing all the interfaces of the module to be tested, and obtaining the total number of the uncovered interfaces of the single module and the details of the uncovered interfaces. And if the single module does not cover the interface and the interface use case is compiled, but the debugging is not completed, counting the number of the test debugging interfaces.
And (4) counting as an overdue interface if the number of the transfer test data exceeds 3 days by inquiring the transfer test data of the uncovered interface.
And counting the number of the current conversion test interfaces by comparing the earliest statistical result of the module to be tested to the new increment of the test environment interface of the current statistical time. The statistics may include, but are not limited to: URL, input parameters, description information, notes, etc.
And summarizing the total number of the uncovered interfaces of all the modules to be tested to obtain the total number of all the uncovered interfaces.
And obtaining the coverage rate of the single module through the total number of the uncovered interfaces of the single module/the total number of the interfaces of the single module.
And obtaining the total coverage rate through the total number of the uncovered interfaces of all the modules to be tested/the total number of the interfaces of all the modules to be tested.
And calculating a change interface by comparing the input parameters of the interface in the swagger of the module to be tested with the input parameters of the interface in the test environment, and obtaining the parameter-entering change condition of the interface.
[4] Scanning test environment report:
and through traversing the test report, obtaining the test passing rate of the module to be tested.
[5] Scanning a module to be tested development environment swagger:
and inquiring whether the interface in the development environment is in the interface of the test environment or not by traversing all the interfaces of the development environment of the module to be tested, if not, counting as an interface which is not tested, and if the interface which is not tested is compiled with an interface case, counting as a development environment debugging interface.
[6] The summary statistics inform the relevant test responsible persons:
and sending the acquired summary information and the detailed statistical information of each module to a relevant test responsible person through a mail and an instant messaging tool.
According to the statistical method for the test data provided by the embodiment of the invention, the service management application of the tested system is scanned to obtain the identification and the number of all the modules to be tested contained in the tested system, and the automatic test cases of the test environment of the tested system are scanned to obtain the identification of the module in which the automatic test cases are compiled in the current period in all the modules to be tested. And screening out the identification of the newly added module in the current period according to the identification of all the modules to be tested and the identification of the module in which the automatic test case is compiled in the current period. And the current period newly added module represents a module in which the automatic test case is not written in the current period. And then inquiring a newly added effective test interface in the previous period, wherein the effective test interface represents an interface which is compiled with an automatic test case and passes the test, and the coverage quantity of the newly added interface in the previous period is obtained. And scanning a web service framework of the test environment to obtain the number of single module interfaces and the number of all interfaces, wherein the single module interfaces represent the interfaces contained in each module to be tested. And finally, sending the obtained identification and number of the modules to be tested, the identification of the modules in which the automatic test cases are compiled in the current period, the identification of the newly added modules in the current period, the number of the newly added interfaces in the previous period, the number of the single module interfaces and the number of all the interfaces to corresponding test responsible users, so that the test responsible users can comprehensively master the overall automatic test condition of the tested system.
For a tester, the embodiment of the invention can automatically obtain the test data, such as the pass rate of the newly added module, the uncovered interface, the un-converted test interface, the module to be tested and the like in the current period, which are required by the tester. Therefore, the interface to be tested is automatically identified, and the interface is prevented from being provided by developers. And after the subsequent interface is changed, the tester can be informed to modify the relevant test case in real time.
For a test manager, the embodiment of the invention can automatically obtain the total number of the interfaces, the coverage rate, the pass rate, the newly added coverage number of yesterday, the overdue interface, the debugging interface number and the untransmitted interface number. Therefore, visual statistical data are provided for the working progress and the working quality of the testing personnel.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the illustrated order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments of the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Referring to fig. 3, a block diagram of a structure of an embodiment of a statistical apparatus for test data according to the present invention is shown, where the statistical apparatus for test data specifically includes the following modules:
the scanning module 31 is configured to scan a service management application of a system under test, and obtain the number of modules under test included in the system under test;
the screening module 32 is used for screening a current period newly-added module and a module for compiling the automatic test case in the current period from all the modules to be tested according to the automatic test case in the test environment of the system to be tested, wherein the current period newly-added module represents a module for not compiling the automatic test case in the current period;
the query module 33 is configured to query, for all interfaces in all the modules to be tested, an effective test interface newly added in the previous period to obtain the coverage number of the newly added interface in the previous period, where the effective test interface represents an interface for which an automatic test case has been compiled and passes a test;
the scanning module 31 is further configured to scan a web service framework of the test environment to obtain the number of single module interfaces and the number of all interfaces, where the single module interface represents an interface included in each module to be tested;
a sending module 34, configured to send the identifier and the number of the modules to be tested, the identifier of the module in which the automatic test case has been written in the current period, the identifier of the newly added module in the current period, the number of the newly added interfaces in the previous period, and the number of the single-module interfaces and the number of all the interfaces to the corresponding test responsible users.
In an exemplary embodiment of the present invention, the query module 33 is further configured to traverse all interfaces of all modules to be tested after the scanning module scans the web service framework of the test environment, query whether each interface has compiled an effective test interface case, obtain the number and detailed information of the uncovered interfaces of a single module, and test the number of the debug interfaces, where the uncovered interfaces of the single module represent interfaces of each module to be tested that do not implement the automated test;
the device further comprises:
and the summarizing module is used for summarizing the number of all uncovered interfaces according to the number of the uncovered interfaces of the single module.
In an exemplary embodiment of the present invention, the query module 33 is further configured to query a measurement transfer date of all uncovered interfaces, and use an uncovered interface with the measurement transfer date being greater than or equal to a preset date threshold as an overdue interface;
the device further comprises:
and the acquisition module is used for acquiring all interface data in the module to be tested in the previous period and the number of newly added interfaces in the test environment in the current period, and counting the number of the transfer test interfaces in the current period according to the interface data and the number of the newly added interfaces.
In an exemplary embodiment of the invention, the apparatus further comprises:
the calculation module is used for obtaining the coverage rate of the single module according to the number of the uncovered interfaces of the single module and the number of the interfaces of the single module;
the computing module is further configured to obtain a total coverage rate according to the number of all uncovered interfaces and the number of all modules to be tested;
the acquisition module is further used for acquiring first input parameters of interfaces of all the modules to be tested in the web service framework and second input parameters of interfaces of all the modules to be tested in the test environment;
and the counting module is used for counting according to the first input parameter and the second input parameter to obtain the change condition of the input parameters.
In an exemplary embodiment of the present invention, the statistical module is configured to, for a same interface, if the first input parameter of the same interface is different from the second input parameter of the same interface in parameter type, when the first input parameter of the same interface is an added type, the first input parameter of the same interface is an added parameter, and when the first input parameter of the same interface is a missing type, the first input parameter of the same interface is a deleted parameter.
In an exemplary embodiment of the present invention, the scanning module 31 is further configured to scan the test report of the test environment, and obtain the test passing rates of all the modules to be tested;
the scanning module 31 is further configured to scan all interfaces of the module to be tested in the development environment and all interfaces of the module to be tested in the test environment, and use an interface located in the development environment but not located in the test environment as an un-test interface;
the statistical module is further configured to take the untransformed test interfaces with the compiled valid test interface cases as the debugging interfaces in the development environment, and count to obtain the number of the debugging interfaces.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The statistical method and device for test data provided by the present invention are introduced in detail, and the principle and the implementation manner of the present invention are explained by applying specific examples, and the description of the above embodiments is only used to help understanding the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (10)

1. A statistical method of test data, comprising:
scanning service management application of a tested system to obtain the number of modules to be tested contained in the tested system;
screening a current period newly added module and a module of the current period compiled automatic test case from all the modules to be tested according to the automatic test case of the test environment of the system to be tested, wherein the current period newly added module represents a module of the current period un-compiled automatic test case;
inquiring newly-increased effective test interfaces in the previous period aiming at all the interfaces in all the modules to be tested to obtain the coverage quantity of the newly-increased interfaces in the previous period, wherein the effective test interfaces represent the interfaces which have compiled automatic test cases and pass the test;
scanning a web service framework of the test environment to obtain the number of single module interfaces and the number of all interfaces, wherein the single module interfaces represent the interfaces contained in each module to be tested;
and sending the identification and the number of the modules to be tested, the identification of the modules in which the automatic test cases are compiled in the current period, the identification of the newly added modules in the current period, the coverage number of the newly added interfaces in the previous period, and the number of the single module interfaces and the number of all the interfaces to corresponding test responsible users.
2. The method of claim 1, wherein after the scanning a web service framework of the test environment, the method further comprises:
traversing all the interfaces of all the modules to be tested, inquiring whether each interface writes an effective test interface case or not, and obtaining the number and the detailed information of the uncovered interfaces of the single module and the number of the test debugging interfaces, wherein the uncovered interfaces of the single module represent the interfaces which do not realize the automatic test in each module to be tested;
and summarizing according to the number of the single module uncovered interfaces to obtain the number of all uncovered interfaces.
3. The method of claim 2, further comprising:
inquiring the transfer date of all the uncovered interfaces, and taking the uncovered interfaces with the transfer date being greater than or equal to a preset date threshold value as an overdue interface;
acquiring all interface data in the module to be tested in the previous period and the number of newly added interfaces in the test environment in the current period, and counting the number of the transfer test interfaces in the current period according to the interface data and the number of the newly added interfaces.
4. The method of claim 2, further comprising:
acquiring the coverage rate of the single module according to the number of the uncovered interfaces of the single module and the number of the interfaces of the single module;
obtaining the total coverage rate according to the number of all uncovered interfaces and the number of all modules to be tested;
acquiring first input parameters of interfaces of all the modules to be tested in the web service framework and second input parameters of interfaces of all the modules to be tested in the test environment;
and counting according to the first input parameter and the second input parameter to obtain the change condition of the input parameters.
5. The method according to claim 4, wherein statistically deriving input parameter variation from the first input parameter and the second input parameter comprises:
for the same interface, if the parameter types of the first input parameter and the second input parameter of the same interface are different, when the parameter type of the first input parameter is an added type, the first input parameter of the same interface is an added parameter, and when the parameter type of the first input parameter is a missing type, the first input parameter of the same interface is a deleted parameter.
6. The method of claim 1, further comprising:
scanning the test report of the test environment to obtain the test passing rate of all the modules to be tested;
scanning all interfaces of the module to be tested in a development environment and all interfaces of the module to be tested in a test environment, and taking the interfaces which are positioned in the development environment but not positioned in the test environment as non-test-transfer interfaces;
and taking the untransformed test interfaces with the written effective test interface cases as debugging interfaces in the development environment, and counting to obtain the number of the debugging interfaces.
7. A statistical apparatus for test data, comprising:
the scanning module is used for scanning the service management application of the tested system and obtaining the number of the modules to be tested contained in the tested system;
the screening module is used for screening a current period newly-added module and a module for compiling the automatic test case in the current period from all the modules to be tested according to the automatic test case in the test environment of the system to be tested, wherein the current period newly-added module represents a module for not compiling the automatic test case in the current period;
the query module is used for querying all interfaces in all the modules to be tested for the newly added effective test interfaces in the previous period to obtain the coverage quantity of the newly added interfaces in the previous period, wherein the effective test interfaces represent interfaces which have compiled automatic test cases and pass tests;
the scanning module is further configured to scan a web service framework of the test environment to obtain the number of single module interfaces and the number of all interfaces, where the single module interface represents an interface included in each module to be tested;
and the sending module is used for sending the identification and the number of the modules to be tested, the identification of the modules in which the automatic test cases are compiled in the current period, the identification of the newly added modules in the current period, the number of the newly added interfaces in the previous period, and the number of the single-module interfaces and the number of all the interfaces to corresponding testing responsible users.
8. The apparatus according to claim 7, wherein the query module is further configured to traverse all interfaces of all modules under test after the scan module scans the web service framework of the test environment, query whether each of the interfaces has written a valid test interface case, obtain the number and detailed information of single-module uncovered interfaces representing interfaces of each module under test that do not implement the automated testing, and test the number of debug interfaces;
the device further comprises:
and the summarizing module is used for summarizing the number of all uncovered interfaces according to the number of the uncovered interfaces of the single module.
9. The apparatus of claim 8, wherein the query module is further configured to query a transfer date of all uncovered interfaces, and use an uncovered interface with the transfer date being greater than or equal to a preset date threshold as an overdue interface;
the device further comprises:
and the acquisition module is used for acquiring all interface data in the module to be tested in the previous period and the number of newly added interfaces in the test environment in the current period, and counting the number of the transfer test interfaces in the current period according to the interface data and the number of the newly added interfaces.
10. The apparatus of claim 8, further comprising:
the calculation module is used for obtaining the coverage rate of the single module according to the number of the uncovered interfaces of the single module and the number of the interfaces of the single module;
the computing module is further configured to obtain a total coverage rate according to the number of all uncovered interfaces and the number of all modules to be tested;
the acquisition module is further used for acquiring first input parameters of interfaces of all the modules to be tested in the web service framework and second input parameters of interfaces of all the modules to be tested in the test environment;
and the counting module is used for counting according to the first input parameter and the second input parameter to obtain the change condition of the input parameters.
CN202010654024.4A 2020-07-08 2020-07-08 Statistical method and device for test data Active CN111930611B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010654024.4A CN111930611B (en) 2020-07-08 2020-07-08 Statistical method and device for test data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010654024.4A CN111930611B (en) 2020-07-08 2020-07-08 Statistical method and device for test data

Publications (2)

Publication Number Publication Date
CN111930611A true CN111930611A (en) 2020-11-13
CN111930611B CN111930611B (en) 2022-06-14

Family

ID=73312688

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010654024.4A Active CN111930611B (en) 2020-07-08 2020-07-08 Statistical method and device for test data

Country Status (1)

Country Link
CN (1) CN111930611B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112597001A (en) * 2020-12-07 2021-04-02 长沙市到家悠享网络科技有限公司 Interface testing method and device, electronic equipment and storage medium
CN113127352A (en) * 2021-04-20 2021-07-16 成都新潮传媒集团有限公司 Automatic case statistical method and device and computer readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107423211A (en) * 2017-03-15 2017-12-01 中国互联网络信息中心 A kind of SDNS interfaces automatization test system and method
CN109062817A (en) * 2018-10-15 2018-12-21 网宿科技股份有限公司 Automated testing method and system
CN110795332A (en) * 2018-08-03 2020-02-14 北京京东尚科信息技术有限公司 Automatic testing method and device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107423211A (en) * 2017-03-15 2017-12-01 中国互联网络信息中心 A kind of SDNS interfaces automatization test system and method
CN110795332A (en) * 2018-08-03 2020-02-14 北京京东尚科信息技术有限公司 Automatic testing method and device
CN109062817A (en) * 2018-10-15 2018-12-21 网宿科技股份有限公司 Automated testing method and system

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112597001A (en) * 2020-12-07 2021-04-02 长沙市到家悠享网络科技有限公司 Interface testing method and device, electronic equipment and storage medium
CN113127352A (en) * 2021-04-20 2021-07-16 成都新潮传媒集团有限公司 Automatic case statistical method and device and computer readable storage medium
CN113127352B (en) * 2021-04-20 2023-03-14 成都新潮传媒集团有限公司 Automatic case statistical method and device and computer readable storage medium

Also Published As

Publication number Publication date
CN111930611B (en) 2022-06-14

Similar Documents

Publication Publication Date Title
Athanasiou et al. Test code quality and its relation to issue handling performance
Da Costa et al. A framework for evaluating the results of the szz approach for identifying bug-introducing changes
US8140565B2 (en) Autonomic information management system (IMS) mainframe database pointer error diagnostic data extraction
EP2572294B1 (en) System and method for sql performance assurance services
CN107818431B (en) Method and system for providing order track data
US20150095892A1 (en) Systems and methods for evaluating a change pertaining to a service or machine
CN111930611B (en) Statistical method and device for test data
US10169461B2 (en) Analysis of data utilization
US11436133B2 (en) Comparable user interface object identifications
Ousterhout Always measure one level deeper
KR20070080313A (en) Method and system for analyzing performance of providing services to client terminal
US20180143897A1 (en) Determining idle testing periods
CN113360394A (en) Code test coverage rate statistical method and device
Söylemez et al. Challenges of software process and product quality improvement: catalyzing defect root-cause investigation by process enactment data analysis
US7827533B2 (en) Analytical server, program analysis network system, and program analysis method
CN114036034A (en) Performance test method applied to real-time streaming computation
JP4502535B2 (en) Software quality inspection support system and method
CN110134583B (en) Software testing and data processing method and device
Eloussi Determining flaky tests from test failures
CN112527584A (en) Software efficiency improving method and system based on script compiling and data acquisition
CN111813662A (en) User behavior driven sustainable integration test method, device and equipment
CN110362464A (en) Software analysis method and equipment
CN110399298A (en) A kind of test method and device
Paul et al. Defect-based reliability analysis for mission-critical software
CN116775703A (en) Data processing method, device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant