CN113760712A - Test evaluation method and device - Google Patents
Test evaluation method and device Download PDFInfo
- Publication number
- CN113760712A CN113760712A CN202011090512.3A CN202011090512A CN113760712A CN 113760712 A CN113760712 A CN 113760712A CN 202011090512 A CN202011090512 A CN 202011090512A CN 113760712 A CN113760712 A CN 113760712A
- Authority
- CN
- China
- Prior art keywords
- scene
- preset
- service identifier
- test
- matching
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000012360 testing method Methods 0.000 title claims abstract description 114
- 238000011156 evaluation Methods 0.000 title claims abstract description 30
- 238000000034 method Methods 0.000 claims abstract description 54
- 230000008569 process Effects 0.000 claims abstract description 27
- 238000012544 monitoring process Methods 0.000 claims abstract description 9
- 239000011159 matrix material Substances 0.000 claims description 25
- 238000012545 processing Methods 0.000 claims description 16
- 238000004458 analytical method Methods 0.000 claims description 10
- 238000013507 mapping Methods 0.000 claims description 10
- 238000004590 computer program Methods 0.000 claims description 8
- 238000004422 calculation algorithm Methods 0.000 claims description 7
- 230000003068 static effect Effects 0.000 claims description 7
- 238000010586 diagram Methods 0.000 description 13
- 238000004891 communication Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 7
- 230000003287 optical effect Effects 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000007619 statistical method Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 230000007547 defect Effects 0.000 description 2
- 239000000835 fiber Substances 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 238000009825 accumulation Methods 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000012854 evaluation process Methods 0.000 description 1
- 238000005206 flow analysis Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000012804 iterative process Methods 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 238000012502 risk assessment Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3676—Test management for coverage analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3688—Test management for test execution, e.g. scheduling of test suites
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Debugging And Monitoring (AREA)
Abstract
The invention discloses a test evaluation method and device, and relates to the technical field of computers. One specific implementation mode of the method comprises the steps of monitoring a trigger test, calling a dynamic proxy class and collecting data in the test process; according to a preset scene library, carrying out scene matching on the data based on a preset matching model to obtain a covered scene; and calling a preset coverage model, and calculating to obtain the scene coverage of the test. Therefore, the method and the device can solve the problems that the existing test scene coverage assessment is high in cost, low in efficiency, opaque in data and lack of persuasion.
Description
Technical Field
The invention relates to the technical field of computers, in particular to a test evaluation method and a test evaluation device.
Background
The test coverage is an important index for measuring whether the test execution is complete, and commonly used test coverage includes code coverage, demand coverage and defect coverage. The scene coverage rate is counted from the perspective of the user, how many use scenes the test case covers, how many use scenes the test case has in total, and the percentage is calculated. The statistics of the scene coverage rate is obtained by calculating according to the record of the executed test cases by the tester in the test process, for example, a plurality of scene test cases are totally obtained, the executed test cases are manually marked after the execution of one test case is finished, and finally, the scene coverage rate is obtained by calculating the percentage according to the actually executed cases and the total test cases.
In the process of implementing the invention, the inventor finds that at least the following problems exist in the prior art:
at present, a statistical method of scene coverage is a manual calculation process of a tester, and has the disadvantages that the statistical method is not intelligent enough, the tester possibly forgets how many scenes the tester tests, and needs to execute a test case again, and if the test case is executed by other testing colleagues, the test case also needs to be communicated with other testing colleagues to obtain data. Therefore, the existing scene coverage rate statistical method is high in cost, low in efficiency, opaque in data and lack of persuasion.
Disclosure of Invention
In view of this, embodiments of the present invention provide a test evaluation method and apparatus, which can solve the problems of high cost, low efficiency, opaque data, and lack of persuasion in the existing test scenario coverage evaluation.
In order to achieve the above object, according to an aspect of the embodiments of the present invention, a test evaluation method is provided, which includes monitoring a trigger test, calling a dynamic proxy class, and collecting data in a test process; according to a preset scene library, carrying out scene matching on the data based on a preset matching model to obtain a covered scene; and calling a preset coverage model, and calculating to obtain the scene coverage of the test.
Optionally, collecting data in the test process includes:
according to a data request in a test process, acquiring a service field in the data request to generate a corresponding service identifier, and further acquiring a test service identifier set; all methods on each data request call chain correspond to the same service identifier;
according to a preset scene library, carrying out scene matching on the data based on a preset matching model to obtain a covered scene, wherein the scene matching comprises the following steps:
matching to obtain a covered scene based on a preset scene library according to the service identifier set; the scene library comprises a mapping relation between the service identification and the scene information.
Optionally, obtaining a service field in the data request to generate a corresponding service identifier includes:
and acquiring a service field in the data request, judging whether the service field is empty, if so, generating a universal unique identification code as a service identifier according to the data request, and if not, generating a corresponding service identifier according to the service field through a preset algorithm model.
Optionally, obtaining a covered scene based on a preset scene library matching according to the service identifier set includes:
if the scene information in the scene library correspondingly has a single service identifier, matching the single service identifier with the service identifier set to obtain corresponding scene information, and marking the scene information with a preset label; or
And if the scene information in the scene library correspondingly has a plurality of service identifiers, respectively matching the plurality of service identifiers with the service identifier set, and if all the plurality of service identifiers are successfully matched, acquiring corresponding scene information so as to mark the scene information with a preset label.
Optionally, calling a preset coverage model, and calculating to obtain the tested scene coverage, including:
and counting and summing scene information marked with a preset label to obtain the number of covered scenes, and further calculating the proportion of the number of covered scenes to the total number of scenes in a scene library to obtain the tested scene coverage rate.
Optionally, obtaining a covered scene based on a preset scene library matching according to the service identifier set, further comprising:
if the service identifier which is not successfully matched in the scene library exists in the service identifier set, calling a code static analysis method, generating a calling relation directed graph of the data request corresponding to the service identifier, mapping the mark of each node in the calling relation directed graph into a positive integer of a preset interval section through hash coding, and further converting the mapped calling relation directed graph into an adjacent matrix;
and matching the adjacent matrix with the adjacent matrix corresponding to the service identifier in the scene library according to the adjacent matrix to obtain the successfully matched service identifier in the scene library, and further determining the matched coverage scene.
Optionally, after the scene coverage of the test is calculated, the method includes:
and acquiring unmatched scene information according to the scene library, generating an uncovered scene set, and sending an early warning message of insufficient test.
In addition, the invention also provides a test evaluation device which comprises an acquisition module, a test processing module and a test processing module, wherein the acquisition module is used for monitoring the trigger test, calling the dynamic proxy class and collecting the data in the test process; the processing module is used for carrying out scene matching on the data based on a preset matching model according to a preset scene library so as to obtain a covered scene; and calling a preset coverage model, and calculating to obtain the scene coverage of the test.
One embodiment of the above invention has the following advantages or benefits: the method calls the dynamic proxy class by monitoring the trigger test, and collects data in the test process; and carrying out scene matching on the data according to a preset scene library to obtain a covered scene, and further calculating to obtain the tested scene coverage rate. Therefore, the invention realizes the automatic and sufficient evaluation process of the test scene coverage and makes up the defects of manual test statistics.
Further effects of the above-mentioned non-conventional alternatives will be described below in connection with the embodiments.
Drawings
The drawings are included to provide a better understanding of the invention and are not to be construed as unduly limiting the invention. Wherein:
FIG. 1 is a schematic diagram of a main flow of a test evaluation method according to a first embodiment of the present invention;
FIG. 2 is a schematic diagram of the main flow of a test evaluation method according to a second embodiment of the present invention;
FIG. 3 is a schematic view of the main flow of a test evaluation method according to a third embodiment of the present invention;
FIG. 4 is a schematic diagram of the main modules of a test evaluation device according to an embodiment of the present invention;
FIG. 5 is an exemplary system architecture diagram in which embodiments of the present invention may be employed;
fig. 6 is a schematic block diagram of a computer system suitable for use in implementing a terminal device or server of an embodiment of the invention.
Detailed Description
Exemplary embodiments of the present invention are described below with reference to the accompanying drawings, in which various details of embodiments of the invention are included to assist understanding, and which are to be considered as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the invention. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
Fig. 1 is a schematic view of a main flow of a test evaluation method according to a first embodiment of the present invention, as shown in fig. 1, the test evaluation method including:
and S101, monitoring a trigger test, calling a dynamic proxy class, and acquiring data in the test process.
Wherein the dynamic proxy class is an Interface invoke bearer in a class under java.
In some embodiments, collecting data during the test may specifically include: and according to the data request in the test process, acquiring the service field in the data request to generate a corresponding service identifier, and further acquiring the test service identifier set. Wherein all methods on each data request call chain correspond to the same service identity. That is, the information of the execution class is collected by the dynamic proxy class, each time data is requested, a unique service identifier BUID is generated by hashing according to the service field, and a plurality of BUID are generated by a plurality of service fields.
In a further embodiment, the obtaining a service field in the data request to generate a corresponding service identifier may specifically include: and acquiring a service field in the data request, judging whether the service field is empty, if so, generating a universal unique identification code as a service identifier according to the data request, and if not, generating a corresponding service identifier according to the service field through a preset algorithm model. That is, if the traffic field is empty, a Universally Unique Identifier UUID (abbreviation of universal Unique Identifier) is generated.
And S102, carrying out scene matching on the data based on a preset matching model according to a preset scene library to obtain a covered scene.
In some embodiments, the covered scene may be obtained based on matching a preset scene library according to the service identifier set when step S102 is executed. The scene library comprises a mapping relation between the service identification and the scene information.
In a further embodiment, if a single service identifier exists in the scene information in the scene library correspondingly, matching is performed according to the single service identifier and the service identifier set to obtain corresponding scene information, and then marking of a preset label is performed on the scene information. Or if a plurality of service identifiers exist correspondingly in the scene information in the scene library, respectively matching the scene information with the service identifier set according to the plurality of service identifiers, and if all the plurality of service identifiers are successfully matched, acquiring the corresponding scene information so as to mark the scene information with a preset label.
That is to say, the present invention may execute a preset effective equivalence class determination logic, and perform matching between the service identifier set and the scene library, specifically, the scene corresponds to a single BUID, the service identifier set is compared with the BUIDs in the scene library, if the matching is successful, the scene information (e.g., the scene key) is labeled with a coverage label. The method comprises the steps that a scene corresponds to a plurality of BUIDs, iterative accumulation matching is carried out for a plurality of times (namely, each BUID is matched respectively), and under the condition that a threshold value N of each scene request test is met, BUIDs of the scenes in a scene library are collected and compared, matching is successful if the sets are equal, and a coverage label is printed on scene information (such as a scene key). Wherein the single match success priority is highest.
Preferably, the threshold number of requests may be preset for each scenario, i.e., each scenario includes at most N requests (i.e., the maximum number of requests).
It should be further noted that if an unmatched service identifier in the scene library exists in the service identifier set, a code static analysis method is called (the code static analysis refers to a code analysis technology that scans a program code by lexical analysis, syntactic analysis, control flow, data flow analysis and other technologies and verifies whether the code meets the indexes of normalization, security, reliability, maintainability and the like in a code non-running mode), a call relation directed graph of a data request corresponding to the service identifier is generated, a mark of each node in the call relation directed graph is mapped into a positive integer of a preset interval by hash coding, and the mapped call relation directed graph is converted into an adjacency matrix.
Then, matching the adjacent matrix with an adjacent matrix corresponding to the service identifier in the scene library according to the adjacent matrix, judging whether the successfully matched service identifier in the scene library is obtained, and if so, determining a matched coverage scene; if not, the service identifier which is not successfully matched and the adjacent matrix corresponding to the service identifier are stored in a scene library, and a scene is configured, so that the scene library is updated.
It can be seen that the invention collects the matching between the service identification set and the scene library in the test execution process, the scene covering marking is carried out if the matching is successful, and the scene library is automatically updated if the matching is unsuccessful. In addition, abnormal data can be filtered when the matching is unsuccessful, invalid data can be filtered, scene information can be configured again, and the scene information can be updated to a scene library.
Preferably, the starting node of the directed graph is calculated from the controller layer, id is "starting method # parameter type", and each node id in the directed graph is "method name # parameter type" which is mapped as a positive integer for hash coding hash code mapping to a given segment.
And step S103, calling a preset coverage model, and calculating to obtain the tested scene coverage rate.
In some embodiments, when step S103 is executed, the specific implementation process includes: and counting and summing scene information marked with a preset label to obtain the number of covered scenes, and further calculating the proportion of the number of covered scenes to the total number of scenes in a scene library to obtain the tested scene coverage rate. In addition, the present invention may employ coverage calculation tools such as Jacoco.
Preferably, the coverage model for calculating the scene coverage of the test may be:
scene coverage ═ 100% (number of covered scenes/total number of scenes)%
It should be further noted that after the test scene coverage is obtained through calculation, unmatched scene information may be obtained according to the scene library, an uncovered scene set may be generated, and an insufficient test warning message may be sent. That is to say, the embodiment of the present invention may report the uncovered scene set and provide the early warning of insufficient test. Therefore, the invention can evaluate the scene coverage of the tested case, and report the uncovered scene and warn the insufficient coverage.
As another embodiment, after the tested scene coverage is obtained, it may be determined whether the tested scene coverage is smaller than a preset coverage threshold, if so, a code reconfiguration risk is prompted, and if not, no processing is performed. Therefore, the method and the device can also carry out risk assessment on the test cases.
Fig. 2 is a schematic diagram of a main flow of a test evaluation method according to a second embodiment of the present invention, as shown in fig. 2, the test evaluation method includes:
step S201, monitoring a trigger test, and calling a dynamic proxy class.
Step S202, according to the data request in the test process, obtaining the service field in the data request to generate the corresponding service identifier, and further obtaining the test service identifier set.
Wherein all methods on each data request call chain correspond to the same service identity.
In an embodiment, after a service field in the data request is obtained, whether the service field is empty or not may be determined, if yes, a universal unique identification code is generated according to the data request to serve as a service identifier, and if not, a corresponding service identifier is generated according to the service field through a preset algorithm model. Preferably, the corresponding unique service identifier is generated through the service field based on a hash algorithm according to the service field.
And step S203, obtaining a covered scene based on preset scene library matching according to the service identifier set.
The scene library comprises a mapping relation between the service identification and the scene information.
In an embodiment, if a single service identifier exists corresponding to the scene information in the scene library, matching is performed according to the single service identifier and the service identifier set to obtain corresponding scene information, and then labeling of a preset label is performed on the scene information. Or if a plurality of service identifiers exist correspondingly in the scene information in the scene library, respectively matching the scene information with the service identifier set according to the plurality of service identifiers, and if all the plurality of service identifiers are successfully matched, acquiring the corresponding scene information so as to mark the scene information with a preset label.
Step S204, determining whether the service identifier set has a service identifier that is not successfully matched in the scene library, if so, performing step S205, and if not, directly performing step S207.
Step S205, a code static analysis method is called, a call relation directed graph of the data request corresponding to the service identifier is generated, a mark of each node in the call relation directed graph is mapped to a positive integer of a preset interval through hash coding, and the mapped call relation directed graph is converted into an adjacency matrix.
And step S206, matching the adjacent matrix corresponding to the service identifier in the scene library according to the adjacent matrix to obtain the successfully matched service identifier in the scene library, and further determining the matched coverage scene.
Step S207, counting and summing scene information marked with a preset label to obtain the number of covered scenes, and further calculating the proportion of the number of covered scenes to the total number of scenes in the scene library to obtain the tested scene coverage rate.
And S208, acquiring unmatched scene information according to the scene library, generating an uncovered scene set, and sending an early warning message of insufficient test.
Fig. 3 is a schematic diagram of a main flow of a test evaluation method according to a third embodiment of the present invention, as shown in fig. 3, the test evaluation method includes:
the test scenario may be identified and input first, and then the test data is collected to generate the scenario library. Preferably, a scenario is preset, a test scenario is executed, a dynamic proxy (for example, an abbreviation of AOP: Aspect Programming, which is a technology for implementing unified maintenance of program functions by a precompilation mode and a dynamic proxy during running) collects a data request in the program execution process, obtains a service field in the data request, processes the service field through a hash algorithm to obtain a unique service identifier of the data request, and further stores a mapping relationship between the service identifier and the preset scenario in a scenario library. From another perspective, the present invention can convert a scene into a representation of a programming language, which can also be referred to as a scene converter. Of course, the scene library may be automatically updated during the iterative process of the later test program execution.
It is worth mentioning that the preset scenario may include one or more service identifiers.
Preferably, the information of the execution class is collected through a dynamic proxy class, a service field is set every time a WEB request is made, a unique service identifier BUID is generated through hashing, and a plurality of BUIDs are generated through a plurality of service fields. And if the service field is empty, generating the UUID. After that, the BUID or the UUID is transferred through a thread sharing variable of the thread local, and if the situation of cross-layer calling and R PC calling exists, the BUID or the UUID is set to be a hidden parameter for transferring. In addition, the methods on the request call chain are all identified with the same service, i.e. < BUID or UUID, { method set } >. Several different web requests are defined as one scenario, and a single web request can also be used as one scenario. Finally, key-value pairs of scene and service identifications are generated, < scene key, { BUID, UUID. Also, a threshold number of requests may be preset for each scene, i.e., each scene includes at most N requests (i.e., a maximum number of requests).
Further, data requests in the program execution process can be collected through the data collector and imported into the scene library.
After the preparation of the scene library is completed, in the testing process or the online monitoring process of the program, the dynamic agent class acquires the entry and exit of the method execution and the instrumentation information of the code, and can obtain < BUID or UUID of each data request, { method set } >, and the service identification set { BUID, UUID. } of the test. And if the scene information in the scene library correspondingly has the single service identifier, matching the single service identifier with the service identifier set to obtain corresponding scene information, and marking the scene information with a preset label. Or if the scene information in the scene library correspondingly has a plurality of service identifiers, respectively matching the plurality of service identifiers with the service identifier set, and if all the plurality of service identifiers are successfully matched, acquiring the corresponding scene information so as to mark the scene information with a preset label.
And counting and summing scene information marked with a preset label to obtain the number of covered scenes, and further calculating the proportion of the number of covered scenes to the total number of scenes in a scene library to obtain the tested scene coverage rate. Meanwhile, acquiring unmatched scene information according to the scene library, generating an uncovered scene set, and sending an early warning message of insufficient test.
It should be noted that, if an unmatched service identifier in the scene library is successfully matched in the service identifier set, a code static analysis method is called to generate a call relationship directed graph of the data request corresponding to the service identifier, so that a mark of each node in the call relationship directed graph is mapped to a positive integer of a preset interval through hash coding, and the mapped call relationship directed graph is converted into an adjacency matrix. Then, matching the adjacent matrix with an adjacent matrix corresponding to the service identifier in the scene library according to the adjacent matrix, judging whether the successfully matched service identifier in the scene library is obtained, and if so, determining a matched coverage scene; and if not, storing the service identifier which is not successfully matched and the adjacent matrix corresponding to the service identifier into a scene library, configuring a scene key, and further updating the scene library.
Fig. 4 is a schematic diagram of main modules of a test evaluation device according to an embodiment of the present invention, and as shown in fig. 4, the test evaluation device 400 includes an acquisition module 401 and a processing module 402. The acquisition module 401 monitors a trigger test, calls a dynamic proxy class, and acquires data in the test process; the processing module 402 performs scene matching on the data based on a preset matching model according to a preset scene library to obtain a covered scene; and calling a preset coverage model, and calculating to obtain the scene coverage of the test.
In some embodiments, the obtaining module 401 collects data during the test process, including:
according to a data request in a test process, acquiring a service field in the data request to generate a corresponding service identifier, and further acquiring a test service identifier set; all methods on each data request call chain correspond to the same service identifier;
the processing module 402 performs scene matching on the data based on a preset matching model according to a preset scene library to obtain a covered scene, including:
matching to obtain a covered scene based on a preset scene library according to the service identifier set; the scene library comprises a mapping relation between the service identification and the scene information.
In some embodiments, the obtaining module 401 obtains the service field in the data request to generate the corresponding service identifier, including:
and acquiring a service field in the data request, judging whether the service field is empty, if so, generating a universal unique identification code as a service identifier according to the data request, and if not, generating a corresponding service identifier according to the service field through a preset algorithm model.
In some embodiments, the processing module 402 matches and obtains covered scenes based on a preset scene library according to the service identifier set, including:
if the scene information in the scene library correspondingly has a single service identifier, matching the single service identifier with the service identifier set to obtain corresponding scene information, and marking the scene information with a preset label; or
And if the scene information in the scene library correspondingly has a plurality of service identifiers, respectively matching the plurality of service identifiers with the service identifier set, and if all the plurality of service identifiers are successfully matched, acquiring corresponding scene information so as to mark the scene information with a preset label.
In some embodiments, the processing module 402 invokes a preset coverage model, and calculates the scene coverage of the test, including:
and counting and summing scene information marked with a preset label to obtain the number of covered scenes, and further calculating the proportion of the number of covered scenes to the total number of scenes in a scene library to obtain the tested scene coverage rate.
In some embodiments, the processing module 402 matches and obtains covered scenes based on a preset scene library according to the service identifier set, and further includes:
if the service identifier which is not successfully matched in the scene library exists in the service identifier set, calling a code static analysis method, generating a calling relation directed graph of the data request corresponding to the service identifier, mapping the mark of each node in the calling relation directed graph into a positive integer of a preset interval section through hash coding, and further converting the mapped calling relation directed graph into an adjacent matrix;
and matching the adjacent matrix with the adjacent matrix corresponding to the service identifier in the scene library according to the adjacent matrix to obtain the successfully matched service identifier in the scene library, and further determining the matched coverage scene.
In some embodiments, after the processing module 402 calculates the scene coverage of the test, it includes:
and acquiring unmatched scene information according to the scene library, generating an uncovered scene set, and sending an early warning message of insufficient test.
It should be noted that the test evaluation method and the test evaluation apparatus of the present invention have corresponding relation in the specific implementation contents, and therefore, the repeated contents are not described again.
Fig. 5 illustrates an exemplary system architecture 500 to which the test evaluation method or the test evaluation apparatus of the embodiments of the present invention may be applied.
As shown in fig. 5, the system architecture 500 may include terminal devices 501, 502, 503, a network 504, and a server 505. The network 504 serves to provide a medium for communication links between the terminal devices 501, 502, 503 and the server 505. Network 504 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The user may use the terminal devices 501, 502, 503 to interact with a server 505 over a network 504 to receive or send messages or the like. The terminal devices 501, 502, 503 may have installed thereon various communication client applications, such as shopping-like applications, web browser applications, search-like applications, instant messaging tools, mailbox clients, social platform software, etc. (by way of example only).
The terminal devices 501, 502, 503 may be various electronic devices having test evaluation screens and supporting web browsing, including but not limited to smart phones, tablet computers, laptop portable computers, desktop computers, and the like.
It should be noted that the test evaluation method provided by the embodiment of the present invention is generally executed by the server 505, and accordingly, the computing device is generally disposed in the server 505.
It should be understood that the number of terminal devices, networks, and servers in fig. 5 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
Referring now to FIG. 6, a block diagram of a computer system 600 suitable for use with a terminal device implementing an embodiment of the invention is shown. The terminal device shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 6, the computer system 600 includes a Central Processing Unit (CPU)601 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage section 608 into a Random Access Memory (RAM) 603. In the RAM603, various programs and data necessary for the operation of the computer system 600 are also stored. The CPU601, ROM602, and RAM603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
The following components are connected to the I/O interface 605: an input portion 606 including a keyboard, a mouse, and the like; an output section 607 including a display such as a Cathode Ray Tube (CRT), a liquid crystal test evaluator (LCD), and the like, and a speaker and the like; a storage section 608 including a hard disk and the like; and a communication section 609 including a network interface card such as a LAN card, a modem, or the like. The communication section 609 performs communication processing via a network such as the internet. The driver 610 is also connected to the I/O interface 605 as needed. A removable medium 611 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 610 as necessary, so that a computer program read out therefrom is mounted in the storage section 608 as necessary.
In particular, according to the embodiments of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 609, and/or installed from the removable medium 611. The computer program performs the above-described functions defined in the system of the present invention when executed by the Central Processing Unit (CPU) 601.
It should be noted that the computer readable medium shown in the present invention can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present invention, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present invention, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules described in the embodiments of the present invention may be implemented by software or hardware. The described modules may also be provided in a processor, which may be described as: a processor includes an acquisition module and a processing module. Wherein the names of the modules do not in some cases constitute a limitation of the module itself.
As another aspect, the present invention also provides a computer-readable medium that may be contained in the apparatus described in the above embodiments; or may be separate and not incorporated into the device. The computer readable medium carries one or more programs, and when the one or more programs are executed by the equipment, the equipment monitors the trigger test, calls the dynamic proxy class and collects data in the test process; according to a preset scene library, carrying out scene matching on the data based on a preset matching model to obtain a covered scene; and calling a preset coverage model, and calculating to obtain the scene coverage of the test.
According to the technical scheme of the embodiment of the invention, the problems of high cost, low efficiency, opaque data and lack of persuasion of the existing test scene coverage evaluation can be solved.
The above-described embodiments should not be construed as limiting the scope of the invention. Those skilled in the art will appreciate that various modifications, combinations, sub-combinations, and substitutions can occur, depending on design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Claims (10)
1. A test evaluation method, comprising:
monitoring a trigger test, calling a dynamic proxy class, and collecting data in the test process;
according to a preset scene library, carrying out scene matching on the data based on a preset matching model to obtain a covered scene;
and calling a preset coverage model, and calculating to obtain the scene coverage of the test.
2. The method of claim 1, wherein collecting data during the test comprises:
according to a data request in a test process, acquiring a service field in the data request to generate a corresponding service identifier, and further acquiring a test service identifier set; all methods on each data request call chain correspond to the same service identifier;
according to a preset scene library, carrying out scene matching on the data based on a preset matching model to obtain a covered scene, wherein the scene matching comprises the following steps:
matching to obtain a covered scene based on a preset scene library according to the service identifier set; the scene library comprises a mapping relation between the service identification and the scene information.
3. The method of claim 2, wherein obtaining the service field in the data request to generate the corresponding service identifier comprises:
and acquiring a service field in the data request, judging whether the service field is empty, if so, generating a universal unique identification code as a service identifier according to the data request, and if not, generating a corresponding service identifier according to the service field through a preset algorithm model.
4. The method of claim 2, wherein matching the covered scene based on a preset scene library according to the service identifier set comprises:
if the scene information in the scene library correspondingly has a single service identifier, matching the single service identifier with the service identifier set to obtain corresponding scene information, and marking the scene information with a preset label; or
And if the scene information in the scene library correspondingly has a plurality of service identifiers, respectively matching the plurality of service identifiers with the service identifier set, and if all the plurality of service identifiers are successfully matched, acquiring corresponding scene information so as to mark the scene information with a preset label.
5. The method of claim 4, wherein invoking a preset coverage model to calculate the tested scene coverage comprises:
and counting and summing scene information marked with a preset label to obtain the number of covered scenes, and further calculating the proportion of the number of covered scenes to the total number of scenes in a scene library to obtain the tested scene coverage rate.
6. The method of claim 2, wherein the matching of the covered scenes based on a preset scene library according to the service identifier set further comprises:
if the service identifier which is not successfully matched in the scene library exists in the service identifier set, calling a code static analysis method, generating a calling relation directed graph of the data request corresponding to the service identifier, mapping the mark of each node in the calling relation directed graph into a positive integer of a preset interval section through hash coding, and further converting the mapped calling relation directed graph into an adjacent matrix;
and matching the adjacent matrix with the adjacent matrix corresponding to the service identifier in the scene library according to the adjacent matrix to obtain the successfully matched service identifier in the scene library, and further determining the matched coverage scene.
7. The method of any one of claims 1-6, wherein after calculating the scene coverage of the test, comprising:
and acquiring unmatched scene information according to the scene library, generating an uncovered scene set, and sending an early warning message of insufficient test.
8. A test evaluation device, comprising:
the acquisition module is used for monitoring a touch test, calling a dynamic proxy class and acquiring data in the test process;
the processing module is used for carrying out scene matching on the data based on a preset matching model according to a preset scene library so as to obtain a covered scene; and calling a preset coverage model, and calculating to obtain the scene coverage of the test.
9. An electronic device, comprising:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-5.
10. A computer-readable medium, on which a computer program is stored, which, when being executed by a processor, carries out the method according to any one of claims 1-5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011090512.3A CN113760712A (en) | 2020-10-13 | 2020-10-13 | Test evaluation method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011090512.3A CN113760712A (en) | 2020-10-13 | 2020-10-13 | Test evaluation method and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113760712A true CN113760712A (en) | 2021-12-07 |
Family
ID=78785959
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011090512.3A Pending CN113760712A (en) | 2020-10-13 | 2020-10-13 | Test evaluation method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113760712A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114328248A (en) * | 2021-12-30 | 2022-04-12 | 杭州笨马网络技术有限公司 | Software test scene coverage rate analysis method, device, equipment and storage medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170289008A1 (en) * | 2016-03-30 | 2017-10-05 | Ca, Inc. | Scenario coverage in test generation |
US10146668B1 (en) * | 2013-12-20 | 2018-12-04 | EMC IP Holding Company LLC | Modeling code coverage in software life cycle |
CN109871311A (en) * | 2017-12-04 | 2019-06-11 | 北京京东尚科信息技术有限公司 | A kind of method and apparatus for recommending test case |
CN110851343A (en) * | 2018-08-21 | 2020-02-28 | 北京京东尚科信息技术有限公司 | Test method and device based on decision tree |
CN111400189A (en) * | 2020-03-25 | 2020-07-10 | 平安银行股份有限公司 | Code coverage rate monitoring method and device, electronic equipment and storage medium |
-
2020
- 2020-10-13 CN CN202011090512.3A patent/CN113760712A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10146668B1 (en) * | 2013-12-20 | 2018-12-04 | EMC IP Holding Company LLC | Modeling code coverage in software life cycle |
US20170289008A1 (en) * | 2016-03-30 | 2017-10-05 | Ca, Inc. | Scenario coverage in test generation |
CN109871311A (en) * | 2017-12-04 | 2019-06-11 | 北京京东尚科信息技术有限公司 | A kind of method and apparatus for recommending test case |
CN110851343A (en) * | 2018-08-21 | 2020-02-28 | 北京京东尚科信息技术有限公司 | Test method and device based on decision tree |
CN111400189A (en) * | 2020-03-25 | 2020-07-10 | 平安银行股份有限公司 | Code coverage rate monitoring method and device, electronic equipment and storage medium |
Non-Patent Citations (1)
Title |
---|
张瑶;白晓颖;张任伟;陆皓;: "一种基于模型的测试充分性评估方法", 计算机科学, no. 02, 15 February 2013 (2013-02-15) * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114328248A (en) * | 2021-12-30 | 2022-04-12 | 杭州笨马网络技术有限公司 | Software test scene coverage rate analysis method, device, equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110532322B (en) | Operation and maintenance interaction method, system, computer readable storage medium and equipment | |
CN111460129A (en) | Method and device for generating identification, electronic equipment and storage medium | |
CN113076253A (en) | Test method and test device | |
CN115617511A (en) | Resource data processing method and device, electronic equipment and storage medium | |
CN108764866B (en) | Method and equipment for allocating resources and drawing resources | |
US11734057B2 (en) | Method and apparatus for processing a service of an abnormal server | |
CN113760712A (en) | Test evaluation method and device | |
CN111274104B (en) | Data processing method, device, electronic equipment and computer readable storage medium | |
CN110580216B (en) | Application lifting method and device | |
CN114465919B (en) | Network service testing method, system, electronic equipment and storage medium | |
CN113778780B (en) | Application stability determining method and device, electronic equipment and storage medium | |
CN112559001B (en) | Method and device for updating application | |
CN110888770B (en) | Method and device for transmitting information | |
CN113114612B (en) | Determination method and device for distributed system call chain | |
CN115174224B (en) | Information security monitoring method and device suitable for industrial control network | |
CN110852537A (en) | Service quality detection method and device | |
CN115309612B (en) | Method and device for monitoring data | |
CN109376023B (en) | Method and equipment for generating calling information and issuing return code group | |
CN116881238A (en) | Micro-service architecture-based SAAS (software as a service) platform and method for power market user management | |
CN117634877A (en) | Risk monitoring method, risk monitoring device, electronic equipment and computer readable medium | |
CN117749755A (en) | Message forwarding method and device in pre-release environment | |
CN117152861A (en) | Automatic inspection method, device, electronic equipment and computer readable medium | |
CN116841857A (en) | Method, device and system for testing application | |
CN117201597A (en) | Message processing method, device, electronic equipment and storage medium | |
CN117667719A (en) | Interface testing method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |