CN116185874A - Test report generation method, device, equipment and storage medium - Google Patents

Test report generation method, device, equipment and storage medium Download PDF

Info

Publication number
CN116185874A
CN116185874A CN202310290838.8A CN202310290838A CN116185874A CN 116185874 A CN116185874 A CN 116185874A CN 202310290838 A CN202310290838 A CN 202310290838A CN 116185874 A CN116185874 A CN 116185874A
Authority
CN
China
Prior art keywords
code
test
difference
tested
difference code
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310290838.8A
Other languages
Chinese (zh)
Inventor
陈嘉俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Bozhong Intelligent Technology Investment Co ltd
Original Assignee
Guangdong Bozhong Intelligent Technology Investment Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Bozhong Intelligent Technology Investment Co ltd filed Critical Guangdong Bozhong Intelligent Technology Investment Co ltd
Priority to CN202310290838.8A priority Critical patent/CN116185874A/en
Publication of CN116185874A publication Critical patent/CN116185874A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3676Test management for coverage analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The invention discloses a test report generation method, a device, equipment and a storage medium. The method comprises the following steps: acquiring a code to be tested; determining a difference code according to the code to be tested; if the code to be tested contains a test case, testing the difference code to obtain a test result corresponding to the difference code; and generating a target test report according to the test result corresponding to the difference code and the preset parameters. The technical scheme of the embodiment of the invention can ensure the running efficiency of each unit test, solve the problem that the compiling speed is slow due to full execution of each unit test, improve the development quality, reduce the workload of a test team, simultaneously can more efficiently and rapidly search out missing test points in a test report, and solve the problem of missing test scenes caused by black box test.

Description

Test report generation method, device, equipment and storage medium
Technical Field
The present invention relates to the field of code detection technologies, and in particular, to a method, an apparatus, a device, and a storage medium for generating a test report.
Background
The unit test can play a role in self-test of a developer, reduce low-level bug of the test, and improve the test quality. At present, as software iteration is continuously performed, software codes are more and more complex, service scenes are more and more abundant, challenges are more and more for development and testers, how to efficiently perform unit testing, and accurate and efficient testing are a great focus of current agile development.
The existing test method is mainly performed in modes of manual test, automatic inspection and the like, and belongs to a black box test for testers. In the automatic test or manual test process of a test team, the test scope needs subjective experience of a tester or a full-scale regression mode to test. In the existing test coverage rate report tool, the generated report is the full code coverage rate, and in the report, the code which is actually required to be used by a tester can be less than one thousandth of the code in the whole report, so that the report is seriously redundant. Therefore, for the testers, the test report greatly increases the workload of the testers, can only serve as a reference function, cannot truly realize accurate test, and still depends on subjective experience and other modes of the testers to test or complete black box test. On the other hand, the unit test, each method is necessary to perform the writing unit test. Based on the existing tool, the test can only pass through the full running of all units each time, and the running speed is slower and slower along with more iterative codes.
Disclosure of Invention
The invention provides a test report generation method, a device, equipment and a storage medium, which are used for ensuring the running efficiency of unit tests each time, solving the problem that the compiling speed is slow due to full execution of unit tests each time, improving the development quality, reducing the workload of a test team, simultaneously, searching out missing test points in the test report more efficiently and rapidly, and solving the missing test scene problem caused by black box tests.
According to an aspect of the present invention, there is provided a test report generating method, the method comprising:
acquiring a code to be tested;
determining a difference code according to the code to be tested;
if the code to be tested contains a test case, testing the difference code to obtain a test result corresponding to the difference code;
and generating a target test report according to the test result corresponding to the difference code and the preset parameters.
Optionally, if the code to be tested includes a test case, testing the difference code to obtain a test result corresponding to the difference code, where the test result includes:
if the code to be tested is detected to contain the test case, acquiring the coverage percentage of the test case in the difference code;
and if the coverage percentage of the test cases in the difference codes is greater than or equal to a preset threshold value, testing the difference codes to obtain test results corresponding to the difference codes.
Optionally, the method further comprises:
and if the code to be tested is detected to not contain the test case, and/or the coverage percentage of the test case in the difference code is smaller than the preset threshold value, determining that compiling fails.
Optionally, after determining the difference code according to the code to be tested, the method further includes:
acquiring the number of the difference codes;
and adding a preset character string for each difference code.
Optionally, generating a target test report according to the test result and the preset parameter corresponding to the difference code includes:
acquiring preset parameters corresponding to the preset character strings;
and generating a target test report according to the test result corresponding to the difference code and the preset parameters.
Optionally, determining a difference code according to the code to be tested includes:
acquiring a code to be tested and a reference code;
and comparing the code to be detected with the reference code, and determining a code which is newly added and/or modified in the code to be detected compared with the reference code as a difference code.
Optionally, if the code to be tested includes a test case, testing the difference code to obtain a test result corresponding to the difference code, where the test result includes:
and if the code to be tested contains the target character, testing the difference code to obtain a test result corresponding to the difference code.
According to another aspect of the present invention, there is provided a test report generating apparatus including:
the first acquisition module is used for acquiring codes to be detected;
the determining module is used for determining a difference code according to the code to be detected;
the test module is used for testing the difference code if the code to be tested contains the test case, so as to obtain a test result corresponding to the difference code;
and the generating module is used for generating a target test report according to the test result corresponding to the difference code and the preset parameters.
According to another aspect of the present invention, there is provided an electronic apparatus including:
at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the test report generating method according to any one of the embodiments of the present invention.
According to another aspect of the present invention, there is provided a computer readable storage medium storing computer instructions for causing a processor to execute a test report generating method according to any one of the embodiments of the present invention.
According to the technical scheme, the code to be tested is obtained, the difference code is determined according to the code to be tested, if the code to be tested is detected to contain the test case, the difference code is tested, a test result corresponding to the difference code is obtained, and a target test report is generated according to the test result corresponding to the difference code and preset parameters. The technical scheme of the embodiment of the invention can ensure the running efficiency of each unit test, solve the problem that the compiling speed is slow due to full execution of each unit test, improve the development quality, reduce the workload of a test team, simultaneously can more efficiently and rapidly search out missing test points in a test report, and solve the problem of missing test scenes caused by black box test.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the invention or to delineate the scope of the invention. Other features of the present invention will become apparent from the description that follows.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a test report generation method according to a first embodiment of the present invention;
FIG. 2 is a schematic diagram of a test report generating device according to a second embodiment of the present invention;
fig. 3 is a schematic structural diagram of an electronic device implementing a test report generating method according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "target," and the like in the description and claims of the present invention and in the above-described figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
Fig. 1 is a flowchart of a test report generating method according to a first embodiment of the present invention, which is applicable to a test report generating situation, and the method may be performed by a test report generating device, which may be implemented in hardware and/or software, and the test report generating device may be integrated in any electronic device that provides a test report generating function. As shown in fig. 1, the method includes:
s101, acquiring a code to be tested.
In this embodiment, the code to be tested may be a code to be subjected to unit testing.
Specifically, a code to be tested for a unit test is obtained.
S102, determining a difference code according to the code to be tested.
It should be explained that the difference code may be a code having a difference between a code that has been subjected to the unit test last time and a code that is to be subjected to the unit test this time after writing based on the code. Specifically, the code that has been subjected to the unit test last time may be used as the reference code, i.e., the difference code may be a code having a difference between the reference code and the code to be tested. The type of differential code may be, for example, a newly added code, a deleted code, or a modified code.
Specifically, the secondary development of the maven-surafire-plug in unit test plug-in unit can be realized, the comparison of the development code to be tested and the reference code through a self-grinding algorithm by a git original method is realized, the difference code is pulled, and the addition and the modification of the unit test code are matched, so that when the unit test is executed, only the corresponding newly added and modified method code is executed, unnecessary unit test execution is reduced, and only the newly added and modified unit test is concerned, thereby achieving the beneficial effects of forcing the unit test writing and not affecting too much compiling efficiency.
In the actual operation process, firstly judging whether the type of the difference code is a deleted code, if so, neglecting, and not paying attention to the deleted code, namely, not carrying out unit test after deleting.
And S103, if the code to be tested contains the test case, testing the difference code to obtain a test result corresponding to the difference code.
The test cases may be test cases written by a user for performing unit tests.
It should be noted that the test result may be a result based on the run-time unit test forced to compile according to the difference code. For example, the test results may include the number of methods corresponding to the difference code, the start time of compilation, the end time of compilation, the total consumption time of compilation, whether compilation succeeded or failed, the number of methods that failed to compile, the number of lines of code that a specific compilation failed, and the like.
Specifically, the specific method for detecting whether the code to be detected contains a Test case or not may be to detect whether the code to be detected contains a name of the Test case, where the name of the Test case may be an original class name+test (the difference code is a class level) or a method name+test (the difference code is a method level), so that the specific method for detecting whether the code to be detected contains the Test case or not may be to detect whether the code to be detected contains a Test word. And if the code to be tested contains the test case, testing the difference code to obtain a test result corresponding to the difference code.
S104, generating a target test report according to the test result corresponding to the difference code and the preset parameters.
It should be explained that the preset parameter may be a value for deriving the target test report. The preset parameter may be a parameter added at a command for generating a report, and the newly added parameter is used for adding a json string of the difference code as a parameter, and processing the required report code through intersection, thereby generating a test report of only the difference code. Specifically, the preset parameter may be a json string added in the difference code, or may be a parameter such as a number, a letter, or a symbol preset by the user according to an actual situation, which is not limited in this embodiment.
In this embodiment, the target test report may be a test report corresponding to the difference code. For example, the target test report may include the number of methods corresponding to the difference code, the start time of compilation, the end time of compilation, the total consumption time of compilation, whether compilation succeeds or fails, the number of methods that have failed to compile, and what number of lines the code that have failed to compile specifically is.
Specifically, after the normal operation of the differential code is completed and the test task is completed, the reconstructed jacoco package is executed to manually generate a coverage rate statistics report, when the reconstructed jacoco plug-in unit is used for generating the test report, the code to be tested and the reference code are developed through a git native method and are compared through a self-grinding algorithm to obtain the differential code, then a preset parameter is added at the command of generating the report through adding the parameter form, and the required report code is processed through intersection, so that the coverage rate report of the differential code is only generated.
According to the technical scheme, the code to be tested is obtained, the difference code is determined according to the code to be tested, if the code to be tested is detected to contain the test case, the difference code is tested, a test result corresponding to the difference code is obtained, and a target test report is generated according to the test result corresponding to the difference code and preset parameters. The technical scheme of the embodiment of the invention can ensure the running efficiency of each unit test, solve the problem that the compiling speed is slow due to full execution of each unit test, improve the development quality, reduce the workload of a test team, simultaneously can more efficiently and rapidly search out missing test points in a test report, and solve the problem of missing test scenes caused by black box test.
Optionally, if the code to be tested is detected to contain a test case, testing the difference code to obtain a test result corresponding to the difference code, including:
if the code to be tested is detected to contain the test cases, the coverage percentage of the test cases in the difference code is obtained.
It should be explained that the percentage coverage may be the percentage of the number of test cases to the number of methods contained in the difference code. By way of example, where the discrepancy code contains 10 method classes, the number of test cases may be 8, and the percentage of coverage of the test cases by the discrepancy code may be 80%.
Specifically, if the code to be tested is detected to contain test cases, the coverage percentage of the number of the test cases in the number of methods contained in the difference code is obtained. In the actual operation process, the statistic of the unit test coverage rate of jacoco can be performed according to the difference codes, if the unit test coverage rate cannot reach the preset ratio on the compiling level, the compiling can be performed after the unit test self-test passes through after the compiling is completed.
And if the coverage percentage of the test cases in the difference codes is greater than or equal to a preset threshold value, testing the difference codes to obtain test results corresponding to the difference codes.
The preset threshold may be a threshold preset by a user based on actual conditions, and the specific value of the preset threshold is not limited in this embodiment. By way of example, the preset threshold may be 60%.
Specifically, if the percentage of coverage of the test cases to the difference codes is detected to be greater than or equal to a preset threshold, the difference codes are tested, and a test result corresponding to the difference codes is obtained.
Optionally, the test report generating method further includes:
if the code to be tested is detected to not contain the test case, and/or the coverage percentage of the test case to the difference code is smaller than a preset threshold value, determining that compiling fails.
Specifically, if the code to be tested is detected to contain no test case, the compiling is determined to fail, the unit test is forced to be written, and in consideration of some simpler tool types, the unit test is unnecessary, and the unit test can be ignored through a packet path in the configuration process. And/or if the coverage percentage of the test cases in the difference codes is smaller than a preset threshold value, determining that compiling fails, reporting errors, forcedly writing unit tests and performing self-test.
Optionally, after determining the difference code according to the code to be tested, the method further includes:
the number of difference codes is obtained.
Specifically, after the code to be measured and the reference code are compared to determine the difference code, the number of the difference codes is obtained. For example, each line of the difference codes may be determined as one difference code, or each method difference code may be determined as one difference code, which is not limited in this embodiment.
A preset string is added for each difference code.
In this embodiment, the preset character string may be a json character string added in the difference code. Specifically, the preset character string may be added at any suitable position of the difference code, which is not limited in this embodiment.
Specifically, a preset character string may be added to each difference code as identification information of each difference code.
Optionally, generating a target test report according to a test result and a preset parameter corresponding to the difference code includes:
and obtaining preset parameters corresponding to the preset character strings.
Specifically, a correspondence between the preset string and the preset parameter may be pre-established, for example, the preset string 1 corresponds to the preset parameter 1, the preset string 2 corresponds to the preset parameter 2, and the like, and the correspondence between the preset string and the preset parameter may be stored in a list form.
And generating a target test report according to the test result corresponding to the difference code and the preset parameters.
Specifically, a preset parameter may be input, and a target test report of a difference code where the preset character string corresponding to the preset parameter is located is obtained according to a corresponding relationship between the preset character string and the preset parameter. Therefore, only the coverage rate condition of the difference codes which the user wants to check is displayed, the code coverage condition is not required to be searched in the full unit test report in a low-efficiency mode, missing test points are searched more efficiently and rapidly, and the problem of missing test scenes caused by black box testing can be solved.
Optionally, determining the difference code according to the code to be tested includes:
and acquiring the code to be tested and the reference code.
The reference code may be a history code corresponding to the code to be tested, that is, a code that has been subjected to the unit test before the code to be tested is adapted.
Specifically, a reference code, that is, a code which has been subjected to the unit test last time, is obtained, and a code to be tested, that is, a code which is about to be subjected to the unit test this time after writing is performed on the basis of the reference code, is obtained.
And comparing the code to be detected with the reference code, and determining the code which is newly added and/or modified in the code to be detected compared with the reference code as a difference code.
Specifically, the secondary development of the maven-surafire-plug in unit test plug-in unit can be realized, the comparison of the development code to be tested and the reference code through a self-grinding algorithm by a git original method is realized, the difference code is pulled, and the addition and the modification of the unit test code are matched, so that the corresponding method for adding and modifying is only executed when the unit test is executed, and the unnecessary unit test execution is reduced. Only the newly added and modified unit tests are concerned, so that the compiling of the unit tests is forced, too much compiling efficiency is not affected, the statistics of the unit test coverage rate of jacoco is finally carried out according to the difference codes, if the unit test coverage rate cannot reach the preset ratio on the compiling level, the compiling cannot be carried out due to forced limitation, and the compiling can be carried out after the self-test of the unit tests is compiled.
Optionally, if the code to be tested is detected to contain a test case, testing the difference code to obtain a test result corresponding to the difference code, including:
and if the code to be tested contains the target character, testing the difference code to obtain a test result corresponding to the difference code.
Wherein the target character may be a character for identifying the test case. By way of example, the target character may be a Test.
Specifically, a specific method for detecting whether the code to be detected contains a Test case may be to detect whether the code to be detected contains a target character, for example, a Test word, and if the code to be detected contains the target character, test the difference code to obtain a Test result corresponding to the difference code.
As an exemplary description of an embodiment of the present invention, a specific implementation procedure of a test report generating method may include the following operations:
step one: using an open source framework: the maven-surafire-plug in unit test framework, the git open source package jgit, jacoco test coverage tool.
Step two: configuring 6 custom parameters: configuring a warehouse address of the project; the branch name of the code to be tested may not be configured; the branch name of the reference code may not be configured; whether to run the full-quantity test case, if yes, running the full quantity, if false, running the increment test case, and defaulting to false; the login user name of the warehouse can be unconfigured; the login password of the repository may not be configured.
Step three: at the beginning of compiling, starting a corresponding interface: firstly, pulling all unit test class paths according to a maven-subtire-plug in unit test framework, and scanning out information of all unit test classes. Based on information such as configured incoming git warehouse paths, codes to be tested, reference codes, account numbers, passwords and the like, all code information to be tested is obtained through an api method provided by jgit, and corresponding incoming reference codes and code information to be tested are obtained in a comparison mode, so that corresponding branch details are obtained. And comparing the difference between the two branches of the reference code and the code to be tested through a jgit method, storing the difference code information, and then traversing the information of the difference code class. Firstly judging whether the difference code is a deleted class, if so, ignoring the deleted class, and testing by a running unit after deleting without paying attention to the deleted class; if the Test case is not deleted, judging whether the Test case exists in the code to be tested, wherein the rule which can be agreed is that the class name of the Test case must be the original class name+test, and judging that the Test case needs to be run again if the Test case is tested and modified by the discovery unit. If there is no difference unit test on the match, then it is a new test case and it also needs to be run again. If the unit test fails, the compiling is failed, and if the unit test quantity accounts for less than the corresponding proportion of the preset threshold value in percentage of all the methods. If all the checks are passed, the compiling is successful.
Step four: when the starting is successful and the test is finished, the test coverage rate report needs to be manually led out after the test is finished: firstly, according to the scheme, a server interface is called, and a class path of the differential code is obtained. And then, calling the reconstructed jacoco slot, inputting preset parameters, calling and exporting report commands as parameters, firstly exporting all test coverage rate reports, comparing difference codes according to the input preset parameters, comparing the code differences, judging the type of adding, deleting and checking, and only displaying the corresponding modified code coverage rate. And generating a difference code coverage rate report, namely a target test report, and returning to download.
The method only accurately realizes class level difference codes, if a finer layer of accurate method is needed, a contracted rule can be used, according to the method, the method name+test is used for naming the unit Test method, which method is changed can be accurately checked, the corresponding unit Test method is accurately executed, and the operation efficiency is further optimized. The implementation of the class difference code is the same as the implementation of the class level difference code, and the detailed process is not repeated here.
According to the technical scheme provided by the embodiment of the invention, by identifying the differential codes and executing only the codes which are correspondingly increased or modified, the running efficiency of each unit test is ensured, and the problem that the compiling speed is slow due to the fact that the unit test is executed in full quantity each time is solved. If the unit test is not written or fails, the unit test is detected and reported wrong when compiling is carried out, and the compiling is impossible, so that the unit test can be forcedly written and self-tested, the development quality is greatly improved, and the workload of a test team is reduced. After the codes are successfully operated and the tests are completed, the test coverage rate report brought by the previous test can be manually exported, only the coverage rate condition of the difference codes can be displayed, the code coverage condition is not required to be searched in the full unit test report in an efficient manner, the missing test points can be searched more efficiently and rapidly, and the missing test scene problem brought by the black box test is solved.
Example two
Fig. 2 is a schematic structural diagram of a test report generating device according to a second embodiment of the present invention. As shown in fig. 2, the apparatus includes: a first acquisition module 201, a determination module 202, a test module 203 and a generation module 204.
The first acquiring module 201 is configured to acquire a code to be tested;
a determining module 202, configured to determine a difference code according to the code to be tested;
the test module 203 is configured to test the difference code if the code to be tested is detected to contain a test case, so as to obtain a test result corresponding to the difference code;
and the generating module 204 is configured to generate a target test report according to the test result corresponding to the difference code and a preset parameter.
Optionally, the test module 203 includes:
the first acquisition unit is used for acquiring the coverage percentage of the test cases in the difference code if the test cases are detected to be contained in the code to be tested;
and the first test unit is used for testing the difference code if the coverage percentage of the test case in the difference code is greater than or equal to a preset threshold value, so as to obtain a test result corresponding to the difference code.
Optionally, the test report generating device further includes:
and the first determining unit is used for determining compiling failure if the code to be tested does not contain a test case and/or the coverage percentage of the test case in the difference code is smaller than the preset threshold value.
Optionally, the test report generating device further includes:
the second acquisition module is used for acquiring the number of the difference codes after the difference codes are determined according to the codes to be detected;
and the adding module is used for adding a preset character string for each difference code after the difference code is determined according to the code to be tested.
Optionally, the generating module 204 includes:
the second acquisition unit is used for acquiring preset parameters corresponding to the preset character strings;
and the generating unit is used for generating a target test report according to the test result corresponding to the difference code and the preset parameters.
Optionally, the determining module 202 includes:
the third acquisition unit is used for acquiring the code to be detected and the reference code;
and the second determining unit is used for comparing the code to be detected with the reference code and determining a code which is newly added and/or modified in the code to be detected compared with the reference code as a difference code.
Optionally, the test module 203 includes:
and the second test unit is used for testing the difference code if the code to be tested contains the target character, so as to obtain a test result corresponding to the difference code.
The test report generating device provided by the embodiment of the invention can execute the test report generating method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the executing method.
Example III
Fig. 3 shows a schematic diagram of an electronic device 30 that may be used to implement an embodiment of the invention. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Electronic equipment may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 3, the electronic device 30 includes at least one processor 31, and a memory, such as a Read Only Memory (ROM) 32, a Random Access Memory (RAM) 33, etc., communicatively connected to the at least one processor 31, wherein the memory stores a computer program executable by the at least one processor, and the processor 31 can perform various suitable actions and processes according to the computer program stored in the Read Only Memory (ROM) 32 or the computer program loaded from the storage unit 38 into the Random Access Memory (RAM) 33. In the RAM 33, various programs and data required for the operation of the electronic device 30 may also be stored. The processor 31, the ROM 32 and the RAM 33 are connected to each other via a bus 34. An input/output (I/O) interface 35 is also connected to bus 34.
Various components in electronic device 30 are connected to I/O interface 35, including: an input unit 36 such as a keyboard, a mouse, etc.; an output unit 37 such as various types of displays, speakers, and the like; a storage unit 38 such as a magnetic disk, an optical disk, or the like; and a communication unit 39 such as a network card, modem, wireless communication transceiver, etc. The communication unit 39 allows the electronic device 30 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor 31 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 31 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microcontroller, etc. The processor 31 performs the various methods and processes described above, such as the test report generation method:
acquiring a code to be tested;
determining a difference code according to the code to be tested;
if the code to be tested contains a test case, testing the difference code to obtain a test result corresponding to the difference code;
and generating a target test report according to the test result corresponding to the difference code and the preset parameters.
In some embodiments, the test report generation method may be implemented as a computer program tangibly embodied on a computer-readable storage medium, such as storage unit 38. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 30 via the ROM 32 and/or the communication unit 39. When the computer program is loaded into RAM 33 and executed by processor 31, one or more steps of the test report generating method described above may be performed. Alternatively, in other embodiments, the processor 31 may be configured to perform the test report generating method by any other suitable means (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for carrying out methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be implemented. The computer program may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) through which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service are overcome.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present invention may be performed in parallel, sequentially, or in a different order, so long as the desired results of the technical solution of the present invention are achieved, and the present invention is not limited herein.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (10)

1. A test report generation method, comprising:
acquiring a code to be tested;
determining a difference code according to the code to be tested;
if the code to be tested contains a test case, testing the difference code to obtain a test result corresponding to the difference code;
and generating a target test report according to the test result corresponding to the difference code and the preset parameters.
2. The method according to claim 1, wherein if the code to be tested is detected to contain a test case, testing the difference code to obtain a test result corresponding to the difference code, including:
if the code to be tested is detected to contain the test case, acquiring the coverage percentage of the test case in the difference code;
and if the coverage percentage of the test cases in the difference codes is greater than or equal to a preset threshold value, testing the difference codes to obtain test results corresponding to the difference codes.
3. The method as recited in claim 2, further comprising:
and if the code to be tested is detected to not contain the test case, and/or the coverage percentage of the test case in the difference code is smaller than the preset threshold value, determining that compiling fails.
4. The method of claim 1, further comprising, after determining a difference code from the code under test:
acquiring the number of the difference codes;
and adding a preset character string for each difference code.
5. The method of claim 4, wherein generating a target test report according to the test result and the preset parameter corresponding to the difference code comprises:
acquiring preset parameters corresponding to the preset character strings;
and generating a target test report according to the test result corresponding to the difference code and the preset parameters.
6. The method of claim 1, wherein determining a difference code from the code under test comprises:
acquiring a code to be tested and a reference code;
and comparing the code to be detected with the reference code, and determining a code which is newly added and/or modified in the code to be detected compared with the reference code as a difference code.
7. The method according to claim 1, wherein if the code to be tested is detected to contain a test case, testing the difference code to obtain a test result corresponding to the difference code, including:
and if the code to be tested contains the target character, testing the difference code to obtain a test result corresponding to the difference code.
8. A test report generating apparatus, comprising:
the first acquisition module is used for acquiring codes to be detected;
the determining module is used for determining a difference code according to the code to be detected;
the test module is used for testing the difference code if the code to be tested contains the test case, so as to obtain a test result corresponding to the difference code;
and the generating module is used for generating a target test report according to the test result corresponding to the difference code and the preset parameters.
9. An electronic device, the electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the test report generating method of any one of claims 1-7.
10. A computer readable storage medium storing computer instructions for causing a processor to perform the test report generating method of any one of claims 1-7.
CN202310290838.8A 2023-03-22 2023-03-22 Test report generation method, device, equipment and storage medium Pending CN116185874A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310290838.8A CN116185874A (en) 2023-03-22 2023-03-22 Test report generation method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310290838.8A CN116185874A (en) 2023-03-22 2023-03-22 Test report generation method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116185874A true CN116185874A (en) 2023-05-30

Family

ID=86438573

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310290838.8A Pending CN116185874A (en) 2023-03-22 2023-03-22 Test report generation method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116185874A (en)

Similar Documents

Publication Publication Date Title
CN113821433A (en) Method, device, equipment, medium and product for testing cloud mobile phone application program
CN112988578A (en) Automatic testing method and device
CN113778849A (en) Method, apparatus, device and storage medium for testing code
CN116185880A (en) Automatic test method, device, equipment and medium for embedded system
CN116185874A (en) Test report generation method, device, equipment and storage medium
CN115061921A (en) Automatic test method, device, electronic equipment and readable storage medium
CN114693116A (en) Method and device for detecting code review validity and electronic equipment
CN114546799A (en) Point burying log checking method and device, electronic equipment, storage medium and product
CN112631930B (en) Dynamic system testing method and related device
CN115374010A (en) Function testing method, device, equipment and storage medium
CN117271373B (en) Automatic construction method and device for test cases, electronic equipment and storage medium
KR102519639B1 (en) Method for providing code inspection interface, and apparatus implementing the same method
CN116991737A (en) Software testing method, system, electronic equipment and storage medium
CN117609070A (en) Service traversal testing method and device, electronic equipment and storage medium
CN116521536A (en) Code coverage rate determining method and device, electronic equipment and storage medium
CN115794525A (en) BMC (baseboard management controller) pressure testing method, device, equipment and storage medium
CN117951031A (en) Graphic interface automatic test method, graphic interface automatic test device, electronic equipment and medium
CN116150024A (en) Method and device for evaluating program to be tested and electronic equipment
CN117609087A (en) Code processing method, device, equipment and medium
CN117371506A (en) Model training method, model testing device, electronic equipment and storage medium
CN117648252A (en) Function test method and device for software application, electronic equipment and storage medium
CN114238149A (en) Batch testing method of accounting system, electronic device and storage medium
CN115374012A (en) Automatic regression testing method and device, electronic equipment and storage medium
CN117632726A (en) Test case validity detection method and device, electronic equipment and storage medium
CN118277275A (en) Interface testing method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination