CN113641573A - Revision log-based automatic testing method and system for program analysis software - Google Patents
Revision log-based automatic testing method and system for program analysis software Download PDFInfo
- Publication number
- CN113641573A CN113641573A CN202110844566.2A CN202110844566A CN113641573A CN 113641573 A CN113641573 A CN 113641573A CN 202110844566 A CN202110844566 A CN 202110844566A CN 113641573 A CN113641573 A CN 113641573A
- Authority
- CN
- China
- Prior art keywords
- analysis result
- version
- minimum code
- code rule
- rule unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000012360 testing method Methods 0.000 title claims abstract description 110
- 238000004458 analytical method Methods 0.000 claims abstract description 181
- 230000008859 change Effects 0.000 claims abstract description 108
- 238000000034 method Methods 0.000 claims abstract description 16
- 238000013024 troubleshooting Methods 0.000 claims abstract description 13
- 238000010998 test method Methods 0.000 claims abstract description 9
- 238000011835 investigation Methods 0.000 claims description 45
- 230000008439 repair process Effects 0.000 claims description 22
- 238000012986 modification Methods 0.000 claims description 8
- 230000004048 modification Effects 0.000 claims description 8
- 239000000203 mixture Substances 0.000 claims description 6
- 238000010586 diagram Methods 0.000 claims description 4
- 238000007792 addition Methods 0.000 claims description 3
- 238000012217 deletion Methods 0.000 claims description 3
- 230000037430 deletion Effects 0.000 claims description 3
- 238000013522 software testing Methods 0.000 claims description 3
- 238000011161 development Methods 0.000 description 12
- 238000005457 optimization Methods 0.000 description 8
- 238000007726 management method Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 4
- 230000002159 abnormal effect Effects 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000011056 performance test Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000000275 quality assurance Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000000391 smoking effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/368—Test management for test version control, e.g. updating test cases to a new software version
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3692—Test management for test results analysis
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Debugging And Monitoring (AREA)
- Stored Programmes (AREA)
Abstract
The invention discloses a revision log-based automatic test method and a revision log-based automatic test system for program analysis software, wherein the method comprises the following steps: acquiring a minimum code rule unit in the software code; acquiring a first version and a second version of a software code, and acquiring a specific change type of a minimum code rule unit changed compared with the first version in the second version; executing the first version and the second version of the program analysis software code to respectively obtain a first analysis result and a second analysis result; judging whether a second analysis result conforms to a preset normal result range corresponding to the minimum code rule unit or not and whether the change of the second analysis result compared with the first analysis result conforms to a normal change range caused by the specific change type or not; the method and the device have the advantages that the minimum code rule unit with the negative judgment result is used as key troubleshooting content, the rationality, the efficiency, the accuracy and the automation degree of software test troubleshooting key and non-key division are improved, and the test efficiency is improved.
Description
Technical Field
The invention relates to the technical field of program testing, in particular to a revision log-based automatic testing method and system for program analysis software.
Background
The program analysis software is composed of a plurality of program analysis rules, each rule is relatively independent, but the situation that the same supporting code is multiplexed exists. And because the influence of code modification is far greater than the expected range of developers due to the characteristics of software, the functional integrity and the stability of a new version cannot be ensured if only the newly added or repaired rules are tested. However, the program analysis software usually has a huge number of rules, and if all the rules need to be tested one by one, the workload is huge, and many test tasks are ineffective.
With the continuous development and improvement of software engineering, more and more companies adopt an iterative and agile development mode to replace the traditional waterfall development, which causes the development period of software to become shorter and shorter, and the test work required to be done before the software is released is more and more frequent. And aiming at different links and different test key points, the software is required to be tested in various aspects such as smoking test, regression test, system test, performance test and the like. Considering that the number of rules is more and more along with the iterative update of the program analysis software version, the software functions are richer and richer, and the test workload of all test links is larger and larger. Balancing the testing work, reducing the invalid testing and repeated testing work of the testing personnel, and improving the testing efficiency in the whole development cycle of the program analysis software becomes more and more important.
Program analysis software existing test protocol: manually collecting revision logs of a new version during testing in each development period by combining manual testing of testers with partial automatic testing tools, and finding out a list of rule changes from the revision logs as a key troubleshooting object of the testing; and other non-change rules are used as non-key objects for investigation. The key troubleshooting rule is tested preferentially, and more detailed troubleshooting is carried out through more software projects to carry out more comprehensive testing; for the non-emphasis checking rules, the rules still need to be checked through a part of software projects, and whether the rules introduce new problems due to change is verified. After all the key investigation rules and the non-key investigation rules are detected, all the test work of the program analysis software aiming at the rules is completed.
The existing test scheme has the disadvantage that it is not possible to effectively distinguish which rules are unchanged and are not affected by the code modification of the development cycle, thereby increasing a lot of invalid test work. In addition, the existing test scheme cannot effectively identify which rules need to be primarily checked and which rules do not need to be primarily checked for the repaired and newly added rules. This results in a problem of huge testing workload and low testing efficiency. In addition, the whole test scheme does not fully utilize the automatic test technology, so that the repeatability of the test work cannot be reduced, and the test efficiency cannot be improved.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides a revision log-based automatic test method and a revision log-based automatic test system for program analysis software, which effectively identify which rules need to be checked in a key way and which rules do not need to be checked in a key way, reduce the test workload of testers, improve the test automation degree and the test efficiency,
to solve the above problems, the following solutions are proposed:
the application discloses in a first aspect a revision log-based automated program analysis software testing method, comprising:
acquiring a minimum code rule unit in the program analysis software code;
acquiring a first version and a second version of the program analysis software code, and acquiring a specific change type of a minimum code rule unit changed in the second version compared with the first version, wherein the second version is a modified version of the first version;
executing a first version and a second version of the program analysis software code based on a benchmark test set, and respectively obtaining a first analysis result and a second analysis result of each minimum code rule unit in each version;
judging whether a second analysis result conforms to a preset normal result range corresponding to the minimum code rule unit or not and whether the change of the second analysis result compared with the first analysis result conforms to a normal change range caused by the specific change type or not;
and when the judgment result is negative, the corresponding minimum code rule unit is used as key checking content, and when the judgment result is positive, the corresponding minimum code rule unit is used as non-key checking content.
As a further optimization of the above scheme, for the non-emphasis content, the method further includes: and if the corresponding minimum code rule unit belongs to the minimum code rule unit which is changed compared with the first version, taking the corresponding minimum code rule unit as light troubleshooting content, otherwise, not troubleshooting the corresponding minimum code rule unit.
As a further optimization of the above scheme, the minimum code rule unit is obtained by dividing the program analysis software code by using 1 rule as a basic unit.
As a further optimization of the above solution, the minimum code rule unit in the second version that is changed compared with the first version includes addition, modification, and deletion of the minimum code rule unit, and the specific change type of the minimum code rule unit that is changed includes: repair false positives, repair false negatives, repair performance, code reconstruction only and no impact on false positives, false negatives, and performance.
As a further optimization of the above solution, the obtaining a first analysis result and a second analysis result of each minimum code rule unit in each version includes:
obtaining first and second execution results obtained by executing the first and second versions of the program analysis software code based on a benchmark test set;
acquiring alarm quantity parameters, alarm occurrence position parameters and execution time consumption of each minimum code rule unit in the execution results based on the first execution results and the second execution results;
and obtaining a first analysis result and a second analysis result of each minimum code rule unit in each version based on the alarm quantity parameter and the execution time consumption of each minimum code rule unit.
As a further optimization of the above scheme, each alarm occurrence position parameter includes a number of a minimum code rule unit in which each alarm occurrence position is located.
As a further optimization of the above scheme, determining whether a second analysis result meets a preset normal result range corresponding to the minimum code rule unit and whether a change of the second analysis result compared with a first analysis result meets a normal change range caused by the specific change type includes:
(1) acquiring preset normal result range data of all minimum code rule units in the second version;
(2) judging whether a second analysis result of the minimum code rule unit is matched with the preset normal result range data or not, if so, entering the step (3), and otherwise, entering the step (4);
(3) entering a step (6) for a minimum code rule unit which is not changed in the second version compared with the first version, obtaining a specific change type of the minimum code rule unit which is changed in the second version and change data of the second analysis result compared with the first analysis result for the minimum code rule unit which is changed in the second version compared with the first version, if the change data does not match preset normal change range data corresponding to the specific change type, entering a step (4), otherwise, entering a step (5);
(4) taking the minimum code rule unit as key investigation content;
(5) taking the minimum code rule unit as light investigation content;
(6) and taking the minimum code rule unit as the non-investigation content.
As a further optimization of the above scheme, the preset normal variation range data corresponding to the specific change type includes:
if the specific change type is a repair false alarm, the normal change range data is as follows: the alarm quantity in the second analysis result is less than that of the first analysis result, the alarm information in the second analysis result is within the alarm information range of the first analysis result, and the analysis time consumption in the second analysis result is not obviously different from that of the first analysis result;
if the specific change type is repair failure, the normal change range data is as follows: the alarm quantity in the second analysis result is more than that of the first analysis result, the alarm information in the second analysis result completely covers the alarm information in the first analysis result, and the analysis time consumption in the second analysis result is not obviously different from that of the first analysis result;
if the specific change type is the repair performance, the normal change range data is as follows: the analysis time consumption in the second analysis result is less than that of the first analysis result, and the alarm information of the second analysis result is consistent with that of the first analysis result;
if the specific change type is only code reconstruction, the normal change range data is as follows: the analysis time consumption in the second analysis result is not obviously different from that of the first analysis result, and the alarm information of the second analysis result is consistent with that of the first analysis result;
if the specific change type is a mixture of false alarm repairing and false alarm repairing, the normal change range data is as follows: the second analysis result has alarm information which is not included in the first analysis result, and the first analysis result has alarm information which is not included in the second analysis result;
if the specific change type is a mixture of multiple change types, the normal change range data is as follows: and combining the normal change range data corresponding to each change type.
As a further optimization of the above scheme, the revision log-based program analysis software automated testing method further comprises:
and generating test report data based on the minimum code rule unit in the key investigation content, the light investigation content and the non-investigation content, wherein the test report comprises data in text and diagram forms.
The second aspect of the present application discloses a revision log-based program analysis software automation test system, comprising:
a minimum code rule unit for obtaining a minimum code rule unit in the program analysis software code;
a new version revision log obtaining unit, configured to obtain a first version and a second version of the program analysis software code, and obtain a specific change type of a minimum code rule unit in the second version that is changed compared with the first version, where the second version is a modified version of the first version;
an analysis result obtaining unit, configured to execute a first version and a second version of the program analysis software code based on a benchmark test set, and obtain a first analysis result and a second analysis result of each minimum code rule unit in each version respectively;
the analysis result analysis unit is used for judging whether a second analysis result conforms to a preset normal result range corresponding to the minimum code rule unit or not and whether the change of the second analysis result compared with the first analysis result conforms to a normal change range caused by the specific change type or not;
and the investigation content classification unit is used for taking the corresponding minimum code rule unit as key investigation content when the output of the analysis result analysis unit is negative, and taking the corresponding minimum code rule unit as non-key investigation content when the judgment result is positive.
The revision log-based program analysis software automatic testing method and system have the following beneficial effects:
1. the version of the program analysis software code is managed by the minimum code rule unit, the change content of the new version and the change content of the old version of the code are managed by the independent minimum unit, the analysis result is preliminarily compared and judged based on the comparison between the second analysis result and the first analysis result of the new version and the old version, the test range of the program analysis software is effectively reduced, the rules to be detected are more reasonably classified, the test of testers is more reasonable, and the overall test effect is improved.
2. The comparison result of the alarm quantity and the execution time-consuming parameters in the execution result of the benchmark test sets of the new version and the old version is used for determining which codes of the program analysis software of the new version need to be primarily checked, which codes need to be lightly checked and which codes do not need to be checked, so that the test range of the rule set needing to be checked in the program analysis software is effectively reduced, and the test rule set in the test range is more reasonably divided: and (4) bringing the minimum code rule unit with abnormal comparison result into a key investigation range, and bringing the minimum code rule unit with normal comparison result into a non-key investigation range. Therefore, the test workload of testers is greatly reduced, and the test efficiency is improved. Meanwhile, the distribution of testing human resources is more reasonable, the testing period is shortened, and the testing effect and quality are improved.
3. The method comprises the steps that a code revision management specification is set at a programming implementation stage in a new version development cycle of software codes, revision notes are provided according to a specific specification format when the codes are revised and submitted, the notes comprise revised rule numbers and corresponding function points, all code changes are submitted in an independent minimum unit mode in combination with the revision notes of the specific specification, and data support is provided for subsequent automatic testing.
Drawings
FIG. 1 is a flowchart of a revision log based procedural analysis software automation testing method according to an embodiment of the present application;
FIG. 2 is a block diagram of a revision log based procedural analysis software automation test system according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The embodiment of the application discloses a revision log-based automatic program analysis software testing method, which specifically comprises the following steps:
acquiring a minimum code rule unit in the program analysis software code, wherein the minimum code rule unit is obtained by dividing the program analysis software code by taking 1 rule as a basic unit, and preferably, the minimum code rule unit comprises 1 rule;
acquiring a first version and a second version of a program analysis software code, and acquiring a specific change type of a minimum code rule unit changed in the second version compared with the first version, wherein the second version is a modified version of the first version;
executing a first version and a second version of the program analysis software code based on a benchmark test set, and respectively obtaining a first analysis result and a second analysis result of each minimum code rule unit in each version;
judging whether a second analysis result conforms to a preset normal result range corresponding to the minimum code rule unit or not and whether the change of the second analysis result compared with the first analysis result conforms to a normal change range caused by the specific change type or not;
and when the judgment result is negative, the corresponding minimum code rule unit is used as key checking content, and when the judgment result is positive, the corresponding minimum code rule unit is used as non-key checking content.
In this embodiment, in the programming implementation phase of the new version development cycle of the software code, the revision of the software code complies with the preset code revision management specification, and the method includes:
1. managing code changes through a code version control tool;
2. all code revisions need to be split into the smallest units possible, i.e., refined to revisions for a rule, several specific function points, including: rule implementation (newly added), false alarm repair, missed alarm repair, performance repair or code reconstruction;
3. when code modification is submitted, providing revision annotations according to a specific standard format, wherein the annotations comprise modified rule numbers and corresponding function points;
at the automated testing tool end, the code modification and submission specifications in the implementation phase are as follows:
1. acquiring a revision log of the program analysis software in the current development cycle, wherein the revision log comprises:
1.1 numbering the newly added rules;
1.2 repairing the rule number of the false alarm;
1.3 repairing the rule number of the missed report;
1.4 rule numbering of repair performance;
1.5 only code reconstruction is carried out, and the rule number of misinformation, missing report and performance improvement is not influenced;
2. executing the program analysis software of the current version and the previous version through the benchmark test set, and collecting the analysis results of the two times, wherein the analysis results comprise:
2.1 the number of alarms per rule;
2.2 the position information of each piece of alarm information;
2.3 time consuming per rule.
In the embodiment, at the code warehouse end, all code changes are submitted in an independent minimum unit mode in combination with revision annotations of specific specifications on the basis of the code revision management specifications of the version control tool, so that data support is provided for subsequent automatic tests;
through code revision management specifications of program analysis software based on version control, a set of test methods for testing the range of rules, testing key classification and filtering and displaying the test results after comparison of the test results based on revision logs and new and old versions are formed;
in the embodiment, version management is performed on the program analysis software codes through the minimum code rule unit, the change contents in the new version and the old version of the codes are managed through the independent minimum unit, the comparison is performed on the analysis results preliminarily based on the comparison between the second analysis result and the first analysis result of the new version and the old version, the test range of the program analysis software is effectively reduced, the rules to be detected are more reasonably classified, the test emphasis of testers is more reasonable, and the overall test effect is improved.
Compared with the prior art that manual testing by testers is combined with partial automatic testing tools, the method has the advantages that the changed rules are roughly used as key troubleshooting objects, other non-changed rules are used as non-key troubleshooting objects, manual work is avoided, and automation of the whole testing work is further improved.
Optionally, for the non-key content, the method further includes: and if the corresponding minimum code rule unit belongs to the minimum code rule unit which is changed compared with the first version, taking the corresponding minimum code rule unit as light troubleshooting content, otherwise, not troubleshooting the corresponding minimum code rule unit.
In the embodiment, the non-key investigation content is further divided, so that the invalid test work of the program analysis software code is further reduced, and the test efficiency is further improved.
In the embodiment, the program analysis software code is divided by taking 1 rule as a basic unit to obtain the independent minimum unit, so that the revision management efficiency of each new version of the program analysis software code is improved, the subsequent classification of the key points and non-key points of the independent minimum unit is facilitated, and the division of the search range in the new version of the program analysis software is facilitated.
Optionally, the minimum code rule unit changed in the second version compared to the first version includes addition, modification, and deletion of the minimum code rule unit, and the specific change type of the minimum code rule unit changed includes: repair false positives, repair false negatives, repair performance, code reconstruction only and no impact on false positives, false negatives, and performance.
In this embodiment, the specific change type of the minimum code rule unit includes: the method comprises the steps of adding a minimum code rule unit, repairing functions (repairing false reports and repairing false reports) of the minimum code rule unit, repairing performance, only reconstructing codes and the like.
Optionally, in the method, obtaining a first analysis result and a second analysis result of each minimum code rule unit in each version includes:
obtaining first and second execution results obtained by executing the first and second versions of the program analysis software code based on a benchmark test set;
acquiring alarm quantity parameters, alarm occurrence position parameters and execution time consumption of each minimum code rule unit in the execution result based on the first execution result and the second execution result, wherein each alarm occurrence position parameter comprises the number of the minimum code rule unit where each alarm occurrence position is located;
and obtaining a first analysis result and a second analysis result of each minimum code rule unit in each version based on the alarm quantity parameter and the execution time consumption of each minimum code rule unit.
In this embodiment, by determining which of the new version of the program analysis software code needs to be mainly checked, which of the new version of the program analysis software code needs to be lightly checked, and which of the new version of the program analysis software code does not need to be checked by comparing the alarm number and the execution time-consuming parameter in the execution result of the new version (second version) and the old version (first version) of the benchmark test set, the test range of the rule set that needs to be checked in the program analysis software is effectively reduced, and the test rule set in the test range is more reasonably divided: and (4) bringing the minimum code rule unit with abnormal comparison result into a key investigation range, and bringing the minimum code rule unit with normal comparison result into a non-key investigation range. Therefore, the test workload of testers is greatly reduced, and the test efficiency is improved. Meanwhile, the distribution of testing human resources is more reasonable, the testing period is shortened, and the testing effect and quality are improved.
In the above method, determining whether the second analysis result meets a preset normal result range corresponding to the minimum code rule unit and whether a change of the second analysis result compared with the first analysis result meets a normal change range caused by the specific change type includes:
(1) acquiring preset normal result range data of all minimum code rule units in the second version, wherein all minimum code rule units comprise newly-added minimum code rule units in the second version and original minimum code rule units in the first version;
(2) judging whether a second analysis result of the minimum code rule unit is matched with the preset normal result range data or not, if so, entering the step (3), and otherwise, entering the step (4);
(3) entering a step (6) for a minimum code rule unit which is not changed in the second version compared with the first version, obtaining a specific change type of the minimum code rule unit which is changed in the second version and change data of the second analysis result compared with the first analysis result for the minimum code rule unit which is changed in the second version compared with the first version, if the change data does not match preset normal change range data corresponding to the specific change type, entering a step (4), otherwise, entering a step (5);
(4) taking the minimum code rule unit as key investigation content;
(5) taking the minimum code rule unit as light investigation content;
(6) and taking the minimum code rule unit as the non-investigation content.
In this embodiment, the second analysis result of the second version is analyzed twice in steps (2) and (3), so as to more comprehensively and more reasonably classify all minimum code rule units in the program analysis software code into effective investigation types, compared with the prior art that manual testing by testers is combined with a partial automatic testing tool, the changed rule is roughly used as a key investigation object of the current testing, other non-changed rules are used as non-key investigation objects, the non-key investigation rules are still examined by rules of a part of software projects, whether the rule introduces a new problem due to change is verified, in this embodiment, according to the matching result in steps (2) and (3), the minimum code rule units which are not changed in the second version are divided into key investigation content and non-investigation content, and for the changed minimum code rule units, the classification method has the advantages that the key investigation content and the light investigation content are divided, so that the classification speed and the classification accuracy of key and non-key points of the investigation content are improved.
The preset normal variation range data corresponding to the specific change type includes:
if the specific change type is a repair false alarm, the normal change range data is as follows: the alarm quantity in the second analysis result is less than that of the first analysis result, the alarm information in the second analysis result is within the alarm information range of the first analysis result (that is, the first analysis result includes all the alarm information in the second analysis result, specifically, the first analysis result and the second analysis result can be determined by comparing and analyzing the position information of all the alarm information), and the analysis time consumption in the second analysis result is not obviously different from that of the first analysis result;
if the specific change type is repair failure, the normal change range data is as follows: the alarm quantity in the second analysis result is more than that of the first analysis result, the alarm information in the second analysis result completely covers the alarm information in the first analysis result, and the analysis time consumption in the second analysis result is not obviously different from that of the first analysis result;
if the specific change type is the repair performance, the normal change range data is as follows: the analysis time consumption in the second analysis result is less than that of the first analysis result, and the alarm information of the second analysis result is consistent with that of the first analysis result;
if the specific change type is only code reconstruction, the normal change range data is as follows: the analysis time consumption in the second analysis result is not obviously different from that of the first analysis result, and the alarm information of the second analysis result is consistent with that of the first analysis result;
if the specific change type is a mixture of false alarm repairing and false alarm repairing, the normal change range data is as follows: the second analysis result has alarm information which is not included in the first analysis result, and the first analysis result has alarm information which is not included in the second analysis result, namely, the intersection of the alarm information of the first analysis result and the alarm information of the second analysis result is not empty, and the union is more than the alarm information of the first analysis result and more than the alarm information of the second analysis result;
if the specific change type is a mixture of multiple change types, the normal change range data is as follows: and combining the normal change range data corresponding to each change type.
The revision log-based program analysis software automated testing method further includes:
and generating test report data based on the minimum code rule unit in the key investigation content, the light investigation content and the non-investigation content, wherein the test report comprises data in text and diagram forms.
In this embodiment, based on the classification results of various types of investigation contents for the minimum code rule unit, the problem type (that is, the specific situation that mismatch occurs in the above steps (2) and (3)) and the details of the difference result are output, and a corresponding test report file and a corresponding chart are generated, which is convenient for test, development and project managers to look up and track. A simple and clear test report is generated by one key, so that personnel such as research, development, design and product/project managers can participate in the test work, the product problems can be exposed more and more earlier and repaired in time, and the quality assurance of the program analysis software product is greatly enhanced.
Based on the revision log-based program analysis software automated testing method provided by the embodiment of the application, the embodiment of the application correspondingly discloses a revision log-based program analysis software automated testing system, which comprises:
a minimum code rule unit for obtaining a minimum code rule unit in the program analysis software code;
a new version revision log obtaining unit, configured to obtain a first version and a second version of the program analysis software code, and obtain a specific change type of a minimum code rule unit in the second version that is changed compared with the first version, where the second version is a modified version of the first version;
an analysis result obtaining unit, configured to execute a first version and a second version of the program analysis software code based on a benchmark test set, and obtain a first analysis result and a second analysis result of each minimum code rule unit in each version respectively;
the analysis result analysis unit is used for judging whether a second analysis result conforms to a preset normal result range corresponding to the minimum code rule unit or not and whether the change of the second analysis result compared with the first analysis result conforms to a normal change range caused by the specific change type or not;
and the investigation content classification unit is used for taking the corresponding minimum code rule unit as key investigation content when the output of the analysis result analysis unit is negative, and taking the corresponding minimum code rule unit as non-key investigation content when the judgment result is positive.
The specific principle and the execution process in the revision log based program analysis software automated testing system disclosed in the embodiment of the present application are the same as those in the revision log based program analysis software automated testing method disclosed in the embodiment of the present application, and reference may be made to corresponding parts in the revision log based program analysis software automated testing method disclosed in the embodiment of the present application, which are not described herein again.
The present invention is not limited to the above-described embodiments, and those skilled in the art will be able to make various modifications without creative efforts from the above-described conception, and fall within the scope of the present invention.
Claims (10)
1. A revision log-based program analysis software automated testing method is characterized by comprising the following steps:
acquiring a minimum code rule unit in the program analysis software code;
acquiring a first version and a second version of the program analysis software code, and acquiring a specific change type of a minimum code rule unit changed in the second version compared with the first version, wherein the second version is a modified version of the first version;
executing a first version and a second version of the program analysis software code based on a benchmark test set, and respectively obtaining a first analysis result and a second analysis result of each minimum code rule unit in each version;
judging whether a second analysis result conforms to a preset normal result range corresponding to the minimum code rule unit or not and whether the change of the second analysis result compared with the first analysis result conforms to a normal change range caused by the specific change type or not;
and when the judgment result is negative, the corresponding minimum code rule unit is used as key checking content, and when the judgment result is positive, the corresponding minimum code rule unit is used as non-key checking content.
2. The revision log-based procedural analysis software automation test method according to claim 1, further comprising, for the non-emphasized contents, the steps of: and if the corresponding minimum code rule unit belongs to the minimum code rule unit which is changed compared with the first version, taking the corresponding minimum code rule unit as light troubleshooting content, otherwise, not troubleshooting the corresponding minimum code rule unit.
3. The revision log-based automated program analysis software testing method according to claim 2, wherein the minimum code rule unit is obtained by dividing the program analysis software code by 1 rule as a basic unit.
4. The revision log-based program analysis software automation test method according to claim 3, wherein the minimum code rule unit of the second version changed compared to the first version comprises addition, modification and deletion of the minimum code rule unit, and the specific change type of the minimum code rule unit changed comprises: repair false positives, repair false negatives, repair performance, code reconstruction only and no impact on false positives, false negatives, and performance.
5. The revision log-based program analysis software automation test method according to claim 4, wherein the obtaining of the first analysis result and the second analysis result of each minimum code rule unit in each version comprises:
obtaining first and second execution results obtained by executing the first and second versions of the program analysis software code based on a benchmark test set;
acquiring alarm quantity parameters, alarm occurrence position parameters and execution time consumption of each minimum code rule unit in the execution results based on the first execution results and the second execution results;
and obtaining a first analysis result and a second analysis result of each minimum code rule unit in each version based on the alarm quantity parameter and the execution time consumption of each minimum code rule unit.
6. The revision log-based program analysis software automation test method according to claim 5, wherein each alarm occurrence location parameter includes a number of a minimum code rule unit in which each alarm occurrence location is located.
7. The method of claim 5, wherein determining whether the second analysis result meets a predetermined normal result range corresponding to the minimum code rule unit and whether the variation of the second analysis result compared with the first analysis result meets a normal variation range caused by the specific change type comprises:
(1) acquiring preset normal result range data of all minimum code rule units in the second version;
(2) judging whether a second analysis result of the minimum code rule unit is matched with the preset normal result range data or not, if so, entering the step (3), and otherwise, entering the step (4);
(3) entering a step (6) for a minimum code rule unit which is not changed in the second version compared with the first version, obtaining a specific change type of the minimum code rule unit which is changed in the second version and change data of the second analysis result compared with the first analysis result for the minimum code rule unit which is changed in the second version compared with the first version, if the change data does not match preset normal change range data corresponding to the specific change type, entering a step (4), otherwise, entering a step (5);
(4) taking the minimum code rule unit as key investigation content;
(5) taking the minimum code rule unit as light investigation content;
(6) and taking the minimum code rule unit as the non-investigation content.
8. The revision log-based program analysis software automation test method according to claim 7, wherein the preset normal variation range data corresponding to the specific change type includes:
if the specific change type is a repair false alarm, the normal change range data is as follows: the alarm quantity in the second analysis result is less than that of the first analysis result, the alarm information in the second analysis result is within the alarm information range of the first analysis result, and the analysis time consumption in the second analysis result is not obviously different from that of the first analysis result;
if the specific change type is repair failure, the normal change range data is as follows: the alarm quantity in the second analysis result is more than that of the first analysis result, the alarm information in the second analysis result completely covers the alarm information in the first analysis result, and the analysis time consumption in the second analysis result is not obviously different from that of the first analysis result;
if the specific change type is the repair performance, the normal change range data is as follows: the analysis time consumption in the second analysis result is less than that of the first analysis result, and the alarm information of the second analysis result is consistent with that of the first analysis result;
if the specific change type is only code reconstruction, the normal change range data is as follows: the analysis time consumption in the second analysis result is not obviously different from that of the first analysis result, and the alarm information of the second analysis result is consistent with that of the first analysis result;
if the specific change type is a mixture of false alarm repairing and false alarm repairing, the normal change range data is as follows: the second analysis result has alarm information which is not included in the first analysis result, and the first analysis result has alarm information which is not included in the second analysis result;
if the specific change type is a mixture of multiple change types, the normal change range data is as follows: and combining the normal change range data corresponding to each change type.
9. The revision log-based procedural analysis software automation test method according to claim 2, further comprising:
and generating test report data based on the minimum code rule unit in the key investigation content, the light investigation content and the non-investigation content, wherein the test report comprises data in text and diagram forms.
10. A revision log-based procedural analysis software automation test system, comprising:
a minimum code rule unit for obtaining a minimum code rule unit in the program analysis software code;
a new version revision log obtaining unit, configured to obtain a first version and a second version of the program analysis software code, and obtain a specific change type of a minimum code rule unit in the second version that is changed compared with the first version, where the second version is a modified version of the first version;
an analysis result obtaining unit, configured to execute a first version and a second version of the program analysis software code based on a benchmark test set, and obtain a first analysis result and a second analysis result of each minimum code rule unit in each version respectively;
the analysis result analysis unit is used for judging whether a second analysis result conforms to a preset normal result range corresponding to the minimum code rule unit or not and whether the change of the second analysis result compared with the first analysis result conforms to a normal change range caused by the specific change type or not;
and the investigation content classification unit is used for taking the corresponding minimum code rule unit as key investigation content when the output of the analysis result analysis unit is negative, and taking the corresponding minimum code rule unit as non-key investigation content when the judgment result is positive.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110844566.2A CN113641573B (en) | 2021-07-26 | 2021-07-26 | Program analysis software automatic test method and system based on revision log |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110844566.2A CN113641573B (en) | 2021-07-26 | 2021-07-26 | Program analysis software automatic test method and system based on revision log |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113641573A true CN113641573A (en) | 2021-11-12 |
CN113641573B CN113641573B (en) | 2024-05-07 |
Family
ID=78418319
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110844566.2A Active CN113641573B (en) | 2021-07-26 | 2021-07-26 | Program analysis software automatic test method and system based on revision log |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113641573B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023103690A1 (en) * | 2021-12-08 | 2023-06-15 | 华为云计算技术有限公司 | Unit testing generation method and apparatus, and related device |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090187788A1 (en) * | 2008-01-17 | 2009-07-23 | International Business Machines Corporation | Method of automatic regression testing |
CN103309804A (en) * | 2013-04-08 | 2013-09-18 | 中国电子科技集团公司第十研究所 | Automatic code rule checking platform |
CN103425572A (en) * | 2012-05-24 | 2013-12-04 | 腾讯科技(深圳)有限公司 | Code analyzing method and system |
CN105302710A (en) * | 2014-07-03 | 2016-02-03 | 腾讯科技(深圳)有限公司 | Method and apparatus for determining test case in need of regression testing |
CN109857431A (en) * | 2019-01-11 | 2019-06-07 | 平安科技(深圳)有限公司 | Code revision method and device, computer-readable medium and electronic equipment |
CN110245081A (en) * | 2019-05-31 | 2019-09-17 | 厦门美柚信息科技有限公司 | Generate the method and device of minimum test scope |
CN110389896A (en) * | 2019-06-18 | 2019-10-29 | 中国平安人寿保险股份有限公司 | Code automated analysis and test method, device and computer readable storage medium |
-
2021
- 2021-07-26 CN CN202110844566.2A patent/CN113641573B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090187788A1 (en) * | 2008-01-17 | 2009-07-23 | International Business Machines Corporation | Method of automatic regression testing |
CN103425572A (en) * | 2012-05-24 | 2013-12-04 | 腾讯科技(深圳)有限公司 | Code analyzing method and system |
CN103309804A (en) * | 2013-04-08 | 2013-09-18 | 中国电子科技集团公司第十研究所 | Automatic code rule checking platform |
CN105302710A (en) * | 2014-07-03 | 2016-02-03 | 腾讯科技(深圳)有限公司 | Method and apparatus for determining test case in need of regression testing |
CN109857431A (en) * | 2019-01-11 | 2019-06-07 | 平安科技(深圳)有限公司 | Code revision method and device, computer-readable medium and electronic equipment |
CN110245081A (en) * | 2019-05-31 | 2019-09-17 | 厦门美柚信息科技有限公司 | Generate the method and device of minimum test scope |
CN110389896A (en) * | 2019-06-18 | 2019-10-29 | 中国平安人寿保险股份有限公司 | Code automated analysis and test method, device and computer readable storage medium |
Non-Patent Citations (1)
Title |
---|
王泉;: "规范化嵌入式软件自测试方法", 计算机工程与设计, no. 10 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023103690A1 (en) * | 2021-12-08 | 2023-06-15 | 华为云计算技术有限公司 | Unit testing generation method and apparatus, and related device |
Also Published As
Publication number | Publication date |
---|---|
CN113641573B (en) | 2024-05-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Karampatsis et al. | How often do single-statement bugs occur? the manysstubs4j dataset | |
US8312440B2 (en) | Method, computer program product, and hardware product for providing program individuality analysis for source code programs | |
Islam et al. | Bug replication in code clones: An empirical study | |
CN103092761A (en) | Method and device of recognizing and checking modifying code blocks based on difference information file | |
Mondal et al. | Bug propagation through code cloning: An empirical study | |
Yang et al. | Vuldigger: A just-in-time and cost-aware tool for digging vulnerability-contributing changes | |
Islam et al. | A comparative study of software bugs in micro-clones and regular code clones | |
CN117951128A (en) | Data quality inspection method based on artificial intelligence | |
CN112131116A (en) | Automatic regression testing method for embedded software | |
CN106933572B (en) | Measurement model based on LLVM intermediate representation program slice | |
CN111475408A (en) | Quality management method based on code inspection tool | |
CN113641573B (en) | Program analysis software automatic test method and system based on revision log | |
CN114490413A (en) | Test data preparation method and device, storage medium and electronic equipment | |
CN117851233A (en) | Software vulnerability reproduction method based on large language model | |
Yan et al. | Revisiting the correlation between alerts and software defects: A case study on myfaces, camel, and cxf | |
CN113791980A (en) | Test case conversion analysis method, device, equipment and storage medium | |
CN111309629A (en) | Processing and analyzing system for testing multiple items | |
CN111552639A (en) | Software test comprehensive control method and system | |
CN110990281B (en) | Automatic static analysis method | |
CN118445214B (en) | Code file modification-based method, device, equipment, medium and product for obtaining measurement | |
Gao et al. | Research on the causes of false positives in source code detection | |
CN118151941A (en) | Compiling optimization method, device, equipment and medium of electric power Internet of things operating system | |
CN118069465A (en) | Compliance detection system and method for software engineering development process | |
Rodríguez-Pérez et al. | What if a Bug has a Different Origin? Making Sense of Bugs Without an Explicit Bug Introducing Commit | |
CN118796660A (en) | Test method, test device, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |