CN108021498A - A kind of test job amount distribution method based on software reliability prediction - Google Patents

A kind of test job amount distribution method based on software reliability prediction Download PDF

Info

Publication number
CN108021498A
CN108021498A CN201610977069.9A CN201610977069A CN108021498A CN 108021498 A CN108021498 A CN 108021498A CN 201610977069 A CN201610977069 A CN 201610977069A CN 108021498 A CN108021498 A CN 108021498A
Authority
CN
China
Prior art keywords
msub
defect
version
software
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610977069.9A
Other languages
Chinese (zh)
Inventor
周毓明
冯义洋
卢红敏
徐宝文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University
Original Assignee
Nanjing University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University filed Critical Nanjing University
Priority to CN201610977069.9A priority Critical patent/CN108021498A/en
Publication of CN108021498A publication Critical patent/CN108021498A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/368Test management for test version control, e.g. updating test cases to a new software version
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Stored Programmes (AREA)

Abstract

The present invention provides a kind of test job amount distribution method based on software reliability prediction, comprises the following steps:1) collection and processing of software data collection;2) structure of software defect prediction model;3) software defect finds the structure of model;4) software defect finds model V0Release parameter is estimated;5) software defect finds model V1Release parameter is estimated;6) software V1The optimal test job amount distribution of version.The present invention is a kind of test job amount allocative decision, solve the problems, such as at present how greatest benefit in the case that test resource is limited defect in testing out system.The program takes full advantage of the information of prior software release, is allocated come the test job amount of the system to current version, so that the accumulation defect of most numbers may finally be found.

Description

Test workload distribution method based on software reliability growth model
Technical Field
The invention belongs to the technical field of software testing, and particularly relates to a test workload distribution method based on a software reliability growth model.
Background
Software testing is an exploratory activity aimed at helping software practitioners assess the quality status of software under test. In the complete software development process, the software testing activities all the time, which occupies most of the time of the software development project, require a large amount of human and material resource investment. To ensure the quality of the software, many techniques have been used to predict which modules are prone to defects, most of which are to select software modules for testing based on their probability of defects, their expected number of defects, or their density, but for the final goal: the subject of reducing the test workload to improve the software quality has been studied little.
The distribution of the test workload of each module in the software is always an important work in the software test process, under the condition that the test resources are limited, if the modules are easy to generate defects can be effectively identified in advance, the distribution of the test workload can be greatly helped, the test workload can be influenced by adopting which test workload distribution strategy in the actual industrial production, and even a poor resource distribution strategy can lead to the fact that the test workload is greatly increased. The traditional software testing workload distribution scheme has the following characteristics: 1. the distribution scheme of the test workload is less; 2. the specific criteria for selecting the module to be tested is not clear; 3. there is a lack of evaluation criteria for the test workload distribution strategy. The traditional software testing workload distribution method can not meet the actual requirement more and more, and a testing workload distribution scheme is urgently needed, so that the testing workload of the system of the current version can be distributed by fully utilizing the information of the software of the previous version under the condition of limited testing resources, and the maximum number of accumulated defects can be found finally.
Disclosure of Invention
The invention aims to provide a test workload distribution method based on a software reliability growth model, which solves the problem of how to test the defects in a system with the maximum benefit under the condition of limited test resources at present, and further greatly improves the software test work efficiency, thereby better controlling the quality of products.
In order to achieve the above object, the present invention provides a method for distributing test workload based on a software reliability growth model. The method comprises the following steps:
1) acquisition and processing of software data sets by means of defect patch reporting, software version V0The data information to be collected includes: collecting defect information, collecting and processing test cases and collecting basic measurement; software version V1The data information to be collected includes: basic metric collection, total test workload to be distributed, etc.
2) Constructing a software defect prediction model, namely after the step 1), converting V into V0And V1The measurement information of each module is organized into a form of a feature vector, the defect number is used as a target variable, and a defect prediction model (pair V) is established by prediction technologies such as random forest, linear regression, support vector machine and the like0Set up training and for V1Making a prediction);
3) constructing a software defect discovery model; the model can estimate the number of the discoverable defects relative to given test resources, and the model used by us is an expanded exponential reliability growth model, also called Goel-Okumoto model (Go model for short), because the model is the simplest NHPP model and has a constant defect detection rate in any test time, which means that the parameter estimation of the model is much easier than that of other SRGM; the model is used to represent the relationship between the test workload and the detected accumulated defects, and the calculation formula is as follows:
in the formulaFor the module m given a certain amount of test workloadiNumber of defects, t, that can be found finallyiIs a module miAssigned test workload, biNumber of defects that can be found for a unit test workload, aiIs a module miInitial defect number of SiIs a module miSize of (a) b0Is a constant.
4) Software defect discovery model V0Estimating version parameters; version V0Parameter a ofiAnd b0It needs to be estimated from the existing data, and the calculation formula is as follows:
wherein HiIs a module miThe number of actually detected defects of the optical disc,is a module miThe number of new/modified code lines of (c),is a module miThe number of code lines multiplexed in (1). R1And R2Is the residual defect rate, which indicates that R still remains in the 1KLOC line in the new/modified code, respectively1A defect; in the multiplexed code, R still remains in the 1KLOC line2And (4) a defect.
Followed by estimation b0It indicates the number of defects detectable per unit of test workload, derived from the actual assigned test workload and the number of defects detected, and is formulated as follows:
Htotal=atotal*[1-exp(-(b0/Stotal)*ttotal)](4.2)
thus, there are:
b0=-Stotal/ttotal*log(1-Htotal/atotal) (4.3)
wherein StotalIs the total size of all modules, ttotalIs the total actual test workload, HtotalIs the total number of defects actually detected, and has
5) Software defect discovery model V1Estimating version parameters; due to V1The actual defect number of the version is unknown, so the method in step 4) cannot be used to estimate V1Parameters in versions, we adopt the method of passing through previous version V0To estimate ViOf (2), in particular, V1B of (a)0And V0Taking the same value, V1Module m in versioniInitial defect number a ofiThe estimation method is as follows:
case 1: if miAt V0In existence of
A. If it is notSuppose V0Version miHas k defects at ViIs repaired, then:
B. if it is notDescription of miPractically no defects, so we can show a decrease in V1Version miA of (a)iThe value:
ai(vi,mi)=ai(vi-1,mi)/10 (5.2)
case 2:miat V0Is absent in (V)1The newly added module. The calculation is as follows:
wherein model is at V0The defect prediction model, Aplymodel (model, m), established abovei) M predicted for applying the defect prediction modeliInstead of the number of defects actually detected.
6) Software V1Distributing the workload of the version optimal test; known as V1Each module m in the editioniA of (a)iAnd biValue and test workload t to be allocatedtotalThen, the optimal test workload distribution strategy of the version under the Go model can be obtained, and the optimal test workload distribution strategy is distributed to m according to the strategyiThe corresponding workload is substituted into the defect discovery model, and finally the maximum number of accumulated defects can be discovered. The calculation formula is as follows:
Ai=aibi(i=1,2,...,M) (6.1)
A1≥A2≥...≥Ak-1≥Ak≥...≥AM(6.2)
next, we can start from { λ1,λ2,...,λMGet the best parameter lambda in }*
(1) Setting k to 1;
(2) from equation 6.3, λ is calculatedk
(3) If A isk≥λk>=Ak+1Then λ*=λk(stop) otherwise, set k-k +1 and go back to step (2) finally we can calculateThe following were used:
wherein,is allocated to module miThe final test workload of (a);
further, the specific steps of the step 1) are as follows:
step 1) -1: an initial state;
step 1) -2: to V0Collecting the defect information, and counting the actual defect number of each module;
step 1) -3: to V0The test cases are processed, and V is counted0Actual test workload of; for V0Manually ensuring that there is a test case that can trigger the defect; specifically, if a test case which can trigger a certain defect cannot be found for the defect, the defect is excluded from statistics;
step 1) -4: statistics V0Code metric and process metric information for each module of (a);
step 1) -5: statistics V1Code metric and process metric information for each module of (1), initialization V1Is measured by the available test workload value.
Step 1) -6: and finishing the acquisition and processing of the software data set.
Further, the specific steps of the step 2) are as follows:
step 2) -1: an initial state;
step 2) -2: will V0And V1The measurement information of each module is organized into a form of a feature vector, and the defect number is used as a target variable;
step 2) -3: with V0Establishing a prediction model for the training set;
step 2) -4: with V1For test set, pair V1Each module in the set outputs a defect prediction value;
step 2) -5: evaluating the prediction result;
step 2) -6: completing the construction of a software defect prediction model;
further, the specific steps of the step 3) are as follows:
step 3) -1: an initial state;
step 3) -2: selecting a Go model as a defect discovery model;
step 3) -3: carrying out parameter expansion on the Go model;
step 3) -4: completing the construction of a software defect discovery model;
further, the specific steps of the step 4) are as follows:
step 4) -1: an initial state;
step 4) -2: setting residual defect rate R1And R2A value of (d);
step 4) -3: to V0Each module m in the versioniEstimating a parameter aiA value of (d);
step 4) -4: estimate V0Version b0A value;
step 4) -5: software defect discovery model V0Finishing the estimation of the version parameters;
further, the specific steps of the step 5) are as follows:
step 5) -1: an initial state;
step 5) -2: reading V0Each module m of the versioniA of (a)iValue and defect prediction value pi
Step 5) -3: judgment V1Each module M of a versioniAt V0Whether the version exists or not, if so, executing the step 5) -4, otherwise, executing the step 5) -5;
step 5) -4: reading V1Version of module MiTo V0Version of module miK value and m of defect repair informationiParameter a ofiValue and estimate V accordingly1Version of module MiParameter a ofiA value of (d);
step 5) -5: reading V1Version of module MiDefect prediction value p ofiAnd estimate V accordingly1Version of module MiParameter a ofiA value of (d);
step 5) -6: reading V0Version b0Value V1Each module M of a versioniS ofiValue and estimate V accordingly1Version of module MiParameter b ofiA value of (d);
step 5) -7: software defect discovery model V1Finishing the estimation of the version parameters;
further, the specific steps of the step 6) are as follows:
step 6) -1: an initial state;
step 6) -2: reading V1Each module m of the versioniA of (a)iValue b andia value;
step 6) -3: calculating V1Each module m of the versioniA of (a)iValue and all modules are according to AiSorting the values from big to small;
step 6) -4: let k equal to 1;
step 6) -5: calculating λ according to equation 6.3kA value of (d);
step 6) -6: if A isk≥λk>=Ak+1Then λ*=λkExecuting steps 6) -7; otherwise, enabling k to be k +1, and returning to execute the steps 6) -5;
step 6) -7: to V1Each module m of the versioniComputingA value;
step 6) -8: to V1Each module m of the versioniIs pressed againstValue-assigning a test workload;
step 6) -9: software V1The workload of the version optimal test is distributed;
the invention relates to a test workload distribution method based on a software reliability growth model, which uses methods such as random forest, linear regression, support vector machine and the like to establish a prediction model, uses a Go model as a defect discovery model, and uses V in the parameter estimation process of a cross-version defect discovery model0Go model parameter of version to V1Estimating the parameters of the version, applying the estimated parameters to a defect discovery model to finally obtain V1The method further improves the software testing work efficiency, so that the most optimal testing workload distribution scheme of the version can be found under the condition of limited testing resourcesAnd the defects are more, so that the quality of the product is better controlled.
Drawings
Fig. 1 is a flowchart of a test workload distribution method based on a software reliability growth model according to an embodiment of the present invention.
FIG. 2 is a flow chart of the collection and processing of the software data set of FIG. 1.
FIG. 3 is a flow chart of the construction of the software defect prediction model of FIG. 1.
Fig. 4 is a flowchart of the construction of the software defect discovery model in fig. 1.
FIG. 5 is a model V for finding software defects in FIG. 10A flow chart of version parameter estimation.
FIG. 6 is a model V for finding software defects in FIG. 11A flow chart of version parameter estimation.
FIG. 7 shows the software V in FIG. 1iA flow chart for version-optimal test workload distribution.
Detailed Description
In order to better understand the technical content of the present invention, specific embodiments are described below with reference to the accompanying drawings.
Fig. 1 is a flowchart of a test workload distribution method based on a software reliability growth model according to an embodiment of the present invention.
A test workload distribution method based on a software reliability growth model comprises the following steps:
s101 acquisition and processing of a software data set in order to be able to adapt the current system V1To perform the test workload distribution, we need to do the previous version V0The information of (2) is collected, specifically comprises defect information collection and test casesProcess statistics and metric collection; also for V1The information of (2) is collected, specifically including measurement collection, setting of available testing workload and the like;
s102, constructing a software defect prediction model, namely after the step 1), converting V into V0And V1The measurement information of each module is organized into a form of a feature vector, the defect number is used as a target variable, and V is subjected to prediction model pair through random forest, linear regression, support vector machine and the like0Set up training and for V1Carrying out prediction;
s103, constructing a software defect finding model which can estimate the number of the detectable defects relative to the given test resource, wherein the model used is a Go model, because the model is the simplest NHPP model and has a constant defect detection rate in any test time, which means that the parameter estimation of the model is easier than that of other SRGM;
s104 software defect discovery model V0Version parameter estimation, we need to estimate a from the existing data of acceptance testiAnd b0
S105 software defect finding model V1Version parameter estimation due to V1The actual defect number of the version is unknown, so the method in step 4) cannot be used to estimate V1Parameters in the version, we adopt the method of using V0To estimate V1The parameters of (1);
s106 software V1Version-optimal test workload distribution, for Go model, module m is knowniA of (a)iAnd biAnd ViTest workload t to be allocated for a versiontotalThen the optimal test workload distribution strategy can be obtained and distributed to m according to the strategyiSubstituting the corresponding workload into a defect finding model to finally find the most defects;
FIG. 2 is a flow chart of the collection and processing of the software data set of FIG. 1 in order to enable reconciliationFront system V1To perform the test workload distribution, we need to do the previous version V0Collecting the information, specifically including defect information collection, test case processing and measurement collection; also for V1The information of (2) is collected, and specifically comprises measurement collection, setting of available testing workload and the like. The method comprises the following specific steps:
step 1: an initial state; step 2: to V0Collecting the defect information, and counting the actual defect number of each module; and step 3: to V0To ensure V0Can run for V0Manually ensuring that there is a test case that can trigger the defect; specifically, if a test case which can trigger a certain defect cannot be found for the defect, the defect is excluded from statistics; and 4, step 4: statistics V0Code metric and process metric information for each module of (a); and 5: statistics V1Code metric and process metric information for each module of (1), initialization V1A value of available test workload; step 6: the collection and processing of the software data set are finished;
FIG. 3 is a flow chart of the construction of the software defect prediction model in FIG. 1, after step 1), V is set0And V1The measurement information of each module is organized into a form of a feature vector, the defect number is used as a target variable, and V is subjected to prediction model pair through random forest, linear regression, support vector machine and the like0Set up training and for V1And (6) performing prediction. The method comprises the following specific steps:
step 1: an initial state; step 2: will V0And V1The measurement information of each module is organized into a form of a feature vector, and the defect number is used as a target variable; and step 3: with V0Establishing a prediction model for the training set; and 4, step 4: with V1For test set, pair V1Each module in the set outputs a defect prediction value; and 5: evaluating the prediction result; step 6: completing the construction of a software defect prediction model;
fig. 4 is a flow chart of the construction of the software defect discovery model of fig. 1, which can estimate the number of discoverable defects with respect to a given test resource, and the model we use is the Go model, which is the simplest NHPP model, with a constant defect detection rate at any test time, which means that the parameter estimation of the model is much easier than other SRGMs. The method comprises the following specific steps:
step 1: an initial state; step 2: selecting a Go model as a defect discovery model; and step 3: expanding a Go model; and 4, step 4: and finishing the construction of the software defect finding model.
FIG. 5 is a model V for finding software defects in FIG. 10A flow chart of version parameter estimation. Due to V1The actual defect number of the version is unknown, so the method in step 4) cannot be used to estimate V1Parameters in the version, we adopt the method of using V0To estimate V1The parameter (c) of (c). The method comprises the following specific steps:
step 1: an initial state; step 2: setting residual defect rate R1And R2A value of (d); and step 3: to V0Each module m in the versioniEstimating a parameter aiA value of (d); and 4, step 4: estimate V0Version b0A value; and 5: software defect discovery model V0And finishing the estimation of the version parameters.
FIG. 6 is a model V for finding software defects in FIG. 11A flow chart of version parameter estimation. Due to V1The actual defect number of the version is unknown, so the method in step 4) cannot be used to estimate V1Parameters in the version, we adopt the method of using V0To estimate V1The parameter (c) of (c). The method comprises the following specific steps:
step 1: an initial state; step 2: reading V0Each module m of the versioniA of (a)iValue and defect prediction value pi(ii) a And step 3: judgment V1Each module M of a versioniAt V0Whether the version exists or not, if so, executing the step 4, otherwise, executing the step 5; and 4, step 4: reading V1Version of module MiTo V0Version of module miK value and m of defect repair informationiParameter a ofiValue and estimate V accordingly1Version of module MiParameter a ofiA value of (d); and 5: reading V1Version of module MiDefect prediction value p ofiAnd estimate V accordingly1Version of module MiParameter a ofiA value of (d); step 6: reading V0Version b0Value V1Each module M of a versioniS ofiValue and estimate V accordingly1Version of module MiParameter b ofiA value of (d); and 7: software defect discovery model V1And finishing the estimation of the version parameters.
FIG. 7 shows the software V in FIG. 11A flow chart for version-optimal test workload distribution. For the Go model, module m is knowniA of (a)iAnd biAnd V1Test workload t to be allocated for a versiontotalThen the optimal test workload distribution strategy can be obtained and distributed to m according to the strategyiAnd substituting the corresponding workload into the defect finding model to finally find the most defects. The method comprises the following specific steps:
step 1: an initial state; step 2: reading V1Each module m of the versioniA of (a)iValue b andia value; and step 3: calculating V1Each module m of the versioniA of (A)iValue and all modules are according to AiSorting the values from big to small; and 4, step 4: calculating lambda*The value of (c). And 5: to V1Each module m of the versioniComputingStep 6: to V1Each module m of the versioniIs pressed againstValue-assigning a test workload; and 7: software V1And finishing the distribution of the workload of the optimal version test.
In summary, the invention solves the problem of the defects existing in the maximum benefit testing system under the condition of limited testing resources at present, firstly, the defects of various defect prediction models are adopted for prediction, then, the cross-version parameter estimation is carried out by means of the prediction result, and finally, the optimal testing workload distribution result set of the defect discovery models under the parameter condition is solved.

Claims (7)

1. A test workload distribution method based on software reliability growth model is characterized in that the old version (V) of software is reported through defect patches0Version) and new version (V)1Version) to respectively perform the collection and processing processes of the data set; then, a defect prediction model (V) is constructed by using defect prediction technologies such as random forest, linear regression and support vector machine0For the training set, V1As a test set); then use Goel-Okumoto model, one of the software reliability growth models, to pair V0Version and V1Version independent build software deficiencyTrapping a discovery model, and respectively carrying out parameter estimation on the defect discovery models of the two versions; finally, software V can be obtained1An optimal test workload (test case) allocation scheme of the version; the method comprises the following steps:
1) collecting and processing a software data set; software version V0The data information to be collected includes: collecting defect information, collecting and processing test cases and collecting basic measurement; software version V1The data information to be collected includes: collecting basic measurement, distributing total testing workload and the like;
2) constructing a software defect prediction model;
definition 1: the software defect prediction model comprises the following three key attributes, namely description, input and output;
the following steps are described: the defect prediction model is used for reading a software training data set for training and testing a software testing set, and common prediction models comprise random forests, linear regression, support vector machines and the like;
inputting: code measurement, process measurement and other information of the software;
and (3) outputting: indicating the number of defects expected to occur in the software;
using the software data set collected and processed in step 1), respectively converting the version V0And version V1As a training set and a testing set, and organizing according to the format of description, input and output to carry out the software defect number prediction process;
3) constructing a software defect discovery model;
definition 1: the software defect discovery model can estimate the number of the discoverable defects relative to a given test resource, the model used by the software defect discovery model is an expanded exponential reliability growth model, also called a Goel-Okumoto model (hereinafter referred to as a Go model), the model has a constant defect detection rate in any test time, the model can represent the relationship between the test workload and the detected accumulated defects, and the calculation formula is as follows:
<mrow> <msub> <mover> <mi>H</mi> <mo>^</mo> </mover> <mi>i</mi> </msub> <mrow> <mo>(</mo> <msub> <mi>t</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <msub> <mi>a</mi> <mi>i</mi> </msub> <mo>&amp;lsqb;</mo> <mn>1</mn> <mo>-</mo> <mi>exp</mi> <mrow> <mo>(</mo> <mo>-</mo> <msub> <mi>b</mi> <mi>i</mi> </msub> <msub> <mi>t</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> <mo>&amp;rsqb;</mo> <mo>,</mo> <msub> <mi>b</mi> <mi>i</mi> </msub> <mo>=</mo> <msub> <mi>b</mi> <mn>0</mn> </msub> <mo>/</mo> <msub> <mi>S</mi> <mi>i</mi> </msub> </mrow>
in the formulaFor the module m given a certain amount of test workloadiNumber of defects, t, that can be found finallyiIs a module miAssigned test workload, biNumber of defects that can be found for a unit test workload, aiIs a module miInitial defect number of SiIs a module miSize of (a) b0Is constant and is estimated by step 4).
4) Software defect discovery model V0Estimating version parameters;
definition 1: software defect discovery model V0Version aiThe calculation formula is as follows:
<mrow> <msub> <mi>a</mi> <mi>i</mi> </msub> <mo>=</mo> <msub> <mi>H</mi> <mi>i</mi> </msub> <mo>+</mo> <msub> <mi>R</mi> <mn>1</mn> </msub> <mo>*</mo> <mfrac> <msubsup> <mi>S</mi> <mi>i</mi> <mrow> <mi>n</mi> <mi>e</mi> <mi>w</mi> </mrow> </msubsup> <mn>1000</mn> </mfrac> <mo>+</mo> <msub> <mi>R</mi> <mn>2</mn> </msub> <mo>*</mo> <mfrac> <msubsup> <mi>S</mi> <mi>i</mi> <mrow> <mi>r</mi> <mi>e</mi> <mi>u</mi> <mi>s</mi> <mi>e</mi> <mi>d</mi> </mrow> </msubsup> <mn>1000</mn> </mfrac> </mrow>
wherein HiIs a module miThe number of actually detected defects of the optical disc,is a module miThe number of added/modified code lines,is a module miThe number of code lines multiplexed in (1). R1And R2Is the residual defect rate, which indicates that R still remains in the 1KLOC line in the newly added/modified code1A defect; in the multiplexed code, R still remains in the 1KLOC line2And (4) a defect.
Definition 2: parameter b0The number of defects detectable per unit of test workload is indicated, which is derived from the actually assigned test workload and the number of detected defects, and the calculation formula is as follows:
b0=-Stotal/ttotal*log(1-Htotal/atotal)
wherein StotalIs the total size of all modules, ttotalIs the total actual test workload, HtotalIs the total number of defects actually detected, and has
5) Software defect discovery model V1Estimating version parameters;
case 1: if miAt V0In existence of
A. If it is notSuppose V0Version miHas k defects at V1Is repaired, then:
<mrow> <msub> <mi>a</mi> <mi>i</mi> </msub> <mrow> <mo>(</mo> <msub> <mi>v</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <mi>m</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <msub> <mi>a</mi> <mi>i</mi> </msub> <mrow> <mo>(</mo> <msub> <mi>v</mi> <mrow> <mi>i</mi> <mo>-</mo> <mn>1</mn> </mrow> </msub> <mo>,</mo> <msub> <mi>m</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> <mo>+</mo> <msub> <mi>R</mi> <mn>1</mn> </msub> <mo>*</mo> <mfrac> <msubsup> <mi>S</mi> <mi>i</mi> <mrow> <mi>n</mi> <mi>e</mi> <mi>w</mi> </mrow> </msubsup> <mn>1000</mn> </mfrac> <mo>+</mo> <msub> <mi>R</mi> <mn>2</mn> </msub> <mo>*</mo> <mfrac> <msubsup> <mi>S</mi> <mi>i</mi> <mrow> <mi>r</mi> <mi>e</mi> <mi>u</mi> <mi>s</mi> <mi>e</mi> <mi>d</mi> </mrow> </msubsup> <mn>1000</mn> </mfrac> <mo>-</mo> <mi>k</mi> </mrow>
B. if it is notDescription of miPractically without defects, so we can lower V1Version miA of (a)iThe value:
ai(vi,mi)=ai(vi-1,mi)/10
case 2:miat V0Is absent in (V)1The newly added module. The calculation is as follows:
<mrow> <msub> <mi>a</mi> <mi>i</mi> </msub> <mrow> <mo>(</mo> <msub> <mi>v</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <mi>m</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mi>A</mi> <mi>p</mi> <mi>p</mi> <mi>l</mi> <mi>y</mi> <mi>M</mi> <mi>o</mi> <mi>d</mi> <mi>e</mi> <mi>l</mi> <mrow> <mo>(</mo> <mi>mod</mi> <mi>e</mi> <mi>l</mi> <mo>,</mo> <msub> <mi>m</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> <mo>+</mo> <msub> <mi>R</mi> <mn>1</mn> </msub> <mo>*</mo> <mfrac> <msub> <mi>S</mi> <mi>i</mi> </msub> <mn>1000</mn> </mfrac> </mrow>
wherein model is at V0The defect prediction model, Aplymodel (model, m), established abovei) M predicted for applying the defect prediction modeliThe number of defects above, instead of the number of defects actually detected.
6) Software V1Distributing the workload of the version optimal test;
known as V1Test workload t to be allocated for a versiontotalEach module miA of (a)iAnd biThe value m under the optimal test workload distribution strategy can be obtained according to the stepsiThe results are substituted into the defect discovery model and the maximum number of accumulated defects can be finally discovered.
2. Software-based according to claim 1The method for distributing the testing workload of the reliability growth model is characterized in that in the step 1), the version V is subjected to0The test cases are processed, and V is counted0Actual test workload of; and for V0Manually ensuring that there is a test case that can trigger the defect; specifically, if a test case which can trigger a certain defect cannot be found for the defect, the defect is excluded from statistics; then statistics version V0And V1Code metrics and process metrics information for each module of (a).
3. The software reliability growth model-based test workload distribution method according to claim 1, wherein in step 2), the processed software data set is subjected to a test; respectively convert version V0And version V1And the software defect number prediction process is performed by being used as a training set and a test set and being organized according to the format of description, input and output.
4. The method as claimed in claim 1, wherein in step 3), the Go model is selected as the defect discovery model and the Go model is subjected to parameter expansion, because the Go model is the simplest NHPP model and has a constant defect detection rate at any testing time.
5. The software reliability growth model-based test workload distribution method according to claim 1, wherein in step 4), for V0Version, residual defect rate R of given software1And R2Can be in accordance with V0The existing defect information and basic metric information of each module are versioned to estimate aiAnd b0The value of (c).
6. The method of claim 1, wherein the method comprises the step of distributing the test workload based on the software reliability growth modelIn step 5), due to V1The actual defect number of the version is unknown, so the method in step 4) cannot be used to estimate V1Parameters in the version, we adopt the method of using V0To estimate V1The parameters of (1); the specific steps are that V is read first0Each module m of the versioniA of (a)iValue and defect prediction value pi(ii) a Then, judging V1Each module M of a versioniAt V0Whether or not there are: if so, read MiTo V0Corresponding module m in versioniDefect repair information (k value) and aiValue to estimate MiParameter a ofiA value of (d); otherwise read MiDefect prediction value p ofiValue to estimate MiParameter a ofiA value of (d); next, estimate MiB of (a)iValue by reading V0Version b0Value (as V)1Version b0Value) and MiS ofiValue and estimate M accordinglyiParameter b ofiThe value of (c).
7. The software reliability growth model-based test workload distribution method according to claim 1, wherein in step 6), V is known for Go model1Test workload t to be allocated for a versiontotalModule MiA of (a)iAnd biThen, the optimal distribution strategy of the test workload to M can be obtainediThe workload of (2) is substituted into the defect discovery model, and finally the maximum number of accumulated defects can be discovered.
CN201610977069.9A 2016-11-04 2016-11-04 A kind of test job amount distribution method based on software reliability prediction Pending CN108021498A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610977069.9A CN108021498A (en) 2016-11-04 2016-11-04 A kind of test job amount distribution method based on software reliability prediction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610977069.9A CN108021498A (en) 2016-11-04 2016-11-04 A kind of test job amount distribution method based on software reliability prediction

Publications (1)

Publication Number Publication Date
CN108021498A true CN108021498A (en) 2018-05-11

Family

ID=62083637

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610977069.9A Pending CN108021498A (en) 2016-11-04 2016-11-04 A kind of test job amount distribution method based on software reliability prediction

Country Status (1)

Country Link
CN (1) CN108021498A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111176961A (en) * 2019-12-05 2020-05-19 腾讯科技(深圳)有限公司 Application program testing method and device and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103744778A (en) * 2013-12-29 2014-04-23 哈尔滨工业大学 Change point based ISQ-FDEFCE software reliability growth model
CN103761183A (en) * 2013-12-29 2014-04-30 哈尔滨工业大学 FDE and FCE considered software reliability growth model establishing method based on ISQ
CN105205002A (en) * 2015-10-28 2015-12-30 北京理工大学 Modeling method of software safety defect discovering model based on test workload

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103744778A (en) * 2013-12-29 2014-04-23 哈尔滨工业大学 Change point based ISQ-FDEFCE software reliability growth model
CN103761183A (en) * 2013-12-29 2014-04-30 哈尔滨工业大学 FDE and FCE considered software reliability growth model establishing method based on ISQ
CN105205002A (en) * 2015-10-28 2015-12-30 北京理工大学 Modeling method of software safety defect discovering model based on test workload

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
AHMED E. HASSAN: "Predicting faults using the complexity of code changes", 《2009 IEEE 31ST INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING》 *
李海峰 等: "考虑测试工作量与覆盖率的软件可靠性模型", 《软件学报》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111176961A (en) * 2019-12-05 2020-05-19 腾讯科技(深圳)有限公司 Application program testing method and device and storage medium
CN111176961B (en) * 2019-12-05 2022-03-29 腾讯科技(深圳)有限公司 Application program testing method and device and storage medium

Similar Documents

Publication Publication Date Title
CN106780121B (en) Power consumption abnormity identification method based on power consumption load mode analysis
US20090007078A1 (en) Computer-Implemented Systems And Methods For Software Application Testing
CN104572449A (en) Automatic test method based on case library
CN109470946B (en) Power generation equipment fault detection method and system
CN106599230A (en) Method and system for evaluating distributed data mining model
CN105871879B (en) Network element abnormal behaviour automatic testing method and device
CN106612511B (en) Wireless network throughput evaluation method and device based on support vector machine
CN110633222A (en) Method and device for determining regression test case
CN112907026A (en) Comprehensive evaluation method based on editable mesh index system
CN116341911A (en) Alternating-current interference corrosion risk evaluation method and system based on FAHP-SVM
WO2020056812A1 (en) Environmental parameter weight determining method and system for evaluating indoor environmental quality
CN103092762A (en) Real-time software defect detection method applicable to rapid software development model
CN108021498A (en) A kind of test job amount distribution method based on software reliability prediction
CN113793057A (en) Building bidding and tendering data generation method based on regression analysis model
CN116777861B (en) Marking quality detection method and system for laser engraving machine
CN105868079B (en) It is a kind of to use detection method using the Java Memory Low Usage for propagating analysis based on memory
CN107957944B (en) User data coverage rate oriented test case automatic generation method
CN101789091A (en) System and method for automatically identifying video definition
CN116383048A (en) Software quality information processing method and device
CN114650552B (en) Method and apparatus for anomaly detection in a network
CN111967774B (en) Software quality risk prediction method and device
Hummen et al. Evaluating existing methodological approaches for prospective LCA at early technology development stages
CN118409976B (en) Software quality assessment method and system based on machine learning
JP5842704B2 (en) Estimation apparatus, program, and estimation method
JP2011145905A (en) Prediction function generation device and method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20180511