WO2023276253A1 - 解析装置、解析方法 - Google Patents
解析装置、解析方法 Download PDFInfo
- Publication number
- WO2023276253A1 WO2023276253A1 PCT/JP2022/006786 JP2022006786W WO2023276253A1 WO 2023276253 A1 WO2023276253 A1 WO 2023276253A1 JP 2022006786 W JP2022006786 W JP 2022006786W WO 2023276253 A1 WO2023276253 A1 WO 2023276253A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- test
- score
- conditional expression
- information
- analysis device
- Prior art date
Links
- 238000004458 analytical method Methods 0.000 title claims abstract description 58
- 238000012360 testing method Methods 0.000 claims abstract description 439
- 238000004364 calculation method Methods 0.000 claims abstract description 101
- 230000008859 change Effects 0.000 claims abstract description 30
- 230000014509 gene expression Effects 0.000 claims description 106
- 238000012790 confirmation Methods 0.000 claims description 12
- 238000010586 diagram Methods 0.000 description 43
- 238000000034 method Methods 0.000 description 39
- 230000008569 process Effects 0.000 description 39
- 238000012545 processing Methods 0.000 description 35
- 230000006870 function Effects 0.000 description 13
- 230000007547 defect Effects 0.000 description 7
- 238000012986 modification Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 238000012937 correction Methods 0.000 description 3
- 230000002776 aggregation Effects 0.000 description 2
- 238000004220 aggregation Methods 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 238000004891 communication Methods 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000007670 refining Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3692—Test management for test results analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3688—Test management for test execution, e.g. scheduling of test suites
Definitions
- the present invention relates to an analysis device and an analysis method.
- Patent Document 1 regarding a defect that is a deliverable in a defect management tool, a revision that is a deliverable in a configuration management tool, and a test item that is a deliverable in a test management tool, the test management tool, the configuration management tool, the defect Predetermined management tool data including test results for each deliverable is input from a management tool, and period extraction for extracting a period from failure to success or from success to failure for the test results related to the defect. Narrowing processing for narrowing down related candidates in associating the defect with other deliverables related to the defect, using the attribute change relationship between the deliverables in the period extracted by the period extracting unit and the period extracting unit.
- a related candidate generating apparatus for a deliverable is disclosed, which includes a section.
- An analysis device is an analysis device for identifying a cause when a test of software under test fails in a test environment, which is an environment for executing a test, wherein the test includes: Software execution and execution result confirmation are executed by a test script without human intervention, and test data is data read into the test target software and the test script when the test is executed, and the test target A score calculation unit that calculates a failure cause score based on the presence or absence of change for each of software, test data, test script, and test environment; , test data, test script, and test environment.
- An analysis method is an analysis method in which a computer identifies the cause when a test of software under test fails in a test environment, which is an environment for executing a test, wherein the test includes: execution of the software under test and confirmation of execution results are executed by a test script without human intervention, and test data is data read into the software under test and the test script when the test is executed; Calculating a failure cause score based on the presence or absence of change for each of the software under test, test data, test script, and test environment, and based on the failure cause score, the cause of the failure of the test is the software under test , test data, test script, and test environment.
- Functional configuration diagram of the analysis device in the first embodiment Hardware configuration diagram of analysis equipment Diagram explaining the outline of the test analyzed by the analysis device Diagram showing an example of configuration information Diagram showing an example of test case information Diagram showing an example of score calculation rule information Diagram showing an example of change history information Diagram showing an example of test execution information Diagram showing an example of score summary information Diagram showing an example of name correspondence information
- a diagram showing an example of output to an output device Flowchart showing processing of the score calculation unit in the first embodiment Flowchart explaining the details of the score calculation process
- Functional configuration diagram of an analysis device according to the second embodiment Diagram showing an example of extended test case information Diagram showing an example of score calculation extended rule information Diagram showing an example of extended score aggregation information Diagram showing an example of element definition information Diagram showing an example of element-related information
- a diagram showing an example of output to an output device by the display generation unit Flowchart showing the processing of the rule generator Flowchart showing the processing
- FIG. 1 A first embodiment of an analysis apparatus according to the present invention will be described below with reference to FIGS. 1 to 13.
- FIG. 1 A first embodiment of an analysis apparatus according to the present invention will be described below with reference to FIGS. 1 to 13.
- FIG. 1 A first embodiment of an analysis apparatus according to the present invention will be described below with reference to FIGS. 1 to 13.
- FIG. 1 A first embodiment of an analysis apparatus according to the present invention will be described below with reference to FIGS. 1 to 13.
- FIG. 1 is a functional configuration diagram of an analysis device 1 that analyzes software test results.
- the analysis device 1 includes a score calculation unit 151, an estimation unit 152, and a display generation unit 153 as its functions.
- the analysis device 1 has a storage unit 84 .
- the storage unit 84 stores test case information 101, score calculation rule information 102, change history information 103, test execution information 104, score total information 105, and name correspondence information 106.
- FIG. 2 is a hardware configuration diagram of the analysis device 1.
- the analysis device 1 includes a CPU 81 that is a central processing unit, a ROM 82 that is a read-only storage device, a RAM 83 that is a readable/writable storage device, a storage section 84 , and an input/output interface 85 .
- Analysis device 1 is connected to input device 2 and output device 3 via input/output interface 85 .
- the input device 2 is a keyboard or mouse.
- the output device 3 is a device for displaying images, such as a liquid crystal display. However, a touch panel or the like in which the input device 2 and the output device 3 are integrated may be used.
- the input device 2 transmits input from a person who uses the analysis device (hereinafter referred to as a “user”) to the analysis device 1 .
- the output device 3 presents the information output by the analysis device 1 to the user.
- the CPU 81 implements the score calculation unit 151, the estimation unit 152, and the display generation unit 153 by expanding the program stored in the ROM 82 into the RAM 83 and executing it.
- the score calculator 151 and the estimator are rewritable logic circuits FPGA (Field Programmable Gate Array) and application-specific integrated circuits ASIC (Application Specific Integrated Circuit).
- FPGA Field Programmable Gate Array
- ASIC Application Specific Integrated Circuit
- the score calculation unit 151, the estimation unit 152, and the display generation unit 153 may be realized by a combination of different configurations, for example, a combination of the CPU 81, the ROM 82, the RAM 83, and an FPGA instead of the combination of the CPU 81, the ROM 82, and the RAM 83. .
- the storage unit 84 is a non-volatile storage device such as a hard disk drive or flash memory. Most of the information stored in the storage unit 84 is created in advance, and only the score total information 105 is created by the score calculation unit 151 and the estimation unit 152 .
- the input/output interface 85 may be a general-purpose communication interface, or may be a dedicated interface conforming to the specifications of the input device 2 and the output device 3 .
- FIG. 3 is a diagram explaining the outline of the test analyzed by the analysis device 1.
- FIG. A test to be analyzed in the present embodiment targets software. However, since not only the software itself but also the conditions incidental to the software affect the test, these are also included in the test items. Specifically, in the present embodiment, four components of the test target software Src, the test data Dat, the test script Ts, and the test environment Env are used. Although various conditions are different for each test, they all have in common that they combine four components: the software under test Src, the test data Dat, the test script Ts, and the test environment Env.
- test target software Src test data Dat
- test script Ts test environment Env
- test environment Env may be referred to as "types" of components.
- these components are conceptual, and concrete ones are used in practice.
- the software under test Src includes a plurality of resources such as source codes and icons, which are updated according to the development status and the version number is changed.
- the test target software Src is essentially the software to be tested.
- the software under test Src is composed of, for example, one or more source codes.
- the software under test Src includes two source codes, but the number of source codes may increase or decrease as the development progresses. Also, the version number of each source code is changed whenever the content is changed. That is, each test may have a different version of the source code or a different number of source codes.
- the software under test Src may include, for example, statically linked libraries in addition to source code.
- the test data Dat is data read by the test target software Src and the test script Ts during test execution, and is, for example, a combination of a variable name and its value.
- the test data Dat is composed of one or more data. These data are prepared according to the software under test Src, and are, for example, text data, image data, voice data, or a combination thereof.
- the test data Dat consists of one sample data. The version number of each data is changed every time the content is changed.
- the test script Ts is a script file for executing the test target software Src and confirming the execution results without human intervention.
- the test script Ts is, for example, a shell script or batch file.
- the test script Ts consists of one script. Each script changes its version number whenever its contents change.
- the test is called a "script" for the sake of convenience, but it is not essential to be a script, and the program itself may be used.
- the test environment Env is the hardware used for testing, and the software used for testing, excluding the test target software Src.
- the test environment Env consists of an OS and a hardware configuration. Those versions are updated whenever the test environment is changed or updated.
- the version number is changed every time the configuration is changed, but instead of this, the name of the configuration may be changed. Details will be described later.
- FIG. 4 is a diagram showing an example of the configuration information 100.
- the configuration information 100 is information indicating the correspondence between the specific configuration of each of the test target software Src, the test data Dat, the test script Ts, and the test environment Env and the configuration ID. However, the configuration information 100 does not include the version number of each configuration. In other words, the configuration information 100 is information that can specify the conditions of each test by being combined with the version information of each configuration.
- the configuration ID is a combination of "K” and a number. For example, if the configuration ID is "K1" in the software under test Src, it indicates that the software under test Src is composed of "initial operation source code" and "main processing source code".
- test target software Src has a configuration ID of "K2”
- test data Dat has a configuration ID of "K4"
- test script Ts has a configuration ID of "K1”
- test environment Env has a configuration ID of "K2”.
- FIG. 5 is a diagram showing an example of the test case information 101.
- the test case information 101 is information indicating the configuration of each test case.
- the test case information 101 consists of a plurality of records, and each record includes fields of test case ID 1011 and configuration ID 1012 .
- the test case ID 1011 stores the identifier of the test case.
- a "test case" is a combination in which a configuration ID is designated for each of the test target software Src, test data Dat, test script Ts, and test environment Env that constitute the test.
- test case ID is "TC1”
- configuration "K1" of the software under test Src is a combination of
- FIG. 6 is a diagram showing an example of the score calculation rule information 102.
- the score calculation rule information 102 is information on rules for calculating failure cause scores.
- the failure cause score is a score for specifying which of the test script Ts, the test data Dat, the test target software Src, and the test environment Env is the cause of the test failure.
- a failure cause score is calculated for each test for each of the test script Ts, the test data Dat, the software under test Src, and the test environment Env. The initial value is zero, and the one with the highest score among the four is determined to be the cause of the test failure.
- the score calculation rule information 102 consists of a plurality of records, and each record includes fields of a rule ID 1021, a conditional expression 1022, and an addition value 1023.
- a rule ID 1021 is an identifier of a rule for calculating a score, and is represented by a combination of "R" and a number.
- a conditional expression 1022 is a conditional expression for adding the score, and when the conditional expression is established, the value described in the addition value 1023 is added to each failure cause score. The added value 1023 indicates a value added to the failure cause score when the conditional expression is satisfied.
- the conditional expression includes a function that is uniquely appropriate.
- the function "IsChangedV” is a function that outputs whether a particular type of test configuration has changed from the first version to the second version.
- the function “IsChangedV” outputs "true” if there is a change and "false” if there is no change.
- the function “IsChangedV” has a first argument representing the type of test configuration, a second argument representing a numerical value indicating the first version, and a third argument representing a numerical value indicating the second version.
- the numbers of the second and third arguments are integers, "1" or more is the specific version number itself, "0” is the current version, "-1” is the previous version, and "-2” is the previous version. version, and similarly, the larger the negative value, the older the version.
- IsChangedV(Ts_x, -1,0) outputs whether or not the test script Ts has changed since the previous execution.
- the third argument is set to "0" as a default argument. That is, "IsChangedV(Ts_x, -1)" and “IsChangedV(Ts_x, -1,0)" mean the same input.
- FIG. 7 is a diagram showing an example of the change history information 103.
- the change history information 103 is information indicating the change history of test components.
- the change history information 103 is composed of a plurality of records, and each record has fields of type 1031 , configuration ID 1032 , version 1033 , and date and time of change 1034 .
- the type 1031 is information indicating any one of the test target software Src, test data Dat, test script Ts, and test environment Env.
- the configuration ID 1032 is one of the configuration IDs 1001 shown in the configuration information 100, which is information specifying the configuration.
- the version 1033 is information specifying the version of the configuration.
- the same configuration has the same version number in order to simplify the explanation, but the version number may be different even within the same configuration.
- the information specifying each version number is version 1033 .
- the date and time of change 1034 is the date and time when the version was changed.
- the "automatic execution script" whose configuration ID of the test script Ts is "K1” has a version "1" of "February 3, 2021" in Japan time. It is shown that it was created at 4:05:32.
- FIG. 8 is a diagram showing an example of the test execution information 104.
- the test execution information 104 is information for managing the configuration of each test and the execution of the test. The test case above did not specify the configuration version, but the test execution information 104 also specifies the version.
- the test execution information 104 consists of a plurality of records, each record having fields of test execution ID 1041 , test case ID 1042 , configuration ID and version 1043 , priority 1044 and test status 1045 .
- the test execution ID 1041 is an identifier that identifies the test, and is represented by a combination of "E" and a number.
- Test case ID 1042 is the same as test case ID 1011 in test case information 101 .
- the configuration ID and version 1043 are the configuration ID and version number of each configuration for each of the test script Ts, test data Dat, software under test Src, and test environment Env. Specifically, it is a combination of the configuration ID 1011 in the test case information 101 and the version 1053 in the score total information 105 . Since the combination of the test case ID and the configuration ID is indicated in the test case information 101, the information of the test case information 101 and the test execution information 104 overlap in this respect.
- the priority 1044 is the priority for executing tests.
- the test identified by the test execution ID of the record whose priority 1044 is set to "high” is executed before the test identified by the test execution ID of the record whose priority 1044 is set to "low". be.
- the test status 1045 is information indicating before and after test execution or test execution results.
- the test states 1045 "successful” and “failed” indicate that the test has been executed, and the test state 1045 "not started” indicates that the test has not been executed.
- a "successful” test state 1045 indicates that the executed test succeeded, and a "failed” test state 1045 indicates that the executed test failed.
- the success of the test means that the execution result of the test target software Src was as intended by the developer.
- a test failure means that the execution result of the test target software Src was not as intended by the developer.
- FIG. 9 is a diagram showing an example of the score total information 105.
- the score summary information 105 is a table showing the summary result of failure cause scores for each test and the presumed cause of failure.
- Score total information 105 is generated by score calculator 151 and estimator 152 .
- the score total information 105 consists of a plurality of records, and each record has fields of test execution ID 1051 , failure cause score 1052 , rule ID 1053 , and estimated failure cause 1054 .
- the test execution ID 1051 is the same as the test execution ID 1041 of the test execution information 104 .
- the failure cause score 1052 is a value calculated by the score calculator 151 based on the score calculation rule information 102 .
- the rule ID 1053 is a list of rule IDs 1021 input by the score calculator 151 .
- the estimated cause of failure 1054 is the cause of failure estimated by the estimation unit 152 .
- FIG. 10 is a diagram showing an example of the name correspondence information 106.
- the name correspondence information 106 is a correspondence table of codes and names, and specifically shows correspondence between the type ID of the test failure factor and the failure factor type, that is, the configuration type.
- FIG. 11 is a diagram showing an output example to the output device 3.
- FIG. Specifically, it shows an example of an analysis result for one test execution result.
- the test execution ID shows the result of "E1”
- the test execution ID and the ID of the test case in the test are shown at the top.
- a list of failure factors, configuration IDs, versions, failure factors, scores, and rule IDs is displayed in the center of the screen.
- the failure factor stores the name of each of the test script Ts, test data Dat, test target software Src, and test environment Env.
- the configuration ID and version display the configuration ID and version of the target test, which is the test whose test execution ID is "E1" in this example.
- “YES” is displayed for items that are presumed to be the cause of failure.
- the score and rule ID display the failure cause score and the rule ID that contributed to the score value.
- only the rule with the rule ID "R1" is applicable, and only the failure cause score of the test script Ts is added. Therefore, since the test script Ts has the highest score among the four, "YES" is displayed in the test script failure cause column. If there are multiple applicable rules, multiple rule IDs are entered in the rule ID column.
- FIG. 12 is a flowchart showing the processing of the score calculation unit 151.
- FIG. Execution of the processing of the score calculation unit 151 described below is started when execution of one or more tests is completed and one or more records of the test execution information 104 are completed. Also, in this case, the record of the test execution ID described in the test execution information 104 is not described in the score total information 105 . For example, three records E1 to E3 are recorded in the test execution information 104, and no information is recorded in the score total information 105, and the processing of the score calculation unit 151 described below is started.
- step S301 the score calculation unit 151 adds a record to the score total information 105 and describes the test execution ID to be processed.
- step S302 the score calculation unit 151 reads one record whose test status 1045 is "failed”. If there are multiple records whose test status 1045 is "failed", any one record that has not yet been read is read.
- the record read in this step is hereinafter referred to as a "target test record”.
- the score calculation unit 151 reads records with the usage flag 1024 of "TRUE" from the score calculation rule information 102. If there are a plurality of records with the use flag 1024 set to "TRUE", any one record that has not yet been read is read. Below, the record read in this step is called a "target rule record”. In subsequent step S304, the score calculation unit 151 executes score calculation processing using the target test record and the target rule record. Details of the score calculation process will be described later.
- the score calculation unit 151 refers to the score calculation rule information 102 and determines whether or not there is a record with the usage flag 1024 of "TRUE” that is not set as the target rule record. If it is determined that another record with the usage flag 1024 of "TRUE” still exists, the process returns to step S303, and if it is determined that there is no other record with the usage flag 1024 of "TRUE", the process proceeds to step S306.
- step S306 the score calculation unit 151 determines whether or not there is a record with the test status 1045 of "failed” that has not yet been set as the target test record. If it is determined that there are still other records with the test status 1045 of "failed”, the process returns to step S302, and if it is determined that there are no other records with the test status 1045 of "failed", the processing shown in FIG. 12 is terminated. do. For example, when the test execution information 104 has three records with the test status 1045 of "failed” and the score calculation rule information 102 has five records with the usage flag 1024 of "true", the score calculation process in step S304 is , the product of 3 and 5 is executed 15 times.
- FIG. 13 is a flowchart explaining the details of step S304 in FIG. 12, that is, the score calculation process.
- the score calculation unit 151 evaluates the conditional expression in the target rule record. Specifically, the score calculation unit 151 calculates the value of the conditional expression and determines whether the condition is met.
- the score calculation unit 151 determines whether or not the conditions in the target rule record are satisfied according to the evaluation result in step S311. The score calculation unit 151 proceeds to step S313 when determining that the condition is satisfied, and terminates the processing shown in FIG. 13 when determining that the condition is not satisfied.
- step S313 the score calculation unit 151 adds the rule ID 1021 of the target rule record to the rule ID 1053 of the record added in step S301 of FIG. However, it is not added to the cause of failure for which the additional value 1023 is zero.
- the target test record is the record "E1" shown in the top of FIG. 7
- the target rule record is the record "R1” shown in the top of FIG.
- "R1" is entered only in the "Ts" column in the rule ID 1053 of the score total information 105, and is not entered in the other columns.
- the score calculation unit 151 adds the additional value 1023 of the target rule record to the failure cause score 1052 of the score summary information 105, and ends the processing shown in FIG.
- the value of the failure cause score 1052 is the value of the additional value 1023 of the applicable rule.
- the estimation unit 152 describes the estimation failure cause 1054 for each record in the score total information 105 generated by the score calculation unit 151 . Specifically, the estimating unit 152 compares the values of the failure cause scores 1052 in each record of the score total information 105 and writes the name of the failure cause having the maximum value in the estimated failure cause 1054 . If there are multiple failure causes with the maximum value, the estimation unit 152 describes all of them in the estimated failure cause 1054 .
- the display generation unit 153 refers to the test execution information 104 and the score total information 105 to generate the image of the display screen shown in FIG. First, the display generator 153 starts processing when the test execution ID of the test to be displayed is designated from the outside. The display generation unit 153 writes the designated test execution ID as, for example, "E1" on the upper right of the screen.
- the display generation unit 153 identifies the record of the test execution information 104 corresponding to the specified test execution ID. Then, the values of the test case ID 1042, configuration ID and version 1043 described in the specified record are transcribed on the screen. Furthermore, the display generation unit 153 identifies a record of the score total information 105 corresponding to the designated test execution ID, and transcribes the values of the failure cause score 1052 and the rule ID 1053 on the screen. Finally, the display generation unit 153 writes "YES" in the failure cause column corresponding to the estimated failure cause 1054, and ends the process.
- the analysis device 1 identifies the cause when the test of the test target software Src fails in the test environment Env, which is the environment for executing the test. In this test, execution of the software under test and confirmation of the execution result are executed by the test script Ts without human intervention.
- the test data Dat is data read into the test target software Src and the test script Ts when the test is executed.
- a score calculation unit 151 that calculates a failure cause score 1052 based on the presence or absence of change for each of the test target software Ts, the test data Dat, the test script Ts, and the test environment Env, and the test fails based on the failure cause score 1052.
- an estimating unit 152 that identifies which one of the software under test Src, the test data Dat, the test script Ts, and the test environment Env is the cause of the error. Therefore, it is possible to identify which one of the test target software Src, the test data Dat, the test script Ts, and the test environment Env is the cause of the test failure.
- a software test is performed for the purpose of improving the quality of the software under test Src, but it may fail due to other elements that constitute the test, that is, the test data Dat, the test script Ts, and the test environment Env. .
- An expert who is familiar with all aspects of testing, including the test environment Env, can quickly identify the cause of a test failure, but it is not easy for an inexperienced person to identify the cause of a test failure. .
- the analysis device 1 even a non-skilled person can easily identify the cause of a test failure, and the efficiency of software development can be improved.
- the analysis device 1 stores the score calculation rule information 102 including a plurality of combinations of the conditional expression 1022 and the additional value 1023 of the failure cause score that is added when the conditional expression 1022 is satisfied.
- the score calculator 151 calculates a failure cause score 1052 based on the score calculation rule information 102 .
- Analysis device 1 includes display generation unit 153 that generates video information including failure cause score 1052 and rule ID 1053 that is information indicating a conditional expression used to calculate failure cause score 1052, such as that shown in FIG. , and an input/output interface 85 for outputting video information generated by the display generation unit 153 to the output device 3 . Therefore, the analysis device 1 can present the reason for specifying the cause of failure to the user.
- Conditional expression 1022 relates to changes in detailed components, which are components of the software under test Src, test data Dat, test script Ts, and test environment Env. Therefore, the analysis device 1 can identify the cause of the failure using objective facts.
- the analysis device 1 has been described as one hardware device. However, it may be configured as an analysis system in which a plurality of hardware devices share and realize functions similar to those of the analysis device 1 .
- FIG. 14 to 31 A second embodiment of the analysis device will be described with reference to FIGS. 14 to 31.
- FIG. 14 to 31 the same components as those in the first embodiment are assigned the same reference numerals, and differences are mainly described. Points that are not particularly described are the same as those in the first embodiment.
- This embodiment differs from the first embodiment mainly in that score calculation rule information 102 is generated.
- FIG. 14 is a functional configuration diagram of an analysis device 1A according to the second embodiment.
- the analysis apparatus 1A according to the present embodiment further includes a rule generation unit 161, a test addition execution unit 162, a priority change unit 163, and a failure cause correction unit 164.
- the storage unit M further stores element definition information 111, element related information 112, conditional expression initial values 113, detailed configuration information 114, and detailed related information 115.
- FIG. The storage unit M stores extended test case information 101A instead of the test case information 101, score calculation extended rule information 102A instead of the score calculation rule information 102, and extended score Total information 105A is stored.
- the operations of the score calculator 151A and the display generator 153A are different from those of the first embodiment.
- configuration information 100 change history information 103, name correspondence information 106, element definition information 111, element related information 112, conditional expression initial values 113, detailed configuration information 114, and detailed related information 115 are created in advance. be.
- Some records are added to the extended test case information 101A by the test addition execution unit 162 .
- the score calculation extended rule information 102A is generated by the rule generator 161. FIG. However, some records may be recorded in advance in the score calculation expansion rule information 102A.
- the test execution information 104 some records are written by the test addition execution unit 162 and priority levels 1044 of some records are written by the priority change unit 163 .
- the extended score summary information 105A is written in some fields by the failure cause correction unit 164.
- FIG. 15 is a diagram showing an example of extended test case information 101A.
- a confirmation 1013 field is added to each record of the test case information 101 in the first embodiment. Confirmation 1013 indicates whether the computer or the operator will judge the results of the test. For test cases whose confirmation 101 is "automatic”, the test script Ts determines whether the result of the test is success or failure. For test cases whose confirmation 101 is "manual”, the operator determines whether the result of the test is success or failure.
- FIG. 16 is a diagram showing an example of score calculation extended rule information 102A.
- Score calculation extended rule information 102A is obtained by adding detailing state 1025 and reliability 1026 fields to each record of score calculation rule information 102 in the first embodiment.
- Refinement state 1025 is rewritten in the process of creating detailed rules based on existing rules. Specifically, the detailing state 1025 transitions from the initial state of "not detailed” to the final state of “completion” through “refining” and "difference value not calculated”.
- the reliability 1026 is a numerical value between 0 and 1.0 indicating the reliability of the rule, and the initial value is "1.0" indicating the highest reliability.
- the extended score calculation rule information 102A includes the following conditional expression, which is not included in the kicking score calculation rule information 102 of the first embodiment.
- the function "IsChangedT” is a function that outputs whether a particular type of test configuration has changed from a first time to a second time. Outputs "true” if there is a change, and outputs "false” if there is no change.
- the function "IsChangedT" has a first argument representing the type of test configuration, a second argument representing a numerical value indicating a first time, and a third argument representing a numerical value indicating a second time.
- the function "IsSuccess" has an argument as a conditional expression and outputs whether or not the evaluation result of the conditional expression of the argument is "true".
- FIG. 17 is a diagram showing an example of extended score total information 105A.
- the extended score total information 105A has a user input 1055 field added to each record of the score total information 105 in the first embodiment.
- the user input 1055 is the test failure cause input by the user, and is any of test script Ts, test data Dat, test target software Src, and test environment Env. When the test is run, the probable cause of failure 1054 and/or user input 1055 fields are populated.
- FIG. 18 is a diagram showing an example of the element definition information 111.
- the element definition information 111 indicates element types subdivided according to the types of constituent elements. That is, at the beginning of the first embodiment, it was explained that each test consists of four components: test script Ts, test data Dat, test target software Src, and test environment Env. Element types are subdivided.
- the element definition information 111 consists of a plurality of records, and each record has fields of type 1111 , element type 1112 , and configuration name 1113 .
- the type 1111 indicates which type is the component of the test.
- Element type 1112 is the type of subdivided configuration.
- the configuration name 1113 is the name of the detailed configuration.
- an element type is also called a "detailed component.”
- Src is subdivided into interface SrcIf, constant/variable SrcV, processing logic SrcLogic, comment SrcCmt, and component SrcComp.
- each type is subdivided into two or more configurations, but it is not essential to subdivide all types. good.
- the element type is only a classification and does not indicate a specific configuration. Therefore, by combining an element type with an individual ID, it is associated with a specific configuration.
- FIG. 19 is a diagram showing an example of the element-related information 112.
- FIG. The element relation information 112 indicates relations between detailed configurations.
- the element related information 112 consists of one or more records, and each record has fields of element A 1121, element B 1122, and relationship 1123.
- FIG. Element A1121 and element B1122 show any detailed configuration.
- Relationship 1123 indicates the relationship between element A 1121 and element B 1122 . In the example shown in the first record of FIG. 19, the relationship 1123 is "one way from A to B", so that changes in element A1121 affect element B1122, but changes in element B1122 do not affect element A1121. It is shown.
- the relevance 1123 in the second record is "two-way", indicating that changing one affects the other. It should be noted that the elements A and B are for the sake of convenience, and the same thing can be said even if the elements A and B are interchanged in the first record and the relationship 1123 is set to the opposite direction "one direction from B to A”. do.
- FIG. 20 is a diagram showing an example of the conditional expression initial value 113.
- the conditional expression initial value 113 is an initial value read by the rule generation unit 161 to generate a plurality of conditional expressions 1022 .
- the conditional expression initial value 113 consists of one or more records, each record including an initial expression ID 1131 and a conditional expression 1132 .
- the initial formula ID 1131 is the identifier of the initial formula.
- a conditional expression 1132 is an initial conditional expression. Although the details will be described later, the rule generation unit 161 uses the conditional expression 1132 to create various conditional expressions.
- FIG. 21 is a diagram showing an example of the detailed configuration information 114.
- the detailed configuration information 114 is information indicating the correspondence between the configuration ID for each configuration type and the specific configuration of the detailed configuration, that is, the combination of the element type and the individual ID.
- the detailed configuration information 114 indicates the correspondence between the configuration 1141 and the detailed configuration 1142. Specifically, it indicates the correspondence between the configuration ID for each type and the combination of the element type and the individual ID.
- FIG. 22 is a diagram showing an example of the detailed related information 115.
- the detailed related information 115 is a relationship between specific configurations, and differs in specific points from the general correlation between the types shown in the element related information 112 .
- the detailed related information 115 is composed of one or more records, and each record has fields of element X 1151 , element Y 1152 , relationship 1153 and similarity 1154 .
- Element X1151 and element Y1152 are combinations of element types and individual IDs that identify specific configurations.
- Relationship 1153 is the relationship between element X 1151 and element Y 1152 and is one of “similar,” “X depends on Y,” and "Y depends on X.”
- the degree of similarity 1154 is the degree of similarity between the element X 1151 and the element Y 1152 and is represented by a value of 0-1. A value is set in the degree of similarity 1154 only when the relationship 1153 is “similar”.
- FIG. 23 is a diagram showing an output example to the output device 3 by the display generation unit 153A. Specifically, it shows an analysis result for one test execution result and an example of input by the user.
- the display generator 153 ⁇ /b>A performs the processing of the display generator 153 in the first embodiment, and further reflects the input operation by the user using the input device 2 to the output device 3 .
- the difference from FIG. 10 in the first embodiment is that the failure cause column is subdivided into "calculation” and "user designation".
- the cause of failure estimated by the estimation unit 152 as in the first embodiment is indicated by "YES" in the "calculation” column.
- a value is entered in the "user specified" column by the user's input.
- radio buttons are arranged as indicated by reference symbol G21, and the user can select only one of the four.
- the value input by the user is input to the analysis device 1 as the value of the user input 1055 of the extended score total information 105A.
- unselected items are represented by white circles, and selected items are represented by black circles.
- step S341 the rule generation unit 161 reads the initial expression 113, transcribes all the conditional expressions 1132 to the column of the conditional expression 1022 of the score calculation extended rule information 102A, and creates the detailed state 1025, the reliability 1026, and the usage An initial value is set in the flag 1024 . Specifically, the detailing state 1025 is set to "not detailed”, the reliability 1026 is set to "1.0”, and the usage flag 1024 is set to "True", and the process proceeds to the next step S342.
- the rule generating unit 161 arbitrarily selects one record whose detailing state 1025 is "not detailed” in the score calculation extended rule information 102A.
- the record selected in step S342 is hereinafter referred to as a "refinement target record”.
- the rule generation unit 161 changes the detailing state 1025 of the detailing target record selected in step S342 to "detailing”.
- the rule generation unit 161 executes conditional expression addition processing, which will be described later, on the detailing target record. Change to Calculation.
- step S346 the rule generation unit 161 determines whether or not there is a record in the extended score calculation rule information 102A with the detailing state 1025 of "not detailed”. If the rule generation unit 161 determines that there is still a record whose detailing state 1025 is "undetailed”, it returns to step S342, and there is no longer any record whose detailing state 1025 is "undetailed”. If so, the process proceeds to step S350 in FIG. 25 via the circled A.
- step S350 of FIG. 25 the rule generation unit 161 arbitrarily selects one record whose detail state 1025 is "difference value not calculated" in the score calculation extended rule information 102A.
- the record selected in step S342 is hereinafter referred to as a "difference target record”.
- the rule generation unit 161 changes the detailing state 1025 of the difference target record to "detailing”.
- the rule generation unit 161 executes the later-described difference geographic calculation for the difference target record, and in the subsequent step S345, the rule generation unit 161 changes the detailing state 1025 of the difference target record to "completed". do.
- step S354 the rule generation unit 161 determines whether or not there is a record in the extended score calculation rule information 102A with the detailed state 1025 of "difference value not calculated". If the rule generation unit 161 determines that there is still a record with the detailing state 1025 of "difference value uncalculated", the process returns to step S350, and there are no more records with the detailing state 1025 of "difference value not calculated”. If it is determined that it does not exist, the process proceeds to step S355.
- step S355 the rule generation unit 161 changes the usage flag 1024 of the record in which all four types of scores of the added value 1023 are less than a predetermined threshold value, for example, 0.5, to "False” in the score calculation extended rule information 102A.
- the rule generation unit 161 changes the usage flag 1024 of the record whose reliability 1026 value is less than a predetermined threshold value, for example, 0.4 in the score calculation expansion rule information 102A to "False”, and changes the usage flag 1024 to "False”. Terminate the indicated process.
- FIGS. 26 and 27 are flowcharts showing details of the process of step S344 in FIG. 24, that is, the conditional expression addition process. 26 and 27 are added to a new record in the Kakehi Suo extended rule information 102A, but the detailed state 1025 of this record is set to "completed” from the beginning. set. Also, the reliability 1026 is set to "1" and the usage flag 1024 is set to "True", as in the previous case.
- step S361 the rule generation unit 161 determines whether or not the conditional expression in the refinement target record includes any of the superordinate concepts, ie, Ts, Dat, Src, and Env. If the rule generation unit 161 determines that any superordinate concept is included, the process proceeds to step S362, and if it determines that the rule generation unit 161 does not include any superordinate concept, the process proceeds to step S363.
- step S362 the rule generation unit 161 refers to the element definition information 111, adds the conditional expression in which the superordinate concept is replaced by each subordinate concept to the new record of the score calculation extended rule information 102A, and proceeds to step S363.
- the conditional expression of the detailing target record includes Src
- five conditional expressions are added by replacing Src in the conditional expression with SrcIf, SrcV, SrcLogic, SrcCmt, and SrcComp.
- the detailed state 1025 of the added record has an initial value, that is, the detailed state 1025 is "undetailed", the reliability 1026 is "1.0”, and the usage flag 1024 is "True". set.
- step S363 the rule generation unit 161 determines whether the conditional expression in the refinement target record and the conditional expression added in step S362 include any of the element types included in the element related information 112. If the rule generating unit 161 determines that any of the element types included in the element related information 112 is included, the process proceeds to step S364. Proceed to FIG. 27 via circled B.
- the element type paired with the element type determined to be included in the element related information 112 in step S363 is called a "correlation element". For example, if TsAs is included in either the conditional expression in the refinement target record or the conditional expression added in step S362, according to the example of FIG. called an element.
- the rule generation unit 161 first determines in step 371 whether or not any of the conditional expressions in the refinement target record and the conditional expressions added in step S362 include time designation. If it is determined that any conditional expression includes time specification, the process proceeds to step S372, and if it is determined that none of the conditional expressions includes time specification, the process proceeds to step S373. In step S372, the rule generation unit 161 adds the conditional expression in which the time specified in the expression including the time specification is replaced with the complement to the new record of the extended score calculation rule information 102A, and proceeds to step S373.
- step S373 the rule generation unit 161 determines whether any of the conditional expressions in the refinement target record and the conditional expressions added in step S362 include version designation. If it is determined that any conditional expression includes version specification, the process proceeds to step S374, and if it is determined that none of the conditional expressions includes version specification, the process proceeds to step S375. In step S374, the rule generation unit 161 adds the conditional expression in which the version specified in the expression including the version specification is replaced with the complement to the new record of the extended score calculation rule information 102A, and proceeds to step S375.
- step S375 the rule generation unit 161 determines whether any of the conditional expressions in the refinement target record and the conditional expressions added in step S362 include time designation or version designation. If it is determined that any conditional expression includes time designation or version designation, the process proceeds to step S376, and if it is determined that none of the conditional expressions includes version designation, the processing shown in FIG. 27 ends.
- FIG. 28 is a flowchart showing details of the difference value calculation process in step S352 of FIG.
- the rule generation unit 161 initializes variables rTotal, rTs, rDat, rSrc, and rEnv to zero. These variables are counters used to count total cases, count Ts, count Dat, count Src, and count Env.
- the rule generation unit 161 selects one record whose test status 1045 is "success" or "failure" from the test execution information 104. Below, the record selected in this step is called a "selected record”.
- the rule generating unit 161 refers to the detailed configuration information 114, the detailed related information 115, and the change history information 103 as necessary to determine whether or not the conditions of the conditional expression in the selected record are satisfied. If the rule generation unit 161 determines that the conditional expression is satisfied, it proceeds to step S384, and if it determines that the conditional expression is not satisfied, it proceeds to step S386.
- step S384 the rule generation unit 161 increases the value of the variable rTotal by "1”, and in subsequent step S385, the rule generation unit 161 generates the corresponding rTs, rDat, rSrc, and Any one value of rEnv is incremented by "1" and the process proceeds to step S386.
- step S386 the rule generation unit 161 determines whether or not all records with the test status 1045 of "success" or "failure" have been selected. If it is determined that all records have been selected, the process proceeds to step S387, and if it is determined that there are records that have not been selected, the process returns to step S382.
- step S387 the rule generation unit 161 divides each of rTs, rDat, rSrc, and rEnv by rTotal.
- step S388 the rule generation unit 161 writes the calculation result of step S387 in each column of the added value 1023 of the score calculation expansion rule information 102A, and ends the difference value calculation process.
- 29 and 30 are flowcharts showing the processing of the score calculation unit 151A in the second embodiment.
- the difference from the processing shown in FIGS. 12 and 13 of the first embodiment is that processing is added between steps S303 and S304.
- the process shown in FIG. 30 is performed via the circled C, and when those processes are completed, the process after step S304 is performed.
- the score calculation unit 151A determines whether or not all the tests necessary for evaluating the conditional expression have been executed. When the score calculation unit 151A determines that all the tests have been executed, it ends the processing of FIG. 30 and returns to FIG. 29 to determine that even one test required for evaluating the conditional expression has not been completed. If so, the process proceeds to step S392. In step S392, the score calculation unit 151A determines whether or not there is a test that is necessary for evaluating the conditional expression but is not scheduled to be executed. If the score calculation unit 151A determines that there is a test that is not scheduled to be executed, the process proceeds to step S393, and if it determines that all necessary tests are scheduled to be performed, the process proceeds to step S394.
- step S393 the score calculation unit 151A causes the test addition execution unit 162 to generate a test item for executing a test that is necessary for evaluating the conditional expression and is not scheduled to be executed, and proceeds to step S394. At this time, the test addition execution unit 162 adds a new record with the priority 1044 set to “high” to the test execution information 104 .
- step S394 the score calculation unit 151A calls the test execution unit (not shown), waits for a predetermined waiting time, and then returns to step S391. The above is the description of the processing shown in FIG. Although the score calculation unit 151A calls the test execution unit in this flowchart, the test addition execution unit 162 may call the test execution unit.
- the test execution unit executes the test according to the description of the test execution information 104. However, since the contents of the actual test execution are described in the corresponding test script Ts, the detailed description of the operation of the test execution unit will be omitted.
- FIG. 31 is a flowchart showing the processing of the test addition execution unit 162.
- step S401 the test addition execution unit 162 reads one record of which the test status 1045 is "not started” in the test execution information 104 in descending order of priority 1044.
- FIG. The record read in this step is called an "execution record” in the following description of this flowchart.
- step S402 the test addition execution unit 162 executes the corresponding test script in order to execute the test of the execution record read in step S401. For example, when a record having a test execution ID of "E3" in FIG. 7 is read, a test script having a configuration ID of "K2" and a version of "V1" is executed.
- the test addition execution unit 162 refers to the extended test case information 101A and determines whether the value of the confirmation 1013 of the test to be executed is "automatic” or “manual”. If the test addition execution unit 162 determines that the confirmation 1013 is "automatic”, it proceeds to step S404, and if it determines that the confirmation 1013 is "manual”, it proceeds to step S405. In step S404, the test addition execution unit 162 records the test result output by the test script, that is, "success" or "failure” in the test status 1045 of the execution record in the test execution information 104, and proceeds to step S406.
- step S405 the test addition execution unit 162 records the test result in the test status 1045 of the execution record in the test execution information 104 based on the user's input from the input device, and proceeds to step S406.
- step S406 the test addition executing unit 162 determines whether or not there is a record whose test status 1045 is "not started”. The test addition executing unit 162 returns to step S401 if it determines that there is a "not started” record, and terminates the processing shown in FIG. 31 if it determines that there is no "not started” record.
- the conditional expression 1022 is a change of the detailed component which is each component of the test target software Src, the test data Dat, the test script Ts, and the test environment Env, the dependency relationship between the detailed components, and the detailed configuration. It relates to similarity relationships between elements. Therefore, the analysis device 1 can identify the cause of the failure from multiple viewpoints.
- the analysis device 1A calculates a new conditional expression by replacing each of the test target software Src, the test data Dat, the test script Ts, and the test environment Env in the conditional expression of the score calculation rule information with detailed components. It further includes a rule generator 161 added to the extended rule information 102A. This process is shown in steps S361-S362 of FIG. Therefore, unlike the first embodiment, the user does not have to prepare all the score calculation rule information 102 .
- the rule generator 161 further adds new conditional expressions based on the relationships between the detailed components, as shown in S363-S365 of FIG. Therefore, unlike the first embodiment, the user does not have to prepare all the score calculation rule information 102 .
- the rule generation unit 161 generates a new conditional expression by changing the time or version included in the conditional expression of the score calculation extended rule information 102A as shown in S371 to S376 of FIG. Add to 102A. Therefore, unlike the first embodiment, the user does not have to prepare all the score calculation rule information 102 .
- the analysis apparatus 1A includes a test addition execution unit 162 that newly generates and executes test items necessary for evaluating the conditional expression 1022 . Therefore, additional tests necessary for evaluating conditional expressions can be performed.
- the configuration of the functional blocks is merely an example. Some functional configurations shown as separate functional blocks may be configured integrally, or a configuration represented by one functional block diagram may be divided into two or more functions. Further, a configuration may be adopted in which part of the functions of each functional block is provided in another functional block.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Debugging And Monitoring (AREA)
Abstract
Description
本発明の第2の態様による解析方法は、テストを実行する環境であるテスト環境において、テスト対象ソフトウエアのテストが失敗した際の原因をコンピュータが特定する解析方法であって、前記テストでは、テスト対象ソフトウエアの実行と実行結果の確認が、人が介入することなくテストスクリプトにより実行され、テストデータは、前記テストの実行時に前記テスト対象ソフトウエアおよび前記テストスクリプトに読み込まれるデータであり、テスト対象ソフトウエア、テストデータ、テストスクリプト、およびテスト環境のそれぞれについて、変更の有無に基づき失敗原因スコアを算出することと、失敗原因スコアに基づき、前記テストが失敗した原因が、テスト対象ソフトウエア、テストデータ、テストスクリプト、およびテスト環境のいずれであるかを推定することと、を含む。
以下、図1~図13を参照して、本発明に係る解析装置の第1の実施の形態を説明する。
(1)解析装置1は、テストを実行する環境であるテスト環境Envにおいて、テスト対象ソフトウエアSrcのテストが失敗した際の原因を特定する。このテストでは、テスト対象ソフトウエアの実行と実行結果の確認が、人が介入することなくテストスクリプトTsにより実行される。テストデータDatは、テストの実行時にテスト対象ソフトウエアSrcおよびテストスクリプトTsに読み込まれるデータである。テスト対象ソフトウエアTs、テストデータDat、テストスクリプトTs、およびテスト環境Envのそれぞれについて、変更の有無に基づき失敗原因スコア1052を算出するスコア算出部151と、失敗原因スコア1052に基づき、テストが失敗した原因が、テスト対象ソフトウエアSrc、テストデータDat、テストスクリプトTs、およびテスト環境Envのいずれであるかを特定する推定部152と、を備える。そのため、テストが失敗した原因がテスト対象ソフトウエアSrc、テストデータDat、テストスクリプトTs、およびテスト環境Envのいずれであるかを特定することができる。
上述した第1の実施の形態では、解析装置1は1つのハードウエア装置として説明した。しかし、解析装置1と同様の機能を複数のハードウエア装置が分担して実現する、解析システムとして構成されてもよい。
図14~図31を参照して、解析装置の第2の実施の形態を説明する。以下の説明では、第1の実施の形態と同じ構成要素には同じ符号を付して相違点を主に説明する。特に説明しない点については、第1の実施の形態と同じである。本実施の形態では、主に、スコア計算ルール情報102を生成する点で、第1の実施の形態と異なる。
(4)条件式1022は、テスト対象ソフトウエアSrc、テストデータDat、テストスクリプトTs、およびテスト環境Envのそれぞれの構成要素である詳細構成要素の変更、詳細構成要素同士の依存関係、ならびに詳細構成要素同士の類似関係に関するものである。そのため解析装置1は、複数の観点から失敗の原因を特定できる。
84…記憶部
85…入出力インタフェース
101…テストケース情報
101A…拡張テストケース情報
102…スコア計算ルール情報
102A…スコア計算拡張ルール情報
103…変更履歴情報
104…テスト実行情報
105…スコア集計情報
105A…拡張スコア集計情報
106…名称対応情報
151、151A…スコア算出部
152…推定部
153、153A…表示生成部
161…ルール生成部
162…テスト追加実行部
163…優先度変更部
164…失敗原因修正部
Claims (8)
- テストを実行する環境であるテスト環境において、テスト対象ソフトウエアのテストが失敗した際の原因を特定する解析装置であって、
前記テストでは、テスト対象ソフトウエアの実行と実行結果の確認が、人が介入することなくテストスクリプトにより実行され、
テストデータは、前記テストの実行時に前記テスト対象ソフトウエアおよび前記テストスクリプトに読み込まれるデータであり、
テスト対象ソフトウエア、テストデータ、テストスクリプト、およびテスト環境のそれぞれについて、変更の有無に基づき失敗原因スコアを算出するスコア算出部と、
失敗原因スコアに基づき、前記テストが失敗した原因が、テスト対象ソフトウエア、テストデータ、テストスクリプト、およびテスト環境のいずれであるかを推定する推定部と、を備える解析装置。 - 請求項1に記載の解析装置において、
条件式と、前記条件式に該当した場合に加算される前記失敗原因スコアの加算値との組合せを複数含むスコア計算ルール情報が格納される記憶部をさらに備え、
前記スコア算出部は、前記スコア計算ルール情報に基づき前記失敗原因スコアを算出し、
前記失敗原因スコアおよび前記失敗原因スコアの算出に用いられた前記条件式を示す情報を含む映像情報を生成する表示生成部と、
前記表示生成部が生成する前記映像情報を表示部に出力する入出力インタフェースと、をさらに備える解析装置。 - 請求項1に記載の解析装置において、
条件式と、前記条件式に該当した場合に加算される前記失敗原因スコアの加算値との組合せを複数含むスコア計算ルール情報が格納される記憶部をさらに備え、
前記条件式は、テスト対象ソフトウエア、テストデータ、テストスクリプト、およびテスト環境のそれぞれの構成要素である詳細構成要素の変更、前記詳細構成要素同士の依存関係、ならびに前記詳細構成要素同士の類似関係に関する、解析装置。 - 請求項3に記載の解析装置において、
前記スコア計算ルール情報の前記条件式におけるテスト対象ソフトウエア、テストデータ、テストスクリプト、およびテスト環境のそれぞれを前記詳細構成要素に置き換えた新たな条件式を前記スコア計算ルール情報に追加する、ルール生成部をさらに備える、解析装置。 - 請求項4に記載の解析装置において、
前記ルール生成部は、前記詳細構成要素同士の関連に基づいて新たな条件式をさらに追加する、解析装置。 - 請求項4に記載の解析装置において、
前記ルール生成部は、前記スコア計算ルール情報の前記条件式に含まれる時間またはバージョンを変更した新たな条件式を生成して前記スコア計算ルール情報に追加する、解析装置。 - 請求項3に記載の解析装置において、
前記条件式を評価するために必要なテスト項目を新たに生成して実行させるテスト追加実行部をさらに備える、解析装置。 - テストを実行する環境であるテスト環境において、テスト対象ソフトウエアのテストが失敗した際の原因をコンピュータが特定する解析方法であって、
前記テストでは、テスト対象ソフトウエアの実行と実行結果の確認が、人が介入することなくテストスクリプトにより実行され、
テストデータは、前記テストの実行時に前記テスト対象ソフトウエアおよび前記テストスクリプトに読み込まれるデータであり、
テスト対象ソフトウエア、テストデータ、テストスクリプト、およびテスト環境のそれぞれについて、変更の有無に基づき失敗原因スコアを算出することと、
失敗原因スコアに基づき、前記テストが失敗した原因が、テスト対象ソフトウエア、テストデータ、テストスクリプト、およびテスト環境のいずれであるかを推定することと、を含む解析方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP22832420.8A EP4365746A1 (en) | 2021-06-28 | 2022-02-18 | Analysis device and analysis method |
US18/564,824 US20240289263A1 (en) | 2021-06-28 | 2022-02-18 | Analysis Device and Analysis Method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-107100 | 2021-06-28 | ||
JP2021107100A JP2023005300A (ja) | 2021-06-28 | 2021-06-28 | 解析装置、解析方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023276253A1 true WO2023276253A1 (ja) | 2023-01-05 |
Family
ID=84692257
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/006786 WO2023276253A1 (ja) | 2021-06-28 | 2022-02-18 | 解析装置、解析方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20240289263A1 (ja) |
EP (1) | EP4365746A1 (ja) |
JP (1) | JP2023005300A (ja) |
WO (1) | WO2023276253A1 (ja) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014142872A (ja) * | 2013-01-25 | 2014-08-07 | Ntt Docomo Inc | 試験装置 |
JP2015069615A (ja) * | 2013-10-01 | 2015-04-13 | 株式会社東芝 | 成果物の関連候補生成装置 |
JP2018173703A (ja) * | 2017-03-31 | 2018-11-08 | 日本電気株式会社 | 障害解析装置、障害解析方法および障害解析プログラム |
WO2020072701A1 (en) * | 2018-10-02 | 2020-04-09 | Cser Tamas | Software testing |
US20210064518A1 (en) * | 2019-08-27 | 2021-03-04 | Shield34 LTD. | Methods Circuits Devices Systems and Functionally Associated Machine Executable Code For Automatic Failure Cause Identification in Software Code Testing |
-
2021
- 2021-06-28 JP JP2021107100A patent/JP2023005300A/ja active Pending
-
2022
- 2022-02-18 WO PCT/JP2022/006786 patent/WO2023276253A1/ja active Application Filing
- 2022-02-18 EP EP22832420.8A patent/EP4365746A1/en active Pending
- 2022-02-18 US US18/564,824 patent/US20240289263A1/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014142872A (ja) * | 2013-01-25 | 2014-08-07 | Ntt Docomo Inc | 試験装置 |
JP2015069615A (ja) * | 2013-10-01 | 2015-04-13 | 株式会社東芝 | 成果物の関連候補生成装置 |
JP2018173703A (ja) * | 2017-03-31 | 2018-11-08 | 日本電気株式会社 | 障害解析装置、障害解析方法および障害解析プログラム |
WO2020072701A1 (en) * | 2018-10-02 | 2020-04-09 | Cser Tamas | Software testing |
US20210064518A1 (en) * | 2019-08-27 | 2021-03-04 | Shield34 LTD. | Methods Circuits Devices Systems and Functionally Associated Machine Executable Code For Automatic Failure Cause Identification in Software Code Testing |
Also Published As
Publication number | Publication date |
---|---|
US20240289263A1 (en) | 2024-08-29 |
EP4365746A1 (en) | 2024-05-08 |
JP2023005300A (ja) | 2023-01-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8589884B2 (en) | Method and system for identifying regression test cases for a software | |
CN110928772B (zh) | 一种测试方法及装置 | |
JP5648584B2 (ja) | ソフトウェアアプリケーションのプロファイリング方法及び装置 | |
JP4876511B2 (ja) | ロジック抽出支援装置 | |
US20120272220A1 (en) | System and method for display of software quality | |
US20070214173A1 (en) | Program, method, and apparatus for supporting creation of business process model diagram | |
US10942840B2 (en) | System and method for managing a code repository | |
US20090271661A1 (en) | Status transition test support device, status transition test support method, and recording medium | |
KR20170052668A (ko) | 데이터 구동 테스트 프레임워크 | |
US20080155514A1 (en) | Method and System for Graphical User Interface Testing | |
Yahya et al. | Domain-driven actionable process model discovery | |
US9098638B2 (en) | System and method for automatic test level generation | |
JP2014241021A (ja) | ソフトウェア評価装置および方法 | |
JP7077909B2 (ja) | デッドコード解析プログラム、デッドコード解析方法及びデッドコード解析装置 | |
JPWO2016174743A1 (ja) | ソースコード等価性検証装置、および、ソースコード等価性検証方法 | |
EP4318244A1 (en) | Software testing with reliability metric | |
JP2016126552A (ja) | テスト選択プログラム、テスト選択方法、及びテスト選択装置 | |
JP5460629B2 (ja) | 表形式ソフトウェア仕様作成支援方法、及び装置 | |
WO2023276253A1 (ja) | 解析装置、解析方法 | |
CN111241766B (zh) | 测试方法与测试系统 | |
JP6622938B1 (ja) | 相関性抽出方法および相関性抽出プログラム | |
US7620937B2 (en) | System and method for debugging programs | |
JP2007304846A (ja) | 仕様書作成支援方法および仕様書作成支援装置 | |
CN115795479A (zh) | 一种智能合约的漏洞检测方法、电子设备和存储介质 | |
Rapos et al. | SimPact: Impact analysis for simulink models |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22832420 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18564824 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022832420 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022832420 Country of ref document: EP Effective date: 20240129 |