WO2019242868A1 - Software testing device, software testing method, and software testing program - Google Patents

Software testing device, software testing method, and software testing program Download PDF

Info

Publication number
WO2019242868A1
WO2019242868A1 PCT/EP2018/084616 EP2018084616W WO2019242868A1 WO 2019242868 A1 WO2019242868 A1 WO 2019242868A1 EP 2018084616 W EP2018084616 W EP 2018084616W WO 2019242868 A1 WO2019242868 A1 WO 2019242868A1
Authority
WO
WIPO (PCT)
Prior art keywords
validation
software
generation unit
detection rule
test case
Prior art date
Application number
PCT/EP2018/084616
Other languages
French (fr)
Inventor
Tsunato NAKAI
Koichi Shimizu
Nobuhiro Kobayashi
Benoit BOYER
David Mentre
Original Assignee
Mitsubishi Electric Corporation
Mitsubishi Electric R&D Centre Europe Bv
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corporation, Mitsubishi Electric R&D Centre Europe Bv filed Critical Mitsubishi Electric Corporation
Priority to PCT/EP2018/084616 priority Critical patent/WO2019242868A1/en
Priority to JP2019571278A priority patent/JP6765554B2/en
Publication of WO2019242868A1 publication Critical patent/WO2019242868A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management

Definitions

  • the present invention relates to a software testing device, a software testing method, and a software testing program.
  • the control device and the IoT device communicate limited types of data to/from each other.
  • a detection rule that defines normal communication data for the control device and the IoT device.
  • a method of detecting, by setting a detection rule in advance, a cyber-attack by whitelisting is paid attention to.
  • the method of detecting a cyber-attack by whitelisting involves setting a detection rule in advance and determining that received communication data is abnormal when the communication data does not match the detection rule, to thereby detect a cyber-attack.
  • test data has been manually created .
  • the test data that depends on the detection rule is manually generated, there may be an omission or error in the test data. Therefore, highly reliable test data without an omission or error is required to be generated.
  • Patent Literature 1 it is proposed to automatically create test data instead of manually creating the test data.
  • the test data is automatically generated from a design model such as a unified modeling language (UML) class diagram or a UML activity diagram.
  • UML unified modeling language
  • Patent Literature 2 it is proposed to automatically create exhaustive test data.
  • the test data is automatically generated from an actual operation rule relating to traveling of trains .
  • Patent Literature 3 it is proposed to automatically generate test data in model-based design of safety-critical software.
  • the test data is automatically- generated from a specification model by model inspection or other format analysis technologies.
  • Patent Literature 1 it is proposed to automatically generate the test data from a design model.
  • the design model for the test data is required to be created.
  • Patent Literature 2 it is proposed to automatically create the exhaustive test data.
  • an expected value is not generated for the test data, and thus there is a problem in that the test result cannot be automatically determined and is required to be manually determined.
  • Patent Literature 3 it is proposed to automatically generate the test data from a specification model by using format analysis technologies.
  • Patent Literature 3 there is a problem in that the specification model for the test data is required to be created.
  • the present invention has an object to provide a software testing device, a software testing method, and a software testing program, which are capable of automatically obtaining a validation result of validation target software merely by simple input without requiring generation of test data that depends on a detection rule to be used at the time of implementation testing.
  • the present invention provides a software testing device including: a validation software generation unit configured to use a detection rule defining normal communication data on communication to be performed by a device into which validation target software is introduced, to thereby generate validation software for generating an expected value of an execution result to be output by the validation target software; a test case generation unit configured to input a test pattern into the validation software to generate the expected value of the execution result as a test case by executing the validation software; and a validation execution unit configured to input another test pattern that is the same as the test pattern into the validation target software, compare an execution result obtained through execution of the validation target software with the expected value of the execution result serving as the test case, and determine validity of the validation target software.
  • a validation result of validation target software can be automatically obtained merely by simple input.
  • FIG. 1 is a block diagram for illustrating an example of an entire configuration of a software testing device according to a first embodiment of the present invention.
  • FIG. 2 is a block diagram for illustrating an example of a hardware configuration of the software testing device according to the first embodiment of the present invention.
  • FIG. 3 is a flowchart for illustrating an example of an operation of the software testing device according to the first embodiment of the present invention.
  • FIG. 4 is a flowchart for illustrating an example of an operation of a validation software generation unit in the software testing device according to the first embodiment of the present invention .
  • FIG. 5 is a table for showing an example of a detection rule in the first embodiment of the present invention.
  • FIG. 6 is a diagram for illustrating an example of a rule list tree to be generated by the software testing device according to the first embodiment of the present invention.
  • FIG. 7 is a diagram for illustrating an example of a decision tree to be generated by the software testing device according to the first embodiment of the present invention.
  • FIG. 8 is a diagram for illustrating an example of validation software to be generated by the software testing device according to the first embodiment of the present invention.
  • FIG. 9 is a flowchart for illustrating an example of an operation of the validation software generation unit in a software testing device according to a second embodiment of the present invention .
  • FIG. 10 is a block diagram for illustrating an example of an entire configuration of a software testing device according to a third embodiment of the present invention.
  • FIG. 11 is a flowchart for illustrating an example of an operation of the software testing device according to the third embodiment of the present invention.
  • FIG. 1 is a diagram for illustrating an example of an entire configuration of a software testing device 100 according to a first embodiment of the present invention.
  • the software testing device 100 includes a validation software generation unit 102, a test case generation unit 104, and a validation execution unit 107.
  • the software testing device 100 is a device configured to validate validation target software 106 for determination of validity of the validation target software 106.
  • a description is given of the validation target software 106 by taking, as an example, attack detection software for detecting abnormal communication data due to, for example, a cyber-attack.
  • the validation target software 106 is software to be introduced into a device such as a control device or an IoT device for determination of whether communication data obtained through communication performed by the device is normal.
  • the validation execution unit 107 receives a test case 105 and the validation target software 106 as input to output a validation result 108.
  • the validation software generation unit 102 receives a detection rule 101 as input to generate validation software 103.
  • the validation software 103 is software to be used as a test oracle that handles only the defined communication data as normal data.
  • the test case generation unit 104 generates the test case 105 by inputting a test pattern into the validation software 103 and executing the validation software 103.
  • an expected value of the execution result is required as a criterion for determining whether the execution result is normal when a test pattern is input to the validation target software 106.
  • the test case 105 is data serving as the expected value of the execution result.
  • FIG. 2 is a block diagram for illustrating an example of a hardware configuration of the software testing device 100 illustrated in FIG. 1. As illustrated in FIG. 2, the software testing device 100 according to the first embodiment includes a computer 200.
  • the computer 200 includes a processor 201, an auxiliary storage device 202, a memory 203, a display device 204, and an operation device 205 in terms of hardware.
  • the auxiliary storage device 202 stores programs for implementing respective functions of the validation software generation unit 102, the test case generation unit 104, and the validation execution unit 107 illustrated in FIG. 1. Further, the auxiliary storage device 202 stores the detection rule 101 and the validation target software 106 to be input to the software testing device 100. Further, the auxiliary storage device 202 stores the validation software 103, the test case 105, and the validation result 108 to be output from the respective units of the software testing device 100.
  • the respective functions of the validation software generation unit 102, the test case generation unit 104, and the validation execution unit 107 illustrated in FIG. 1 are implemented by the stored programs.
  • the programs for implementing the respective functions of the validation software generation unit 102, the test case generation unit 104, and the validation execution unit 107 are loaded into the memory 203 to be executed by the processor 201.
  • FIG. 2 is a diagram for schematically illustrating a state in which the processor 201 is executing the programs for implementing the respective functions of the validation software generation unit 102, the test case generation unit 104, and the validation execution unit 107.
  • the display device 204 is configured to display data on the validation result 108 while assisting the operation device 205.
  • the operation device 205 is used for executing an operation of inputting each piece of data such as the detection rule 101 or the validation target software 106.
  • the validation software generation unit 102 generates the validation software 103 for determining only the defined communication data as normal data based on the detection rule 101 defining normal communication data.
  • the test case generation unit 104 inputs an arbitrary test pattern based on a boundary value or probabilistic input into the validation software 103, and executes the validation software
  • test case generation unit 103 to obtain an execution result.
  • the execution result contained in the test case 105 is an expected value of the execution result obtained when the same test pattern is input to the validation target software 106.
  • the execution result in the test case 105 is hereinafter referred to as "expected value of the execution result" or "expected value”.
  • the validation execution unit 107 inputs another test pattern that is the same as the test pattern input to the validation software 103 into the validation target software 106, and compares the execution result with an expected value in the test case 105 to output a comparison result as the validation result 108.
  • FIG. 3 is an illustration of Step S1000, which is an example of an operation flow of the software testing device 100, and the operation flow of the software testing device 100 may not necessarily be as illustrated in FIG. 3.
  • Step S1001 the detection rule 101 is input to the software testing device 100.
  • StepS1002 the validation software generation unit 102 generates the validation software 103 from the detection rule input in Step S1001.
  • the detailed processing flow of a validation software generation operation is described later.
  • Step S1003 the test case generation unit 104 inputs a test pattern into the validation software 103 generated in Step S1002.
  • Step S1004 the test case generation unit 104 executes the validation software 103, and generates a pair of the execution result and the test pattern input in Step S1003 as the test case 105.
  • the execution result contained in the test case 105 is used as the "expected value of the execution result" as described above .
  • Step S1005 the test case generation unit 104 determines whether all the test cases 105 have been generated in Step S1004. When it is determined that all the test cases 105 have been generated in Step S1004, the processing advances to Step S1006. On the other hand, when the generation of all the test cases 105 is ongoing, the processing returns to Step S1003. The processing of from Step S1003 to Step S1005 is repeated until completion of generation of all the test cases 105.
  • Step S1006 the validation execution unit 107 inputs the test pattern in the test case 105, which has been generated in the processing of from Step S1003 to Step S1005, into the validation target software 106.
  • Step S1007 the validation execution unit 107 compares the execution result obtained through execution of the validation target software 106 and the expected value of the execution result in the test case 105.
  • the comparison result is output as a "match"
  • the comparison result is output as a "mismatch” .
  • Step S1008 the validation execution unit 107 determines whether validation in Step S1007 has been executed for all the test cases 105. When it is determined that the validation has been executed for all the test cases 105, the processing advances to Step S1009. On the other hand, when the execution of the validation is ongoing, the processing returns to Step S1006. The processing of from Step S1006 to Step S1008 is repeated until the execution of the validation for all the test cases 105 is complete.
  • Step S1009 the validation execution unit 107 outputs all the comparison results as the validation result 108, and the processing of the operation flow of FIG. 3 is ended.
  • Step S2001 the detection rule 101 is input to the validation software generation unit 102.
  • An example of the detection rule 101 is shown in FIG. 5.
  • the detection rule 101 shown in FIG. 5 contains information such as transmission source information, transmission destination information, a data length, and a payload as definition items for defining normal communication data.
  • the detection rule 101 contains N detection rules of from a rule 1 to a rule N.
  • the validation software 103 and the validation target software 106 determine N pieces of communication data contained in the detection rule 101 as normal data, and determines the other pieces of data as abnormal data .
  • Step S2002 the validation software generation unit 102 performs detection rule analysis on the detection rule 101 input in Step S2001 to create a rule list tree 400.
  • An example of the rule list tree is illustrated in FIG. 6.
  • the rule list tree 400 illustrated in FIG. 6 represents a rule list with a definition item being a leaf and each rule being a tree.
  • Step S2003 the validation software generation unit 102 combines definition items of all the trees for each common item from the rule list tree 400 generated in Step S2002.
  • Step S2004 the validation software generation unit 102 constructs a decision tree 500.
  • the decision tree 500 illustrated in FIG. 7 is obtained by converting the rule list tree 400 in which definition items are combined in Step S2003 into a decision tree structure, and adding a leaf indicating "mismatch" to each branch.
  • Step S2005 the validation software generation unit
  • FIG. 8 An example of a program 600 serving as the validation software 103 is illustrated in FIG. 8.
  • Step S2006 the validation software generation unit
  • the validation software 103 determines the test data as normal data when the test data matches any one of the N detection rules of from the rule 1 to the rule N shown in FIG. 5. On the other hand, the validation software 103 determines the test data as abnormal data when the test data does not match any of those N detection rules, and determines that there is a possibility of being cyber-attacked . Specifically, when the transmission source information, transmission destination information, data length, and payload of the test data are transmission source information 1, transmission destination information 1, a data length 1, and a payload 1, respectively, the test data matches the rule 1, and thus the test data can be determined as normal data. [0053] As described above, in the first embodiment, the validation software generation unit 102 generates the validation software 103 illustrated in FIG. 8 from the detection rule 101. The test case generation unit 104 inputs M test patterns A, B, C,
  • the validation execution unit 107 inputs the test patterns A, B, C, ⁇ ⁇ ⁇ , M into the validation target software 106 to be tested, and obtains respective execution results as execution results Aout', Bout', Cout', ⁇ , and Mout'.
  • the validation execution unit 107 compares the expected values Aout, Bout, Cout, ⁇ ⁇ ⁇ , and Mout of the execution results with the execution results Aout', Bout', Cout', ⁇ ⁇ ⁇ , and Mout', respectively, and determines the validation target software 106 to be valid when the number of matches among those M pieces of data is equal to or larger than a threshold value set in advance, and determines the validation target software 106 to be invalid when the number of matches is smaller than the threshold value.
  • the threshold value may be set to M, or may be set to a value smaller than M.
  • testing by the validation target software 106 can be executed fully automatically by simply inputting the detection rule 101 and the validation target software 106. Therefore, a test case is not required to be generated manually as in the related art, to thereby be able to reduce the period of time required for testing.
  • the software testing device 100 generates a test case fully automatically, and thus it is possible to reduce the possibility of occurrence of an omission or error in the test case due to intervention by a person.
  • the software testing device 100 generates the test case 105 from the detection rule 101, and executes validation testing of the validation target software 106. Therefore, it is possible to perform validation testing even when details of the validation target software 106 are a black box and unclear.
  • the software testing device 100 can obtain the test case 105, which is an expected value of the execution result, by using the validation software 103, and thus can automatically obtain the result of determining the validity of the validation target software 106. Therefore, the validity is not required to be manually performed.
  • FIG. 9 is a flowchart for illustrating a detailed processing flow of an operation of "validation software generation" in Step S1002 illustrated in FIG. 3 described above.
  • the entire configuration of the software testing device according to the second embodiment is the same as the configuration of the software testing device 100 according to the first embodiment illustrated in FIG. 1. Further, an example of the hardware configuration of the second embodiment is the same as the hardware configuration of the first embodiment illustrated in FIG. 2.
  • the operation of the software testing device according to the second embodiment is basically the same as that of the first embodiment illustrated in FIG. 3. In the second embodiment, a difference from the first embodiment is that the flow of the operation of FIG. 9 is performed instead of that of FIG. 4 in the first embodiment.
  • a format validation property generated in this case is used for validating a match between communication data to be inspected and the detection rule 101 of FIG. 5.
  • the format validation property describes N detection rules defined in the detection rule 101 of FIG. 5, which are converted in terms of format.
  • the sender, receiver, length, and command are details of communication data to be inspected.
  • StepS3001, StepS3002, Step S3003, Step S3005, Step S3006, and Step S3008 are the same as Step S2001, Step S2002, Step S2003, Step S2004, Step S2005, and Step S2006 of FIG. 4 in the first embodiment, respectively, and thus a description thereof is omitted here. That is, in FIG. 9, a step of "format validation property generation" of Step S3004 is added to FIG. 4.
  • the format validation property is a property for validating the operation of the validation software 103 in terms of format.
  • Step S3004 the validation software generation unit 102 generates the format validation property that depends on the detection rule 101 from the rule list tree 400 of FIG. 6 generated in Step S3002.
  • Step S3007 regarding the program 600 generated in Step S3006, the validation software generation unit 102 uses the format validation property generated in Step S3004 to validate whether the detection rule 101 of FIG. 5 is correctly reflected in the program 600.
  • the validity of the validation target software 106 can be automatically determined merely by inputting the detection rule 101 and the validation target software 106 similarly to the first embodiment. Therefore, an effect similar to that of the first embodiment described above can be obtained.
  • the validation software generation unit 102 generates the format validation property that depends on the detection rule 101.
  • the format is validated by using the format validation property to prove the fact that there is no bug in the validation software 103. Therefore, the highly reliable test case 105 can be generated.
  • FIG. 10 is a diagram for illustrating an example of an entire configuration of a software testing device 100A according to a third embodiment of the present invention.
  • a model inspection unit 702 and a detection rule generation unit 703 are added to the software testing device 100A in the configuration of the first embodiment illustrated in FIG. 1.
  • the model inspection unit 702 inputs the communication model 701 to perform model inspection.
  • the detection rule generation unit 703 outputs the detection rule 101 from a communication model after model inspection, which is output from the model inspection unit 702.
  • model inspection unit 701 and the detection rule generation unit 703 are described later.
  • An example of the hardware configuration of the software testing device 100A according to a third embodiment of the present invention is a configuration obtained by adding the functions of the model inspection unit 702 and the detection rule generation unit 703 to the hardware configuration of the software testing device 100 according to the first embodiment illustrated in FIG. 2.
  • the model inspection unit 702 and the detection rule generation unit 703 are implemented by programs similarly to the validation software generation unit 102.
  • the example of the hardware configuration of the software testing device 100A according to the third embodiment is basically the same as the hardware configuration of the software testing device 100 according to the first embodiment illustrated in FIG. 2, and thus a description thereof is omitted here.
  • the communication model 701 is stored in the auxiliary storage device 202 or the memory 203.
  • the communication model 701 is a design model of communication data communicated by a device such as a control device or an IoT device into which the validation target software 106 is introduced.
  • the communication model 701 is described by, for example, a block diagram or a state transition diagram, and contains information required for implementation of a communication function .
  • the model inspection unit 702 performs model inspection as to whether there is an error in the communication model 701 in accordance with a model inspection rule set in advance .
  • the model inspection unit 702 performs format validation by model inspection as to, for example, whether deadlock occurs in a communication sequence.
  • the detection rule generation unit 703 inputs the communication model 701 after model inspection, and generates the detection rule 101 that defines normal communication data.
  • the detection rule generation unit 703 extracts specifications of a communication function contained in the communication model 701, and defines the extracted communication function as a definition item of the detection rule 101.
  • FIG. 11 is an illustration of Step S4000, which is an example of an operation of the software testing device 100A, and the operation flow of the software testing device 100A may not necessarily be as illustrated in FIG. 11.
  • Step S4001 the communication model 701 is input to the software testing device 100A.
  • Step S4002 the model inspection unit 702 performs format validation on the communication model 701 input in Step S4001 by model inspection.
  • the detection rule generation unit 703 generates the detection rule 101 from the communication model 701 subjected to model inspection in Step S4002.
  • An example of the generated detection rule 101 is, for example, the one illustrated in FIG. 5.
  • the detection rule 101 generated in Step S4003 is input to the validation software generation unit 102.
  • Step S4004 to Step S4011 of FIG. 11 is similar to processing of from Step S1002 to Step S1009 in the first embodiment illustrated in FIG. 3, respectively, and thus a description thereof is omitted here.
  • the validity of the validation target software 106 can be automatically determined merely by inputting the communication model 701 and the validation target software 106. Therefore, an effect similar to that of the first embodiment described above can be obtained.
  • the processor 201 is an integrated circuit (IC) configured to perform processing.
  • the processor 201 is, for example, a central processing unit (CPU) or a digital signal processor (DSP) .
  • the auxiliary storage device 202 is, for example, a read only memory (ROM) , a flash memory, or a hard disk drive (HDD) .
  • ROM read only memory
  • HDD hard disk drive
  • the memory 203 is, for example, a random access memory
  • the display device 204 is, for example, a display, a lamp, or an operator by sound.
  • the operation device 205 is, for example, a mouse, a keyboard, or a touch panel .
  • An operating system is also stored in the auxiliary storage device 202. Then, at least a part of the OS is loaded into the memory 203.
  • the processor 201 executes the OS while executing a program for implementing the function of each of the validation software generation unit 102, the test case generation unit 104, the validation execution unit 107, the model inspection unit 702, and the detection rule generation unit 703.
  • the processor 201 executes the OS to perform, for example, task management, memory management, file management, and communication control.
  • the software testing devices 100 and 100A may include a plurality of processors replacing the processor 201. Those plurality of processors execute the programs for implementing the respective functions of the validation software generation unit 102, the test case generation unit 104, the validation execution unit 107, the model inspection unit 702, and the detection rule generation unit 703 in a distributed manner. Those processors are ICs configured to perform processing similarly to the processor 201.
  • information and data representing results of processing of the validation software generation unit 102, the test case generation unit 104, the validation execution unit 107, the model inspection unit 702, and the detection rule generation unit 703 are stored in the memory 203, the auxiliary storage device 202, or the register or cache memory of the processor 201 as files.
  • the programs for implementing respective functions of the validation software generation unit 102, the test case generation unit 104, the validation execution unit 107, the model inspection unit 702, and the detection rule generation unit 703 may be stored in a portable storage medium such as a magnetic disk, a flexible disk, an optical disc, a compact disc, a Blu-ray (trademark) disc, or a DVD.
  • a portable storage medium such as a magnetic disk, a flexible disk, an optical disc, a compact disc, a Blu-ray (trademark) disc, or a DVD.
  • each of the validation software generation unit 102, the test case generation unit 104, the validation execution unit 107, the model inspection unit 702, and the detection rule generation unit 703 may be replaced with "circuit”, “procedure”, “processing step”, or "processing".
  • the software testing device 100 or 100A may be implemented by an electronic circuit such as a logic integrated circuit (IC) , a gate array (GA) , an application specific integrated circuit (ASIC) , or a field-programmable gate array (FPGA) .
  • IC logic integrated circuit
  • GA gate array
  • ASIC application specific integrated circuit
  • FPGA field-programmable gate array
  • each of the validation software generation unit 102, the test case generation unit 104, the validation execution unit 107, the model inspection unit 702, and the detection rule generation unit 703 is implemented as a part of the electronic circuit.
  • processor circuitry The processor and the above-mentioned electronic circuit are also collectively referred to as "processing circuitry” .
  • 102 validation software generation unit 103 validation software, 104 test case generation unit, 105 test case, 106 validation target software, 107 validation execution unit, 108 validation result, 201 processor, 202 auxiliary storage device, 203 memory, 204 display device, 205 operation device, 701 communication model, 702 model inspection unit, 703 detection rule generation unit

Abstract

Provided is a software testing device including: a validation software generation unit (102) configured to use a detection rule (101) defining normal communication data, to thereby generate validation software (103) for generating an expected value of an execution result to be output by the validation target software (106); a test case generation unit (104) configured to input a test pattern into the validation software (103) to generate the expected value of the execution result as a test case (105) by executing the validation software (103); and a validation execution unit (107) configured to input another test pattern that is the same as the test pattern into the validation target software (106), compare an execution result obtained through execution of the validation target software with the expected value of the execution result serving as the test case (105), and determine validity of the validation target software (106).

Description

SOFTWARE TESTING DEVICE, SOFTWARE TESTING
METHOD, AND SOFTWARE TESTING PROGRAM
Technical Field
[0001] The present invention relates to a software testing device, a software testing method, and a software testing program.
Background Art
[0002] In recent years, an increasing number of control devices have been connected to a network. Further, Internet of Things (IoT) devices have been widely used. As a result, an increasing number of control devices and IoT devices are targeted in a cyber-attack.
[0003] In general, the control device and the IoT device communicate limited types of data to/from each other. Thus, it is relatively easy to set in advance a detection rule that defines normal communication data for the control device and the IoT device. In view of this, a method of detecting, by setting a detection rule in advance, a cyber-attack by whitelisting is paid attention to. The method of detecting a cyber-attack by whitelisting involves setting a detection rule in advance and determining that received communication data is abnormal when the communication data does not match the detection rule, to thereby detect a cyber-attack.
[0004] The performance and quality of a product that uses the method of detecting a cyber-attack by whitelisting largely depend on the variety of types of applicable detection rules. In developing highly reliable attack detection software, implementation of attack detection software that depends on a detection rule different for each system is required to be tested before introduction of the attack detection software into an actual system. Thus, highly reliable test data that depends on each detection rule to be used at the time of implementation testing is required to be created.
[0005] In the related art , test data has been manually created . However, due to the fact that the test data that depends on the detection rule is manually generated, there may be an omission or error in the test data. Therefore, highly reliable test data without an omission or error is required to be generated.
[0006] In Patent Literature 1, it is proposed to automatically create test data instead of manually creating the test data. In Patent Literature 1, the test data is automatically generated from a design model such as a unified modeling language (UML) class diagram or a UML activity diagram.
[0007] In Patent Literature 2, it is proposed to automatically create exhaustive test data. In Patent Literature 2, the test data is automatically generated from an actual operation rule relating to traveling of trains .
[0008] In Patent Literature 3, it is proposed to automatically generate test data in model-based design of safety-critical software. In Patent Literature 3, the test data is automatically- generated from a specification model by model inspection or other format analysis technologies.
Citation List
Patent Literature
[0009] [ PTL 1] JP 2010-267023 A
[ PTL 2] JP 2014-046800 A
[PTL 3] JP 2017-033562 A
Summary of Invention
Technical Problem
[0010] In Patent Literature 1, it is proposed to automatically generate the test data from a design model. However, in Patent Literature 1, there is a problem in that the design model for the test data is required to be created.
[0011] In Patent Literature 2, it is proposed to automatically create the exhaustive test data. However, in Patent Literature 2, an expected value is not generated for the test data, and thus there is a problem in that the test result cannot be automatically determined and is required to be manually determined.
[0012] In Patent Literature 3, it is proposed to automatically generate the test data from a specification model by using format analysis technologies. However, in Patent Literature 3, there is a problem in that the specification model for the test data is required to be created.
[0013] The present invention has an object to provide a software testing device, a software testing method, and a software testing program, which are capable of automatically obtaining a validation result of validation target software merely by simple input without requiring generation of test data that depends on a detection rule to be used at the time of implementation testing.
Solution to Problem
[0014] The present invention provides a software testing device including: a validation software generation unit configured to use a detection rule defining normal communication data on communication to be performed by a device into which validation target software is introduced, to thereby generate validation software for generating an expected value of an execution result to be output by the validation target software; a test case generation unit configured to input a test pattern into the validation software to generate the expected value of the execution result as a test case by executing the validation software; and a validation execution unit configured to input another test pattern that is the same as the test pattern into the validation target software, compare an execution result obtained through execution of the validation target software with the expected value of the execution result serving as the test case, and determine validity of the validation target software. Advantageous Effects of Invention
[0015] With the software testing device according to one embodiment of the present invention, a validation result of validation target software can be automatically obtained merely by simple input.
Brief Description of Drawings
[0016] FIG. 1 is a block diagram for illustrating an example of an entire configuration of a software testing device according to a first embodiment of the present invention.
FIG. 2 is a block diagram for illustrating an example of a hardware configuration of the software testing device according to the first embodiment of the present invention.
FIG. 3 is a flowchart for illustrating an example of an operation of the software testing device according to the first embodiment of the present invention.
FIG. 4 is a flowchart for illustrating an example of an operation of a validation software generation unit in the software testing device according to the first embodiment of the present invention .
FIG. 5 is a table for showing an example of a detection rule in the first embodiment of the present invention.
FIG. 6 is a diagram for illustrating an example of a rule list tree to be generated by the software testing device according to the first embodiment of the present invention.
FIG. 7 is a diagram for illustrating an example of a decision tree to be generated by the software testing device according to the first embodiment of the present invention.
FIG. 8 is a diagram for illustrating an example of validation software to be generated by the software testing device according to the first embodiment of the present invention.
FIG. 9 is a flowchart for illustrating an example of an operation of the validation software generation unit in a software testing device according to a second embodiment of the present invention .
FIG. 10 is a block diagram for illustrating an example of an entire configuration of a software testing device according to a third embodiment of the present invention.
FIG. 11 is a flowchart for illustrating an example of an operation of the software testing device according to the third embodiment of the present invention.
Description of Embodiments
[0017] Now, a software testing device according to embodiments of the present invention is described with reference to the drawings .
[0018] First Embodiment
FIG. 1 is a diagram for illustrating an example of an entire configuration of a software testing device 100 according to a first embodiment of the present invention. As illustrated in FIG. 1, the software testing device 100 includes a validation software generation unit 102, a test case generation unit 104, and a validation execution unit 107. The software testing device 100 is a device configured to validate validation target software 106 for determination of validity of the validation target software 106. Now, a description is given of the validation target software 106 by taking, as an example, attack detection software for detecting abnormal communication data due to, for example, a cyber-attack. The validation target software 106 is software to be introduced into a device such as a control device or an IoT device for determination of whether communication data obtained through communication performed by the device is normal.
[0019] The validation execution unit 107 receives a test case 105 and the validation target software 106 as input to output a validation result 108.
[0020] The validation software generation unit 102 receives a detection rule 101 as input to generate validation software 103. The validation software 103 is software to be used as a test oracle that handles only the defined communication data as normal data.
[0021] The test case generation unit 104 generates the test case 105 by inputting a test pattern into the validation software 103 and executing the validation software 103.
[0022] At the time of testing the validation target software 106, an expected value of the execution result is required as a criterion for determining whether the execution result is normal when a test pattern is input to the validation target software 106. The test case 105 is data serving as the expected value of the execution result. Thus, when a test pattern is input to the validation target software 106 and the execution result matches the execution result of the test case 105, it can be determined that the validation target software 106 is operating normally.
[0023] Details of each unit of the software testing device 100 are described later.
[0024] FIG. 2 is a block diagram for illustrating an example of a hardware configuration of the software testing device 100 illustrated in FIG. 1. As illustrated in FIG. 2, the software testing device 100 according to the first embodiment includes a computer 200.
[0025] The computer 200 includes a processor 201, an auxiliary storage device 202, a memory 203, a display device 204, and an operation device 205 in terms of hardware.
[0026] The auxiliary storage device 202 stores programs for implementing respective functions of the validation software generation unit 102, the test case generation unit 104, and the validation execution unit 107 illustrated in FIG. 1. Further, the auxiliary storage device 202 stores the detection rule 101 and the validation target software 106 to be input to the software testing device 100. Further, the auxiliary storage device 202 stores the validation software 103, the test case 105, and the validation result 108 to be output from the respective units of the software testing device 100.
[0027] In this manner, the respective functions of the validation software generation unit 102, the test case generation unit 104, and the validation execution unit 107 illustrated in FIG. 1 are implemented by the stored programs. The programs for implementing the respective functions of the validation software generation unit 102, the test case generation unit 104, and the validation execution unit 107 are loaded into the memory 203 to be executed by the processor 201.
[0028] FIG. 2 is a diagram for schematically illustrating a state in which the processor 201 is executing the programs for implementing the respective functions of the validation software generation unit 102, the test case generation unit 104, and the validation execution unit 107.
[0029] The display device 204 is configured to display data on the validation result 108 while assisting the operation device 205.
[0030] The operation device 205 is used for executing an operation of inputting each piece of data such as the detection rule 101 or the validation target software 106.
[0031] Next, a description is given of details of the validation software generation unit 102, the test case generation unit 104, and the validation execution unit 107 illustrated in FIG.
1. [0032] The validation software generation unit 102 generates the validation software 103 for determining only the defined communication data as normal data based on the detection rule 101 defining normal communication data.
[0033] The test case generation unit 104 inputs an arbitrary test pattern based on a boundary value or probabilistic input into the validation software 103, and executes the validation software
103 to obtain an execution result. The test case generation unit
104 generates a pair of a test pattern input to the validation software 103 and an execution result as the test case 105. The execution result contained in the test case 105 is an expected value of the execution result obtained when the same test pattern is input to the validation target software 106. Thus, the execution result in the test case 105 is hereinafter referred to as "expected value of the execution result" or "expected value".
[0034] The validation execution unit 107 inputs another test pattern that is the same as the test pattern input to the validation software 103 into the validation target software 106, and compares the execution result with an expected value in the test case 105 to output a comparison result as the validation result 108.
[0035] Next, a description is given of an example of an operation of the software testing device 100 with reference to FIG. 3. FIG. 3 is an illustration of Step S1000, which is an example of an operation flow of the software testing device 100, and the operation flow of the software testing device 100 may not necessarily be as illustrated in FIG. 3.
[0036] In FIG. 3, first, in Step S1001, the detection rule 101 is input to the software testing device 100.
[0037] Next, inStepS1002, the validation software generation unit 102 generates the validation software 103 from the detection rule input in Step S1001. The detailed processing flow of a validation software generation operation is described later.
[0038] Next, in Step S1003, the test case generation unit 104 inputs a test pattern into the validation software 103 generated in Step S1002.
[0039] Next, in Step S1004, the test case generation unit 104 executes the validation software 103, and generates a pair of the execution result and the test pattern input in Step S1003 as the test case 105. The execution result contained in the test case 105 is used as the "expected value of the execution result" as described above .
[0040] Next, in Step S1005, the test case generation unit 104 determines whether all the test cases 105 have been generated in Step S1004. When it is determined that all the test cases 105 have been generated in Step S1004, the processing advances to Step S1006. On the other hand, when the generation of all the test cases 105 is ongoing, the processing returns to Step S1003. The processing of from Step S1003 to Step S1005 is repeated until completion of generation of all the test cases 105.
[0041] In Step S1006, the validation execution unit 107 inputs the test pattern in the test case 105, which has been generated in the processing of from Step S1003 to Step S1005, into the validation target software 106.
[0042] In Step S1007, the validation execution unit 107 compares the execution result obtained through execution of the validation target software 106 and the expected value of the execution result in the test case 105. When the execution result of the validation target software 106 and the expected value of the execution result in the test case 105 match each other, the comparison result is output as a "match", while when the execution result and the expected value do not match each other, the comparison result is output as a "mismatch" .
[0043] In Step S1008, the validation execution unit 107 determines whether validation in Step S1007 has been executed for all the test cases 105. When it is determined that the validation has been executed for all the test cases 105, the processing advances to Step S1009. On the other hand, when the execution of the validation is ongoing, the processing returns to Step S1006. The processing of from Step S1006 to Step S1008 is repeated until the execution of the validation for all the test cases 105 is complete.
[0044] In Step S1009, the validation execution unit 107 outputs all the comparison results as the validation result 108, and the processing of the operation flow of FIG. 3 is ended.
[0045] Next, a description is given of the detailed processing flow of the operation of "validation software generation" in Step S1002 illustrated in FIG. 3 with reference to FIG. 4.
[0046] In FIG. 4, in Step S2001, the detection rule 101 is input to the validation software generation unit 102. An example of the detection rule 101 is shown in FIG. 5. The detection rule 101 shown in FIG. 5 contains information such as transmission source information, transmission destination information, a data length, and a payload as definition items for defining normal communication data. In the example of FIG. 5, the detection rule 101 contains N detection rules of from a rule 1 to a rule N. The validation software 103 and the validation target software 106 determine N pieces of communication data contained in the detection rule 101 as normal data, and determines the other pieces of data as abnormal data .
[0047] In Step S2002, the validation software generation unit 102 performs detection rule analysis on the detection rule 101 input in Step S2001 to create a rule list tree 400. An example of the rule list tree is illustrated in FIG. 6. The rule list tree 400 illustrated in FIG. 6 represents a rule list with a definition item being a leaf and each rule being a tree.
[0048] In Step S2003, the validation software generation unit 102 combines definition items of all the trees for each common item from the rule list tree 400 generated in Step S2002.
[0049] In Step S2004, the validation software generation unit 102 constructs a decision tree 500. An example of the decision tree
500 is illustrated in FIG. 7. The decision tree 500 illustrated in FIG. 7 is obtained by converting the rule list tree 400 in which definition items are combined in Step S2003 into a decision tree structure, and adding a leaf indicating "mismatch" to each branch.
[0050] In Step S2005, the validation software generation unit
102 generates a program serving as the validation software 103 from the decision tree 500 generated in Step S2004. An example of a program 600 serving as the validation software 103 is illustrated in FIG. 8.
[0051] In Step S2006, the validation software generation unit
102 outputs the program 600 generated in Step S2005 as the validation software 103 to end the processing.
[0052] Now, a description is given of the validation software
103 formed of the program 600 illustrated in FIG. 8. The validation software 103 determines the test data as normal data when the test data matches any one of the N detection rules of from the rule 1 to the rule N shown in FIG. 5. On the other hand, the validation software 103 determines the test data as abnormal data when the test data does not match any of those N detection rules, and determines that there is a possibility of being cyber-attacked . Specifically, when the transmission source information, transmission destination information, data length, and payload of the test data are transmission source information 1, transmission destination information 1, a data length 1, and a payload 1, respectively, the test data matches the rule 1, and thus the test data can be determined as normal data. [0053] As described above, in the first embodiment, the validation software generation unit 102 generates the validation software 103 illustrated in FIG. 8 from the detection rule 101. The test case generation unit 104 inputs M test patterns A, B, C,
- - - , and M into the validation software 103, and obtains respective execution results as expected values Aout, Bout, Cout, ·, and Mout of the execution results. The validation execution unit 107 inputs the test patterns A, B, C, · · · , M into the validation target software 106 to be tested, and obtains respective execution results as execution results Aout', Bout', Cout', ·, and Mout'. The validation execution unit 107 compares the expected values Aout, Bout, Cout, · · · , and Mout of the execution results with the execution results Aout', Bout', Cout', · · ·, and Mout', respectively, and determines the validation target software 106 to be valid when the number of matches among those M pieces of data is equal to or larger than a threshold value set in advance, and determines the validation target software 106 to be invalid when the number of matches is smaller than the threshold value. The threshold value may be set to M, or may be set to a value smaller than M.
[0054] As described above, with the software testing device 100 according to the first embodiment, testing by the validation target software 106 can be executed fully automatically by simply inputting the detection rule 101 and the validation target software 106. Therefore, a test case is not required to be generated manually as in the related art, to thereby be able to reduce the period of time required for testing.
[0055] The software testing device 100 according to the first embodiment generates a test case fully automatically, and thus it is possible to reduce the possibility of occurrence of an omission or error in the test case due to intervention by a person.
[0056] The software testing device 100 according to this embodiment generates the test case 105 from the detection rule 101, and executes validation testing of the validation target software 106. Therefore, it is possible to perform validation testing even when details of the validation target software 106 are a black box and unclear.
[0057] The software testing device 100 according to this embodiment can obtain the test case 105, which is an expected value of the execution result, by using the validation software 103, and thus can automatically obtain the result of determining the validity of the validation target software 106. Therefore, the validity is not required to be manually performed.
[0058] Second Embodiment
Now, a description is given of a software testing device according to a second embodiment of the present invention with reference to FIG. 9. FIG. 9 is a flowchart for illustrating a detailed processing flow of an operation of "validation software generation" in Step S1002 illustrated in FIG. 3 described above.
[0059] The entire configuration of the software testing device according to the second embodiment is the same as the configuration of the software testing device 100 according to the first embodiment illustrated in FIG. 1. Further, an example of the hardware configuration of the second embodiment is the same as the hardware configuration of the first embodiment illustrated in FIG. 2. The operation of the software testing device according to the second embodiment is basically the same as that of the first embodiment illustrated in FIG. 3. In the second embodiment, a difference from the first embodiment is that the flow of the operation of FIG. 9 is performed instead of that of FIG. 4 in the first embodiment. A format validation property generated in this case is used for validating a match between communication data to be inspected and the detection rule 101 of FIG. 5. The format validation property describes N detection rules defined in the detection rule 101 of FIG. 5, which are converted in terms of format. For example, the format validation property, for example, "rule 1 == sender == transmission source information 1 && receiver == transmission destination information 1 && length == data length 1 && command == payload 1" is created from the rule 1 of the detection rule 101 of FIG. 5. In this case, the sender, receiver, length, and command are details of communication data to be inspected.
[0060] In FIG. 9, StepS3001, StepS3002, Step S3003, Step S3005, Step S3006, and Step S3008 are the same as Step S2001, Step S2002, Step S2003, Step S2004, Step S2005, and Step S2006 of FIG. 4 in the first embodiment, respectively, and thus a description thereof is omitted here. That is, in FIG. 9, a step of "format validation property generation" of Step S3004 is added to FIG. 4. The format validation property is a property for validating the operation of the validation software 103 in terms of format.
[0061] In FIG. 9, in Step S3004, the validation software generation unit 102 generates the format validation property that depends on the detection rule 101 from the rule list tree 400 of FIG. 6 generated in Step S3002.
[0062] In Step S3007, regarding the program 600 generated in Step S3006, the validation software generation unit 102 uses the format validation property generated in Step S3004 to validate whether the detection rule 101 of FIG. 5 is correctly reflected in the program 600.
[0063] As described above, also in the second embodiment, the validity of the validation target software 106 can be automatically determined merely by inputting the detection rule 101 and the validation target software 106 similarly to the first embodiment. Therefore, an effect similar to that of the first embodiment described above can be obtained.
[0064] Further, in the second embodiment, the validation software generation unit 102 generates the format validation property that depends on the detection rule 101. The format is validated by using the format validation property to prove the fact that there is no bug in the validation software 103. Therefore, the highly reliable test case 105 can be generated.
[0065] Third Embodiment FIG. 10 is a diagram for illustrating an example of an entire configuration of a software testing device 100A according to a third embodiment of the present invention.
[0066] As illustrated in FIG. 10, a model inspection unit 702 and a detection rule generation unit 703 are added to the software testing device 100A in the configuration of the first embodiment illustrated in FIG. 1.
[0067] The model inspection unit 702 inputs the communication model 701 to perform model inspection.
[0068] The detection rule generation unit 703 outputs the detection rule 101 from a communication model after model inspection, which is output from the model inspection unit 702.
[0069] Details of the model inspection unit 701 and the detection rule generation unit 703 are described later.
[0070] An example of the hardware configuration of the software testing device 100A according to a third embodiment of the present invention is a configuration obtained by adding the functions of the model inspection unit 702 and the detection rule generation unit 703 to the hardware configuration of the software testing device 100 according to the first embodiment illustrated in FIG. 2. The model inspection unit 702 and the detection rule generation unit 703 are implemented by programs similarly to the validation software generation unit 102. In this manner, the example of the hardware configuration of the software testing device 100A according to the third embodiment is basically the same as the hardware configuration of the software testing device 100 according to the first embodiment illustrated in FIG. 2, and thus a description thereof is omitted here. The communication model 701 is stored in the auxiliary storage device 202 or the memory 203.
[0071] Next, a detailed description is given of the communication model 701, the model inspection unit 702, and the detection rule generation unit 703 in the software testing device 100A according to the third embodiment illustrated in FIG. 10. Other components are the same as those of the first embodiment, and thus a description thereof is omitted here.
[0072] The communication model 701 is a design model of communication data communicated by a device such as a control device or an IoT device into which the validation target software 106 is introduced. The communication model 701 is described by, for example, a block diagram or a state transition diagram, and contains information required for implementation of a communication function .
[0073] The model inspection unit 702 performs model inspection as to whether there is an error in the communication model 701 in accordance with a model inspection rule set in advance . For example, the model inspection unit 702 performs format validation by model inspection as to, for example, whether deadlock occurs in a communication sequence.
[0074] The detection rule generation unit 703 inputs the communication model 701 after model inspection, and generates the detection rule 101 that defines normal communication data. The detection rule generation unit 703 extracts specifications of a communication function contained in the communication model 701, and defines the extracted communication function as a definition item of the detection rule 101.
[0075] Next, a description is given of an operation of the software testing device 100A with reference to FIG. 11. FIG. 11 is an illustration of Step S4000, which is an example of an operation of the software testing device 100A, and the operation flow of the software testing device 100A may not necessarily be as illustrated in FIG. 11.
[0076] In FIG. 11, first, in Step S4001, the communication model 701 is input to the software testing device 100A.
[0077] Next, in Step S4002, the model inspection unit 702 performs format validation on the communication model 701 input in Step S4001 by model inspection.
[0078] Next, inStepS4003, the detection rule generation unit 703 generates the detection rule 101 from the communication model 701 subjected to model inspection in Step S4002. An example of the generated detection rule 101 is, for example, the one illustrated in FIG. 5.
[0079] The detection rule 101 generated in Step S4003 is input to the validation software generation unit 102.
[0080] The processing of from Step S4004 to Step S4011 of FIG. 11 is similar to processing of from Step S1002 to Step S1009 in the first embodiment illustrated in FIG. 3, respectively, and thus a description thereof is omitted here.
[0081] As described above, also in the third embodiment, the validity of the validation target software 106 can be automatically determined merely by inputting the communication model 701 and the validation target software 106. Therefore, an effect similar to that of the first embodiment described above can be obtained.
[0082] Further, in the third embodiment, the detection rule
101 is generated from the communication model 701 after execution of model inspection, and thus it is possible to implement generation of the detection rule 101 and validation testing of the validation target software 106 from the communication model 701 from the communication model 701 by a fully automatic and highly reliable test scheme.
[0083] Lastly, a supplementary description is given of the hardware configurations of the software testing devices 100 and 100A of FIG. 2.
[0084] The processor 201 is an integrated circuit (IC) configured to perform processing. The processor 201 is, for example, a central processing unit (CPU) or a digital signal processor (DSP) .
[0085] The auxiliary storage device 202 is, for example, a read only memory (ROM) , a flash memory, or a hard disk drive (HDD) .
[0086] The memory 203 is, for example, a random access memory
(RAM) . [0087] The display device 204 is, for example, a display, a lamp, or an operator by sound.
[0088] The operation device 205 is, for example, a mouse, a keyboard, or a touch panel .
[0089] An operating system (OS) is also stored in the auxiliary storage device 202. Then, at least a part of the OS is loaded into the memory 203. The processor 201 executes the OS while executing a program for implementing the function of each of the validation software generation unit 102, the test case generation unit 104, the validation execution unit 107, the model inspection unit 702, and the detection rule generation unit 703.
[0090] The processor 201 executes the OS to perform, for example, task management, memory management, file management, and communication control.
[0091] Further, the software testing devices 100 and 100A may include a plurality of processors replacing the processor 201. Those plurality of processors execute the programs for implementing the respective functions of the validation software generation unit 102, the test case generation unit 104, the validation execution unit 107, the model inspection unit 702, and the detection rule generation unit 703 in a distributed manner. Those processors are ICs configured to perform processing similarly to the processor 201.
[0092] Further, information and data representing results of processing of the validation software generation unit 102, the test case generation unit 104, the validation execution unit 107, the model inspection unit 702, and the detection rule generation unit 703 are stored in the memory 203, the auxiliary storage device 202, or the register or cache memory of the processor 201 as files.
[0093] Further, the programs for implementing respective functions of the validation software generation unit 102, the test case generation unit 104, the validation execution unit 107, the model inspection unit 702, and the detection rule generation unit 703 may be stored in a portable storage medium such as a magnetic disk, a flexible disk, an optical disc, a compact disc, a Blu-ray (trademark) disc, or a DVD.
[0094] Further, the "unit" of each of the validation software generation unit 102, the test case generation unit 104, the validation execution unit 107, the model inspection unit 702, and the detection rule generation unit 703 may be replaced with "circuit", "procedure", "processing step", or "processing".
[0095] Further, the software testing device 100 or 100A may be implemented by an electronic circuit such as a logic integrated circuit (IC) , a gate array (GA) , an application specific integrated circuit (ASIC) , or a field-programmable gate array (FPGA) . In that case, each of the validation software generation unit 102, the test case generation unit 104, the validation execution unit 107, the model inspection unit 702, and the detection rule generation unit 703 is implemented as a part of the electronic circuit.
[0096] The processor and the above-mentioned electronic circuit are also collectively referred to as "processing circuitry" .
Reference Signs List
[0097] 100, 100A software testing device, 101 detection rule,
102 validation software generation unit, 103 validation software, 104 test case generation unit, 105 test case, 106 validation target software, 107 validation execution unit, 108 validation result, 201 processor, 202 auxiliary storage device, 203 memory, 204 display device, 205 operation device, 701 communication model, 702 model inspection unit, 703 detection rule generation unit

Claims

Claims
[Claim 1] A software testing device, comprising:
a validation software generation unit configured to use a detection rule defining normal communication data on communication to be performed by a device into which validation target software is introduced, to thereby generate validation software for generating an expected value to be output by the validation target software as an execution result;
a test case generation unit configured to input a test pattern into the validation software to generate the expected value as a test case by executing the validation software; and
a validation execution unit configured to input another test pattern that is the same as the test pattern into the validation target software, compare an actual execution result obtained through execution of the validation target software with the expected value serving as the test case, and determine validity of the validation target software.
[Claim 2] The software testing device according to claim 1, further comprising :
a model inspection unit configured to perform model inspection as to whether there is an error in a communication model, which is a design model of the normal communication data, in accordance with a model inspection rule set in advance; and a detection rule generation unit configured to generate the detection rule based on the communication model after the model inspection .
[Claim 3] The software testing device according to claim 1 or 2, wherein the validation software generation unit is configured to: input the detection rule containing at least one of transmission source information, transmission destination information, a data length, or a payload as a definition item for defining the normal communication data;
perform detection rule analysis on the detection rule to generate a rule list tree with the definition item being a leaf and each rule being a tree;
combine definition items of all the trees for each common item from the rule list tree to generate a decision tree; and
generate the validation software from the decision tree.
[Claim 4] The software testing device according to claim 3, wherein the validation software generation unit is configured to:
generate a format validation property that depends on the detection rule from the rule list tree; and
use the format validation property to validate the validation software generated from the decision tree in terms of format.
[Claim 5] A software testing method, comprising: a validation software generation step of using a detection rule defining normal communication data on communication to be performed by a device into which validation target software is introduced, to thereby generate validation software for generating an expected value to be output by the validation target software as an execution result;
a test case generation step of inputting a test pattern into the validation software to generate the expected value as a test case by executing the validation software; and
a validation execution step of inputting another test pattern that is the same as the test pattern into the validation target software, comparing an actual execution result obtained through execution of the validation target software with the expected value serving as the test case, and determining validity of the validation target software.
[Claim 6] A software testing program for causing a computer to execute :
a validation software generation step of using a detection rule defining normal communication data on communication to be performed by a device into which validation target software is introduced, to thereby generate validation software for generating an expected value to be output by the validation target software as an execution result;
a test case generation step of inputting a test pattern into the validation software to generate the expected value as a test case by executing the validation software; and
a validation execution step of inputting another test pattern that is the same as the test pattern into the validation target software, comparing an actual execution result obtained through execution of the validation target software with the expected value serving as the test case, and determining validity of the validation target software.
PCT/EP2018/084616 2018-12-12 2018-12-12 Software testing device, software testing method, and software testing program WO2019242868A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/EP2018/084616 WO2019242868A1 (en) 2018-12-12 2018-12-12 Software testing device, software testing method, and software testing program
JP2019571278A JP6765554B2 (en) 2018-12-12 2018-12-12 Software test equipment, software test methods, and software test programs

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2018/084616 WO2019242868A1 (en) 2018-12-12 2018-12-12 Software testing device, software testing method, and software testing program

Publications (1)

Publication Number Publication Date
WO2019242868A1 true WO2019242868A1 (en) 2019-12-26

Family

ID=64870443

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2018/084616 WO2019242868A1 (en) 2018-12-12 2018-12-12 Software testing device, software testing method, and software testing program

Country Status (2)

Country Link
JP (1) JP6765554B2 (en)
WO (1) WO2019242868A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111858298A (en) * 2020-05-29 2020-10-30 卡斯柯信号有限公司 Software testing method based on 3V model
CN112749084A (en) * 2020-12-17 2021-05-04 中国农业银行股份有限公司 Test case generation method and device
CN113918474A (en) * 2021-12-15 2022-01-11 杭银消费金融股份有限公司 Test case management method and device based on data mode

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010267023A (en) 2009-05-13 2010-11-25 Nippon Telegr & Teleph Corp <Ntt> Test data generation method, device and program
US20120030761A1 (en) * 2010-08-02 2012-02-02 Yokogawa Electric Corporation Improper communication detection system
JP2014046800A (en) 2012-08-31 2014-03-17 Hitachi Ltd Test data covering generation device and method
JP2017033562A (en) 2015-08-05 2017-02-09 ゼネラル・エレクトリック・カンパニイ System and method for model based technology and process for safety-critical software development
US9874869B2 (en) * 2013-03-29 2018-01-23 Hitachi, Ltd. Information controller, information control system, and information control method
US20180024914A1 (en) * 2016-07-20 2018-01-25 International Business Machines Corporation Generating test scripts for testing a network-based application

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010267023A (en) 2009-05-13 2010-11-25 Nippon Telegr & Teleph Corp <Ntt> Test data generation method, device and program
US20120030761A1 (en) * 2010-08-02 2012-02-02 Yokogawa Electric Corporation Improper communication detection system
JP2014046800A (en) 2012-08-31 2014-03-17 Hitachi Ltd Test data covering generation device and method
US9874869B2 (en) * 2013-03-29 2018-01-23 Hitachi, Ltd. Information controller, information control system, and information control method
JP2017033562A (en) 2015-08-05 2017-02-09 ゼネラル・エレクトリック・カンパニイ System and method for model based technology and process for safety-critical software development
US20180024914A1 (en) * 2016-07-20 2018-01-25 International Business Machines Corporation Generating test scripts for testing a network-based application

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111858298A (en) * 2020-05-29 2020-10-30 卡斯柯信号有限公司 Software testing method based on 3V model
CN111858298B (en) * 2020-05-29 2022-08-30 卡斯柯信号有限公司 Software testing method based on 3V model
CN112749084A (en) * 2020-12-17 2021-05-04 中国农业银行股份有限公司 Test case generation method and device
CN113918474A (en) * 2021-12-15 2022-01-11 杭银消费金融股份有限公司 Test case management method and device based on data mode
CN113918474B (en) * 2021-12-15 2022-03-11 杭银消费金融股份有限公司 Test case management method and device based on data mode

Also Published As

Publication number Publication date
JP6765554B2 (en) 2020-10-07
JP2020524862A (en) 2020-08-20

Similar Documents

Publication Publication Date Title
Yan et al. Just-in-time defect identification and localization: A two-phase framework
Kang et al. Dta++: dynamic taint analysis with targeted control-flow propagation.
US8386851B2 (en) Functional coverage using combinatorial test design
US8539475B2 (en) API backward compatibility checking
US10169034B2 (en) Verification of backward compatibility of software components
Nadi et al. Where do configuration constraints stem from? an extraction approach and an empirical study
US20180300227A1 (en) System and method for detecting an error in software
US8683282B2 (en) Automatic identification of information useful for generation-based functional verification
Deng et al. Fuzzing deep-learning libraries via automated relational api inference
WO2019242868A1 (en) Software testing device, software testing method, and software testing program
Kirbas et al. The relationship between evolutionary coupling and defects in large industrial software
US10387288B2 (en) Interactive analysis of a security specification
Amankwah et al. Bug detection in Java code: An extensive evaluation of static analysis tools using Juliet Test Suites
Suneja et al. Towards reliable ai for source code understanding
Phung et al. Error-Type—A Novel Set of Software Metrics for Software Fault Prediction
Rahman et al. From legal agreements to blockchain smart contracts
US8639490B2 (en) Concretization of abstracted traces
Imtiaz et al. Predicting vulnerability for requirements
US8458523B2 (en) Meta attributes in functional coverage models
Greer Unsupervised interpretable feature extraction for binary executables using libcaise
Boboň Analysis of NIST FIPS 140-2 Security Certificates
Arcaini et al. A Process for Fault-Driven Repair of Constraints Among Features
JP2010244139A (en) Countermeasure completeness inspection device
Ding et al. Towards a hybrid framework for detecting input manipulation vulnerabilities
Arcaini et al. A Process for Fault-Driven Repair of Constraints Among

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2019571278

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18825937

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18825937

Country of ref document: EP

Kind code of ref document: A1