US20180196739A1 - System and method for safety-critical software automated requirements-based test case generation - Google Patents
System and method for safety-critical software automated requirements-based test case generation Download PDFInfo
- Publication number
- US20180196739A1 US20180196739A1 US15/916,660 US201815916660A US2018196739A1 US 20180196739 A1 US20180196739 A1 US 20180196739A1 US 201815916660 A US201815916660 A US 201815916660A US 2018196739 A1 US2018196739 A1 US 2018196739A1
- Authority
- US
- United States
- Prior art keywords
- test
- strategy
- model
- software architecture
- requirements
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/22—Detection or location of defective computer hardware by testing during standby operation or during idle time, e.g. start-up testing
- G06F11/2273—Test methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3684—Test management for test design, e.g. generating new test cases
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3604—Software analysis for verifying properties of programs
- G06F11/3608—Software analysis for verifying properties of programs using formal methods, e.g. model checking, abstract interpretation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3676—Test management for coverage analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/368—Test management for test version control, e.g. updating test cases to a new software version
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/20—Software design
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/30—Creation or generation of source code
- G06F8/35—Creation or generation of source code model driven
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
Definitions
- Safety-critical software such as aviation software
- certification standards e.g., DO-178B/C for aviation software
- Testing is an essential part of the verification process. Manual test case generation from the requirements is hard and time-consuming, especially with complex, large software.
- test cases and/or test procedures derived from the high-level software requirements can help reduce the cost introduced by manual test case generation and review activities.
- Those test cases and/or test procedures generated from the specifications can be executed on the associated low-level design implementations through a test conductor.
- test tools and/or models are not able to generate requirements-based test cases at different levels in the design model.
- the generated test cases produced by conventional tools cannot be directly executed on components at multi-levels in the design.
- FIG. 1 depicts a system for automated requirements-based test case generation in accordance with embodiments
- FIG. 2 depicts a process for software testing in accordance with embodiments
- FIGS. 3A-3D depict a software architecture model automatically derived from a software design in accordance with embodiments
- FIG. 4 depicts a process for component-level test case generation in accordance with embodiments
- FIG. 5 depicts a requirement model in accordance with embodiments
- FIG. 6 depicts a requirement model with attached test objectives in accordance with embodiments
- FIG. 7 depicts a logic condition coverage model with attached test objectives in accordance with embodiments
- FIGS. 8A-8B depict an input masking model with attached test objectives in accordance with embodiments
- FIG. 9 depicts a test script generation and review process in accordance with embodiments.
- FIG. 10 depicts an exemplary software architecture design and the associated requirements traceability information in accordance with embodiments
- FIG. 11 depicts an exemplary unit level test script in accordance with embodiments.
- FIG. 12 depicts an exemplary integration level test script in accordance with embodiments.
- systems and methods automatically create a software architecture model from the software design architecture along with requirement models to automate multi-level architectural requirements-based test case generation based on the proposed software architecture model.
- the software architecture model and its requirements allocation are constructed using a model-based development (MBD) tool with the representation of a hierarchical data flow diagram.
- MBD model-based development
- an embodying MBD tool automatically creates the software architecture model from the software design architecture and generates corresponding test cases for the system-level or high-level requirements.
- Embodying systems and methods can implement component-level requirements-based test case generation to automatically generate test cases for components at different levels in the software architecture.
- FIG. 1 depicts system 100 for automated requirements-based test case generation in accordance with embodiments.
- System 100 includes control processor 110 which executes computer instructions 131 to control the operation of the system and its components.
- the executable instructions can be stored in data store 130 , which also can store data accessed and generated by the system.
- Control processor 110 can be located in a computer, or a server, and interconnected to the various components via communication link 120 .
- the communication link can be an internal bus, an electronic communication network, or the like.
- System 100 can generate test cases from system and/or high level requirements 132 of the safety-critical software.
- FIG. 2 depicts process 200 for automated software testing in accordance with embodiments.
- Process 200 derives software architecture model 134 from the software design model.
- the software architecture model is constructed, step 205 , in model-based development tool 150 environment based on the design model's architectural information.
- This software architecture model can be automatically created in accordance with some implementations.
- Requirement models are allocated, step 210 , into different modules in the software architecture model by connecting the corresponding monitored/controlled variables (e.g., FIG. 1 : VAR1-VAR19) with the input/output ports of the component.
- input ports for all output variables in the requirement models can be added. By adding input ports for all output variables, the test case generator can generate inputs and expected outputs for the test cases in one run.
- the component-level test case generator unit 140 can use the software architecture model with allocated requirements to generate, step 215 , unit/module-level requirements-based test cases.
- the test case generator unit 140 can also generate, step 220 , integration-level test cases to verify if the code component or integration complies with the allocated requirements.
- FIGS. 3A-3D depict software architecture model 300 built as a hierarchical data flow diagram based on the software design architecture in accordance with embodiments.
- Each component Component1, Component2, Component3, Component4 in the software design is a block/operator in the software architecture model with input and output ports.
- Blocks/operators in the software architecture model are connected with each other and can have multiple layers of sub-blocks/sub-operators.
- Requirement models REQ12, REQ13, REQ14, REQ15 are allocated into the software architecture model and connected to the input and output ports.
- the building process is automatic, systematic and modular.
- the hierarchical data flow provides good visualization and easy traceability from the requirements to the software design.
- FIG. 4 depicts process 400 for component-level test case generation in accordance with embodiments.
- Process 400 is based on a received, step 405 , automatically created software architecture model in the form of a hierarchical data flow diagram with requirement models allocated.
- One, or more, of the software components in the software design is selected, step 410 , based on the level of test generation, and the corresponding software architecture model block/operator is used for test case generation.
- An intermediate test model is built, step 415 , based on the selected component by attaching the test constraints and test objectives to the corresponding software architecture model block/operator.
- test objectives are attached to satisfy certain coverage criteria at the requirements-level, such as requirements coverage (e.g., all requirements are activated), logic condition coverage (e.g., logic conditions in the requirements are activated at certain level), etc.
- requirements coverage e.g., all requirements are activated
- logic condition coverage e.g., logic conditions in the requirements are activated at certain level
- the test constraints are attached to the model to constrain the monitored/controlled variables range and to ensure that the generated test cases do not violate the requirements.
- the automatic test case generation strategies (i.e., to attach the test objectives and the constraints) can be based on the general form of a requirement.
- the form of a requirement can be expressed as:
- ⁇ antecedent expression> is a logic expression on monitored variables
- FIG. 5 depicts requirement model 500 (when expressed in Simulink) in accordance with embodiments.
- the automatic test case generator unit 140 receives such and, from that, generates test cases, according to one or more strategies (e.g., requirements coverage strategy, logic condition coverage strategy, input masking strategy, data completeness analysis strategy, event strategy, list strategy, decomposition and equation strategy, equivalence class strategy, boundary value analysis strategy, robustness strategy, timing strategy, etc.).
- strategies e.g., requirements coverage strategy, logic condition coverage strategy, input masking strategy, data completeness analysis strategy, event strategy, list strategy, decomposition and equation strategy, equivalence class strategy, boundary value analysis strategy, robustness strategy, timing strategy, etc.
- a requirements coverage strategy includes, for each requirement, generating one test case where the requirement must be satisfied with the antecedent expression being true. This is done by inserting test objectives and constraints and running a test generation engine that can drive the input sequences to achieve the test objectives.
- test objective can be done using test objective and test condition blocks from a commercial design verifier block library in the selected model-based development tool (e.g., such as Simulink Design Verifier blocks available from Simulink).
- the test generation engine can be used to drive the inputs to achieve the test objectives.
- FIG. 6 depicts requirement model 600 with attached test objectives in accordance with embodiments.
- Test Objective block 610 (notated with a “O” within a circle), is analyzed by the design verifier to find a combination of variable value assignments VAR21, VAR22 that cause its antecedent expression, which is of Boolean type, be true.
- Test Condition block 620 (notated with a “C” within a circle), causes the design verifier to keep the output of the “implies” block to be true.
- a true output signal of the “implies” block is an indication that the requirement REQ27 is satisfied.
- the value assignments to the monitored and controlled variables are generated automatically by the design verifier.
- a logic condition coverage (LCC) strategy can be implemented to achieve functional coverage of logic equation conditions. Each condition within a logic equation is demonstrated to have an effect on the outcome of the logic equation by varying only that condition and holding fixed for all others that could affect the outcome.
- Table 1 depicts logic condition coverage for two variables, where two Boolean values (a and b) are the conditions for the listed Boolean operators. Table 1 indicates whether a test case is necessary to achieve LCC coverage ( ⁇ ) or not (x). When the antecedent expression has one of these operators, test cases are generated for each of the corresponding combinations marked with ( ⁇ ), and this is generalizable for any number of operands.
- FIG. 7 depicts logic condition coverage model 700 with attached test objectives in accordance with embodiments.
- This LCC model is based on requirement model 600 with a pattern of additional test objective and condition blocks 710 , 720 , 730 .
- the test objective blocks are attached according to the Boolean operators used and respective cases in Table 1.
- a set of test cases is generated to satisfy logic condition coverage.
- Each generated test case is also examined in order to discover which requirements it “activates” activation means that the output signal of Satisfiability port 740 must be 1 or “true”.
- An input masking strategy can achieve masking Modified Condition/Decision Coverage (MC/DC).
- the masking MC/DC meets the definition of independent effect by guaranteeing the same minimum test cases at each logical operator as a unique cause, and is acceptable for meeting the MC/DC objective of safety-critical software development standards (e.g., DO-178B/C).
- Masking refers to the concept that specific inputs to a logic construct can hide the effect of other inputs to the construct. For example, a false input to an AND operator masks all other inputs, and a true input to an OR operator masks all other inputs.
- the masking approach to MC/DC allows more than one input to change in an independence pair, as long as the condition of interest is shown to be the only condition that affects the value of the decision outcome. However, analysis of the internal logic of the decision is needed to show that the condition of interest is the only condition causing the value of the decision's outcome to change.
- FIGS. 8A-8B depicts input masking model strategy 800 with attached test objectives in accordance with embodiments.
- Each sub-expression depicted in input masking model 800 corresponds to a signal/block path that starts at an input condition of the antecedent expression, involving a monitored variable VAR23, VAR24, VAR25, and ends at the signal that represents the result of the monitored expression.
- test cases can be automatically generated, associated to requirements REQ28 by automatic test case generator unit 140 and translated to an output test script.
- the input masking test generation strategy attaches test objectives according to the following steps:
- a data completeness analysis strategy can analyze one or more variables that appear in the requirement and selects test objectives to test different points within the physical or functional range of the particular variables.
- the selection of test objectives can be based on, for example, the variable type.
- numeric variables could be tested on their minimum, maximum, boundaries, etc.; and enumerated and Boolean variables can be tested on all possible values and/or states.
- An event strategy can ensure that each event can be triggered at least once. This strategy can also ascertain that one event is not continuously triggered.
- the generated event test cases and procedures can trigger particular events and verify the outputs with other input conditions remain constant.
- a list strategy can analyze list variables and operators that appear in the requirement and selects test objectives to test different properties of lists. For example, this strategy can determine whether list operations take place at different positions of the lists, and ensure that each list variable is tested at least at a minimum and a maximum list length.
- a decomposition and equation strategy can analyze functions and/or equations inside of the requirement. These functions and/or equations can be undefined in some implementations. Test objectives can be selected by analyzing the input and/or the output parameters of these functions or equations, and error prone points in the defined functions or equations.
- An equivalence class strategy, a boundary value analysis strategy, and a robustness strategy can each analyze inequalities in the requirement and select test objectives based on the equivalence class partitions that can be induced by the inequalities.
- An equivalence class strategy can select one or more test objective for each normal equivalence class;
- a boundary value analysis strategy can select one or more test objectives at the boundaries between every two equivalence classes;
- a robustness strategy can select one or more test objectives for the abnormal equivalence classes.
- a timing strategy can analyze one or more timing operators in the requirement and selects test objectives to test different points in the time span—such as, for example, a leading trigger and/or a lagging trigger. Events can be taken into consideration so that events are not always triggered in the time span.
- test case generator unit 140 generating test cases 136 , step 420 .
- the test case generator unit can perform model-checking, theorem proving, constraint solving, and reachability resolution methods on the intermediate test model to generate the test cases so as to satisfy the test objectives and/or detect unreachable test objectives.
- the model-checking and theorem proving methods can each utilize formal methods tools that are respectively based on model-checking or theorem proving techniques that check the satisfaction of the negation of the test objectives against the requirements. If not satisfied, a counterexample can be generated, which can be used to generate test cases; If satisfied, the particular test objective is unreachable.
- the constraint-solving method can use constraint solvers and/or optimization tools to solve the constraints in the test objective to find a feasible solution as a test case. If the constraints are identified as infeasible, the corresponding test objective is unreachable.
- the reachability resolution method can model a set of requirements as a hybrid model which combines discrete transitions and dynamics (if requirements are stateful).
- the model can then be analyzed to find a feasible path from initial conditions to reach the test objective, where the dynamics are approximated and/or analytically solved during the path finding. If a feasible path is identified, it can be used to generate test cases; if no feasible paths can reach the test objective, the test objective is identified as unreachable.
- test cases are translated, step 425 , into test scripts for test execution and test review artifacts for certification.
- the advantage of the component-level test generation method is that the method is flexible to automatically generate requirements-based test cases for components at different levels in the software architecture to achieve appropriate requirements-level coverage criteria.
- test cases can be generated applicable to either unit/module level testing as well as integration level testing.
- FIG. 9 depicts process 900 for test script generation and test artifact review in accordance with embodiments.
- the intermediate format 905 generated by process 900 can be readable by humans and/or machines.
- Process 900 operates on the component-level test cases described above.
- An intermediate format is generated, step 905 , from the test cases.
- This intermediate format can indicate the input and expected output information.
- the intermediate format can also indicate the requirements to which the test case traces back, the test objective that the test case is satisfying, and the reference that the test objective is derived from.
- the intermediate information can be used to manually, or automatically, conduct test reviews.
- Certification artifacts are generated, step 910 , from the intermediate information.
- the intermediate information can be used to generate, step 915 , executable test scripts suitable to execute in different test environments.
- the test scripts can also automatically be written back, step 920 , to requirements and test management tools (e.g., IBM® Rational® DOORS®).
- FIGS. 10-13 depict an illustration of an end-to-end implementation in accordance with embodiments.
- FIG. 10 depicts exemplary software architecture design model 1000 and the associated requirements traceability information in accordance with embodiments.
- the software architecture model can be constructed ( FIG. 2 , step 205 ) as a Simulink model ( FIGS. 3A-3D ).
- Each block in the software design model software architecture design is converted to a block in the software architecture model in Simulink with the same interface and architectural information.
- Each block in the software architecture model is also allocated with a set of requirements models based on the requirements traceability information of FIG. 10 .
- four requirements models ( 1010 ) are allocated to component2 based on the information in FIG. 10 .
- the software architecture model depicted in FIGS. 3A-3D is then used to generate requirements-based test cases at different levels in the software architecture.
- test objectives and constraints blocks will automatically be attached to all the requirements models inside “component2” block at step 415 . After calling Simulink Design Verifier at step 420 and translating test cases at step 425 , the test cases that satisfy all the test objectives and constraints for input masking test strategy will be generated.
- FIG. 11 depicts exemplary unit level test script 1100 in accordance with embodiments.
- This unit level test script is an example of generated test cases at the unit-level for “component2.”
- the test case is generated to be able to execute in SCADE test environment on the “component2” block in the design.
- a user can alternatively select integration-level block that includes component 1-4 at FIG. 4 , step 410 to generate integration-level test cases.
- test objectives and constraints blocks are automatically attached to all the requirements models inside the integration-level block at step 415 . After calling Simulink Design Verifier at step 420 and translating test cases at step 425 , the test cases that satisfy all the test objectives and constraints for input masking test strategy will be generated.
- FIG. 12 depicts exemplary integration level test script 1200 in accordance with embodiments.
- This test script is one example for the generated integration level test cases.
- the test case is generated to be able to execute in SCADE test environment on the integration-level block in the design.
- a hierarchical data flow diagram (i.e., software architecture model along with requirement models) is automatically created to capture requirements and design information.
- This hierarchical data flow diagram is used to generate requirements-based test cases at different levels in the software architecture.
- system design information is used to build the hierarchical data flow diagram, where requirements models are allocated inside modules of the hierarchical data flow diagram. The requirements allocations are based on the requirements-module traceability information from the design information.
- Test objectives and constraints can be attached to the software architecture model according to a user-selected test strategy.
- Automatic test case generation is based on the hierarchical data flow diagram to generate requirements-based test cases at different levels in the design architecture that satisfy the test objectives and constraints. The generated test cases can be directly executed on components at multi-levels in the design.
- a computer program application stored in non-volatile memory or computer-readable medium may include code or executable instructions that when executed may instruct and/or cause a controller or processor to perform methods discussed herein such as for automated requirements-based test case generation, as described above.
- the computer-readable medium may be a non-transitory computer-readable media including all forms and types of memory and all computer-readable media except for a transitory, propagating signal.
- the non-volatile memory or computer-readable medium may be external memory.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Software Systems (AREA)
- Debugging And Monitoring (AREA)
- Stored Programmes (AREA)
Abstract
Automated requirements-based test case generation method includes constructing a software architecture model derived from software design model architectural information, allocating requirement models into blocks/operators of the software architecture model, and generating component-level requirements-based test cases from the software architecture configured to be executable at different levels in the software architecture. The component-level requirements-based test case generation method includes receiving a software architecture along with allocated requirement models represented in hierarchical data flow diagram, selecting one of the software components, building an intermediate test model based on the selected component by automatically attaching at least one of test objectives or constraints to the corresponding software architecture model blocks/operators based on the selected test strategy, and generating human and machine readable test cases with the test generator for further automatic conversion to test executable and test review artifacts. A system and a non-transitory computer-readable medium for implementing the method are also disclosed.
Description
- This patent application claims the benefit of priority as a continuation-in-part, under 35 U.S.C. § 120, to U.S. patent application Ser. No. 14/947,633, filed Nov. 20, 2015, titled “SYSTEM AND METHOD FOR SAFETY-CRITICAL SOFTWARE AUTOMATED REQUIREMENTS-BASED TEST CASE GENERATION” (now U.S. Pat. No. TBD; issued MONTH DD, 2018), the entire disclosure of which is incorporated herein by reference.
- Safety-critical software, such as aviation software, is required by certification standards (e.g., DO-178B/C for aviation software) to be strictly verified against certification objectives. Testing is an essential part of the verification process. Manual test case generation from the requirements is hard and time-consuming, especially with complex, large software.
- Automatically generated test cases and/or test procedures derived from the high-level software requirements can help reduce the cost introduced by manual test case generation and review activities. Those test cases and/or test procedures generated from the specifications can be executed on the associated low-level design implementations through a test conductor.
- Conventional test tools and/or models are not able to generate requirements-based test cases at different levels in the design model. The generated test cases produced by conventional tools cannot be directly executed on components at multi-levels in the design.
-
FIG. 1 depicts a system for automated requirements-based test case generation in accordance with embodiments; -
FIG. 2 depicts a process for software testing in accordance with embodiments; -
FIGS. 3A-3D depict a software architecture model automatically derived from a software design in accordance with embodiments; -
FIG. 4 depicts a process for component-level test case generation in accordance with embodiments; -
FIG. 5 depicts a requirement model in accordance with embodiments; -
FIG. 6 depicts a requirement model with attached test objectives in accordance with embodiments; -
FIG. 7 depicts a logic condition coverage model with attached test objectives in accordance with embodiments; -
FIGS. 8A-8B depict an input masking model with attached test objectives in accordance with embodiments; -
FIG. 9 depicts a test script generation and review process in accordance with embodiments; -
FIG. 10 depicts an exemplary software architecture design and the associated requirements traceability information in accordance with embodiments; -
FIG. 11 depicts an exemplary unit level test script in accordance with embodiments; and -
FIG. 12 depicts an exemplary integration level test script in accordance with embodiments. - In accordance with embodiments, systems and methods automatically create a software architecture model from the software design architecture along with requirement models to automate multi-level architectural requirements-based test case generation based on the proposed software architecture model.
- In accordance with embodiments, the software architecture model and its requirements allocation are constructed using a model-based development (MBD) tool with the representation of a hierarchical data flow diagram. As opposed to conventional MBD tools, which are traditionally used for low-level design, an embodying MBD tool automatically creates the software architecture model from the software design architecture and generates corresponding test cases for the system-level or high-level requirements.
- Embodying systems and methods can implement component-level requirements-based test case generation to automatically generate test cases for components at different levels in the software architecture.
-
FIG. 1 depictssystem 100 for automated requirements-based test case generation in accordance with embodiments.System 100 includescontrol processor 110 which executescomputer instructions 131 to control the operation of the system and its components. The executable instructions can be stored indata store 130, which also can store data accessed and generated by the system.Control processor 110 can be located in a computer, or a server, and interconnected to the various components viacommunication link 120. The communication link can be an internal bus, an electronic communication network, or the like.System 100 can generate test cases from system and/orhigh level requirements 132 of the safety-critical software. -
FIG. 2 depictsprocess 200 for automated software testing in accordance with embodiments.Process 200 derivessoftware architecture model 134 from the software design model. The software architecture model is constructed,step 205, in model-baseddevelopment tool 150 environment based on the design model's architectural information. This software architecture model can be automatically created in accordance with some implementations. Requirement models are allocated,step 210, into different modules in the software architecture model by connecting the corresponding monitored/controlled variables (e.g.,FIG. 1 : VAR1-VAR19) with the input/output ports of the component. In accordance with some embodiments, atstep 205 input ports for all output variables in the requirement models can be added. By adding input ports for all output variables, the test case generator can generate inputs and expected outputs for the test cases in one run. - The component-level test
case generator unit 140 can use the software architecture model with allocated requirements to generate,step 215, unit/module-level requirements-based test cases. The testcase generator unit 140 can also generate,step 220, integration-level test cases to verify if the code component or integration complies with the allocated requirements. -
FIGS. 3A-3D depictsoftware architecture model 300 built as a hierarchical data flow diagram based on the software design architecture in accordance with embodiments. Each component Component1, Component2, Component3, Component4 in the software design is a block/operator in the software architecture model with input and output ports. Blocks/operators in the software architecture model are connected with each other and can have multiple layers of sub-blocks/sub-operators. Requirement models REQ12, REQ13, REQ14, REQ15 are allocated into the software architecture model and connected to the input and output ports. The building process is automatic, systematic and modular. The hierarchical data flow provides good visualization and easy traceability from the requirements to the software design. -
FIG. 4 depictsprocess 400 for component-level test case generation in accordance with embodiments.Process 400 is based on a received,step 405, automatically created software architecture model in the form of a hierarchical data flow diagram with requirement models allocated. One, or more, of the software components in the software design is selected,step 410, based on the level of test generation, and the corresponding software architecture model block/operator is used for test case generation. An intermediate test model is built,step 415, based on the selected component by attaching the test constraints and test objectives to the corresponding software architecture model block/operator. The test objectives are attached to satisfy certain coverage criteria at the requirements-level, such as requirements coverage (e.g., all requirements are activated), logic condition coverage (e.g., logic conditions in the requirements are activated at certain level), etc. The test constraints are attached to the model to constrain the monitored/controlled variables range and to ensure that the generated test cases do not violate the requirements. - The automatic test case generation strategies (i.e., to attach the test objectives and the constraints) can be based on the general form of a requirement. In natural structural English language, the form of a requirement can be expressed as:
-
- <antecedent expression>implies<consequent expression>,
- Where <antecedent expression> is a logic expression on monitored variables;
- and <consequent expression> is a logic expression on controlled variables.
-
FIG. 5 depicts requirement model 500 (when expressed in Simulink) in accordance with embodiments. The requirement model includesantecedent expression 510 andconsequent expression 520. These expressions are provided tologic block 530, which impliesoutput signal 540 based on the expressions' states (i.e., A==>B). In order for the requirement to hold, the output signal must be 1 or “true”. The automatic testcase generator unit 140 receives such and, from that, generates test cases, according to one or more strategies (e.g., requirements coverage strategy, logic condition coverage strategy, input masking strategy, data completeness analysis strategy, event strategy, list strategy, decomposition and equation strategy, equivalence class strategy, boundary value analysis strategy, robustness strategy, timing strategy, etc.). Embodiments are not so limited, and other strategies for test case generation are within the contemplation of this disclosure. - A requirements coverage strategy includes, for each requirement, generating one test case where the requirement must be satisfied with the antecedent expression being true. This is done by inserting test objectives and constraints and running a test generation engine that can drive the input sequences to achieve the test objectives.
- By way of example, the insertion of a test objective can be done using test objective and test condition blocks from a commercial design verifier block library in the selected model-based development tool (e.g., such as Simulink Design Verifier blocks available from Simulink). The test generation engine can be used to drive the inputs to achieve the test objectives.
FIG. 6 depictsrequirement model 600 with attached test objectives in accordance with embodiments. Test Objective block 610 (notated with a “O” within a circle), is analyzed by the design verifier to find a combination of variable value assignments VAR21, VAR22 that cause its antecedent expression, which is of Boolean type, be true. Test Condition block 620 (notated with a “C” within a circle), causes the design verifier to keep the output of the “implies” block to be true. A true output signal of the “implies” block is an indication that the requirement REQ27 is satisfied. The value assignments to the monitored and controlled variables are generated automatically by the design verifier. - A logic condition coverage (LCC) strategy can be implemented to achieve functional coverage of logic equation conditions. Each condition within a logic equation is demonstrated to have an effect on the outcome of the logic equation by varying only that condition and holding fixed for all others that could affect the outcome. Consider the examples in Table 1, which depicts logic condition coverage for two variables, where two Boolean values (a and b) are the conditions for the listed Boolean operators. Table 1 indicates whether a test case is necessary to achieve LCC coverage (✓) or not (x). When the antecedent expression has one of these operators, test cases are generated for each of the corresponding combinations marked with (✓), and this is generalizable for any number of operands.
-
TABLE 1 Boolean Test Case (T = True, F = False) operator TT TF FT FF a AND b ✓ ✓ ✓ x a OR b x ✓ ✓ ✓ a NAND b ✓ ✓ ✓ x a NOR b x ✓ ✓ ✓ a XOR b ✓ ✓ ✓ ✓ a XNOR b ✓ ✓ ✓ ✓ -
FIG. 7 depicts logiccondition coverage model 700 with attached test objectives in accordance with embodiments. This LCC model is based onrequirement model 600 with a pattern of additional test objective and condition blocks 710, 720, 730. In order to generate the test cases, the test objective blocks are attached according to the Boolean operators used and respective cases in Table 1. After running the test generation engine, a set of test cases is generated to satisfy logic condition coverage. Each generated test case is also examined in order to discover which requirements it “activates” activation means that the output signal ofSatisfiability port 740 must be 1 or “true”. - An input masking strategy can achieve masking Modified Condition/Decision Coverage (MC/DC). The masking MC/DC meets the definition of independent effect by guaranteeing the same minimum test cases at each logical operator as a unique cause, and is acceptable for meeting the MC/DC objective of safety-critical software development standards (e.g., DO-178B/C). Masking refers to the concept that specific inputs to a logic construct can hide the effect of other inputs to the construct. For example, a false input to an AND operator masks all other inputs, and a true input to an OR operator masks all other inputs. The masking approach to MC/DC allows more than one input to change in an independence pair, as long as the condition of interest is shown to be the only condition that affects the value of the decision outcome. However, analysis of the internal logic of the decision is needed to show that the condition of interest is the only condition causing the value of the decision's outcome to change.
-
FIGS. 8A-8B depicts input maskingmodel strategy 800 with attached test objectives in accordance with embodiments. Each sub-expression depicted ininput masking model 800 corresponds to a signal/block path that starts at an input condition of the antecedent expression, involving a monitored variable VAR23, VAR24, VAR25, and ends at the signal that represents the result of the monitored expression. From this model, test cases can be automatically generated, associated to requirements REQ28 by automatic testcase generator unit 140 and translated to an output test script. - The input masking test generation strategy attaches test objectives according to the following steps:
- For each basic proposition (input condition) of the antecedent expression, obtain the set S of all sub-expressions which contain this proposition, except the proposition itself. Then, for each expression in set S: (1) if the top-level operation of the sub-expression is an OR gate, substitute this expression by its negation in S; (2) create an expression e which is the conjunction of all expressions in S and the basic proposition of above; and (3) create a test objective which must make expression e true.
- A data completeness analysis strategy can analyze one or more variables that appear in the requirement and selects test objectives to test different points within the physical or functional range of the particular variables. The selection of test objectives can be based on, for example, the variable type. In one implementation, numeric variables could be tested on their minimum, maximum, boundaries, etc.; and enumerated and Boolean variables can be tested on all possible values and/or states.
- An event strategy can ensure that each event can be triggered at least once. This strategy can also ascertain that one event is not continuously triggered. The generated event test cases and procedures can trigger particular events and verify the outputs with other input conditions remain constant.
- A list strategy can analyze list variables and operators that appear in the requirement and selects test objectives to test different properties of lists. For example, this strategy can determine whether list operations take place at different positions of the lists, and ensure that each list variable is tested at least at a minimum and a maximum list length.
- A decomposition and equation strategy can analyze functions and/or equations inside of the requirement. These functions and/or equations can be undefined in some implementations. Test objectives can be selected by analyzing the input and/or the output parameters of these functions or equations, and error prone points in the defined functions or equations.
- An equivalence class strategy, a boundary value analysis strategy, and a robustness strategy can each analyze inequalities in the requirement and select test objectives based on the equivalence class partitions that can be induced by the inequalities. An equivalence class strategy can select one or more test objective for each normal equivalence class; a boundary value analysis strategy can select one or more test objectives at the boundaries between every two equivalence classes; and a robustness strategy can select one or more test objectives for the abnormal equivalence classes.
- A timing strategy can analyze one or more timing operators in the requirement and selects test objectives to test different points in the time span—such as, for example, a leading trigger and/or a lagging trigger. Events can be taken into consideration so that events are not always triggered in the time span.
- With reference again to
FIG. 4 , after attachment of the test constraints andtest objectives process 400 continues with testcase generator unit 140 generatingtest cases 136,step 420. The test case generator unit can perform model-checking, theorem proving, constraint solving, and reachability resolution methods on the intermediate test model to generate the test cases so as to satisfy the test objectives and/or detect unreachable test objectives. - The model-checking and the theorem proving methods can each utilize formal methods tools that are respectively based on model-checking or theorem proving techniques that check the satisfaction of the negation of the test objectives against the requirements. If not satisfied, a counterexample can be generated, which can be used to generate test cases; If satisfied, the particular test objective is unreachable.
- The constraint-solving method can use constraint solvers and/or optimization tools to solve the constraints in the test objective to find a feasible solution as a test case. If the constraints are identified as infeasible, the corresponding test objective is unreachable.
- The reachability resolution method can model a set of requirements as a hybrid model which combines discrete transitions and dynamics (if requirements are stateful). The model can then be analyzed to find a feasible path from initial conditions to reach the test objective, where the dynamics are approximated and/or analytically solved during the path finding. If a feasible path is identified, it can be used to generate test cases; if no feasible paths can reach the test objective, the test objective is identified as unreachable.
- With reference again to
FIG. 4 , the generated test cases are translated,step 425, into test scripts for test execution and test review artifacts for certification. The advantage of the component-level test generation method is that the method is flexible to automatically generate requirements-based test cases for components at different levels in the software architecture to achieve appropriate requirements-level coverage criteria. In accordance with embodiments, test cases can be generated applicable to either unit/module level testing as well as integration level testing. -
FIG. 9 depictsprocess 900 for test script generation and test artifact review in accordance with embodiments. Theintermediate format 905 generated byprocess 900 can be readable by humans and/or machines.Process 900 operates on the component-level test cases described above. An intermediate format is generated,step 905, from the test cases. This intermediate format can indicate the input and expected output information. The intermediate format can also indicate the requirements to which the test case traces back, the test objective that the test case is satisfying, and the reference that the test objective is derived from. The intermediate information can be used to manually, or automatically, conduct test reviews. Certification artifacts are generated,step 910, from the intermediate information. The intermediate information can be used to generate, step 915, executable test scripts suitable to execute in different test environments. The test scripts can also automatically be written back,step 920, to requirements and test management tools (e.g., IBM® Rational® DOORS®). - Collectively,
FIGS. 10-13 depict an illustration of an end-to-end implementation in accordance with embodiments.FIG. 10 depicts exemplary softwarearchitecture design model 1000 and the associated requirements traceability information in accordance with embodiments. The software architecture model can be constructed (FIG. 2 , step 205) as a Simulink model (FIGS. 3A-3D ). Each block in the software design model software architecture design is converted to a block in the software architecture model in Simulink with the same interface and architectural information. Each block in the software architecture model is also allocated with a set of requirements models based on the requirements traceability information ofFIG. 10 . For example, inFIG. 3D , four requirements models (1010) are allocated to component2 based on the information inFIG. 10 . Similarly,block 1020 indicates the requirements traceability information for component1;block 1030 indicates the requirements traceability information for component3; andblock 1040 indicates the requirements traceability information for component4. The software architecture model depicted inFIGS. 3A-3D is then used to generate requirements-based test cases at different levels in the software architecture. - A user can select “component2” block (
FIG. 4 , step 410) to generate test cases at this unit-level and select input masking test strategy. In accordance with embodiments, test objectives and constraints blocks will automatically be attached to all the requirements models inside “component2” block atstep 415. After calling Simulink Design Verifier atstep 420 and translating test cases atstep 425, the test cases that satisfy all the test objectives and constraints for input masking test strategy will be generated. -
FIG. 11 depicts exemplary unitlevel test script 1100 in accordance with embodiments. This unit level test script is an example of generated test cases at the unit-level for “component2.” The test case is generated to be able to execute in SCADE test environment on the “component2” block in the design. A user can alternatively select integration-level block that includes component 1-4 atFIG. 4 , step 410 to generate integration-level test cases. In accordance with embodiments, test objectives and constraints blocks are automatically attached to all the requirements models inside the integration-level block atstep 415. After calling Simulink Design Verifier atstep 420 and translating test cases atstep 425, the test cases that satisfy all the test objectives and constraints for input masking test strategy will be generated. -
FIG. 12 depicts exemplary integrationlevel test script 1200 in accordance with embodiments. This test script is one example for the generated integration level test cases. The test case is generated to be able to execute in SCADE test environment on the integration-level block in the design. - In accordance with embodiments, a hierarchical data flow diagram (i.e., software architecture model along with requirement models) is automatically created to capture requirements and design information. This hierarchical data flow diagram is used to generate requirements-based test cases at different levels in the software architecture. In accordance with embodiments, system design information is used to build the hierarchical data flow diagram, where requirements models are allocated inside modules of the hierarchical data flow diagram. The requirements allocations are based on the requirements-module traceability information from the design information. Test objectives and constraints can be attached to the software architecture model according to a user-selected test strategy. Automatic test case generation is based on the hierarchical data flow diagram to generate requirements-based test cases at different levels in the design architecture that satisfy the test objectives and constraints. The generated test cases can be directly executed on components at multi-levels in the design.
- In accordance with some embodiments, a computer program application stored in non-volatile memory or computer-readable medium (e.g., register memory, processor cache, RAM, ROM, hard drive, flash memory, CD ROM, magnetic media, etc.) may include code or executable instructions that when executed may instruct and/or cause a controller or processor to perform methods discussed herein such as for automated requirements-based test case generation, as described above.
- The computer-readable medium may be a non-transitory computer-readable media including all forms and types of memory and all computer-readable media except for a transitory, propagating signal. In one implementation, the non-volatile memory or computer-readable medium may be external memory.
- Although specific hardware and methods have been described herein, note that any number of other configurations may be provided in accordance with embodiments of the invention. Thus, while there have been shown, described, and pointed out fundamental novel features of the invention, it will be understood that various omissions, substitutions, and changes in the form and details of the illustrated embodiments, and in their operation, may be made by those skilled in the art without departing from the spirit and scope of the invention. Substitutions of elements from one embodiment to another are also fully intended and contemplated. The invention is defined solely with regard to the claims appended hereto, and equivalents of the recitations therein.
Claims (20)
1. A method for automated requirements-based test case generation, the method comprising:
constructing a software architecture model in a model based development tool, the software architecture model automatically derived from architectural information of a software design model;
allocating requirement models into different components of a software architecture model;
a test case generator unit generating component level requirements-based test cases from one or more levels of the software architecture model; and
the generated requirements-based test cases configured to be executable at different levels in the software architecture.
2. The method of claim 1 , including allocating the requirement models by connecting corresponding monitored or controlled variables with at least one of an input port and an output port of respective ones of the different modules.
3. The method of claim 1 , including the test case generator unit generating integration level test cases, and applying the integration level test cases to verify if a code module complies with the allocated requirements.
4. The method of claim 1 , including:
receiving the software architecture model in the form of a hierarchical data flow diagram derived from the software design along with the allocated requirement models, the hierarchical data flow diagram including one or more blocks/operators mapping to corresponding components in the software design;
selecting one of the software components from the software design for test case generation; and
building an intermediate test model based on the selected component by automatically attaching at least one test objectives and test constraints to the corresponding software architecture model block/operator.
5. The method of claim 4 , including selecting the software component based on a level of test generation.
6. The method of claim 1 , including generating the test cases according to at least one strategy selected from the list of a requirements coverage strategy, a logic condition coverage strategy, an input masking strategy, a data completeness analysis strategy, an event strategy, a list strategy, a decomposition and equation strategy, an equivalence class strategy, a boundary value analysis strategy, a robustness strategy, and a timing strategy.
7. The method of claim 4 , including:
generating requirements-based test cases by performing at least one of model-checking, theorem proving, constraint solving, and reachability resolution methods on the intermediate test model; and
translating the generated test cases into test scripts for test execution, and into test artifacts for review.
8. A non-transitory computer-readable medium having stored thereon instructions which when executed by a processor cause the processor to perform a method for automated requirements-based test case generation, the method comprising:
constructing a software architecture model, the software architecture model automatically derived from architectural information of a software design model;
allocating requirement models into different blocks/operators of a software architecture model;
generating component level requirements-based test cases from one or more levels of the software architecture model; and
the generated requirements-based test cases configured to be executable at different levels in the software architecture.
9. The non-transitory computer-readable medium of claim 8 , including instructions to cause the processor to allocate the requirement models by connecting corresponding monitored or controlled variables with an input port or an output port of respective ones of the different modules.
10. The non-transitory computer-readable medium of claim 8 , including instructions to cause the processor to generate integration level test cases, and apply the integration level test cases to verify if a code module complies with the allocated requirements.
11. The non-transitory computer-readable medium of claim 8 , including instructions to cause the processor to:
receive the software architecture model in the form of a hierarchical data flow diagram derived from the software design along with the allocated requirement models, the hierarchical data flow diagram including one or more blocks/operators mapping to corresponding components in the software design;
select one of the software components from the software design for test case generation; and
build an intermediate test model based on the selected component by automatically attaching at least one test objectives and test constraints to the corresponding software architecture model block/operator.
12. The non-transitory computer-readable medium of claim 10 , including instructions to cause the processor to generate the test cases according to at least one strategy selected from the list of a requirements coverage strategy, a logic condition coverage strategy, an input masking strategy, a data completeness analysis strategy, an event strategy, a list strategy, a decomposition and equation strategy, an equivalence class strategy, a boundary value analysis strategy, a robustness strategy, and a timing strategy.
13. The non-transitory computer-readable medium of claim 11 , including instructions to cause the processor to:
generate requirements-based test cases by performing at least one of model-checking, theorem proving, constraint solving, and reachability resolution methods on the intermediate test model; and
translate the generated test cases into test scripts for test execution, and into test artifacts for review.
14. A system for automated requirements-based test case generation, the system comprising:
a model based development tool including a control processor configured to execute instructions, the control processor connected to a communication link;
a component level test case generator unit to automatically generate test cases.
15. The system of claim 14 , including the control processor configured to execute instructions that cause the control processor to perform the steps of:
deriving software architecture model from software design
allocating requirement models into different blocks/operators of a software architecture model;
generating component level requirements-based test cases.
16. The system of claim 15 , including the control processor configured to execute instructions that cause the control processor to generate integration level test cases, and apply the integration level test cases to verify if a code module complies with the software architecture model and the allocated requirement models.
17. The system of claim 15 , including the control processor configured to execute instructions that cause the control processor to:
receiving the software architecture model in the form of a hierarchical data flow diagram derived from the software design along with the allocated requirement models, the hierarchical data flow diagram including one or more blocks/operators mapping to corresponding components in the software design;
selecting one of the software components from the software design for test case generation; and
building an intermediate test model based on the selected component by automatically attaching at least one test objectives and test constraints to the corresponding software architecture model block/operator.
18. The method of claim 6 , the input masking strategy including masking Modified Condition/Decision Coverage (MC/DC) to allow more than one input of an input condition to change in an independent pair.
19. The non-transitory computer-readable medium of claim 12 , the input masking strategy including masking Modified Condition/Decision Coverage (MC/DC) to allow more than one input of an input condition to change in an independent pair.
20. The system of claim 15 , the generating component level requirements-based test cases including an input masking strategy that allows more than one input of an input condition to change in an independent pair.
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/916,660 US20180196739A1 (en) | 2015-11-20 | 2018-03-09 | System and method for safety-critical software automated requirements-based test case generation |
CA3035176A CA3035176A1 (en) | 2015-11-20 | 2019-02-28 | System and method for safety-critical software automated requirements-based test case generation |
EP19160803.3A EP3572945A1 (en) | 2018-03-09 | 2019-03-05 | System and method for safety-critical software automated requirements-based test case generation |
CN201910175783.XA CN110245067B (en) | 2015-11-20 | 2019-03-08 | System and method for automatically generating test case based on requirement of safety key software |
BR102019004568-0A BR102019004568A2 (en) | 2015-11-20 | 2019-03-08 | METHOD FOR AUTOMATIC GENERATION OF REQUIRED BASED TEST CASES, NON-TRANSITIONAL COMPUTER LEGIBLE MEANS AND SYSTEM FOR AUTOMATIC REQUIRED TEST CASES |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/947,633 US9940222B2 (en) | 2015-11-20 | 2015-11-20 | System and method for safety-critical software automated requirements-based test case generation |
US15/916,660 US20180196739A1 (en) | 2015-11-20 | 2018-03-09 | System and method for safety-critical software automated requirements-based test case generation |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/947,633 Continuation US9940222B2 (en) | 2015-11-20 | 2015-11-20 | System and method for safety-critical software automated requirements-based test case generation |
Publications (1)
Publication Number | Publication Date |
---|---|
US20180196739A1 true US20180196739A1 (en) | 2018-07-12 |
Family
ID=58699700
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/947,633 Active US9940222B2 (en) | 2015-11-20 | 2015-11-20 | System and method for safety-critical software automated requirements-based test case generation |
US15/916,660 Abandoned US20180196739A1 (en) | 2015-11-20 | 2018-03-09 | System and method for safety-critical software automated requirements-based test case generation |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/947,633 Active US9940222B2 (en) | 2015-11-20 | 2015-11-20 | System and method for safety-critical software automated requirements-based test case generation |
Country Status (7)
Country | Link |
---|---|
US (2) | US9940222B2 (en) |
JP (1) | JP6307140B2 (en) |
CN (3) | CN114996115A (en) |
BR (2) | BR102016026988A2 (en) |
CA (2) | CA2948250C (en) |
FR (1) | FR3044126B1 (en) |
GB (1) | GB2545800A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10705800B2 (en) * | 2017-11-30 | 2020-07-07 | The Mathworks, Inc. | Systems and methods for evaluating compliance of implementation code with a software architecture specification |
EP3722942A1 (en) * | 2019-04-10 | 2020-10-14 | The Boeing Company | Automatic generation of integration tests from unit tests |
US10915422B2 (en) | 2017-12-13 | 2021-02-09 | The Mathworks, Inc. | Automatic setting of multitasking configurations for a code-checking system |
US11200069B1 (en) | 2020-08-21 | 2021-12-14 | Honeywell International Inc. | Systems and methods for generating a software application |
US20220269593A1 (en) * | 2021-02-24 | 2022-08-25 | The Boeing Company | Automatic generation of integrated test procedures using system test procedures |
Families Citing this family (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9940222B2 (en) * | 2015-11-20 | 2018-04-10 | General Electric Company | System and method for safety-critical software automated requirements-based test case generation |
US10025696B2 (en) | 2016-02-09 | 2018-07-17 | General Electric Company | System and method for equivalence class analysis-based automated requirements-based test case generation |
US20180165180A1 (en) * | 2016-12-14 | 2018-06-14 | Bank Of America Corporation | Batch File Creation Service |
JP6556786B2 (en) | 2017-05-17 | 2019-08-07 | 矢崎総業株式会社 | Terminal |
CN107729226A (en) * | 2017-07-13 | 2018-02-23 | 中科院合肥技术创新工程院 | Automatic generating test case system and method based on Business Stream |
CN107729256A (en) * | 2017-11-16 | 2018-02-23 | 郑州云海信息技术有限公司 | The correlating method and system of a kind of project demands and test case |
EP3572945A1 (en) * | 2018-03-09 | 2019-11-27 | General Electric Company | System and method for safety-critical software automated requirements-based test case generation |
CN108595318B (en) * | 2018-03-31 | 2021-05-14 | 西安电子科技大学 | Difference test method for digital certificate verification module in RFC-guided SSL/TLS implementation |
CN108427554B (en) * | 2018-05-14 | 2023-09-08 | 华南理工大学 | Automatic construction method and system for cloud mode software driven by table |
WO2020015837A1 (en) * | 2018-07-20 | 2020-01-23 | Siemens Aktiengesellschaft | Method and arrangement for providing, checking and optimizing an automation program |
US10585779B2 (en) | 2018-07-30 | 2020-03-10 | General Electric Company | Systems and methods of requirements chaining and applications thereof |
CN109344059B (en) * | 2018-09-20 | 2023-04-11 | 天津龙拳风暴科技有限公司 | Server pressure testing method and device |
CN109783354A (en) * | 2018-12-14 | 2019-05-21 | 深圳壹账通智能科技有限公司 | Function test method, terminal device and the medium of application system |
US11138089B2 (en) | 2018-12-19 | 2021-10-05 | International Business Machines Corporation | Performance benchmark generation |
CN109815147B (en) * | 2019-01-21 | 2022-06-24 | 深圳乐信软件技术有限公司 | Test case generation method, device, server and medium |
CN110389898A (en) * | 2019-06-19 | 2019-10-29 | 深圳壹账通智能科技有限公司 | Acquisition methods, device, terminal and the computer readable storage medium of Test Strategy |
US12056917B2 (en) | 2019-10-02 | 2024-08-06 | Intelligent Dynamics, Llc | Distributed management and control in autonomous conveyances |
US11620454B2 (en) * | 2020-02-05 | 2023-04-04 | Hatha Systems, LLC | System and method for determining and representing a lineage of business terms and associated business rules within a software application |
CN111539099A (en) * | 2020-04-17 | 2020-08-14 | 北京航空航天大学 | Simulink model verification method based on program variation |
CN111679941A (en) * | 2020-05-31 | 2020-09-18 | 西南电子技术研究所(中国电子科技集团公司第十研究所) | Method for automatically identifying instrument model mapping instrument instruction to realize instrument interchange |
US11144436B1 (en) | 2020-10-19 | 2021-10-12 | Bank Of America Corporation | System for testing an application with dynamically linked security tests |
CN112231164B (en) * | 2020-12-11 | 2021-08-27 | 鹏城实验室 | Processor verification method, device and readable storage medium |
CN113010152B (en) * | 2021-03-24 | 2024-09-20 | 中广核工程有限公司 | Nuclear power plant security level software design system and method |
CN112905486B (en) * | 2021-03-26 | 2022-07-08 | 建信金融科技有限责任公司 | Service integration test method, device and system |
CN113282498B (en) * | 2021-05-31 | 2024-04-05 | 深圳赛安特技术服务有限公司 | Method, device, equipment and storage medium for generating test cases |
CN115525532A (en) * | 2021-06-25 | 2022-12-27 | 华为云计算技术有限公司 | Test case selection method and related device |
CN114036041B (en) * | 2021-10-21 | 2024-09-17 | 南京航空航天大学 | Automatic generation method for SCADE model combination verification environment hypothesis based on machine learning |
CN117555812B (en) * | 2024-01-11 | 2024-05-17 | 北京捷科智诚科技有限公司 | Cloud platform automatic testing method and system |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070038977A1 (en) * | 2005-08-10 | 2007-02-15 | Capital One Financial Corporation | Software development tool using a structured format to generate software code |
Family Cites Families (45)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5390325A (en) | 1992-12-23 | 1995-02-14 | Taligent, Inc. | Automated testing system |
US7437304B2 (en) * | 1999-11-22 | 2008-10-14 | International Business Machines Corporation | System and method for project preparing a procurement and accounts payable system |
US6681383B1 (en) | 2000-04-04 | 2004-01-20 | Sosy, Inc. | Automatic software production system |
US7272752B2 (en) | 2001-09-05 | 2007-09-18 | International Business Machines Corporation | Method and system for integrating test coverage measurements with model based test generation |
CA2393043A1 (en) | 2002-07-11 | 2004-01-11 | Luiz Marcelo Aucelio Paternostro | Formal test case definitions |
US20050043913A1 (en) | 2003-08-19 | 2005-02-24 | Rex Hyde | Method of determining the level of structural coverage testing of test cases which are written for a program that does not provide for structural coverage testing |
US7478365B2 (en) | 2004-01-13 | 2009-01-13 | Symphony Services Corp. | Method and system for rule-based generation of automation test scripts from abstract test case representation |
US7392509B2 (en) | 2004-04-13 | 2008-06-24 | University Of Maryland | Method for domain specific test design automation |
JP2006024006A (en) | 2004-07-08 | 2006-01-26 | Denso Corp | Test case generation device, test case generation program, model base development program, device and program for diagnosing validity of source code generation, and method for developing model base |
US7865339B2 (en) | 2004-07-12 | 2011-01-04 | Sri International | Formal methods for test case generation |
US7979849B2 (en) | 2004-10-15 | 2011-07-12 | Cisco Technology, Inc. | Automatic model-based testing |
US8392873B2 (en) | 2005-01-26 | 2013-03-05 | Tti Inventions C Llc | Methods and apparatus for implementing model-based software solution development and integrated change management |
KR101222860B1 (en) | 2005-09-01 | 2013-01-16 | 삼성전자주식회사 | Optical pickup device |
US7716254B2 (en) | 2005-09-12 | 2010-05-11 | Infosys Technologies Ltd. | System for modeling architecture for business systems and methods thereof |
US7853906B2 (en) | 2006-03-22 | 2010-12-14 | Nec Laboratories America, Inc. | Accelerating high-level bounded model checking |
US20080056210A1 (en) | 2006-06-14 | 2008-03-06 | Toshiba America Research, Inc. | Moving Networks Information Server |
DE102006050112A1 (en) | 2006-10-25 | 2008-04-30 | Dspace Digital Signal Processing And Control Engineering Gmbh | Requirement description e.g. test specification, creating method for embedded system i.e. motor vehicle control device, involves automatically representing modules, and assigning to classes in particular unified modeling language classes |
US7644334B2 (en) | 2006-11-27 | 2010-01-05 | Honeywell International, Inc. | Requirements-based test generation |
US8041554B1 (en) | 2007-06-06 | 2011-10-18 | Rockwell Collins, Inc. | Method and system for the development of high-assurance microcode |
US9189757B2 (en) * | 2007-08-23 | 2015-11-17 | International Business Machines Corporation | Monitoring and maintaining balance of factory quality attributes within a software factory environment |
US8307342B2 (en) | 2008-05-14 | 2012-11-06 | Honeywell International Inc. | Method, apparatus, and system for automatic test generation from statecharts |
US8423879B2 (en) | 2008-05-14 | 2013-04-16 | Honeywell International Inc. | Method and apparatus for test generation from hybrid diagrams with combined data flow and statechart notation |
EP2286339B1 (en) | 2008-05-19 | 2018-09-26 | Johnson Controls Technology Company | Method of automatically formulating test cases for verifying at least one part of a piece of software |
JP2009294846A (en) * | 2008-06-04 | 2009-12-17 | Denso Corp | Test case generator, and test case generation program and method |
US8260479B2 (en) | 2008-12-09 | 2012-09-04 | Honeywell International Inc. | Modular software architecture for an unmanned aerial vehicle |
US20100192128A1 (en) | 2009-01-27 | 2010-07-29 | Honeywell International Inc. | System and methods of using test points and signal overrides in requirements-based test generation |
US20110083121A1 (en) | 2009-10-02 | 2011-04-07 | Gm Global Technology Operations, Inc. | Method and System for Automatic Test-Case Generation for Distributed Embedded Systems |
US9298598B2 (en) * | 2010-03-22 | 2016-03-29 | Red Hat, Inc. | Automated visual testing |
US8849626B1 (en) | 2010-06-23 | 2014-09-30 | Iowa State University Research Foundation, Inc. | Semantic translation of stateflow diagrams into input/output extended finite automata and automated test generation for simulink/stateflow diagrams |
WO2012049816A1 (en) | 2010-10-14 | 2012-04-19 | 日本電気株式会社 | Model checking device, method, and program |
CN102136047A (en) | 2011-02-25 | 2011-07-27 | 天津大学 | Software trustworthiness engineering method based on formalized and unified software model |
JP5814603B2 (en) * | 2011-04-21 | 2015-11-17 | 株式会社東芝 | Test specification creation support apparatus, method and program |
US8645924B2 (en) | 2011-06-06 | 2014-02-04 | Fujitsu Limited | Lossless path reduction for efficient symbolic execution and automatic test generation |
US8893087B2 (en) | 2011-08-08 | 2014-11-18 | Ca, Inc. | Automating functionality test cases |
US9063673B2 (en) * | 2011-08-30 | 2015-06-23 | Uniquesoft, Llc | System and method for implementing application code from application requirements |
US9360853B2 (en) | 2011-09-19 | 2016-06-07 | Dspace Gmbh | Exchange of files and meta-information between system design tools and behavior modeling tools and/or simulators for the creation of ECU software |
CN102693134B (en) | 2012-05-25 | 2014-11-19 | 南京邮电大学 | Sensor network software modeling platform development method based on unified modeling language |
US9971676B2 (en) | 2012-08-30 | 2018-05-15 | Toyota Motor Engineering & Manufacturing North America, Inc. | Systems and methods for state based test case generation for software validation |
KR101408870B1 (en) | 2012-11-06 | 2014-06-17 | 대구교육대학교산학협력단 | Apparatus and method for multi level tast case generation based on multiple condition control flow graph from unified modeling language sequence diagram |
US9747079B2 (en) | 2014-12-15 | 2017-08-29 | General Electric Company | Method and system of software specification modeling |
CN104991863B (en) * | 2015-07-14 | 2017-11-03 | 株洲南车时代电气股份有限公司 | A kind of method that test case is automatically generated based on FBD test model |
CN104978275B (en) * | 2015-07-16 | 2017-09-29 | 北京航空航天大学 | A kind of target verification and evidence model extracting method towards DO 178C software test procedures |
CN105068927A (en) * | 2015-08-04 | 2015-11-18 | 株洲南车时代电气股份有限公司 | Keyword drive-based automatic test method of urban rail drive control units |
US9940222B2 (en) * | 2015-11-20 | 2018-04-10 | General Electric Company | System and method for safety-critical software automated requirements-based test case generation |
CN107727411B (en) * | 2017-10-30 | 2019-09-27 | 青岛慧拓智能机器有限公司 | A kind of automatic driving vehicle assessment scene generation system and method |
-
2015
- 2015-11-20 US US14/947,633 patent/US9940222B2/en active Active
-
2016
- 2016-11-09 JP JP2016218486A patent/JP6307140B2/en not_active Expired - Fee Related
- 2016-11-14 CA CA2948250A patent/CA2948250C/en not_active Expired - Fee Related
- 2016-11-16 GB GB1619371.6A patent/GB2545800A/en not_active Withdrawn
- 2016-11-17 FR FR1661158A patent/FR3044126B1/en active Active
- 2016-11-18 BR BR102016026988-1A patent/BR102016026988A2/en not_active Application Discontinuation
- 2016-11-18 CN CN202210276124.7A patent/CN114996115A/en active Pending
- 2016-11-18 CN CN201611015902.8A patent/CN107066375B/en active Active
-
2018
- 2018-03-09 US US15/916,660 patent/US20180196739A1/en not_active Abandoned
-
2019
- 2019-02-28 CA CA3035176A patent/CA3035176A1/en not_active Abandoned
- 2019-03-08 CN CN201910175783.XA patent/CN110245067B/en active Active
- 2019-03-08 BR BR102019004568-0A patent/BR102019004568A2/en not_active IP Right Cessation
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070038977A1 (en) * | 2005-08-10 | 2007-02-15 | Capital One Financial Corporation | Software development tool using a structured format to generate software code |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10705800B2 (en) * | 2017-11-30 | 2020-07-07 | The Mathworks, Inc. | Systems and methods for evaluating compliance of implementation code with a software architecture specification |
US10915422B2 (en) | 2017-12-13 | 2021-02-09 | The Mathworks, Inc. | Automatic setting of multitasking configurations for a code-checking system |
EP3722942A1 (en) * | 2019-04-10 | 2020-10-14 | The Boeing Company | Automatic generation of integration tests from unit tests |
CN111813649A (en) * | 2019-04-10 | 2020-10-23 | 波音公司 | Automatically generating integration tests from unit tests |
US11055207B2 (en) | 2019-04-10 | 2021-07-06 | The Boeing Company | Automatic generation of integration tests from unit tests |
US11200069B1 (en) | 2020-08-21 | 2021-12-14 | Honeywell International Inc. | Systems and methods for generating a software application |
US20220269593A1 (en) * | 2021-02-24 | 2022-08-25 | The Boeing Company | Automatic generation of integrated test procedures using system test procedures |
EP4050489A1 (en) * | 2021-02-24 | 2022-08-31 | The Boeing Company | Automatic generation of integrated test procedures using system test procedures |
US11960385B2 (en) * | 2021-02-24 | 2024-04-16 | The Boeing Company | Automatic generation of integrated test procedures using system test procedures |
Also Published As
Publication number | Publication date |
---|---|
GB2545800A (en) | 2017-06-28 |
CN107066375A (en) | 2017-08-18 |
JP2017097862A (en) | 2017-06-01 |
BR102019004568A2 (en) | 2019-10-01 |
CN110245067B (en) | 2023-09-22 |
CA2948250A1 (en) | 2017-05-20 |
CN114996115A (en) | 2022-09-02 |
CA3035176A1 (en) | 2019-09-09 |
JP6307140B2 (en) | 2018-04-04 |
CN107066375B (en) | 2022-03-01 |
FR3044126A1 (en) | 2017-05-26 |
CN110245067A (en) | 2019-09-17 |
FR3044126B1 (en) | 2022-01-14 |
US9940222B2 (en) | 2018-04-10 |
BR102016026988A2 (en) | 2017-07-18 |
US20170147482A1 (en) | 2017-05-25 |
CA2948250C (en) | 2020-05-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20180196739A1 (en) | System and method for safety-critical software automated requirements-based test case generation | |
US10437713B2 (en) | System and method for equivalence class analysis-based automated requirements-based test case generation | |
US9792204B2 (en) | System and method for coverage-based automated test case augmentation for design models | |
EP2381367A1 (en) | Method and apparatus for the performing unit testing of software modules in software systems | |
US20100192128A1 (en) | System and methods of using test points and signal overrides in requirements-based test generation | |
US20160266952A1 (en) | Automated Qualification of a Safety Critical System | |
JP2024507532A (en) | Systems and methods for determining program code defects and acceptability of use | |
EP3572945A1 (en) | System and method for safety-critical software automated requirements-based test case generation | |
US20160063162A1 (en) | System and method using pass/fail test results to prioritize electronic design verification review | |
US10769332B2 (en) | Automatic simulation failures analysis flow for functional verification | |
US10394688B2 (en) | Method for detecting computer module testability problems | |
Falah et al. | Taxonomy dimensions of complexity metrics | |
US11847393B2 (en) | Computing device and method for developing a system model utilizing a simulation assessment module | |
US20240241816A1 (en) | Automated test generation | |
Mwambe et al. | Selection and application of software testing techniques to specific conditions of software projects | |
Sharma et al. | Designing control logic for cockpit display systems using model-based design | |
Mor | Software Testing Techniques for Faults/Errors Detection | |
Trout | Testing safety-critical systems using Model-based Systems Engineering (MBSE) | |
Nanda et al. | Quantitative metrics for improving software performance for an integrated tool platform | |
US10223486B2 (en) | Static modelling of an electronic device | |
Baar | Towards measuring the abstractness of state machines based on mutation testing | |
Poonam | Software testing strategies and methodologies | |
Ackermann et al. | Integrating Functional and Non-Functional Design Verification for Embedded Software Systems | |
Selyunin et al. | Applying High-Level Synthesis for Synthesizing Hardware Runtime STL Monitors of Mission-Critical Properties | |
Anderson et al. | Dependability of Component Based Systems (Dagstuhl Seminar 02451) |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: GENERAL ELECTRIC COMPANY, NEW YORK Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LI, MENG;DURLING, MICHAEL RICHARD;SIU, KIT YAN;AND OTHERS;SIGNING DATES FROM 20180306 TO 20180308;REEL/FRAME:045158/0477 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |