EP2238540A1 - Selective code instrumentation for software verification - Google Patents
Selective code instrumentation for software verificationInfo
- Publication number
- EP2238540A1 EP2238540A1 EP08751038A EP08751038A EP2238540A1 EP 2238540 A1 EP2238540 A1 EP 2238540A1 EP 08751038 A EP08751038 A EP 08751038A EP 08751038 A EP08751038 A EP 08751038A EP 2238540 A1 EP2238540 A1 EP 2238540A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- code
- computer code
- verification analysis
- computer
- dynamic
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3664—Environments for testing or debugging software
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3688—Test management for test execution, e.g. scheduling of test suites
Definitions
- Software products can potentially be very large and complex.
- Software testing is the process used to assess the quality of developed computer software. Quality may be judged based on a number of metrics, such as correctness, completeness, reliability, number of bugs found, efficiency, compatibility, etc.
- Static verification is an analysis performed without executing the software. Static verification of software code can prove, for example, which operations are free of run-time errors such as numeric overflows, divisions by zero, buffer overflows, or pointer issues, and identify where run-time errors will or might occur.
- static verification is used to classify the code into categories.
- the categories may include code determined to be good or safe or correct, code determined to have errors, code determined not to be accessible (e.g., "dead code” or “deactivated code”) and code for which a determination could not be made (e.g., "don't know”). Code classified as
- FIG. 1 is an exemplary diagram of a system in which concepts described herein may be implemented
- FIG. 2 is a diagram of an exemplary device corresponding to one of the workstations or servers shown in Fig. 1 ;
- Fig. 3 is a flow chart illustrating exemplary operations that may be performed by the verification tool shown in Fig. 1 ;
- Fig. 4 is a diagram illustrating an exemplary graphical interface in which static verification results may be presented to a user
- FIGs. 5A and 5B are diagrams illustrating exemplary code before and after instrumentation
- Fig. 6 is a diagram illustrating an exemplary graphical interface in which dynamic verification results may be presented to a user.
- Fig. 7 is a diagram conceptually illustrating components of a verification tool in an exemplary implementation in which a verification tool is implemented using a client-server model.
- Implementations described herein relate to an automated software testing tool in which the results of a static verification analysis technique are used to select portions of the code for additional analysis.
- the selected portions may include the code determined by the static verification analysis to have an unknown or unproven error condition.
- These selected portions may be modified (instrumented) for dynamic analysis and then executed in conjunction with input test values to perform a dynamic verification analysis of the code.
- the results of the dynamic analysis can provide additional verification information about the code that was determined to be unproven by the static verification testing.
- Static verification analysis generally refer to an analysis of computer code that is performed without executing the code. For example, static code analysis may examine code using abstract interpretation techniques to verify all possible execution paths of a program.
- Dynamic verification analysis refers to verification of software performed by or during the execution of the software. Dynamic verification may involve, for example, executing the software with a set of test input values.
- Fig. 1 is an exemplary diagram of a system 100 in which concepts described herein may be implemented.
- the system may include one or more personal computers or workstations 110, one or more servers 120, and a network 130.
- software verification tool 105 may be executed by one or more of servers 120 and workstations 110 to assist in software verification.
- Workstations 110 may include computing devices, such as desktop or laptop computers, that may be used for general computing tasks.
- users of workstations 110 may be software developers.
- the users may use verification tool 105 to assist in verifying their developed software code.
- verification tool 105 may include client-side components and server-side components.
- the client-side components may be executed at the user's workstation 110 while the server-side components may execute at one or more of servers 120.
- verification tool 105 may execute exclusively at the user's workstation 110.
- workstations 110 may execute a technical computing environment (TCE) that presents a user with an interface that enables efficient analysis and generation of technical applications.
- TCE technical computing environment
- the TCE may provide a numerical and/or symbolic computing environment that allows for matrix manipulation, plotting of functions and data, implementation of algorithms, creation of user interfaces, and/or interfacing with programs in other languages.
- the TCE may be textual and/or graphical.
- Servers 120 may each include a device, such as a computer or another type of computation or communication device, a thread or process running on one of these devices, and/or an object executable by one of these devices.
- Server device 120 may generally provide services to other devices (e.g., workstations 110) connected to network 130.
- one or more of server devices 120 may include server components of software verification tool 105.
- Fig. 2 is a diagram of an exemplary device 200 corresponding to one of workstations 110 or servers 120.
- device 200 may include a bus 210, a processing unit 220, a main memory 230, a read-only memory (ROM) 240, a storage device 250, an input device 260, an output device 270, and/or a communication interface 280.
- Bus 210 may include a path that permits communication among the components of workstation 110.
- Processing unit 220 may include a processor, microprocessor, or other types of processing logic that may interpret and execute instructions.
- Main memory 230 may include a random access memory (RAM) or another type of dynamic storage device that may store information and instructions for execution by processing unit 220.
- ROM 240 may include a ROM device or another type of static storage device that may store static information and/or instructions for use by processing unit 220.
- Storage device 250 may include a magnetic and/or optical recording medium and its corresponding drive.
- Input device 260 may include a mechanism that permits an operator to input information to workstation 110, such as a keyboard, a mouse, a pen, a microphone, a touch- sensitive display, voice recognition and/or biometric mechanisms, etc.
- Output device 270 may include a mechanism that outputs information to the operator, including a display, a printer, a speaker, etc.
- Communication interface 280 may include any transceiver-like mechanism that enables workstation 110 to communicate with other devices and/or systems.
- communication interface 280 may include mechanisms for communicating with another device or system via a network.
- workstation 110 may perform certain operations in response to processing unit 220 executing software instructions contained in a computer- readable medium, such as main memory 230.
- a computer-readable medium may be defined as a physical or logical memory device.
- the software instructions may be read into main memory 230 from another computer-readable medium, such as storage device 250, or from another device via communication interface 280.
- the software instructions contained in main memory 230 may cause processing unit 220 to perform processes that will be described later.
- hardwired circuitry may be used in place of or in combination with software instructions to implement processes described herein. Thus, implementations described herein are not limited to any specific combination of hardware circuitry and software.
- device 200 may contain fewer, different, or additional components than depicted in Fig. 2.
- one or more components of device 200 may perform one or more tasks performed by one or more other components of device 200.
- verification tool 105 may be used to measure the quality of developed computer software. Verification tool 105 may perform both static and dynamic software verification. The dynamic software verification may be selectively implemented so that only portions of the code determined to be unproven by the static verification will be instrumented for dynamic analysis.
- verification tool 105 may be used in the context of a technical computing environment.
- a "technical computing environment,” as the term is used herein, is to be broadly interpreted to include any hardware and/or software based logic that provides a computing environment that allows users to perform tasks related to disciplines, such as, but not limited to, mathematics, science, engineering, medicine, business, etc., more efficiently than if the tasks were performed in another type of computing environment, such as an environment that required the user to develop code in a conventional programming language, such as C++, C, Ada, Java, Javascript, Perl, Ruby, Fortran, Pascal, etc.
- a technical computing environment may additionally provide mathematical functions and/or graphical tools or blocks (e.g., for creating plots, surfaces, images, volumetric representations, etc.).
- a technical computing environment is not limited to performing disciplines discussed above, and may be used to perform computations relating to fields of endeavor not typically associated with mathematics, science or business.
- Verification tool 105 may operate as a component in a technical computing environment to verify code created with the technical computing environment. For example, the technical computing environment may give the user an option to create graphical models. The technical computing environment may then compile the created graphical model for execution on a target system. Verification tool 105 may be used to verify the code that embodies the graphical model. Alternatively, verification tool 105 may be used to verify generated code or the graphical model itself.
- verification tool 105 may be used in the context of a technical computing environment, verification tool 105 may be used with any software development project.
- verification tool 105 may analyze code written in conventional programming language, such as C++, C and Ada, etc., and which is produced manually by a developer with or without the use of a technical computing environment.
- Fig. 3 is a flow chart illustrating exemplary operations that may be performed by verification tool 105.
- Verification tool 105 may initially receive the software code that is to be verified (block 305). For example, a user at one of workstations 110 may use verification tool 105 to select one or more files that contain the software code that is to be verified.
- the software code may be textual source code, or code describing a graphical model created using a technical computing environment, or an intermediate representation of the textual and/or graphical source code.
- the user may additionally provide verification tool 105 with input parameters for the software verification (block 310).
- the input parameters may relate to, for example, options or parameters applicable to a static verification analysis performed by verification tool 105.
- These static verification options may include, for example, the names of the files or procedures that are to be analyzed, the specific static verification algorithms to use, and/or the options relating to the static verification algorithms that are to be used.
- Verification tool 105 may perform a static verification analysis to generate initial classifications for the code (block 315).
- static verification analysis may involve analysis of the software code without execution of the code.
- the static verification may involve analysis of the code to construct a model of the code (i.e., an abstract representation of the code).
- the model can be used for matching commonly occurring error patterns to the code.
- the model may also be used to perform some kind of data-flow analysis of the code to infer the possible values that variables might have at certain points in the program.
- Data-flow analysis can be used for vulnerability checking.
- Static verification analysis may be used to prove which operations are free of run-time errors or to find possible errors. Errors that may be found include: overflows and underflows; divisions by zero and other arithmetic errors; out-of-bounds array access; illegally dereferenced pointers; read-only access to non-initialized data; dangerous type conversions; dead code; access to null pointers; dynamic errors related to object programming and inheritance; errors related to exception handling; and non-initialized class members in the C++ language.
- verification tool 105 may classify the code into classifications that relate to possible errors in the code.
- Verification tool 105 may output the verification results from the static analysis (block 320). For example, the results may be saved to a file, shown to a user on a display, stored in a database, etc.
- the classification may include classifying each possible failure point in the source code into classes that define: (1) code that has no errors, code that might have errors (unknown or unproven conditions), (2) code that definitely has errors, or (3) code that cannot be reached.
- the classifications may be presented to the user in a number of possible ways, such as by changing the appearance of the code (e.g., font type, font size, font color, etc.) based on its classification.
- the code may be presented using color codes.
- the code may be shown to the user as GREEN code (code that has no errors), RED (code that definitely has errors), GRAY (code that cannot be reached), or ORANGE (unknown or unproven error conditions). This color-identifier assignment is for illustration purposes only, and in alternative embodiments, other colors or other visual or non-visual schemes for identifying different types of verified code may be used, as determined by one of skill in the art.
- FIG. 4 is a diagram illustrating an exemplary graphical interface 400 in which verification results may be presented to a user, such as the initial verification results (Fig. 3, block 320).
- Graphical interface 400 may be presented to a user, for example, by verification tool 105 on a display of one of workstations 110.
- Graphical interface 400 may include an entities list section 405 and an indicator list section 410.
- Indicator list section 410 may include a number of columns (numbered 411-415 in Fig. 4), where each column provides information about the corresponding entity shown in section 410.
- Entities list section 405 may display the names of the files and underlying functions that have been analyzed. Each file and function may be visually distinguished based on its error classification.
- the text of each file may be color coded (e.g., RED, ORANGE, GREEN, or GRAY) to indicate its error classification.
- the "color" codes are shown using underlined, italic, and bold fonts, in which RED code is shown in italic, GREEN code is shown underlined, and ORANGE code is shown in bold.
- Each file and underlying function may be colorized according to the most critical error found.
- the file “polyspace main.c,” labeled as file 430, is GREEN, indicating no errors were found.
- the file “example.c,” labeled as file 431, is red, indicating errors were found.
- the user has chosen to drill-down into “example.c” by selecting the "+” icon, which shows the functions 432 contained in “example.c".
- Functions 432 may be similarly color-coded. For example, the function "close to zero” is shown as GREEN code, the function “pointer arithmetic” is shown as RED code, and the function “recursion” is shown as orange code.
- indicator list section 410 may include a column 411 that displays the software reliability of that code, where 100% indicates complete reliability of the code for the code category and 0% means no reliability.
- Column 412 may indicate the number of RED code segments
- column 413 may indicate the number of GRAY segments
- column 414 may indicate the number of ORANGE segments
- column 415 may indicate the number of GREEN segments.
- ORANGE unproven code tends to be particularly troubling for developers. Because ORANGE code represents an unknown error state, ORANGE code may need to be manually verified by the software developer. For a large project, even if only five or ten percent of the code is classified as ORANGE, the manual effort required to review such code can be extensive.
- verification tool 105 may automatically instrument ORANGE code for further dynamic verification analysis.
- the dynamic verification analysis can provide further clarification on whether code initially classified as ORANGE code needs to be manually reviewed.
- "Instrumenting code” may generally refer to modifying the code to embed test or verification statements in the code. These statements may be used to monitor or catch conditions that result in errors or other problems or properties of the code.
- verification tool 105 may instrument the ORANGE code for dynamic testing (block 322).
- Figs. 5A and 5B are diagrams illustrating exemplary code before and after instrumentation.
- a section of code 500 may include two statements, a GREEN statement 501 and an ORANGE statement 502.
- Fig. 5B shows code 500 after an exemplary automated code instrumentation procedure.
- the ORANGE statement 502 is instrumented, while GREEN statement 501 has not been modified. Selective instrumentation of the code in this manner (e.g., only instrumenting ORANGE statements) may reduce the time required to test the code.
- modified statement 502 may include an initial "if statement 510 that tests whether the denominator (i.e., the variable "K") in Fig.
- 5A is equal or close to zero. If it is not, the original statement is executed at statement 511. If the denominator is equal or close to zero however, various error or warning messages may be raised. In this example, the warning messages may vary based on flags that define the environment in which the code is executing. Whether an error is generated may depend on the values of the input test cases. That is, an error will be generated only when the input test cases cause the variable K to be zero when statement 502 is executed.
- the user may additionally provide verification tool 105 with input parameters for dynamic verification of the software (block 325).
- the input parameters may relate to, for example, options or parameters applicable to a dynamic verification analysis performed by verification tool 105.
- a particular type of input parameter may define ranges or values used for "test cases" for the software.
- the test cases may be applicable to software that receives external values during execution.
- the software may be designed to act on input received from users, external sensors, or external applications or computing devices.
- the variables needed to describe the external stimuli may be used to define the test cases.
- the user may define a range that is to be covered for each variable to define all of the test cases. For example, considered a software system in which two integer variables, X and Y, represent all of the external inputs received by the system. The designer may decide that a satisfactory test can be obtained when X is constrained to the range of zero to five and Y is constrained to the range of zero to two.
- test cases for this example system would be [(0,0), (0,1), (0,2), (1, 0), (1,1), (1,2), (2,0) ... (5, 2)].
- verification tool 105 may randomly select 10 of these pairs of values, which may be used as the set of test cases that are used to test the software.
- Other techniques for specifying test cases may be used. For example, instead of having test cases automatically generated based on a range of variable values entered by the user, the user may specify specific test cases to be used, such as by manually entering the specific test cases.
- Verification tool 105 may next perform the dynamic verification analysis (block 330).
- the dynamic verification analysis may be performed by compiling the code and then executing it multiple times, with each execution using a different test case from the set of defined test cases.
- verification tool 105 may output additional verification results that provide additional verification information about the code that has been instrumented for dynamic verification analysis (block 335). The additional verification information can be used to reduce the number of "false positives" (i.e., uncertain output conditions) generated by the initial static verification analysis.
- FIG. 6 is a diagram illustrating an exemplary graphical interface 600 in which the additional verification results may be presented to a user.
- Graphical interface 600 may be presented to a user, for example, by verification tool 105 on a display of a workstation 110.
- Graphical interface 600 may include a variable list section 605, a test summary section 610, and an error log section 615.
- Variable list section 605 may contain information about each of the variables used to define the test cases. As shown, variable list section 605 may include a variable name field 620, a variable type field 625, and a variable values field 630.
- Each item in variable name field 620 may be the name of an input variable of the software, or an output parameter of an external function that is stubbed by verification tool 105.
- Variable type field 625 may list the type of each variable. For example, the variable returned from the function "random float" is a variable of type "float32.”
- Variable values field 630 may list the range of values, such as the minimum and maximum value that was assigned to the variable.
- Test summary section 610 may provide summary information or configuration information relating to the dynamic verification analysis. As shown in Fig. 6, the summary information may include a configuration section 640 and a results section 645. In the configuration section 640, the user may enter configuration information relating to the dynamic analysis.
- Results section 645 may display results relating to the dynamic verification analysis.
- the results may be dynamically updated as the dynamic verification analysis executes.
- the displayed results may include, for example, the number of completed tests, the number of tests in which no run-time error was detected in instrumented code sections, and the number of test in which run-time errors were detected in instrumented code sections.
- Error log section 615 may provide a detailed list of all of the errors that occurred during the dynamic verification analysis. For example, as shown in Fig. 6, for each error, error log section 615 may display the name of the file in which the error occurred, the line number and column of the error, a description of the error, and the number of test cases in which the error occurred.
- a user may monitor and/or control the dynamic verification analysis performed on selected code. Based on the dynamic verification results, the user can increase their confidence in the reliability of the code. As the tests performed may not be exhaustive, they may not definitively prove that the code is run-time error free. But a set of performed tests without any run-time errors may increase the user's confidence in the code's reliability.
- verification tool 105 may provide results of the dynamic verification analysis in conjunction with or together with the color codes generated by the static verification analysis. For example, one or more additional categories may also be created.
- a "DYNAMIC VERIFICATION GREEN" category may be created, which may be associated with a different color, such as a shade of green different than the green used to present the static verification analysis GREEN code.
- a "DYNAMIC VERIFICATION GREEN” category may be created, which may be associated with a different color, such as a shade of green different than the green used to present the static verification analysis GREEN code.
- the categories for the dynamic verification and static verification may be presented using an interface similar to that shown in Fig. 4.
- Fig. 7 is a diagram conceptually illustrating components of verification tool 105 in an exemplary implementation in which verification tool 105 is implemented using a client-server model.
- verification tool 105 may include components 710 that execute on a server, such as one or more of servers 120, and components 715 that execute on a client, such as one of workstations 110. More particularly, server components 710 may include static code verification component 711 and results 712. Client components 715 may include source code 720, result viewer component 721, potential bug list 722, test case generator component 723, test cases 724, dynamic instrumentation component 725, instrumented code 726, executable component 727, and potential bug list 728. Client components 715 may be associated with, for example, one of workstations 110.
- a user may load or otherwise generate source code 720.
- the source code may be transmitted to server 120 via a network, such as network 130.
- Server 120 may function as a static code verification component for a number of users, such as for all of the software developers for a particular company.
- Static code verification component 711 may analyze the source code using static verification analysis techniques, such as using the techniques discussed previously with respect to block 315 (Fig. 3). The initial verification results may be stored as results 712. Results 712 may be transmitted back to client 110 for viewing and/or analysis.
- Client components 715 may particularly include result viewer 721.
- Result viewer 721 may present results 722 to the user.
- Result viewer 721 may, for example, contain a graphical user interface, such as the one shown in Fig. 4, that uses color-codes to illustrate the error status of different sections of code.
- Result viewer 721 may additionally output or store results 712 as potential bug list 722.
- Client components 715 may additionally include test case generator component 723 to generate test cases 724.
- Test case generator component 723 may, for example, present an interface such as variable list section 605 of graphical interface 600 (Fig. 6). Through this graphical interface, users may define, for instance, value ranges that are to be tested for each input variable of the software. Based on this information, test case generator component 723 may generate test cases 724.
- Client components 720 may additionally include dynamic instrumentation component 725 to instrument source code 720 based on results 712. As previously discussed, this instrumentation may be performed such that code segments that were determined by the static verification analysis to have an unknown or unproven error condition (e.g., ORANGE code) may be instrumented.
- Source code 720 after instrumentation by dynamic instrumentation component 725, is stored as instrumented code 726.
- Instrumented code 726 may be compiled and run in conjunction with test cases 724 as executable component 727 and may generate potential bug list 728 during the execution.
- Fig. 7 illustrates an implementation of verification tool 105 using a client- server model, in an alternative implementation, all or substantially all of the functionality of verification tool 105 may be implemented on a single computer, such as a single workstation 110.
- the dynamic verification analysis techniques may be automatically applied to selected portions of the code.
- the selected portions of the code may correspond to portions of the code that was determined to correspond to unproven sections of the code.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Debugging And Monitoring (AREA)
Abstract
Description
Claims
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/IB2008/001327 WO2009095741A1 (en) | 2008-02-01 | 2008-02-01 | Selective code instrumentation for software verification |
Publications (1)
Publication Number | Publication Date |
---|---|
EP2238540A1 true EP2238540A1 (en) | 2010-10-13 |
Family
ID=39773027
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP08751038A Ceased EP2238540A1 (en) | 2008-02-01 | 2008-02-01 | Selective code instrumentation for software verification |
Country Status (2)
Country | Link |
---|---|
EP (1) | EP2238540A1 (en) |
WO (1) | WO2009095741A1 (en) |
Families Citing this family (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8336030B1 (en) | 2009-09-11 | 2012-12-18 | The Mathworks, Inc. | System and method for coding standard testing |
US8566800B2 (en) | 2010-05-11 | 2013-10-22 | Ca, Inc. | Detection of method calls to streamline diagnosis of custom code through dynamic instrumentation |
US8782612B2 (en) | 2010-05-11 | 2014-07-15 | Ca, Inc. | Failsafe mechanism for dynamic instrumentation of software using callbacks |
US8473925B2 (en) | 2010-05-11 | 2013-06-25 | Ca, Inc. | Conditional dynamic instrumentation of software in a specified transaction context |
US8386504B1 (en) | 2010-07-06 | 2013-02-26 | The Mathworks, Inc. | System and method for file differencing with importance ranking |
US8938729B2 (en) | 2010-10-12 | 2015-01-20 | Ca, Inc. | Two pass automated application instrumentation |
US8752015B2 (en) | 2011-12-05 | 2014-06-10 | Ca, Inc. | Metadata merging in agent configuration files |
US9411616B2 (en) | 2011-12-09 | 2016-08-09 | Ca, Inc. | Classloader/instrumentation approach for invoking non-bound libraries |
US9971896B2 (en) | 2011-12-30 | 2018-05-15 | International Business Machines Corporation | Targeted security testing |
WO2014016649A1 (en) * | 2012-07-27 | 2014-01-30 | Freescale Semiconductor, Inc. | Method and apparatus for implementing instrumentation code |
EP2713277B1 (en) * | 2012-09-28 | 2021-11-17 | Accenture Global Services Limited | Latent defect identification |
IN2013MU03461A (en) | 2013-10-31 | 2015-07-17 | Tata Consultancy Services Ltd | |
US10002252B2 (en) | 2014-07-01 | 2018-06-19 | Fireeye, Inc. | Verification of trusted threat-aware microvisor |
US10108529B2 (en) | 2015-10-13 | 2018-10-23 | International Business Machines Corporation | Dynamic instrumentation based on detected errors |
US10423518B2 (en) | 2016-04-27 | 2019-09-24 | The Mathworks, Inc. | Systems and methods for analyzing violations of coding rules |
US10592678B1 (en) | 2016-09-09 | 2020-03-17 | Fireeye, Inc. | Secure communications between peers using a verified virtual trusted platform module |
US10025691B1 (en) | 2016-09-09 | 2018-07-17 | Fireeye, Inc. | Verification of complex software code using a modularized architecture |
DE102019211037A1 (en) * | 2019-07-25 | 2021-01-28 | Robert Bosch Gmbh | Method of testing a system |
KR102126960B1 (en) * | 2020-04-24 | 2020-06-25 | 한화시스템 주식회사 | Reliability testing integrated-platform of weapon system software for next-generation naval ship |
-
2008
- 2008-02-01 EP EP08751038A patent/EP2238540A1/en not_active Ceased
- 2008-02-01 WO PCT/IB2008/001327 patent/WO2009095741A1/en active Search and Examination
Non-Patent Citations (3)
Title |
---|
KOREL B ET AL: "Assertion-oriented automated test data generation", SOFTWARE ENGINEERING, 1996., PROCEEDINGS OF THE 18TH INTERNATIONAL CON FERENCE ON BERLIN, GERMANY 25-30 MARCH 1996, LOS ALAMITOS, CA, USA,IEEE COMPUT. SOC, US, 25 March 1996 (1996-03-25), pages 71 - 80, XP010158231, ISBN: 978-0-8186-7247-7, DOI: 10.1109/ICSE.1996.493403 * |
RAMAMOORTHY ET AL: "Design and construction of an automated software evaluation system", RECORD OF THE 1973 IEEE SYMPOSIUM ON COMPUTER SOFTWARE RELIABILITY, IEEE, US, 1 January 1973 (1973-01-01), pages 28 - 37, XP008118454 * |
See also references of WO2009095741A1 * |
Also Published As
Publication number | Publication date |
---|---|
WO2009095741A1 (en) | 2009-08-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2238540A1 (en) | Selective code instrumentation for software verification | |
US9535821B1 (en) | Displaying violated coding rules in source code | |
EP2718820B1 (en) | Identifying and triaging software bugs through backward propagation of under-approximated values and empiric techniques | |
Pradel et al. | A framework for the evaluation of specification miners based on finite state machines | |
Carrington et al. | A tale of two paradigms: Formal methods and software testing | |
US9665350B1 (en) | Automatic test generation for model-based code coverage | |
US8694958B1 (en) | Marking up objects in code generation | |
US8997065B2 (en) | Automatic modularization of source code | |
Schroeder et al. | Generating expected results for automated black-box testing | |
US9058427B2 (en) | Iterative generation of symbolic test drivers for object-oriented languages | |
Kadry | A new proposed technique to improve software regression testing cost | |
Chowdhury et al. | CyFuzz: A differential testing framework for cyber-physical systems development environments | |
Podelski et al. | Classifying bugs with interpolants | |
WO2000072145A1 (en) | Analyzing an extended finite state machine system model | |
Hayes | Testing of object-oriented programming systems (OOPS): A fault-based approach | |
WO2000072147A1 (en) | Analyzing an extended finite state machine system model | |
Jones et al. | A formal methods-based verification approach to medical device software analysis | |
US9442701B1 (en) | Verifying models for exceptional behavior | |
Alves et al. | Static estimation of test coverage | |
Perez | Dynamic code coverage with progressive detail levels | |
Arantes et al. | On proposing a test oracle generator based on static and dynamic source code analysis | |
Singh et al. | The review: Lifecycle of object-oriented software testing | |
Yousaf et al. | Efficient Identification of Race Condition Vulnerability in C code by Abstract Interpretation and Value Analysis | |
US11782682B2 (en) | Providing metric data for patterns usable in a modeling environment | |
Odeh | Software Source Code: Theoretical Analyzing and Practical Reviewing Model |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20080331 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MT NL NO PL PT RO SE SI SK TR |
|
AX | Request for extension of the european patent |
Extension state: AL BA MK RS |
|
DAX | Request for extension of the european patent (deleted) | ||
17Q | First examination report despatched |
Effective date: 20121109 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
APBK | Appeal reference recorded |
Free format text: ORIGINAL CODE: EPIDOSNREFNE |
|
APBN | Date of receipt of notice of appeal recorded |
Free format text: ORIGINAL CODE: EPIDOSNNOA2E |
|
APBR | Date of receipt of statement of grounds of appeal recorded |
Free format text: ORIGINAL CODE: EPIDOSNNOA3E |
|
APAF | Appeal reference modified |
Free format text: ORIGINAL CODE: EPIDOSCREFNE |
|
APBX | Invitation to file observations in appeal sent |
Free format text: ORIGINAL CODE: EPIDOSNOBA2E |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: THE MATHWORKS, INC. |
|
APAF | Appeal reference modified |
Free format text: ORIGINAL CODE: EPIDOSCREFNE |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R003 |
|
APBT | Appeal procedure closed |
Free format text: ORIGINAL CODE: EPIDOSNNOA9E |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED |
|
18R | Application refused |
Effective date: 20210326 |