WO2009095741A1 - Selective code instrumentation for software verification - Google Patents

Selective code instrumentation for software verification Download PDF

Info

Publication number
WO2009095741A1
WO2009095741A1 PCT/IB2008/001327 IB2008001327W WO2009095741A1 WO 2009095741 A1 WO2009095741 A1 WO 2009095741A1 IB 2008001327 W IB2008001327 W IB 2008001327W WO 2009095741 A1 WO2009095741 A1 WO 2009095741A1
Authority
WO
WIPO (PCT)
Prior art keywords
code
computer code
verification analysis
computer
dynamic
Prior art date
Application number
PCT/IB2008/001327
Other languages
French (fr)
Inventor
Patrick Munier
Haibin Yin
Cyril Preve
Original Assignee
The Mathworks, Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The Mathworks, Inc filed Critical The Mathworks, Inc
Priority to PCT/IB2008/001327 priority Critical patent/WO2009095741A1/en
Priority to EP08751038A priority patent/EP2238540A1/en
Publication of WO2009095741A1 publication Critical patent/WO2009095741A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3664Environments for testing or debugging software
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Definitions

  • Software products can potentially be very large and complex.
  • Software testing is the process used to assess the quality of developed computer software. Quality may be judged based on a number of metrics, such as correctness, completeness, reliability, number of bugs found, efficiency, compatibility, etc.
  • Static verification is an analysis performed without executing the software. Static verification of software code can prove, for example, which operations are free of run-time errors such as numeric overflows, divisions by zero, buffer overflows, or pointer issues, and identify where run-time errors will or might occur.
  • static verification is used to classify the code into categories.
  • the categories may include code determined to be good or safe or correct, code determined to have errors, code determined not to be accessible (e.g., "dead code” or “deactivated code”) and code for which a determination could not be made (e.g., "don't know”). Code classified as
  • FIG. 1 is an exemplary diagram of a system in which concepts described herein may be implemented
  • FIG. 2 is a diagram of an exemplary device corresponding to one of the workstations or servers shown in Fig. 1 ;
  • Fig. 3 is a flow chart illustrating exemplary operations that may be performed by the verification tool shown in Fig. 1 ;
  • Fig. 4 is a diagram illustrating an exemplary graphical interface in which static verification results may be presented to a user
  • FIGs. 5A and 5B are diagrams illustrating exemplary code before and after instrumentation
  • Fig. 6 is a diagram illustrating an exemplary graphical interface in which dynamic verification results may be presented to a user.
  • Fig. 7 is a diagram conceptually illustrating components of a verification tool in an exemplary implementation in which a verification tool is implemented using a client-server model.
  • Implementations described herein relate to an automated software testing tool in which the results of a static verification analysis technique are used to select portions of the code for additional analysis.
  • the selected portions may include the code determined by the static verification analysis to have an unknown or unproven error condition.
  • These selected portions may be modified (instrumented) for dynamic analysis and then executed in conjunction with input test values to perform a dynamic verification analysis of the code.
  • the results of the dynamic analysis can provide additional verification information about the code that was determined to be unproven by the static verification testing.
  • Static verification analysis generally refer to an analysis of computer code that is performed without executing the code. For example, static code analysis may examine code using abstract interpretation techniques to verify all possible execution paths of a program.
  • Dynamic verification analysis refers to verification of software performed by or during the execution of the software. Dynamic verification may involve, for example, executing the software with a set of test input values.
  • Fig. 1 is an exemplary diagram of a system 100 in which concepts described herein may be implemented.
  • the system may include one or more personal computers or workstations 110, one or more servers 120, and a network 130.
  • software verification tool 105 may be executed by one or more of servers 120 and workstations 110 to assist in software verification.
  • Workstations 110 may include computing devices, such as desktop or laptop computers, that may be used for general computing tasks.
  • users of workstations 110 may be software developers.
  • the users may use verification tool 105 to assist in verifying their developed software code.
  • verification tool 105 may include client-side components and server-side components.
  • the client-side components may be executed at the user's workstation 110 while the server-side components may execute at one or more of servers 120.
  • verification tool 105 may execute exclusively at the user's workstation 110.
  • workstations 110 may execute a technical computing environment (TCE) that presents a user with an interface that enables efficient analysis and generation of technical applications.
  • TCE technical computing environment
  • the TCE may provide a numerical and/or symbolic computing environment that allows for matrix manipulation, plotting of functions and data, implementation of algorithms, creation of user interfaces, and/or interfacing with programs in other languages.
  • the TCE may be textual and/or graphical.
  • Servers 120 may each include a device, such as a computer or another type of computation or communication device, a thread or process running on one of these devices, and/or an object executable by one of these devices.
  • Server device 120 may generally provide services to other devices (e.g., workstations 110) connected to network 130.
  • one or more of server devices 120 may include server components of software verification tool 105.
  • Fig. 2 is a diagram of an exemplary device 200 corresponding to one of workstations 110 or servers 120.
  • device 200 may include a bus 210, a processing unit 220, a main memory 230, a read-only memory (ROM) 240, a storage device 250, an input device 260, an output device 270, and/or a communication interface 280.
  • Bus 210 may include a path that permits communication among the components of workstation 110.
  • Processing unit 220 may include a processor, microprocessor, or other types of processing logic that may interpret and execute instructions.
  • Main memory 230 may include a random access memory (RAM) or another type of dynamic storage device that may store information and instructions for execution by processing unit 220.
  • ROM 240 may include a ROM device or another type of static storage device that may store static information and/or instructions for use by processing unit 220.
  • Storage device 250 may include a magnetic and/or optical recording medium and its corresponding drive.
  • Input device 260 may include a mechanism that permits an operator to input information to workstation 110, such as a keyboard, a mouse, a pen, a microphone, a touch- sensitive display, voice recognition and/or biometric mechanisms, etc.
  • Output device 270 may include a mechanism that outputs information to the operator, including a display, a printer, a speaker, etc.
  • Communication interface 280 may include any transceiver-like mechanism that enables workstation 110 to communicate with other devices and/or systems.
  • communication interface 280 may include mechanisms for communicating with another device or system via a network.
  • workstation 110 may perform certain operations in response to processing unit 220 executing software instructions contained in a computer- readable medium, such as main memory 230.
  • a computer-readable medium may be defined as a physical or logical memory device.
  • the software instructions may be read into main memory 230 from another computer-readable medium, such as storage device 250, or from another device via communication interface 280.
  • the software instructions contained in main memory 230 may cause processing unit 220 to perform processes that will be described later.
  • hardwired circuitry may be used in place of or in combination with software instructions to implement processes described herein. Thus, implementations described herein are not limited to any specific combination of hardware circuitry and software.
  • device 200 may contain fewer, different, or additional components than depicted in Fig. 2.
  • one or more components of device 200 may perform one or more tasks performed by one or more other components of device 200.
  • verification tool 105 may be used to measure the quality of developed computer software. Verification tool 105 may perform both static and dynamic software verification. The dynamic software verification may be selectively implemented so that only portions of the code determined to be unproven by the static verification will be instrumented for dynamic analysis.
  • verification tool 105 may be used in the context of a technical computing environment.
  • a "technical computing environment,” as the term is used herein, is to be broadly interpreted to include any hardware and/or software based logic that provides a computing environment that allows users to perform tasks related to disciplines, such as, but not limited to, mathematics, science, engineering, medicine, business, etc., more efficiently than if the tasks were performed in another type of computing environment, such as an environment that required the user to develop code in a conventional programming language, such as C++, C, Ada, Java, Javascript, Perl, Ruby, Fortran, Pascal, etc.
  • a technical computing environment may additionally provide mathematical functions and/or graphical tools or blocks (e.g., for creating plots, surfaces, images, volumetric representations, etc.).
  • a technical computing environment is not limited to performing disciplines discussed above, and may be used to perform computations relating to fields of endeavor not typically associated with mathematics, science or business.
  • Verification tool 105 may operate as a component in a technical computing environment to verify code created with the technical computing environment. For example, the technical computing environment may give the user an option to create graphical models. The technical computing environment may then compile the created graphical model for execution on a target system. Verification tool 105 may be used to verify the code that embodies the graphical model. Alternatively, verification tool 105 may be used to verify generated code or the graphical model itself.
  • verification tool 105 may be used in the context of a technical computing environment, verification tool 105 may be used with any software development project.
  • verification tool 105 may analyze code written in conventional programming language, such as C++, C and Ada, etc., and which is produced manually by a developer with or without the use of a technical computing environment.
  • Fig. 3 is a flow chart illustrating exemplary operations that may be performed by verification tool 105.
  • Verification tool 105 may initially receive the software code that is to be verified (block 305). For example, a user at one of workstations 110 may use verification tool 105 to select one or more files that contain the software code that is to be verified.
  • the software code may be textual source code, or code describing a graphical model created using a technical computing environment, or an intermediate representation of the textual and/or graphical source code.
  • the user may additionally provide verification tool 105 with input parameters for the software verification (block 310).
  • the input parameters may relate to, for example, options or parameters applicable to a static verification analysis performed by verification tool 105.
  • These static verification options may include, for example, the names of the files or procedures that are to be analyzed, the specific static verification algorithms to use, and/or the options relating to the static verification algorithms that are to be used.
  • Verification tool 105 may perform a static verification analysis to generate initial classifications for the code (block 315).
  • static verification analysis may involve analysis of the software code without execution of the code.
  • the static verification may involve analysis of the code to construct a model of the code (i.e., an abstract representation of the code).
  • the model can be used for matching commonly occurring error patterns to the code.
  • the model may also be used to perform some kind of data-flow analysis of the code to infer the possible values that variables might have at certain points in the program.
  • Data-flow analysis can be used for vulnerability checking.
  • Static verification analysis may be used to prove which operations are free of run-time errors or to find possible errors. Errors that may be found include: overflows and underflows; divisions by zero and other arithmetic errors; out-of-bounds array access; illegally dereferenced pointers; read-only access to non-initialized data; dangerous type conversions; dead code; access to null pointers; dynamic errors related to object programming and inheritance; errors related to exception handling; and non-initialized class members in the C++ language.
  • verification tool 105 may classify the code into classifications that relate to possible errors in the code.
  • Verification tool 105 may output the verification results from the static analysis (block 320). For example, the results may be saved to a file, shown to a user on a display, stored in a database, etc.
  • the classification may include classifying each possible failure point in the source code into classes that define: (1) code that has no errors, code that might have errors (unknown or unproven conditions), (2) code that definitely has errors, or (3) code that cannot be reached.
  • the classifications may be presented to the user in a number of possible ways, such as by changing the appearance of the code (e.g., font type, font size, font color, etc.) based on its classification.
  • the code may be presented using color codes.
  • the code may be shown to the user as GREEN code (code that has no errors), RED (code that definitely has errors), GRAY (code that cannot be reached), or ORANGE (unknown or unproven error conditions). This color-identifier assignment is for illustration purposes only, and in alternative embodiments, other colors or other visual or non-visual schemes for identifying different types of verified code may be used, as determined by one of skill in the art.
  • FIG. 4 is a diagram illustrating an exemplary graphical interface 400 in which verification results may be presented to a user, such as the initial verification results (Fig. 3, block 320).
  • Graphical interface 400 may be presented to a user, for example, by verification tool 105 on a display of one of workstations 110.
  • Graphical interface 400 may include an entities list section 405 and an indicator list section 410.
  • Indicator list section 410 may include a number of columns (numbered 411-415 in Fig. 4), where each column provides information about the corresponding entity shown in section 410.
  • Entities list section 405 may display the names of the files and underlying functions that have been analyzed. Each file and function may be visually distinguished based on its error classification.
  • the text of each file may be color coded (e.g., RED, ORANGE, GREEN, or GRAY) to indicate its error classification.
  • the "color" codes are shown using underlined, italic, and bold fonts, in which RED code is shown in italic, GREEN code is shown underlined, and ORANGE code is shown in bold.
  • Each file and underlying function may be colorized according to the most critical error found.
  • the file “polyspace main.c,” labeled as file 430, is GREEN, indicating no errors were found.
  • the file “example.c,” labeled as file 431, is red, indicating errors were found.
  • the user has chosen to drill-down into “example.c” by selecting the "+” icon, which shows the functions 432 contained in “example.c".
  • Functions 432 may be similarly color-coded. For example, the function "close to zero” is shown as GREEN code, the function “pointer arithmetic” is shown as RED code, and the function “recursion” is shown as orange code.
  • indicator list section 410 may include a column 411 that displays the software reliability of that code, where 100% indicates complete reliability of the code for the code category and 0% means no reliability.
  • Column 412 may indicate the number of RED code segments
  • column 413 may indicate the number of GRAY segments
  • column 414 may indicate the number of ORANGE segments
  • column 415 may indicate the number of GREEN segments.
  • ORANGE unproven code tends to be particularly troubling for developers. Because ORANGE code represents an unknown error state, ORANGE code may need to be manually verified by the software developer. For a large project, even if only five or ten percent of the code is classified as ORANGE, the manual effort required to review such code can be extensive.
  • verification tool 105 may automatically instrument ORANGE code for further dynamic verification analysis.
  • the dynamic verification analysis can provide further clarification on whether code initially classified as ORANGE code needs to be manually reviewed.
  • "Instrumenting code” may generally refer to modifying the code to embed test or verification statements in the code. These statements may be used to monitor or catch conditions that result in errors or other problems or properties of the code.
  • verification tool 105 may instrument the ORANGE code for dynamic testing (block 322).
  • Figs. 5A and 5B are diagrams illustrating exemplary code before and after instrumentation.
  • a section of code 500 may include two statements, a GREEN statement 501 and an ORANGE statement 502.
  • Fig. 5B shows code 500 after an exemplary automated code instrumentation procedure.
  • the ORANGE statement 502 is instrumented, while GREEN statement 501 has not been modified. Selective instrumentation of the code in this manner (e.g., only instrumenting ORANGE statements) may reduce the time required to test the code.
  • modified statement 502 may include an initial "if statement 510 that tests whether the denominator (i.e., the variable "K") in Fig.
  • 5A is equal or close to zero. If it is not, the original statement is executed at statement 511. If the denominator is equal or close to zero however, various error or warning messages may be raised. In this example, the warning messages may vary based on flags that define the environment in which the code is executing. Whether an error is generated may depend on the values of the input test cases. That is, an error will be generated only when the input test cases cause the variable K to be zero when statement 502 is executed.
  • the user may additionally provide verification tool 105 with input parameters for dynamic verification of the software (block 325).
  • the input parameters may relate to, for example, options or parameters applicable to a dynamic verification analysis performed by verification tool 105.
  • a particular type of input parameter may define ranges or values used for "test cases" for the software.
  • the test cases may be applicable to software that receives external values during execution.
  • the software may be designed to act on input received from users, external sensors, or external applications or computing devices.
  • the variables needed to describe the external stimuli may be used to define the test cases.
  • the user may define a range that is to be covered for each variable to define all of the test cases. For example, considered a software system in which two integer variables, X and Y, represent all of the external inputs received by the system. The designer may decide that a satisfactory test can be obtained when X is constrained to the range of zero to five and Y is constrained to the range of zero to two.
  • test cases for this example system would be [(0,0), (0,1), (0,2), (1, 0), (1,1), (1,2), (2,0) ... (5, 2)].
  • verification tool 105 may randomly select 10 of these pairs of values, which may be used as the set of test cases that are used to test the software.
  • Other techniques for specifying test cases may be used. For example, instead of having test cases automatically generated based on a range of variable values entered by the user, the user may specify specific test cases to be used, such as by manually entering the specific test cases.
  • Verification tool 105 may next perform the dynamic verification analysis (block 330).
  • the dynamic verification analysis may be performed by compiling the code and then executing it multiple times, with each execution using a different test case from the set of defined test cases.
  • verification tool 105 may output additional verification results that provide additional verification information about the code that has been instrumented for dynamic verification analysis (block 335). The additional verification information can be used to reduce the number of "false positives" (i.e., uncertain output conditions) generated by the initial static verification analysis.
  • FIG. 6 is a diagram illustrating an exemplary graphical interface 600 in which the additional verification results may be presented to a user.
  • Graphical interface 600 may be presented to a user, for example, by verification tool 105 on a display of a workstation 110.
  • Graphical interface 600 may include a variable list section 605, a test summary section 610, and an error log section 615.
  • Variable list section 605 may contain information about each of the variables used to define the test cases. As shown, variable list section 605 may include a variable name field 620, a variable type field 625, and a variable values field 630.
  • Each item in variable name field 620 may be the name of an input variable of the software, or an output parameter of an external function that is stubbed by verification tool 105.
  • Variable type field 625 may list the type of each variable. For example, the variable returned from the function "random float" is a variable of type "float32.”
  • Variable values field 630 may list the range of values, such as the minimum and maximum value that was assigned to the variable.
  • Test summary section 610 may provide summary information or configuration information relating to the dynamic verification analysis. As shown in Fig. 6, the summary information may include a configuration section 640 and a results section 645. In the configuration section 640, the user may enter configuration information relating to the dynamic analysis.
  • Results section 645 may display results relating to the dynamic verification analysis.
  • the results may be dynamically updated as the dynamic verification analysis executes.
  • the displayed results may include, for example, the number of completed tests, the number of tests in which no run-time error was detected in instrumented code sections, and the number of test in which run-time errors were detected in instrumented code sections.
  • Error log section 615 may provide a detailed list of all of the errors that occurred during the dynamic verification analysis. For example, as shown in Fig. 6, for each error, error log section 615 may display the name of the file in which the error occurred, the line number and column of the error, a description of the error, and the number of test cases in which the error occurred.
  • a user may monitor and/or control the dynamic verification analysis performed on selected code. Based on the dynamic verification results, the user can increase their confidence in the reliability of the code. As the tests performed may not be exhaustive, they may not definitively prove that the code is run-time error free. But a set of performed tests without any run-time errors may increase the user's confidence in the code's reliability.
  • verification tool 105 may provide results of the dynamic verification analysis in conjunction with or together with the color codes generated by the static verification analysis. For example, one or more additional categories may also be created.
  • a "DYNAMIC VERIFICATION GREEN" category may be created, which may be associated with a different color, such as a shade of green different than the green used to present the static verification analysis GREEN code.
  • a "DYNAMIC VERIFICATION GREEN” category may be created, which may be associated with a different color, such as a shade of green different than the green used to present the static verification analysis GREEN code.
  • the categories for the dynamic verification and static verification may be presented using an interface similar to that shown in Fig. 4.
  • Fig. 7 is a diagram conceptually illustrating components of verification tool 105 in an exemplary implementation in which verification tool 105 is implemented using a client-server model.
  • verification tool 105 may include components 710 that execute on a server, such as one or more of servers 120, and components 715 that execute on a client, such as one of workstations 110. More particularly, server components 710 may include static code verification component 711 and results 712. Client components 715 may include source code 720, result viewer component 721, potential bug list 722, test case generator component 723, test cases 724, dynamic instrumentation component 725, instrumented code 726, executable component 727, and potential bug list 728. Client components 715 may be associated with, for example, one of workstations 110.
  • a user may load or otherwise generate source code 720.
  • the source code may be transmitted to server 120 via a network, such as network 130.
  • Server 120 may function as a static code verification component for a number of users, such as for all of the software developers for a particular company.
  • Static code verification component 711 may analyze the source code using static verification analysis techniques, such as using the techniques discussed previously with respect to block 315 (Fig. 3). The initial verification results may be stored as results 712. Results 712 may be transmitted back to client 110 for viewing and/or analysis.
  • Client components 715 may particularly include result viewer 721.
  • Result viewer 721 may present results 722 to the user.
  • Result viewer 721 may, for example, contain a graphical user interface, such as the one shown in Fig. 4, that uses color-codes to illustrate the error status of different sections of code.
  • Result viewer 721 may additionally output or store results 712 as potential bug list 722.
  • Client components 715 may additionally include test case generator component 723 to generate test cases 724.
  • Test case generator component 723 may, for example, present an interface such as variable list section 605 of graphical interface 600 (Fig. 6). Through this graphical interface, users may define, for instance, value ranges that are to be tested for each input variable of the software. Based on this information, test case generator component 723 may generate test cases 724.
  • Client components 720 may additionally include dynamic instrumentation component 725 to instrument source code 720 based on results 712. As previously discussed, this instrumentation may be performed such that code segments that were determined by the static verification analysis to have an unknown or unproven error condition (e.g., ORANGE code) may be instrumented.
  • Source code 720 after instrumentation by dynamic instrumentation component 725, is stored as instrumented code 726.
  • Instrumented code 726 may be compiled and run in conjunction with test cases 724 as executable component 727 and may generate potential bug list 728 during the execution.
  • Fig. 7 illustrates an implementation of verification tool 105 using a client- server model, in an alternative implementation, all or substantially all of the functionality of verification tool 105 may be implemented on a single computer, such as a single workstation 110.
  • the dynamic verification analysis techniques may be automatically applied to selected portions of the code.
  • the selected portions of the code may correspond to portions of the code that was determined to correspond to unproven sections of the code.

Abstract

A software testing tool may selectively modify (instrument) portions of a software project for dynamic verification analysis based on the results a static verification analysis. In one implementation, the tool may perform a static verification analysis (315) of the computer code and selectively instrument portions of the computer code for dynamic verification analysis, the selected portion of the computer code including a portion of the computer code that was determined by the static verification analysis to include an unproven error condition. The tool may further execute the computer code to perform a dynamic verification analysis of the selectively instrumented portion of the computer code and store output information describing potential errors identified by the dynamic verification analysis.

Description

SELECTIVE CODE INSTRUMENTATION FOR SOFTWARE VERIFICATION
BACKGROUND [0001] Software products can potentially be very large and complex. Software testing is the process used to assess the quality of developed computer software. Quality may be judged based on a number of metrics, such as correctness, completeness, reliability, number of bugs found, efficiency, compatibility, etc.
[0002] The amount of testing required for a particular software project frequently depends on the target for the deployed software. A developer of game software intended for personal computers, for example, may devote relatively little resources into formal testing of the software.
In contrast, the developer of a mission critical application in the healthcare, automotive, or utility industry may require a much more rigorous level of software testing.
[0003] One technique for testing software is based on the concept of static verification of the software code. In general, static code verification is an analysis performed without executing the software. Static verification of software code can prove, for example, which operations are free of run-time errors such as numeric overflows, divisions by zero, buffer overflows, or pointer issues, and identify where run-time errors will or might occur.
[0004] In one existing system, static verification is used to classify the code into categories. The categories may include code determined to be good or safe or correct, code determined to have errors, code determined not to be accessible (e.g., "dead code" or "deactivated code") and code for which a determination could not be made (e.g., "don't know"). Code classified as
"don't know" represents code that the static verification system could not conclusively determine as including an error. A developer faced with "don't know" code may be required to manually review the code for errors. Manual review of code can be highly labor intensive. BRIEF DESCRIPTION OF THE DRAWINGS
[0005] The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate one or more implementations described herein and, together with the description, explain these implementations. In the drawings:
[0006] Fig. 1 is an exemplary diagram of a system in which concepts described herein may be implemented;
[0007] Fig. 2 is a diagram of an exemplary device corresponding to one of the workstations or servers shown in Fig. 1 ; [0008] Fig. 3 is a flow chart illustrating exemplary operations that may be performed by the verification tool shown in Fig. 1 ;
[0009] Fig. 4 is a diagram illustrating an exemplary graphical interface in which static verification results may be presented to a user;
[0010] Figs. 5A and 5B are diagrams illustrating exemplary code before and after instrumentation;
[0011] Fig. 6 is a diagram illustrating an exemplary graphical interface in which dynamic verification results may be presented to a user; and
[0012] Fig. 7 is a diagram conceptually illustrating components of a verification tool in an exemplary implementation in which a verification tool is implemented using a client-server model.
DETAILED DESCRIPTION
[0013] The following detailed description refers to the accompanying drawings. The same reference numbers in different drawings may identify the same or similar elements. Also, the following detailed description does not limit the invention.
OVERVIEW
[0014] Implementations described herein relate to an automated software testing tool in which the results of a static verification analysis technique are used to select portions of the code for additional analysis. The selected portions may include the code determined by the static verification analysis to have an unknown or unproven error condition. These selected portions may be modified (instrumented) for dynamic analysis and then executed in conjunction with input test values to perform a dynamic verification analysis of the code. The results of the dynamic analysis can provide additional verification information about the code that was determined to be unproven by the static verification testing.
DEFINITIONS
[0015] "Static verification analysis," "static verification testing," or "static analysis techniques," as the phrases are used herein, generally refer to an analysis of computer code that is performed without executing the code. For example, static code analysis may examine code using abstract interpretation techniques to verify all possible execution paths of a program. [0016] "Dynamic verification analysis," "dynamic testing," or "dynamic analysis techniques," as the phrases are used herein, refer to verification of software performed by or during the execution of the software. Dynamic verification may involve, for example, executing the software with a set of test input values. SYSTEM DESCRIPTION
[0017] Fig. 1 is an exemplary diagram of a system 100 in which concepts described herein may be implemented. The system may include one or more personal computers or workstations 110, one or more servers 120, and a network 130. Consistent with aspects described herein, software verification tool 105 may be executed by one or more of servers 120 and workstations 110 to assist in software verification.
[0018] Workstations 110 may include computing devices, such as desktop or laptop computers, that may be used for general computing tasks. In general, users of workstations 110 may be software developers. The users may use verification tool 105 to assist in verifying their developed software code. In some implementations, as shown in Fig. 1, verification tool 105 may include client-side components and server-side components. The client-side components may be executed at the user's workstation 110 while the server-side components may execute at one or more of servers 120. In an alternative implementation, and possibly under certain restrictions on use such as the size of the code to be verified, verification tool 105 may execute exclusively at the user's workstation 110.
[0019] In some implementations, workstations 110 may execute a technical computing environment (TCE) that presents a user with an interface that enables efficient analysis and generation of technical applications. For example, the TCE may provide a numerical and/or symbolic computing environment that allows for matrix manipulation, plotting of functions and data, implementation of algorithms, creation of user interfaces, and/or interfacing with programs in other languages. The TCE may be textual and/or graphical.
[0020] Servers 120 may each include a device, such as a computer or another type of computation or communication device, a thread or process running on one of these devices, and/or an object executable by one of these devices. Server device 120 may generally provide services to other devices (e.g., workstations 110) connected to network 130. In one embodiment, one or more of server devices 120 may include server components of software verification tool 105. [0021] Fig. 2 is a diagram of an exemplary device 200 corresponding to one of workstations 110 or servers 120. As illustrated, device 200 may include a bus 210, a processing unit 220, a main memory 230, a read-only memory (ROM) 240, a storage device 250, an input device 260, an output device 270, and/or a communication interface 280. Bus 210 may include a path that permits communication among the components of workstation 110.
[0022] Processing unit 220 may include a processor, microprocessor, or other types of processing logic that may interpret and execute instructions. Main memory 230 may include a random access memory (RAM) or another type of dynamic storage device that may store information and instructions for execution by processing unit 220. ROM 240 may include a ROM device or another type of static storage device that may store static information and/or instructions for use by processing unit 220. Storage device 250 may include a magnetic and/or optical recording medium and its corresponding drive.
[0023] Input device 260 may include a mechanism that permits an operator to input information to workstation 110, such as a keyboard, a mouse, a pen, a microphone, a touch- sensitive display, voice recognition and/or biometric mechanisms, etc. Output device 270 may include a mechanism that outputs information to the operator, including a display, a printer, a speaker, etc. Communication interface 280 may include any transceiver-like mechanism that enables workstation 110 to communicate with other devices and/or systems. For example, communication interface 280 may include mechanisms for communicating with another device or system via a network.
[0024] As will be described in detail below, workstation 110 may perform certain operations in response to processing unit 220 executing software instructions contained in a computer- readable medium, such as main memory 230. A computer-readable medium may be defined as a physical or logical memory device. The software instructions may be read into main memory 230 from another computer-readable medium, such as storage device 250, or from another device via communication interface 280. The software instructions contained in main memory 230 may cause processing unit 220 to perform processes that will be described later. Alternatively, hardwired circuitry may be used in place of or in combination with software instructions to implement processes described herein. Thus, implementations described herein are not limited to any specific combination of hardware circuitry and software. [0025] Although Fig. 2 shows exemplary components of device 200, in other implementations device 200 may contain fewer, different, or additional components than depicted in Fig. 2. In still other implementations, one or more components of device 200 may perform one or more tasks performed by one or more other components of device 200.
SOFTWARE VERIFICATION TOOL
[0026] As previously mentioned, verification tool 105 may be used to measure the quality of developed computer software. Verification tool 105 may perform both static and dynamic software verification. The dynamic software verification may be selectively implemented so that only portions of the code determined to be unproven by the static verification will be instrumented for dynamic analysis.
[0027] In one implementation, verification tool 105 may be used in the context of a technical computing environment. A "technical computing environment," as the term is used herein, is to be broadly interpreted to include any hardware and/or software based logic that provides a computing environment that allows users to perform tasks related to disciplines, such as, but not limited to, mathematics, science, engineering, medicine, business, etc., more efficiently than if the tasks were performed in another type of computing environment, such as an environment that required the user to develop code in a conventional programming language, such as C++, C, Ada, Java, Javascript, Perl, Ruby, Fortran, Pascal, etc. A technical computing environment may additionally provide mathematical functions and/or graphical tools or blocks (e.g., for creating plots, surfaces, images, volumetric representations, etc.). A technical computing environment, however, is not limited to performing disciplines discussed above, and may be used to perform computations relating to fields of endeavor not typically associated with mathematics, science or business. Verification tool 105 may operate as a component in a technical computing environment to verify code created with the technical computing environment. For example, the technical computing environment may give the user an option to create graphical models. The technical computing environment may then compile the created graphical model for execution on a target system. Verification tool 105 may be used to verify the code that embodies the graphical model. Alternatively, verification tool 105 may be used to verify generated code or the graphical model itself. [0028] More generally, although verification tool 105 may be used in the context of a technical computing environment, verification tool 105 may be used with any software development project. For example, verification tool 105 may analyze code written in conventional programming language, such as C++, C and Ada, etc., and which is produced manually by a developer with or without the use of a technical computing environment. [0029] Fig. 3 is a flow chart illustrating exemplary operations that may be performed by verification tool 105.
[0030] Verification tool 105 may initially receive the software code that is to be verified (block 305). For example, a user at one of workstations 110 may use verification tool 105 to select one or more files that contain the software code that is to be verified. The software code may be textual source code, or code describing a graphical model created using a technical computing environment, or an intermediate representation of the textual and/or graphical source code.
[0031] The user may additionally provide verification tool 105 with input parameters for the software verification (block 310). The input parameters may relate to, for example, options or parameters applicable to a static verification analysis performed by verification tool 105. These static verification options may include, for example, the names of the files or procedures that are to be analyzed, the specific static verification algorithms to use, and/or the options relating to the static verification algorithms that are to be used. [0032] Verification tool 105 may perform a static verification analysis to generate initial classifications for the code (block 315). As previously mentioned, static verification analysis may involve analysis of the software code without execution of the code. For example, the static verification may involve analysis of the code to construct a model of the code (i.e., an abstract representation of the code). The model can be used for matching commonly occurring error patterns to the code. The model may also be used to perform some kind of data-flow analysis of the code to infer the possible values that variables might have at certain points in the program. Data-flow analysis can be used for vulnerability checking. Static verification analysis may be used to prove which operations are free of run-time errors or to find possible errors. Errors that may be found include: overflows and underflows; divisions by zero and other arithmetic errors; out-of-bounds array access; illegally dereferenced pointers; read-only access to non-initialized data; dangerous type conversions; dead code; access to null pointers; dynamic errors related to object programming and inheritance; errors related to exception handling; and non-initialized class members in the C++ language. [0033] As a result of the static analysis, verification tool 105 may classify the code into classifications that relate to possible errors in the code. Verification tool 105 may output the verification results from the static analysis (block 320). For example, the results may be saved to a file, shown to a user on a display, stored in a database, etc. [0034] In one implementation, the classification may include classifying each possible failure point in the source code into classes that define: (1) code that has no errors, code that might have errors (unknown or unproven conditions), (2) code that definitely has errors, or (3) code that cannot be reached. The classifications may be presented to the user in a number of possible ways, such as by changing the appearance of the code (e.g., font type, font size, font color, etc.) based on its classification. In one particular implementation, the code may be presented using color codes. For example, the code may be shown to the user as GREEN code (code that has no errors), RED (code that definitely has errors), GRAY (code that cannot be reached), or ORANGE (unknown or unproven error conditions). This color-identifier assignment is for illustration purposes only, and in alternative embodiments, other colors or other visual or non-visual schemes for identifying different types of verified code may be used, as determined by one of skill in the art.
[0035] Fig. 4 is a diagram illustrating an exemplary graphical interface 400 in which verification results may be presented to a user, such as the initial verification results (Fig. 3, block 320). Graphical interface 400 may be presented to a user, for example, by verification tool 105 on a display of one of workstations 110.
[0036] Graphical interface 400 may include an entities list section 405 and an indicator list section 410. Indicator list section 410 may include a number of columns (numbered 411-415 in Fig. 4), where each column provides information about the corresponding entity shown in section 410.
[0037] Entities list section 405 may display the names of the files and underlying functions that have been analyzed. Each file and function may be visually distinguished based on its error classification. In one implementation, as mentioned above, the text of each file may be color coded (e.g., RED, ORANGE, GREEN, or GRAY) to indicate its error classification. In this example, however, the "color" codes are shown using underlined, italic, and bold fonts, in which RED code is shown in italic, GREEN code is shown underlined, and ORANGE code is shown in bold. [0038] Each file and underlying function may be colorized according to the most critical error found. For example, the file "polyspace main.c," labeled as file 430, is GREEN, indicating no errors were found. The file "example.c," labeled as file 431, is red, indicating errors were found. In this example, the user has chosen to drill-down into "example.c" by selecting the "+" icon, which shows the functions 432 contained in "example.c". Functions 432 may be similarly color-coded. For example, the function "close to zero" is shown as GREEN code, the function "pointer arithmetic" is shown as RED code, and the function "recursion" is shown as orange code.
[0039] For each file or function, indicator list section 410 may include a column 411 that displays the software reliability of that code, where 100% indicates complete reliability of the code for the code category and 0% means no reliability. Column 412 may indicate the number of RED code segments, column 413 may indicate the number of GRAY segments, column 414 may indicate the number of ORANGE segments, and column 415 may indicate the number of GREEN segments. [0040] As shown in Fig. 4, the results of a static verification analysis can be concisely presented to a user in a format in which the user can quickly gauge which code merits further attention.
[0041] In practice, ORANGE (unproven) code tends to be particularly troubling for developers. Because ORANGE code represents an unknown error state, ORANGE code may need to be manually verified by the software developer. For a large project, even if only five or ten percent of the code is classified as ORANGE, the manual effort required to review such code can be extensive.
[0042] Consistent with embodiments described herein, verification tool 105 may automatically instrument ORANGE code for further dynamic verification analysis. The dynamic verification analysis can provide further clarification on whether code initially classified as ORANGE code needs to be manually reviewed. "Instrumenting code," as the phrase is used herein, may generally refer to modifying the code to embed test or verification statements in the code. These statements may be used to monitor or catch conditions that result in errors or other problems or properties of the code. Referring back to Fig. 3, verification tool 105 may instrument the ORANGE code for dynamic testing (block 322).
[0043] Figs. 5A and 5B are diagrams illustrating exemplary code before and after instrumentation. As shown in Fig. 5 A, a section of code 500 may include two statements, a GREEN statement 501 and an ORANGE statement 502. Fig. 5B shows code 500 after an exemplary automated code instrumentation procedure. Here, the ORANGE statement 502 is instrumented, while GREEN statement 501 has not been modified. Selective instrumentation of the code in this manner (e.g., only instrumenting ORANGE statements) may reduce the time required to test the code. [0044] As shown in Fig. 5B, modified statement 502 may include an initial "if statement 510 that tests whether the denominator (i.e., the variable "K") in Fig. 5A is equal or close to zero. If it is not, the original statement is executed at statement 511. If the denominator is equal or close to zero however, various error or warning messages may be raised. In this example, the warning messages may vary based on flags that define the environment in which the code is executing. Whether an error is generated may depend on the values of the input test cases. That is, an error will be generated only when the input test cases cause the variable K to be zero when statement 502 is executed.
[0045] Referring back to Fig. 3, the user may additionally provide verification tool 105 with input parameters for dynamic verification of the software (block 325). The input parameters may relate to, for example, options or parameters applicable to a dynamic verification analysis performed by verification tool 105.
[0046] A particular type of input parameter may define ranges or values used for "test cases" for the software. The test cases may be applicable to software that receives external values during execution. For example, the software may be designed to act on input received from users, external sensors, or external applications or computing devices. The variables needed to describe the external stimuli may be used to define the test cases. The user may define a range that is to be covered for each variable to define all of the test cases. For example, considered a software system in which two integer variables, X and Y, represent all of the external inputs received by the system. The designer may decide that a satisfactory test can be obtained when X is constrained to the range of zero to five and Y is constrained to the range of zero to two.
Further, the designer may decide that 10 dynamic tests should be played. The complete set of (X, Y) test cases for this example system would be [(0,0), (0,1), (0,2), (1, 0), (1,1), (1,2), (2,0) ... (5, 2)]. Of these, verification tool 105 may randomly select 10 of these pairs of values, which may be used as the set of test cases that are used to test the software. [0047] Other techniques for specifying test cases may be used. For example, instead of having test cases automatically generated based on a range of variable values entered by the user, the user may specify specific test cases to be used, such as by manually entering the specific test cases. Additionally, in some implementations, test cases generated by multiple different sources may be combined or imported for use in the final set of test cases. This can potentially provide the advantage of improving the structural coverage and over all quality of the dynamic testing. [0048] Verification tool 105 may next perform the dynamic verification analysis (block 330). The dynamic verification analysis may be performed by compiling the code and then executing it multiple times, with each execution using a different test case from the set of defined test cases. [0049] As a result of the dynamic analysis, verification tool 105 may output additional verification results that provide additional verification information about the code that has been instrumented for dynamic verification analysis (block 335). The additional verification information can be used to reduce the number of "false positives" (i.e., uncertain output conditions) generated by the initial static verification analysis.
[0050] Fig. 6 is a diagram illustrating an exemplary graphical interface 600 in which the additional verification results may be presented to a user. Graphical interface 600 may be presented to a user, for example, by verification tool 105 on a display of a workstation 110. [0051] Graphical interface 600 may include a variable list section 605, a test summary section 610, and an error log section 615. Variable list section 605 may contain information about each of the variables used to define the test cases. As shown, variable list section 605 may include a variable name field 620, a variable type field 625, and a variable values field 630. Each item in variable name field 620 may be the name of an input variable of the software, or an output parameter of an external function that is stubbed by verification tool 105. Variable type field 625 may list the type of each variable. For example, the variable returned from the function "random float" is a variable of type "float32." Variable values field 630 may list the range of values, such as the minimum and maximum value that was assigned to the variable. [0052] Test summary section 610 may provide summary information or configuration information relating to the dynamic verification analysis. As shown in Fig. 6, the summary information may include a configuration section 640 and a results section 645. In the configuration section 640, the user may enter configuration information relating to the dynamic analysis. For example, the user may enter the number of tests to be executed, the number of loops to run before determining that a loop is an infinite loop, and the timeout period associated with each test. Results section 645 may display results relating to the dynamic verification analysis. In one implementation, the results may be dynamically updated as the dynamic verification analysis executes. The displayed results may include, for example, the number of completed tests, the number of tests in which no run-time error was detected in instrumented code sections, and the number of test in which run-time errors were detected in instrumented code sections.
[0053] Error log section 615 may provide a detailed list of all of the errors that occurred during the dynamic verification analysis. For example, as shown in Fig. 6, for each error, error log section 615 may display the name of the file in which the error occurred, the line number and column of the error, a description of the error, and the number of test cases in which the error occurred.
[0054] Through graphical interface 600, a user may monitor and/or control the dynamic verification analysis performed on selected code. Based on the dynamic verification results, the user can increase their confidence in the reliability of the code. As the tests performed may not be exhaustive, they may not definitively prove that the code is run-time error free. But a set of performed tests without any run-time errors may increase the user's confidence in the code's reliability. [0055] In one implementation, based on the results of the dynamic verification analysis, verification tool 105 may provide results of the dynamic verification analysis in conjunction with or together with the color codes generated by the static verification analysis. For example, one or more additional categories may also be created. For example, a "DYNAMIC VERIFICATION GREEN" category may be created, which may be associated with a different color, such as a shade of green different than the green used to present the static verification analysis GREEN code. In this manner, the user can easily distinguish between the GREEN code generated by the static verification analysis and the "GREEN" code generated by the dynamic verification analysis. The categories for the dynamic verification and static verification may be presented using an interface similar to that shown in Fig. 4. [0056] Fig. 7 is a diagram conceptually illustrating components of verification tool 105 in an exemplary implementation in which verification tool 105 is implemented using a client-server model.
[0057] As shown in Fig. 7, verification tool 105 may include components 710 that execute on a server, such as one or more of servers 120, and components 715 that execute on a client, such as one of workstations 110. More particularly, server components 710 may include static code verification component 711 and results 712. Client components 715 may include source code 720, result viewer component 721, potential bug list 722, test case generator component 723, test cases 724, dynamic instrumentation component 725, instrumented code 726, executable component 727, and potential bug list 728. Client components 715 may be associated with, for example, one of workstations 110.
[0058] In operation, a user may load or otherwise generate source code 720. The source code may be transmitted to server 120 via a network, such as network 130. Server 120 may function as a static code verification component for a number of users, such as for all of the software developers for a particular company.
[0059] Static code verification component 711 may analyze the source code using static verification analysis techniques, such as using the techniques discussed previously with respect to block 315 (Fig. 3). The initial verification results may be stored as results 712. Results 712 may be transmitted back to client 110 for viewing and/or analysis.
[0060] Client components 715 may particularly include result viewer 721. Result viewer 721 may present results 722 to the user. Result viewer 721 may, for example, contain a graphical user interface, such as the one shown in Fig. 4, that uses color-codes to illustrate the error status of different sections of code. Result viewer 721 may additionally output or store results 712 as potential bug list 722.
[0061] Client components 715 may additionally include test case generator component 723 to generate test cases 724. Test case generator component 723 may, for example, present an interface such as variable list section 605 of graphical interface 600 (Fig. 6). Through this graphical interface, users may define, for instance, value ranges that are to be tested for each input variable of the software. Based on this information, test case generator component 723 may generate test cases 724.
[0062] Client components 720 may additionally include dynamic instrumentation component 725 to instrument source code 720 based on results 712. As previously discussed, this instrumentation may be performed such that code segments that were determined by the static verification analysis to have an unknown or unproven error condition (e.g., ORANGE code) may be instrumented. Source code 720, after instrumentation by dynamic instrumentation component 725, is stored as instrumented code 726. [0063] Instrumented code 726 may be compiled and run in conjunction with test cases 724 as executable component 727 and may generate potential bug list 728 during the execution. [0064] Although Fig. 7 illustrates an implementation of verification tool 105 using a client- server model, in an alternative implementation, all or substantially all of the functionality of verification tool 105 may be implemented on a single computer, such as a single workstation 110.
CONCLUSION
[0065] Techniques were described herein for testing code using both static and dynamic verification analysis techniques. The dynamic verification analysis techniques may be automatically applied to selected portions of the code. The selected portions of the code may correspond to portions of the code that was determined to correspond to unproven sections of the code.
[0066] The foregoing description of implementations provides illustration and description, but is not intended to be exhaustive or to limit the invention to the precise form disclosed. Modifications and variations are possible in light of the above teachings or may be acquired from practice of the invention.
[0067] For example, while a series of acts has been described with regard to Fig. 3, the order of the acts may be modified in other implementations. Further, non-dependent acts may be performed in parallel. [0068] Also, the term "user" has been used herein. The term "user" is intended to be broadly interpreted to include a workstation or a user of a workstation.
[0069] It will be apparent that embodiments, as described herein, may be implemented in many different forms of software, firmware, and hardware in the implementations illustrated in the figures. The actual software code or specialized control hardware used to implement embodiments described herein is not limiting of the invention. Thus, the operation and behavior of the embodiments were described without reference to the specific software code—it being understood that one would be able to design software and control hardware to implement the embodiments based on the description herein. [0070] Further, certain portions of the invention may be implemented as "logic" that performs one or more functions. This logic may include hardware, such as an application specific integrated circuit or a field programmable gate array, software, or a combination of hardware and software. [0071] No element, act, or instruction used in the present application should be construed as critical or essential to the invention unless explicitly described as such. Also, as used herein, the article "a" is intended to include one or more items. Where only one item is intended, the term "one" or similar language is used. Further, the phrase "based on" is intended to mean "based, at least in part, on" unless explicitly stated otherwise.

Claims

WHAT IS CLAIMED IS:
1. A computing device-implemented method, comprising: receiving computer code; performing a static verification analysis of the computer code; selectively instrumenting a portion of the computer code for dynamic verification analysis, the selected portion of the computer code including a portion of the computer code that has been determined by the static verification analysis to include an unproven error condition; executing the computer code to perform a dynamic verification analysis of the selectively instrumented portion of the computer code; and storing output information describing potential errors identified by the dynamic verification analysis.
2. The computing device-implemented method of claim 1, wherein the static verification analysis detects one or more of errors relating to one or a combination of the following: overflows and underflows; divisions by zero; out-of-bounds array access; illegally dereferenced pointers; read-only access to non-initialized data; dangerous type conversions; dead code; or access to null pointers.
3. The computing device-implemented method of claim 1, wherein the computer code is classified, based on the static verification analysis, into classes that include code that has an unproven error condition, code that has no errors, and code that is known to have errors.
4. The computing device-implemented method of claim 3, wherein the classes additionally include a class for unreachable code.
5. The computing device-implemented method of claim 3, further comprising: presenting the output information using color codes to represent the classifications of the computer code.
6. The computing device-implemented method of claim 1, further comprising: generating test cases that define sets of values for variables used by the computer code; and performing the dynamic verification analysis using the test cases.
7. The computing device-implemented method of claim 6, where generating the test cases includes automatically generating the test cases based on input parameters or manually entering test cases.
8. The computing device-implemented method of claim 1, wherein the computer code is code manually generated by a developer or automatically generated based on a model created in a technical computing environment .
9. A computer-readable medium containing programming instructions for execution by one or more processing devices, the computer readable medium comprising: instructions to receive computer code; instructions to perform a static verification analysis of the computer code; instructions to classify, based on the static verification analysis, the computer code into diagnostic categories that relate to reliability of the computer code; instructions to instrument portions of the computer code for a dynamic verification analysis, the instrumented portions corresponding to computer code that was categorized into selected diagnostic categories; instructions to perform the dynamic verification analysis on the computer code to identify potential errors in the computer code; and instructions to store output information describing the identified potential errors.
10. The computer-readable medium of claim 9, wherein the static verification analysis detects one or more of errors relating to overflows and underflows; division by zero; out-of-bounds array access; illegally dereferenced pointers; read-only access to non-initialized data; dangerous type conversions; dead code; or access to null pointers.
11. The computer-readable medium of claim 9, wherein the instructions to instrument portions of the computer code further include: instructions to instrument portions of the computer code that were categorized into a diagnostic category that includes code having an unproven error condition.
12. The computer-readable medium of claim 9, further comprising: instructions to present the output information using color codes to represent the diagnostic categories.
13. The computer-readable medium of claim 9, wherein the computer code represents code generated based on a model created in a technical computing environment.
14. A computing device comprising: a component to receive classifications for computer code that is based on potential error conditions, identified via static verification analysis techniques, in the computer code; a dynamic instrumentation component to selectively instrument portions of the computer code for dynamic verification of the computer code, the portions of the computer code selected for instrumentation being based on the classifications of the computer code; a test case generator to generate test cases for input to the computer code; and an execution component to execute the selectively instrumented computer code in conjunction with the test cases to perform a dynamic verification analysis of the computer code.
15. The device of claim 14, where the dynamic instrumentation component selectively instruments the portions of the computer code that is classified as having an unknown error condition.
16. The device of claim 14, where the potential error conditions include code that has an has an unproven error condition, code that has no errors, and code that is known to have errors.
17. The device of claim 14, where the computing device additionally presents output information describing potential errors identified by the dynamic verification analysis.
18. The device of claim 17, where the output information uses color codes to represent the classifications of the computer code.
19. The device of claim 14, wherein the computer code represents code generated based on a model created in a technical computing environment.
PCT/IB2008/001327 2008-02-01 2008-02-01 Selective code instrumentation for software verification WO2009095741A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/IB2008/001327 WO2009095741A1 (en) 2008-02-01 2008-02-01 Selective code instrumentation for software verification
EP08751038A EP2238540A1 (en) 2008-02-01 2008-02-01 Selective code instrumentation for software verification

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IB2008/001327 WO2009095741A1 (en) 2008-02-01 2008-02-01 Selective code instrumentation for software verification

Publications (1)

Publication Number Publication Date
WO2009095741A1 true WO2009095741A1 (en) 2009-08-06

Family

ID=39773027

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2008/001327 WO2009095741A1 (en) 2008-02-01 2008-02-01 Selective code instrumentation for software verification

Country Status (2)

Country Link
EP (1) EP2238540A1 (en)
WO (1) WO2009095741A1 (en)

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8336030B1 (en) 2009-09-11 2012-12-18 The Mathworks, Inc. System and method for coding standard testing
US8386504B1 (en) 2010-07-06 2013-02-26 The Mathworks, Inc. System and method for file differencing with importance ranking
US8473925B2 (en) 2010-05-11 2013-06-25 Ca, Inc. Conditional dynamic instrumentation of software in a specified transaction context
US8566800B2 (en) 2010-05-11 2013-10-22 Ca, Inc. Detection of method calls to streamline diagnosis of custom code through dynamic instrumentation
WO2014016649A1 (en) * 2012-07-27 2014-01-30 Freescale Semiconductor, Inc. Method and apparatus for implementing instrumentation code
EP2713277A1 (en) * 2012-09-28 2014-04-02 Accenture Global Services Limited Latent defect identification
US8752015B2 (en) 2011-12-05 2014-06-10 Ca, Inc. Metadata merging in agent configuration files
US8782612B2 (en) 2010-05-11 2014-07-15 Ca, Inc. Failsafe mechanism for dynamic instrumentation of software using callbacks
US8938729B2 (en) 2010-10-12 2015-01-20 Ca, Inc. Two pass automated application instrumentation
US9201765B2 (en) 2013-10-31 2015-12-01 Tata Consultancy Services Limited System and method to facilitate a user interface enabled review of static analysis warnings
US9411616B2 (en) 2011-12-09 2016-08-09 Ca, Inc. Classloader/instrumentation approach for invoking non-bound libraries
US9971897B2 (en) 2011-12-30 2018-05-15 International Business Machines Corporation Targeted security testing
US10002252B2 (en) 2014-07-01 2018-06-19 Fireeye, Inc. Verification of trusted threat-aware microvisor
US10025691B1 (en) 2016-09-09 2018-07-17 Fireeye, Inc. Verification of complex software code using a modularized architecture
US10108529B2 (en) 2015-10-13 2018-10-23 International Business Machines Corporation Dynamic instrumentation based on detected errors
US10423518B2 (en) 2016-04-27 2019-09-24 The Mathworks, Inc. Systems and methods for analyzing violations of coding rules
US10592678B1 (en) 2016-09-09 2020-03-17 Fireeye, Inc. Secure communications between peers using a verified virtual trusted platform module
KR102126960B1 (en) * 2020-04-24 2020-06-25 한화시스템 주식회사 Reliability testing integrated-platform of weapon system software for next-generation naval ship
US11669438B2 (en) * 2019-07-25 2023-06-06 Robert Bosch Gmbh Method for testing a system

Non-Patent Citations (7)

* Cited by examiner, † Cited by third party
Title
BEIZER, "SOFTWARE TESTING TECHNIQUES, 1990, USA, pages 155 - 157, XP008097104 *
BRAT G ET AL: "Precise and Scalable Static Program Analysis of NASA Flight Software", AEROSPACE, 2005 IEEE CONFERENCE BIG SKY, MT, USA 05-12 MARCH 2005, PISCATAWAY, NJ, USA,IEEE, PISCATAWAY, NJ, USA, 5 March 2005 (2005-03-05), pages 1 - 10, XP031213684, ISBN: 978-0-7803-8870-3 *
C. DEL GROSSO G. ANTONIOL M. DI PENTA: "An evolutionary testing approach to detect buffer overflow", PROCEEDINGS OF THE INTERNATIONAL SYMPOSIUM OF SOFTWARE RELIABILITY ENGINEERING (ISSRE), 2004, St Malo, Bretagne, France, pages 77 - 78, XP007905817 *
DEUTSCH A.: "Static verification of dynamic properties", WHITE PAPER, POLYSPACE TECHNOLOGIES, 27 November 2003 (2003-11-27), XP007905827 *
DING Z ET AL: "Practical Strategies to Improve Test Efficiency", TSINGHUA SCIENCE AND TECHNOLOGY, TSINGHUA UNIVERSITY PRESS, BEIJING, CN, vol. 12, 1 July 2007 (2007-07-01), pages 250 - 254, XP022933117, ISSN: 1007-0214, [retrieved on 20070701] *
ERKKINEN T., HOTE C.: "Automatic Flight Code Generation with Integrated Static Run-Time Error Checking and Code Analysis", 2006 AIAA MEETING PAPERS, vol. 11, no. 19-20, 2006, US, pages 1 - 9, XP007905828 *
See also references of EP2238540A1 *

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8336030B1 (en) 2009-09-11 2012-12-18 The Mathworks, Inc. System and method for coding standard testing
US8745586B1 (en) 2009-09-11 2014-06-03 The Mathworks, Inc. System and method for coding standard testing
US8782612B2 (en) 2010-05-11 2014-07-15 Ca, Inc. Failsafe mechanism for dynamic instrumentation of software using callbacks
US8473925B2 (en) 2010-05-11 2013-06-25 Ca, Inc. Conditional dynamic instrumentation of software in a specified transaction context
US8566800B2 (en) 2010-05-11 2013-10-22 Ca, Inc. Detection of method calls to streamline diagnosis of custom code through dynamic instrumentation
US8386504B1 (en) 2010-07-06 2013-02-26 The Mathworks, Inc. System and method for file differencing with importance ranking
US9104743B1 (en) 2010-07-06 2015-08-11 The Mathworks, Inc. System and method for file differencing with importance ranking
US8938729B2 (en) 2010-10-12 2015-01-20 Ca, Inc. Two pass automated application instrumentation
US8752015B2 (en) 2011-12-05 2014-06-10 Ca, Inc. Metadata merging in agent configuration files
US9411616B2 (en) 2011-12-09 2016-08-09 Ca, Inc. Classloader/instrumentation approach for invoking non-bound libraries
US9971897B2 (en) 2011-12-30 2018-05-15 International Business Machines Corporation Targeted security testing
US9971896B2 (en) 2011-12-30 2018-05-15 International Business Machines Corporation Targeted security testing
WO2014016649A1 (en) * 2012-07-27 2014-01-30 Freescale Semiconductor, Inc. Method and apparatus for implementing instrumentation code
CN103713995A (en) * 2012-09-28 2014-04-09 埃森哲环球服务有限公司 Latent defect identification
EP2713277A1 (en) * 2012-09-28 2014-04-02 Accenture Global Services Limited Latent defect identification
US9244821B2 (en) 2012-09-28 2016-01-26 Accenture Global Services Limited Latent defect indication
CN103713995B (en) * 2012-09-28 2018-04-06 埃森哲环球服务有限公司 Method and apparatus for determining test data
US9201765B2 (en) 2013-10-31 2015-12-01 Tata Consultancy Services Limited System and method to facilitate a user interface enabled review of static analysis warnings
US10002252B2 (en) 2014-07-01 2018-06-19 Fireeye, Inc. Verification of trusted threat-aware microvisor
US11244056B1 (en) 2014-07-01 2022-02-08 Fireeye Security Holdings Us Llc Verification of trusted threat-aware visualization layer
US10108529B2 (en) 2015-10-13 2018-10-23 International Business Machines Corporation Dynamic instrumentation based on detected errors
US10114730B2 (en) 2015-10-13 2018-10-30 International Business Machines Corporation Dynamic instrumentation based on detected errors
US10599554B2 (en) 2015-10-13 2020-03-24 International Business Machines Corporation Dynamic instrumentation based on detected errors
US10423518B2 (en) 2016-04-27 2019-09-24 The Mathworks, Inc. Systems and methods for analyzing violations of coding rules
US10025691B1 (en) 2016-09-09 2018-07-17 Fireeye, Inc. Verification of complex software code using a modularized architecture
US10592678B1 (en) 2016-09-09 2020-03-17 Fireeye, Inc. Secure communications between peers using a verified virtual trusted platform module
US11669438B2 (en) * 2019-07-25 2023-06-06 Robert Bosch Gmbh Method for testing a system
KR102126960B1 (en) * 2020-04-24 2020-06-25 한화시스템 주식회사 Reliability testing integrated-platform of weapon system software for next-generation naval ship

Also Published As

Publication number Publication date
EP2238540A1 (en) 2010-10-13

Similar Documents

Publication Publication Date Title
EP2238540A1 (en) Selective code instrumentation for software verification
US9535821B1 (en) Displaying violated coding rules in source code
EP2718820B1 (en) Identifying and triaging software bugs through backward propagation of under-approximated values and empiric techniques
Pradel et al. A framework for the evaluation of specification miners based on finite state machines
US8694958B1 (en) Marking up objects in code generation
US8997065B2 (en) Automatic modularization of source code
US7500149B2 (en) Generating finite state machines for software systems with asynchronous callbacks
Schroeder et al. Generating expected results for automated black-box testing
JP6142705B2 (en) Iterative generation of symbolic test drivers for object-oriented languages
Kadry A new proposed technique to improve software regression testing cost
Chowdhury et al. CyFuzz: A differential testing framework for cyber-physical systems development environments
Podelski et al. Classifying bugs with interpolants
WO2000072145A1 (en) Analyzing an extended finite state machine system model
Hayes Testing of object-oriented programming systems (OOPS): A fault-based approach
WO2000072147A1 (en) Analyzing an extended finite state machine system model
US6427233B1 (en) Method for addressing the dynamic windows
Jones et al. A formal methods-based verification approach to medical device software analysis
Plöger et al. A Usability Evaluation of AFL and libFuzzer with CS Students
Perez Dynamic code coverage with progressive detail levels
Alves et al. Static estimation of test coverage
Singh et al. The review: Lifecycle of object-oriented software testing
US11782682B2 (en) Providing metric data for patterns usable in a modeling environment
Odeh Software Source Code: Theoretical Analyzing and Practical Reviewing Model
Tasnim et al. Inferring Bug Patterns for Detecting Bugs in JavaScript By Analyzing Abstract Syntax Tree
Vashishtha et al. Software testing

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 2008751038

Country of ref document: EP

DPE2 Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08751038

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE