EP1782196A1 - Testing packages - Google Patents

Testing packages

Info

Publication number
EP1782196A1
EP1782196A1 EP04780083A EP04780083A EP1782196A1 EP 1782196 A1 EP1782196 A1 EP 1782196A1 EP 04780083 A EP04780083 A EP 04780083A EP 04780083 A EP04780083 A EP 04780083A EP 1782196 A1 EP1782196 A1 EP 1782196A1
Authority
EP
European Patent Office
Prior art keywords
testing
packages
package
computer
data structure
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP04780083A
Other languages
German (de)
French (fr)
Other versions
EP1782196A4 (en
Inventor
John Wesley Walker
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Corp
Original Assignee
Microsoft Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Corp filed Critical Microsoft Corp
Publication of EP1782196A1 publication Critical patent/EP1782196A1/en
Publication of EP1782196A4 publication Critical patent/EP1782196A4/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3409Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
    • G06F11/3414Workload generation, e.g. scripts, playback
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L43/00Arrangements for monitoring or testing data switching networks
    • H04L43/50Testing arrangements

Definitions

  • aspects of the present invention relate to computer systems. More particularly, aspects of the present invention relate to testing of computer systems.
  • Figure 2 shows a conventional testing system including a server 201 providing a list of test processes to agent 202.
  • a test developer at agent 202 selects the specific tests to be run and runs them as shown by processes 1-5 203-207.
  • the developer needs to specify each individual testing as well as ensure that the requirements for each test have been met prior to running each test.
  • a mistake in not enabling or instantiating a required test element directly translates into a loss of a test run as expected tests cannot be completed.
  • Figure 1 shows a general-purpose computing environment in accordance with aspects of the present invention.
  • Figure 2 shows a conventional testing system.
  • Figure 3 shows processes for assembling and/or deploying test packages in accordance with aspects of the present invention.
  • Figure 4 shows an illustrative user interface in accordance with aspects of the present invention.
  • Figure 5 shows various alternative processes for assembling and/or deploying test packages in accordance with aspects of the present invention.
  • Figure 6 shows yet another process for assembling and/or deploying test packages in accordance with aspects of the present invention.
  • Figure 7 shows an illustrative example of relationships between testing packages and related components in accordance with aspects of the present invention.
  • aspects of the present invention relate to defining and/or deploying test packages to make testing of computer systems (hardware, software, firmware, and the like) easier for computer developers.
  • an exemplary system for implementing the invention includes a computing device, such as computing device 100.
  • computing device 100 typically includes at least one processing unit 102 and memory 104.
  • memory 104 may be volatile (such as RAM), non- volatile (such as ROM, flash memory, etc.) or some combination of the two.
  • This most basic configuration is illustrated in Figure 1 by dashed line 106.
  • device 100 may also have additional features/functionality.
  • device 100 may also include additional storage (removable and/or non-removable) including, but not limited to, magnetic or optical disks or tape. Such additional storage is illustrated in Figure 1 by removable storage 108 and non-removable storage 110.
  • Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data.
  • Memory 104, removable storage 108 and non-removable storage 110 are all examples of computer storage media.
  • Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by device 100. Any such computer storage media may be part of device 100.
  • Device 100 may also contain communications connection(s) 112 that allow the device to communicate with other devices.
  • Communications connection(s) 112 is an example of communication media.
  • Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media.
  • modulated data signal means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
  • communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media.
  • the term computer readable media as used herein includes both storage media and communication media.
  • Device 100 may also have input device(s) 114 such as keyboard, mouse, pen, voice input device, touch input device, etc.
  • Output device(s) 116 such as a display, speakers, printer, etc. may also be included. All these devices are well know in the art and need not be discussed at length here.
  • testing of computer systems can be a time-consuming and tedious process.
  • Automated testing requires the running of an application on a test machine.
  • the test application and any dependencies have to be preconfigured on a test machine before the test is executed.
  • These dependencies include files, environment variable settings, registry settings, and commands. There can be a significant number of dependencies, of which failing to enable one will jeopardize the validity of a test run.
  • Manual testing is another commonly used testing system. Manual testing includes having a user physically control a system to approach a desired condition and then monitoring the condition. For instance, this may include a game developer controlling a game to reach a desired point then evaluate performance or rendering of the game. Consistently being able to reach the same predefined location may be jeopardized by modifications to the environment, thereby making consistent testing difficult.
  • a modified version of automated and manual testing may also be used.
  • "semi- automated testing” may be used to automate some portion of the testing process (e.g. system configuration) with manual verification and logging by a developer to follow.
  • testing packages may be used to help define, organize, and/or pre-configure tests for deployment.
  • related testing packages may be combined in which they may inherit properties from one another. Multiple tests may inherit from any one package. Each test may be associated with a package, which includes the inheritable information.
  • Computer scripts may be generated ahead of time or as needed.
  • One advantage of generating the computer scripts on a just-in-time basis is the reduction in administrative overhead.
  • Figure 3 shows a process for creating and deploying the test packages.
  • a developer uses a testing packages user interface 301 to perform a number of operations with server 302. The developer may perform the following operations:
  • an agent 303 on a test machine queries server 302 for packages associated with a test case.
  • the server may then start with a topmost package and walk through a package inheritance chain and return contents to the agent 303.
  • the agent 303 then generates script using dependency descriptions as shown in box 306.
  • agent 303 executes the generated script or may store or cause the script to be stored for later execution (locally or remotely).
  • Script is then invoked by the agent 303 to pre-configure test machines.
  • the script may perform a number of other operations including setting environment variables, setting registry entries, copying any required files, monitoring relevant applications, and scheduling any cleanup.
  • Package-specific processes 304-306 may then be run on one or more test machines. For instance, package-specific processes 304 may be run on a first test machine while package-specific process 306 is run on a second test machine.
  • the script may or may not perform simple cleanup operations after each test has been completed or when the collective set of tests has been completed.
  • Figure 4 shows an illustrative user interface for configuring test packages in accordance with aspects of the present invention.
  • Alternative user interfaces may include more or less sections or span across multiple interfaces.
  • Figure 4 includes user interface 401.
  • the user interface provides a developer with the ability to define a package and position it within a package hierarchy.
  • the various packages are shown in region 402.
  • the top-level package is "Main".
  • Subordinate packages include UIMain, UIl (with subordinate packages UIlA and UIlB), UI2, UI3, UI4, and UI5.
  • TextBoxMain includes subordinate packages TextBox 1, TextBox 2, and TextBox 3 (with subordinate packages TextBox 3A and TextBox 3B). Other packages may also be included.
  • Various add and delete buttons may be used to increase and decrease the files/environment variables, command lines, registry entries, and the like that may be associated with a testing package.
  • testing package UIMain has been selected.
  • Information relating to testing package UIMain is shown in regions 403-407 of Figure 4.
  • Region 403 shows the file and/or files referenced by testing package UIMain (possibly including the file name, source, destination, and whether the file needs to be registered (e.g., a COM/DLL).
  • Another file Test.vbs is also shown.
  • Region 404 shows environment variables related to testing package UIMain (possibly including a name associated with the environment variable and its setting).
  • Region 405 shows a command line or lines associated with Test.vbs (including the command-line as well as an indication whether it should be invoked during cleanup).
  • Region 406 shows registry entries (including a root registry entry, a path, registry type, and data associated with the registry entry).
  • Region 407 shows a listing of related packages.
  • package UIMain is related to and derived from a global package (referenced as _Global) and OS_Main_IP.
  • Dependent packages are derived from UIMain. These dependent packages include Workspace, Parsing, Stylus Input, Text Box Automation, Mouse Input and Keyboard Input testing packages.
  • Region 407 also includes regions that allow one to specify new dependencies for packages and to re ⁇ order package dependencies (for instance, the order in which they may be executed).
  • the following pseudo-code example represents script that may be created by agent 303 running on a test machine, after the agent 303 retrieved packages from server 303.
  • the following script example configures the test machine with related dependencies and then runs the test application.
  • the list may be sorted from least dependent to most dependent package.
  • cleanup of testing procedures is optional in this example. The cleanup may occur after each test is performed, after some tests have been performed, or after all tests have been performed, if at all.
  • Figure 5 shows various processes that may be used to deploy testing packages.
  • test packages are defined. Before, after, or simultaneously, test cases may be defined in step 502. In some situations, test cases may be defined well ahead of associating tests with test packages.
  • a test case or cases may be associated with a package or packages.
  • a test pass is defined.
  • the testing package or packages are deployed.
  • Step 506 shows the automated reporting of results.
  • automated testing may be combined with manual testing to ease burdens on developers needing to visually inspect aspects of a computer system that may be tedious to arrive at consistently. This may be referred to as semi-automated testing.
  • Step 507 shows the reporting of semi-automated reporting of results.
  • Step 508 shows the optional logging of manual results associated with manual verifications made in association with the test pass.
  • FIG. 6 shows a manual implementation of testing packages.
  • test cases are defined.
  • a test pass or test passes are defined. Step 601 or step 602 may occur first. Alternatively, both steps may occur at the same time.
  • a user logs manual results from running of the testing system.
  • the use of testing packages may allow a developer to more quickly arrive at a testing environment during running of a computer system.
  • FIG. 7 shows various relationships between objects, thereby creating a hierarchy of tests and testing packages.
  • Testing package 701 includes package identification, a package name, and an indication of the owner of the package.
  • Testing package 701 is associated with package dependency 702.
  • Package dependency 702 includes host package identification, a guest package identification, and the order of the package dependency. Two references are shown from package dependency 702 to package 701. The first relates to the host package ID (naming the package itself) and the second relates to the guest package ID (the package from which the host package inherits).
  • Environment entry 703 includes an environment entry identification, a package identification, an environment variable name, and a setting.
  • registry entry 704. includes a registry entry identification, a package identification, a root indication for the registry entry, a path, type, and data fields.
  • Package 701 further relates to file 705.
  • File 705 includes a file identification, a package identification, a server location, file name, local target, and an indication whether the file needs to be registered (including but not limited to a COM executable or DLL file).
  • command-line 706 includes a command-line identification, a package identification, a cleanup operation, and command text.
  • TestUI_CMD %RHINO_JOB% ⁇ TestUI.exe /xml:TestCases.xml /a /c /coverage /filter: " "all+ fxcop- setup- monitor+”"
  • TestUI_CMD %RHINO_JOB% ⁇ TestUI.exe /xml:TestCases.xml /a /c /filter:" "all+ fxcop- setup- monitor+”"
  • SDK15_InstallPoint W serverl ⁇ Builds ⁇ ArchivedBlds ⁇ TPGvl.5-RTM- 2600.xxxx.3027.0 ⁇ Retail ⁇ WispSetup
  • PARSER_LOG_FILES W serverlSScriptDataSPlatformSParsingsParserTestSAutomationLog sSM8Logs 7.
  • PARSER_ACTIONSCRIPT_FILES W serverlSScriptDataSPlatformSParsingsParserTestSAutomationLog sSM8Logs 7.
  • PARSER_METRIC_CONFIG_FILES ⁇ serverl ⁇ ScriptData ⁇ Platform ⁇ Parsing ⁇ ParserTest ⁇ MetricConfig
  • PARSER_ENGINE_CONFIG_FILES XX serverlXScriptDataXPlatformXParsingXParserTestXEngineConfig
  • PARSER_METRIC_CONFIG_FILES XX server1XScriptDataXPlatform ⁇ Parsing ⁇ ParserTest ⁇ MetricConfig
  • VSPATH %programfiles% ⁇ Microsoft Visual Studio .NET 2003 ⁇ Common7XIDE

Abstract

A system, method, and data structure for testing packages are described. Tests may be combined into testing packages at testing packages user interface (301) to make testing of computer systems easier for developers. Testing packages may be sets of one or more testing procedures that may be applied to computer systems from a testing packages user interface (301). Testing packages may inherit information based on their relationship to other testing packages.

Description

Testing Packages
BACKGROUND OF THE INVENTION
FIELD OF THE INVENTION
[01] Aspects of the present invention relate to computer systems. More particularly, aspects of the present invention relate to testing of computer systems.
DESCRIPTION OF RELATED ART
[02] Computer system developers desire to release bug-free systems and/or applications. Be it hardware, software, or firmware, all computer products undergo some level of testing. Conventional testing systems require test operators to individually specify each test to be run on a system. To run a test, the operator needs to determine the elements required for the test, instantiate the elements or enable the elements, then run a selected test. This combination of steps, while ideally simple, becomes exceedingly complex with complex software and/or hardware.
[03] Figure 2 shows a conventional testing system including a server 201 providing a list of test processes to agent 202. A test developer at agent 202 selects the specific tests to be run and runs them as shown by processes 1-5 203-207. For 1000 or more tests to be run, the developer needs to specify each individual testing as well as ensure that the requirements for each test have been met prior to running each test. A mistake in not enabling or instantiating a required test element directly translates into a loss of a test run as expected tests cannot be completed.
BRIEF SUMMARY OF THE INVENTION
[04] Aspects of the present invention addressed one or more of the issues described above, thereby providing an improved testing method and system for developers.
BRIEF DESCRIPTION OF THE DRAWINGS [05] Aspects of the present invention are illustrated by way of example and not limited in the accompanying figures in which like reference numerals indicate similar elements and in which:
[06] Figure 1 shows a general-purpose computing environment in accordance with aspects of the present invention.
[07] Figure 2 shows a conventional testing system.
[08] Figure 3 shows processes for assembling and/or deploying test packages in accordance with aspects of the present invention.
[09] Figure 4 shows an illustrative user interface in accordance with aspects of the present invention.
[10] Figure 5 shows various alternative processes for assembling and/or deploying test packages in accordance with aspects of the present invention.
[11] Figure 6 shows yet another process for assembling and/or deploying test packages in accordance with aspects of the present invention.
[12] Figure 7 shows an illustrative example of relationships between testing packages and related components in accordance with aspects of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
[13] Aspects of the present invention relate to defining and/or deploying test packages to make testing of computer systems (hardware, software, firmware, and the like) easier for computer developers.
[14] The following description is separated into the following sections: general purpose computing environment; automated and manual testing; testing packages; and testing package relationships. General Purpose Computing Environment
[15] With reference to Figure 1, an exemplary system for implementing the invention includes a computing device, such as computing device 100. In its most basic configuration, computing device 100 typically includes at least one processing unit 102 and memory 104. Depending on the exact configuration and type of computing device, memory 104 may be volatile (such as RAM), non- volatile (such as ROM, flash memory, etc.) or some combination of the two. This most basic configuration is illustrated in Figure 1 by dashed line 106. Additionally, device 100 may also have additional features/functionality. For example, device 100 may also include additional storage (removable and/or non-removable) including, but not limited to, magnetic or optical disks or tape. Such additional storage is illustrated in Figure 1 by removable storage 108 and non-removable storage 110. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Memory 104, removable storage 108 and non-removable storage 110 are all examples of computer storage media. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by device 100. Any such computer storage media may be part of device 100.
[16] Device 100 may also contain communications connection(s) 112 that allow the device to communicate with other devices. Communications connection(s) 112 is an example of communication media. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term "modulated data signal" means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. The term computer readable media as used herein includes both storage media and communication media.
[17] Device 100 may also have input device(s) 114 such as keyboard, mouse, pen, voice input device, touch input device, etc. Output device(s) 116 such as a display, speakers, printer, etc. may also be included. All these devices are well know in the art and need not be discussed at length here.
Automated And Manual Testing
[18] Testing of computer systems can be a time-consuming and tedious process. Two types of testing exist: automated testing and manual testing. Automated testing requires the running of an application on a test machine. The test application and any dependencies have to be preconfigured on a test machine before the test is executed. These dependencies include files, environment variable settings, registry settings, and commands. There can be a significant number of dependencies, of which failing to enable one will jeopardize the validity of a test run.
[19] Manual testing is another commonly used testing system. Manual testing includes having a user physically control a system to approach a desired condition and then monitoring the condition. For instance, this may include a game developer controlling a game to reach a desired point then evaluate performance or rendering of the game. Consistently being able to reach the same predefined location may be jeopardized by modifications to the environment, thereby making consistent testing difficult.
[20] A modified version of automated and manual testing may also be used. Here, "semi- automated testing" may be used to automate some portion of the testing process (e.g. system configuration) with manual verification and logging by a developer to follow.
Testing Packages
[21] To make testing easier, testing packages may be used to help define, organize, and/or pre-configure tests for deployment. In at least some aspects of the present invention, related testing packages may be combined in which they may inherit properties from one another. Multiple tests may inherit from any one package. Each test may be associated with a package, which includes the inheritable information.
[22] One advantage of packaging tests together is that dependency descriptions may be related between tests. Here, one package may be built upon other packages. A change in one package may then cascade through all dependent packages and associated tests and update them in the process.
[23] Computer scripts may be generated ahead of time or as needed. One advantage of generating the computer scripts on a just-in-time basis is the reduction in administrative overhead.
[24] Figure 3 shows a process for creating and deploying the test packages. A developer uses a testing packages user interface 301 to perform a number of operations with server 302. The developer may perform the following operations:
a. Define testing packages
b. Define test cases
c. Associate test case with package
d. Define test pass and
e. Deploy package.
[25] Next, in a client environment, an agent 303 on a test machine queries server 302 for packages associated with a test case. The server may then start with a topmost package and walk through a package inheritance chain and return contents to the agent 303. The agent 303 then generates script using dependency descriptions as shown in box 306. Next, agent 303 executes the generated script or may store or cause the script to be stored for later execution (locally or remotely).
[26] Script is then invoked by the agent 303 to pre-configure test machines. The script may perform a number of other operations including setting environment variables, setting registry entries, copying any required files, monitoring relevant applications, and scheduling any cleanup.
[27] Package-specific processes 304-306 may then be run on one or more test machines. For instance, package-specific processes 304 may be run on a first test machine while package-specific process 306 is run on a second test machine.
[28] The script may or may not perform simple cleanup operations after each test has been completed or when the collective set of tests has been completed.
[29] Figure 4 shows an illustrative user interface for configuring test packages in accordance with aspects of the present invention. Alternative user interfaces may include more or less sections or span across multiple interfaces.
[30] Figure 4 includes user interface 401. The user interface provides a developer with the ability to define a package and position it within a package hierarchy. The various packages are shown in region 402. Here, the top-level package is "Main". Subordinate packages include UIMain, UIl (with subordinate packages UIlA and UIlB), UI2, UI3, UI4, and UI5. TextBoxMain includes subordinate packages TextBox 1, TextBox 2, and TextBox 3 (with subordinate packages TextBox 3A and TextBox 3B). Other packages may also be included. Various add and delete buttons may be used to increase and decrease the files/environment variables, command lines, registry entries, and the like that may be associated with a testing package.
[31] As shown in region 402, testing package UIMain has been selected. Information relating to testing package UIMain is shown in regions 403-407 of Figure 4. Region 403 shows the file and/or files referenced by testing package UIMain (possibly including the file name, source, destination, and whether the file needs to be registered (e.g., a COM/DLL). Another file Test.vbs is also shown. Region 404 shows environment variables related to testing package UIMain (possibly including a name associated with the environment variable and its setting). Region 405 shows a command line or lines associated with Test.vbs (including the command-line as well as an indication whether it should be invoked during cleanup). Region 406 shows registry entries (including a root registry entry, a path, registry type, and data associated with the registry entry). Region 407 shows a listing of related packages. Here, for instance, package UIMain is related to and derived from a global package (referenced as _Global) and OS_Main_IP. Dependent packages are derived from UIMain. These dependent packages include Workspace, Parsing, Stylus Input, Text Box Automation, Mouse Input and Keyboard Input testing packages. Region 407 also includes regions that allow one to specify new dependencies for packages and to re¬ order package dependencies (for instance, the order in which they may be executed).
[32] The following pseudo-code example represents script that may be created by agent 303 running on a test machine, after the agent 303 retrieved packages from server 303. The following script example configures the test machine with related dependencies and then runs the test application.
Dim p As Package 'an arbitrary package utilized in a test pass
Function GetPackageDependencies (Package x)
Returns a list of packages that "x" is dependent on (including "x" itself) .
'The list may be sorted from least dependent to most dependent package.
"The first package in this list is the "_Global" package.
"The last package in this list is "x" itself. End Function
For Each RequiredPackage In GetPackageDependencies (p)
SetEnvironmentVariables(ReguiredPackage)
SetRegistryValues (RequiredPackage)
CopyFiles (RequiredPackage)
InvokeExecutables (RequiredPackage) Next RequiredPackage
For Each RequiredPackage In GetPackageDependencies (p)
CleanupPackage(RequiredPackage) Next RequiredPackage
[33] It is noted that the cleanup of testing procedures is optional in this example. The cleanup may occur after each test is performed, after some tests have been performed, or after all tests have been performed, if at all.
[34] Figure 5 shows various processes that may be used to deploy testing packages. In step 501 test packages are defined. Before, after, or simultaneously, test cases may be defined in step 502. In some situations, test cases may be defined well ahead of associating tests with test packages. In step 503, a test case or cases may be associated with a package or packages. In step 504, a test pass is defined. In step 505, the testing package or packages are deployed. Step 506 shows the automated reporting of results.
[35] In one example, automated testing may be combined with manual testing to ease burdens on developers needing to visually inspect aspects of a computer system that may be tedious to arrive at consistently. This may be referred to as semi-automated testing. Step 507 shows the reporting of semi-automated reporting of results.
[36] Step 508 shows the optional logging of manual results associated with manual verifications made in association with the test pass.
[37] Figure 6 shows a manual implementation of testing packages. In step 601, test cases are defined. In step 602, a test pass or test passes are defined. Step 601 or step 602 may occur first. Alternatively, both steps may occur at the same time. In step 603, a user logs manual results from running of the testing system. The use of testing packages may allow a developer to more quickly arrive at a testing environment during running of a computer system.
Testing Package Relationships
[38] Figure 7 shows various relationships between objects, thereby creating a hierarchy of tests and testing packages. Testing package 701 includes package identification, a package name, and an indication of the owner of the package. Testing package 701 is associated with package dependency 702. Package dependency 702 includes host package identification, a guest package identification, and the order of the package dependency. Two references are shown from package dependency 702 to package 701. The first relates to the host package ID (naming the package itself) and the second relates to the guest package ID (the package from which the host package inherits).
[39] Also related to package 701 is environment entry 703. Environment entry 703 includes an environment entry identification, a package identification, an environment variable name, and a setting. [40] Further related to package 701 is registry entry 704. Registry entry 704 includes a registry entry identification, a package identification, a root indication for the registry entry, a path, type, and data fields.
[41] Package 701 further relates to file 705. File 705 includes a file identification, a package identification, a server location, file name, local target, and an indication whether the file needs to be registered (including but not limited to a COM executable or DLL file).
[42] Finally package 701 includes an indication of a command-line 706. Command-line 706 includes a command-line identification, a package identification, a cleanup operation, and command text.
[43] The following provides sample entries that may be used in conjunction with the package 701 and related objects of Figure 7.
a. Sample environment entries:
1. TestUI_CMD = %RHINO_JOB%\TestUI.exe /xml:TestCases.xml /a /c /coverage /filter: " "all+ fxcop- setup- monitor+""
2. TestUI_CMD = %RHINO_JOB%\TestUI.exe /xml:TestCases.xml /a /c /filter:" "all+ fxcop- setup- monitor+""
3. PARSER_ACTIONSCRIPT_FILES =
SSserverlSScriptDataSPlatformSParsingSParserTestSActionScri ptFilesSAPI-integration
4. SDK15_InstallPoint = W serverl\Builds\ArchivedBlds\TPGvl.5-RTM- 2600.xxxx.3027.0\Retail\WispSetup
5. PARSER_METRIC_CONFIG_FILES = W server1\ScriptData\PlatformSParsing\ParserTest\MetricConfig\ API-integration
6. PARSER_LOG_FILES = W serverlSScriptDataSPlatformSParsingsParserTestSAutomationLog sSM8Logs 7. PARSER_ACTIONSCRIPT_FILES =
WserverlXScriptDataXPlatforiτΛParsinαXParserTestXActionScrip tFiles
8. PARSER_ACTIONSCRIPT-FILES = W serverlXScriptDataXPlatformVParsingXParserTestXActionScriptF iles
9. PARSER_METRIC_CONFIG_FILES = \\ serverl\ScriptData\Platform\Parsing\ParserTest\MetricConfig
10. PARSER_ENGINE_CONFIG_FILES = W serverlXScriptDataXPlatformXParsingXParserTestXEngineConfig
11. PARSER_ENGINE_CONFIG_FILES = XX serverlXScriptDataXPlatformXParsingXParserTestXEngineConfig
12. PARSER_METRIC_CONFIG_FILES = XX server1XScriptDataXPlatform\Parsing\ParserTest\MetricConfig
13. VSPATH = %programfiles%\Microsoft Visual Studio .NET 2003\Common7XIDE
b. Sample Command Lines:
1. %RHINO_JOB%\cpparsertest.exe -logcon -c
Provider=sqloledb;Server=SERVERl;database=InkLogs;Trusted_Co nnection=yes
2. cscript.exe %RHINO_JOB%\MUISetupBVT.vbs x:\Xserverl\osinst\MuirpBVT t:2600.1106.2201.0 w:2600.1106.2201.0
3. xcopy /Y
%TESTSRC%\Test\Personalization\MHC\MHCResources\TestInkFiles X* ,. XMHCResourcesXTestlnkFilesX
4. xcopy /Y
%TESTSRC%\Test\Personalization\MHC\MHCResources\TestScripts\ * . XMHCResourcesXTestScriptsX
5. "%programfiles%\Microsoft Visual Studio\Common\MSDev98\Bin\msdev" . \inkeditautotest.vb6.exe
6. "%programFiles%\Internet Explorer\iexplore.exe" http: //serverl/windowsapplicationδ.exe 7. xcopy
%SERVERl_DATA_FILES%\Systems\ContextE2E\DataUs\*google*.pif
8. cscript.exe %RHINO_JOB%\SetupBVTV2.vbs x:\\serverl\osinst\V2setupBVT t:2600.1081.2201.0
9. cscript.exe %RHINO_JOB%\SetupBVT-Lonestar-removed.vbs x:%RHINO_JOB% t:2600.1081.2201.0
10. cscript.exe %RHINO_JOB%\SetupBVT.vbs x: \\serverl\osinst\VlsetupBVT t:2600.1081.2201.0
11. xcopy %TESTSRC%\TabletPC\Test\ProgrammaticScenarioCases*.xml .
12. xcopy %SERVERl_DATA_FILES%\Systems\ContextE2E\DataUs\*.pif .
13. %RHINOJOB%\signaturesafe -s (signatures.txt) - d(master_sig.txt) -r(sig_cmp_rpt.txt)
14. xcopy %SERVERl_DATA_FILES%\Systems\ContextE2E\DataITA\*.*
15. xcopy %SERVERl_DATA_FILES%\Systems\ContextE2E\DataESP\*.*
16. xcopy %SERVERl_DATA_FILES%\Systems\ContextE2E\DataITA\*.*
17. xcopy %SERVERl_DATA_FILES%\Systems\ContextE2E\DataESP\*.*
18. xcopy %SERVERl_DATA_FILES%\Systems\TIPScenario\ESP\*.pif
19. %RHINOJOB%\idlewriter -s - i (%ENLIST_DRIVE%\\%COM_AUT_DIR%\\* .idl) -o(autsig.txt)
20. %RHINOJOB%\idlewriter -s - i(%ENLIST_DRIVE%\\%COM_DLL_DIR%\\*.idl) -o(dllsig.txt)
21. %RHINO_JOB%\LaunchIt.bat Managedlnheritance.html ManagedInheritance_Control.dll c. Sample File Entries (including a server location, local target and filename if applicable):
1. Wserverl\ScriptData\Platform\Parsing\ParserTest\WritingDraw ingClassification\BVT&Acceptance WritingDrawing classification BVT data_Labeled.xml
2. WServerlXpublicXTabletPCXretailXtestXAPIsControls recoa_vb6.exe
3. %TPG_TESTROOT%\ManifestAPI\Performance\ PerfApp.exe
4. %TPG_TEST_APISCONTROLS% GestureSemiCircleLeft.dat
5. %TPG_TEST_APISCONTROLS% GestureTap.dat
d. Sample Registry Entries (including a root path, a type, and data):
1. HKLM SOFTWARE\Microsoft\Windows NT\CurrentVersion\Winlogon\AutoAdminLogon REG_DW0RD 1
2. HKLM SOFTWARENMicrosoft:\Windows NT\CurrentVersion\Winlogon\AutoLogonCount REG_DW0RD 1
3. HKLM SOFTWARE\Microsoft\Windows
NT\CurrentVersion\Winlogon\DefaultDomainName REG_SZ REDMOND
4. HKLM SOFTWARE\Microsoft\Windows NT\CurrentVersion\Winlogon\DefaultUserName REG_SZ Guest
[44] The present invention has been described in terms of preferred and exemplary embodiments thereof. Numerous other embodiments, modifications and variations within the scope and spirit of the appended claims will occur to persons of ordinary skill in the art from a review of this disclosure.

Claims

We claim:
1. A system for creating testing packages comprising: a user interface that receives instructions linking two or more testing programs into a testing package; a storage that receives an association between the testing package and said two or more testing programs; a client computer that processes testing package to generate testing procedures for a computer system.
2. The system according to claim 1, wherein said client computer dynamically generates said testing procedures from said testing package.
3. A computer-implemented method for creating testing packages comprising the steps of: defining testing packages; defining test cases; associating test cases with said testing packages; defining a test pass; and deploying said testing packages to a storage.
4. The computer-implemented method according to claim 3, further comprising the step of generating executable code from said packages.
5. The computer-implemented method according to claim 3, further comprising the step of running code generated from said testing packages to test a computer system.
6. The computer-implemented method according to claim 5, further comprising the step of logging manual results.
7. A computer readable medium having a program stored thereon for creating testing packages, said program comprising the steps of: defining testing packages; defining test cases; associating test cases with said testing packages; defining a test pass; and deploying said testing packages to a storage.
8. The computer readable medium according to claim 7, said program further comprising the step of generating executable code from said packages.
9. The computer readable medium according to claim 7, said program further comprising the step of running code generated from said testing packages to test a computer system.
10. The computer readable medium according to claim 9, said program further comprising the step of logging manual results.
11. In a computing system, a user interface for receiving user designations of testing packages comprising: a first region that displays testing packages; a second region that allows a user to modify relationships between packages.
12. The user interface according to claim 11, further comprising: a third region listing files associated with one of the testing packages.
13. The user interface according to claim 11, further comprising: a third region listing environment variables associated with a file that is associated with one of the testing packages.
14. The user interface according to claim 11 , further comprising: a third region listing registry settings associated with a file that is associated with one of the testing packages.
15. The user interface according to claim 11 , further comprising: a third region listing clean up settings associated with a file that is associated with one of the testing packages.
16. The user interface according to claim 11 , further comprising: a third region providing the user with the ability to add a new testing package or file.
17. A computer-readable medium having a data structure stored thereon, said data structure relating a testing package, said data structure comprising: a first data structure storing a testing package identification; a second data structure referencing said testing package identification; and a third data structure referencing a child testing package.
18. The computer-readable medium according to claim 17, wherein said testing package identified in said first data structure inherits from said child testing package.
19. The computer-readable medium according to claim 17, further comprising: a fourth data structure that provides one or more registry entries associated with said testing package.
20. The computer-readable medium according to claim 17, further comprising: a fourth data structure that provides one or more environment variables associated with a file associated with said testing package.
21. The computer-readable medium according to claim 17, further comprising: a fourth data structure that references a file associated with said testing package.
22. The computer-readable medium according to claim 17, further comprising: a fourth data structure that describes a command line usable with a file associated with said testing package.
EP04780083A 2004-08-06 2004-08-06 Testing packages Withdrawn EP1782196A4 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2004/025184 WO2006022681A1 (en) 2004-08-06 2004-08-06 Testing packages

Publications (2)

Publication Number Publication Date
EP1782196A1 true EP1782196A1 (en) 2007-05-09
EP1782196A4 EP1782196A4 (en) 2010-08-25

Family

ID=35967774

Family Applications (1)

Application Number Title Priority Date Filing Date
EP04780083A Withdrawn EP1782196A4 (en) 2004-08-06 2004-08-06 Testing packages

Country Status (5)

Country Link
EP (1) EP1782196A4 (en)
JP (1) JP2008509472A (en)
KR (1) KR101038877B1 (en)
CN (1) CN101019103A (en)
WO (1) WO2006022681A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102300118B (en) * 2011-09-06 2015-03-25 博威科技(深圳)有限公司 Testing system and testing method for monitoring system
CN104956326A (en) 2013-02-01 2015-09-30 惠普发展公司,有限责任合伙企业 Test script creation based on abstract test user controls

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040073662A1 (en) * 2001-01-26 2004-04-15 Falkenthros Henrik Bo System for providing services and virtual programming interface
US20040143830A1 (en) * 2003-01-17 2004-07-22 Gupton Kyle P. Creation of application system installer

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5751941A (en) * 1996-04-04 1998-05-12 Hewlett-Packard Company Object oriented framework for testing software
US6662217B1 (en) * 1999-01-19 2003-12-09 Microsoft Corporation Distributed and automated test administration system for administering automated tests on server computers over the internet
US20040148590A1 (en) 2003-01-27 2004-07-29 Sun Microsystems, Inc., A Delaware Corporation Hierarchical test suite

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040073662A1 (en) * 2001-01-26 2004-04-15 Falkenthros Henrik Bo System for providing services and virtual programming interface
US20040143830A1 (en) * 2003-01-17 2004-07-22 Gupton Kyle P. Creation of application system installer

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of WO2006022681A1 *

Also Published As

Publication number Publication date
WO2006022681A1 (en) 2006-03-02
KR101038877B1 (en) 2011-06-02
EP1782196A4 (en) 2010-08-25
KR20070040412A (en) 2007-04-16
CN101019103A (en) 2007-08-15
JP2008509472A (en) 2008-03-27

Similar Documents

Publication Publication Date Title
EP3769223B1 (en) Unified test automation system
Gallaba et al. Use and misuse of continuous integration features: An empirical study of projects that (mis) use Travis CI
US8516464B2 (en) Computer system and method for resolving dependencies in a computer system
US9063725B2 (en) Portable management
CN105657191B (en) Application increment upgrading method and system based on Android system
US9152403B2 (en) Virtual software application deployment configurations
US8805804B2 (en) Configuring an application program in a computer system
US20080127175A1 (en) Packaging software products as single-file executables containing scripting logic
US9329841B2 (en) Software asset packaging and consumption
US20040088397A1 (en) System and method for management of software applications
US20090287643A1 (en) Context based script generation
US11481245B1 (en) Program inference and execution for automated compilation, testing, and packaging of applications
CN110908670A (en) Method and device for automatically publishing service
EP1782196A1 (en) Testing packages
Springer et al. Accelerating SCA compliance testing with advanced development tools
US9389844B2 (en) Solution for a computer system
US20050278694A1 (en) Describing Runtime Components of a Solution for a Computer System
US11360805B1 (en) Project discovery for automated compilation, testing, and packaging of applications
Elia et al. ITWS: An extensible tool for interoperability testing of web services
CN117215965B (en) Test case identification-based test method and device, electronic equipment and medium
Pasala et al. An approach for test suite selection to validate applications on deployment of COTS upgrades
Zielińska Framework for Extensible Application Testing
Kulkarni Provenance Issues in {Platform-as-a-Service} Model of Cloud Computing
Soundarrajan et al. Using black-box persistent state manifest for dependency management in patching and upgrading J2EE based applications
Schaefer JBoss 3.0: Quick Start Guide

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20070116

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PL PT RO SE SI SK TR

DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20100726

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

18W Application withdrawn

Effective date: 20120918