EP1236108A1 - A system and method for testing and validating devices having an embedded operating system - Google Patents

A system and method for testing and validating devices having an embedded operating system

Info

Publication number
EP1236108A1
EP1236108A1 EP00909951A EP00909951A EP1236108A1 EP 1236108 A1 EP1236108 A1 EP 1236108A1 EP 00909951 A EP00909951 A EP 00909951A EP 00909951 A EP00909951 A EP 00909951A EP 1236108 A1 EP1236108 A1 EP 1236108A1
Authority
EP
European Patent Office
Prior art keywords
operating system
testing
test
computer
target device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP00909951A
Other languages
German (de)
French (fr)
Inventor
Peter R. Gregory
James Floyd Walters
Janardhana Rao Dikkala
Ian Sample
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bsquare Corp
Original Assignee
Bsquare Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bsquare Corp filed Critical Bsquare Corp
Publication of EP1236108A1 publication Critical patent/EP1236108A1/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/22Detection or location of defective computer hardware by testing during standby operation or during idle time, e.g. start-up testing
    • G06F11/26Functional testing
    • G06F11/263Generation of test inputs, e.g. test vectors, patterns or sequences ; with adaptation of the tested hardware for testability with external testers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/362Software debugging
    • G06F11/366Software debugging using diagnostics

Definitions

  • TECHNICAL FIELD This invention relates to product quality assurance and to test systems and methods for validating operating systems provided in computerized products. More particularly, the present invention relates to product quality assurance and to test systems and methods for validating an operating systems during the development of computerized products. Even more particularly, the present invention relates to product quality assurance and to test systems and methods for validating operating systems, such as Windows CE, manufactured and sold by Microsoft, Incorporated of Redmond, WA. , typically provided in computerized products. BACKGROUND OF THE INVENTION Increasingly, developers are embedding operating systems, such as Windows CE, into many different types of computerized products, including set-top boxes, gaming systems, bar-code scanners, and factory automation systems.
  • the present invention to be commercially available under applicant's assignee's trademark of CEValidatorTM, is an operating system validator, (herein also referred to as O/S Validator, and designated in the Figures with the numeral 1), which solves the foregoing problems by providing a test system encompassing an automated test suite method for testing a port of an operating system, such as Windows CE, in a target device's hardware and/or software being newly developed.
  • the O/S Validator comprises a comprehensive code base, specifically developed to purposefully stress the O/S, device driver, OEM Adaptation Layer (OAL), and hardware interaction.
  • the provided test suites focus on identifying three primary defects: hardware design, hardware programming (drivers/OAL), and operating system interaction.
  • the test suites comprise nearly 1500 tests which include system stress-testing routines, as well as feature-and-function tests, providing a complete analysis of a Windows CE port. These tests are grouped by the O/S Validator.
  • the O/S Validator includes both test source codes and executable programs for all tests.
  • an intuitive user interface for the O/S Validator host component such as a standard Windows application leveraging the Microsoft Windows user interface, is utilized.
  • the O/S Validator distributes test suites as a client/server application.
  • a graphical user interface interacts with a small application, CEHarness.exe, which is running on a target device. Because this communication may occur over Ethernet, at least one host may run suites against at least one target device.
  • the O/S Validator generates useful error information when a target device fails a test.
  • results are displayed in a plurality of dynamically created log windows as well as in a configuration's summary tab.
  • the logging windows contain the full text of a given test's results. Failures are color-coded red to ease identification. Navigation buttons in the logging window allow you to quickly move from one failure to another.
  • the logging APIs in the tests also cause a prolog and an epilog to be generated in each result file. Information such as concurrently ranning processes, battery power level, and execution date and time is automatically recorded in the results file and displayed in the log window. Useful summary information such as loss of program memory, loss of storage memory, or total test execution time is provided in a log window tab.
  • the summary information for a given test result is also collected and displayed in a Summary tab of the configuration window.
  • the summary tab reports the number of PASS and FAIL test cases in real time. Breakout PASS and FAIL numbers for individual suites are also displayed.
  • the configuration window's Summary tab facilitates quick navigation to an individual failure among perhaps thousands of test results.
  • the exact source file and line number corresponding to a logged failure are automatically reported by the O/S Validator's logging APIs. Since O/S Validator provides the source code for all of its executables, being able to go directly to the source code reporting an error is a powerful adjunct to the textural descriptions of the failure.
  • Figure 1.0 is a schematic diagram representative of a computerized product presently being provided with an embedded operating system in a control unit.
  • Figure 2.0 is a manufacturing flow diagram illustrating quality assurance testing on a computerized product provided with an embedded operating system, in accordance with the present invention.
  • Figure 3.0 is a block diagram showing the primary components of the operating system validator of the present invention, including a graphical user interface, an engine, a plurality of test suites and a logging library.
  • Figure 4.0 is a block diagram showing the present invention executing a plurality of test suite configurations from a host device for testing a plurality of target devices provided with an embedded operating system, in accordance with the present invention.
  • Figure 5.0 is a block diagram showing the present invention essentially as depicted in Figure 4, except showing communicate by the target device with the O/S Validator 1 at the host via Ethernet means.
  • Figure 5a illustrates an arrangement where communications between a plurality of host and target devices may occur over Ethernet.
  • Figure 6 shows yet another arrangement for a test suite execution situation.
  • Figure 7.0 is a table listing of functional areas of specific APIs tested in automatic and manual test suite execution.
  • Figures 8A. 8B, and 8C is a table listing a comprehensive list of functional area and their APIs which may be tested in automatic or manual mode.
  • Figure 9.0 is a table listing of selected APIs for use in building automation scripts.
  • Figure 10.0 is a schematic diagram representative of the concept of the present invention providing source code for all executable programs.
  • Figure 11.0 is an block diagram representation of a window showing the test suites selection options as well as other related summary functions.
  • Figure 12.0 is an block diagram representation of a logging window showing tabs for test results, test failures and related test summary options.
  • Figure 13 illustrates in graph form the relationship of test cycle time versus the number of test devices being concurrently tested
  • Figure 14.0 is an block diagram representation of a configuration window showing tabs for executing a variety of configuration related functions.
  • FIGS. 15A, 15B AND 15C show a table listing testing details of the operating system components, in accordance with the present invention.
  • Figure 1.0 shows a computerized product 1000, (9), typical of computerized products such as computer workstations, set-top boxes, gaming systems, bar-code scanners, and factory automation systems presently being provided with an embedded operating system, depicted by the numeral 1001a.
  • product 1000 may comprise a typical target device 9 provided with an operating system 1001a, such as an embedded Windows CE operating system (O/S).
  • O/S embedded Windows CE operating system
  • the computerized product 1000 may function as a standalone device, having an installation of the present invention, the O/S Validator 1, for testing and validating its own operating system 1001a.
  • the standalone testing facilitates new test development and de-bugging of reported defects as elaborate knowledge of O/S Validator infrastructure is not necessitated.
  • product 1000 may function in a manufacturing quality assurance testing environment M as a host computer 4 having an installation of the present invention, the O/S Validator 1 for testing a target devices 9, provided with an operating system 1001a.
  • a computerized product 1000 may comprises, by example, several sub-components, including a control unit 1020, including a plurality of input/output ports 1021, a keyboard 1009, a printer 1010, a mouse 1011, and a monitor 1012.
  • the sub-components 1009, 1010, 1011, 1012 themselves, may be testable target devices.
  • the typical control unit 1020 itself, comprises several sub-components, including a central processing unit 1001, storage devices such as a hard disk drive 1004, other memory components including RAM, 1002, a ROM 1003, a compact disc 1005, an audio component 1006, a network/server card 1007, and a modem 1008. Included in the control unit, of necessity, is an operating system 1001a, to make product 1000 functional as a useful device.
  • Figure 3.0 shows the primary components of the O/S Validator 1 including a graphical user interface (GUI) 2, an Engine 3, a plurality of Test Suites 11, and a Logging Library 12.
  • GUI graphical user interface
  • Engine 3 a plurality of Test Suites 11, and a Logging Library 12.
  • the GUI 2 and the Engine 3 communicate internally, in both directions, through a component called HarnessLink.dll, designated by the numeral 7 within O/S Validator 1 and is discussed in more detail below.
  • Figure 4.0 illustrates a host computer 4 provided with O/S Validator 1. As illustrated, a plurality of target devices 9, are provided with an O/S 1001a for being tested in accordance with the present invention.
  • the O/S Validator 1 has capabilities of generating testing configurations, such as a plurality of test configurations 21a, 21b, and 21c, for testing a particular function under control of OS 1001a within target devices 9.
  • a device side component termed CEHarness 8 communicates with Engine 3 in O/S Validator 1.
  • CEHarness 8 may also communicate with Engine 3 in O/S Validator 1 via Ethernet means 4a.
  • Figure 5a illustrates that because communication may occur over Ethernet, a plurality host 4 may run suites against a plurality of target device 9.
  • CEHarness 8 may also communicate with Engine 3 in O/S Validator 1 via suite execution connection 1021a, where host computer 4 may comprise a NT host computer, the logging library 12 is also provided in a target device 9, and the test results are provided to Host computer 4 via socket connections 1021b.
  • the O/S Validator 1 tests and validates target devices 9 provided with an embedded operating system, by example a Windows CE Operating System.
  • the O/S Validator 1 functions by (1) validating the Windows CE port to a target device, (2) by providing stress and performance testing, (3) by logging and analyzing results, (4) by testing a plurality of functional areas, including a plurality of applications program interfaces (API), see Table 1.0 and Table 2.0 in Figures 7, 8A, 8B, and 8C, respectively, (5) by executing a plurality of pass/fail tests in a plurality of test suites, (6) by facilitating customization of tests in the automated suites, (7) by providing a host side graphical test harness (8) by stressing and evaluating memory performance, (9) providing means for building test automation, including a plurality of APIs, see Table 3.0 in Figure 9.0, and (10) providing a results analysis tool, termed CEAnalyzer.
  • API applications program interfaces
  • O/S Validator 1 includes, for all tests, both test source codes SC and executable programs EP (also referred to as test executable).
  • the sole implementation requirement for a test executable EP is that test case "passes" and “failures” be reported using two specific APIs: WRITETESTPASS ( ) and WRITETESTFAIL ( ). These macros have signatures similar to the well-known printf ( ) function, but their use generates a standard test case result format in the "results" file amenable to automated summarization and integrates the reporting with the host user interface.
  • the O/S Validator 1 method further comprises mediating between the GUI 2 and the test cases encapsulated in the test executables EP and providing a means by which the GUI 2 distributes and coordinates the execution of tests.
  • the test suites 11 comprise text files 5 composed of suite language commands (e.g. PUT, RUN, WAIT, and DELETE) which are the direct expressions of the tasks needed to distribute and execute the test.
  • the O/S Validator 1 method includes organizing the test suite files 5 by their hierarchical placement within a directory structure on the host computer 4. As shown in Figure 11, test suites 11 are divided at the top level as being either automatic Au test suites, manual Ma test suites, or stress SS test suites. Automatic suites Au are tests which do not require any user intervention during their execution. In contrast, manual suites Ma do require user intervention (e.g. keyboard and touch panel suites).
  • the O/S Validator 1 method includes stressing the system, through the stress suites SS by pressing input/output (I/O) throughput to the limit, operating with little or no available program or object store memory, and possibly bringing down a custom file system with multi-threaded concurrent stressing.
  • I/O input/output
  • the O/S Validator 1 method includes arranging the test suites 11 by the functional areas, see generally Figure 7.
  • Figure 3.0 shows the primary components of the O/S Validator 1 including a graphical user interface (GUI) 2, an Engine 3, a plurality of Test Suites 11, and a Logging Library 12.
  • GUI 2 graphical user interface
  • the GUI 2 design is based on the concept of interfacing the user input with underlying components.
  • the Engine 3 holds the core of the functionality on the host 4.
  • the Engine 3 reads a plurality of suite files 5, parses them, and executes the commands.
  • the Engine 3 obtains the information from the GUI 2 and uses the information to set-up a variety of execution options.
  • HarnessLink.dll 7 is an ActiveX control. HarnessLink.dll 7 is instantiated and called from the GUI 2 by a variety of information, which is passed to the Engine 3 before it begins to execute. Dll link 7 also functions to communicate between the Engine 3 and the GUI 2 during the execution, to relay information, relay error messages, and to relay some dynamic run-time commands.
  • the target device 9 comprises a device-side (as opposed to a host-side) component called CEHarness 8.
  • CEHarness is a C/C+ + program residing on the target device 9 and, as shown in Figure 4, communicates nearly exclusively with the Engine 3, unless broadcasting target information on the network, with the GUI 2 receiving such information and passing it to the Engine 3, see Figures 5 and 5a.
  • CEHarness 8 is an event-driven application, where the Engine 3 sends the events and CEHarness 8 responds.
  • the two remaining components, test suites 11 and logging library 12, are intertwined since the test suites 11 are written using a plurality of application program interfaces (API) 13 that are part of the logging library 12, see Figures 8A, 8B and 8C.
  • API 13 application program interfaces
  • the logging library 12 has a simple functionality concept, cornmunicating the test results 14 by creating log files 15 by either logging TCP/IP information 16 back to the GUI 2, see Figure 5.0, or by writing results 14 and log files 15 directly to the device 9, see Figure 6.0.
  • a logging window LW shows the test results 14 in test files 15, failures F and a summary tab SumT which facilitates user-access to program memory, passes, failures, and timing information.
  • the plurality of test suites tests 11 comprise the indispensable component of the O/S Validator 1.
  • Figure 13 illustrates in graph form that the test cycle time CT decreases as the number of test devices are concurrently tested.
  • the GUI 2 is a complex code due to its degree of functionality required for handling its layer of components.
  • GUI 2 provides a "wizard" whose primary function is walking new users through the various selectable settings and listing the default settings.
  • GUI 2 also provides a configuration window CW, as the means for executing a test run which comprises a single pass through a set of selected suites 11 on a target device 9.
  • a plurality of configurations 21a, 21b, and 21c may be run to simulate a variety of scenarios.
  • the contents of a configuration window CW comprises a plurality of tabs for user control.
  • suite tab S provides a tree view of suite files directory under the O/S Validator directory. This tree view is organized to provide meaningful distinctions between the types of tests 11 the user selects.
  • Test suites 11 are scripts of commands that the Engine 3 reads and then performs actions corresponding to the commands in the script.
  • the suite files 5 are generally started with a series of comments. These comments appear in the suite file information section of the file. If the word "Manual"' appears at the stop of a suite file 5, the file is deemed to require manual execution and is, therefore, assigned a different icon.
  • the test suite section as illustrated in Figure 11, the user may reorder the test suite files 5 in any way. Still referring to Figure 14, the logging tabs contains considerable valuable information.
  • the user may select three methods of logging, namely LH for logging to the host 4, LTD for logging to the target device 9, or LHD for logging to both host 4 and target device 9.
  • the log information is then stored in a configurable directory listed in an edit box. All this information is sent through the DLL 7 to the Engine 3 and then to CEHarness 8 in target device 9. Subsequently, the information is acquired by the logging library 12 upon running a test 11.
  • Other tabs in the configuration window CW include a set stress condition tab SC, select high priority threads during suite execution by selecting thread tab T, reducing program and storage memory by selecting tabs PM and SM, select run time by selecting tab SRT and stopping the run by selecting tab STOP.
  • the user can utilize infinite loop tab Hoop for finding memory leaks in the system.
  • Useful summary information such as loss of program memory, loss of storage memory, or total test execution time is provided in a summary tab SumT.
  • the summary information for a given test result is also collected and displayed in a Summary tab SumT.
  • the summary tab reports the number of PASS and FAIL test cases in real time. Breakout PASS and FAIL numbers for individual suites are also displayed.
  • the configuration window's Summary tab facilitates quick navigation to an individual failure among perhaps thousands of test results.
  • the exact source file and line number corresponding to a logged failure are automatically reported by the O/S Validator's logging APIs. Since O/S Validator provides the source code for all of its executables, being able to go directly to the source code reporting an error is a powerful adjunct to the textural descriptions of the failure.
  • the logging options vary dramatically in their implementation and effects.
  • the first option presumes that whenever the user runs a test suite 11 resulting in a "pass", a set of log files 15, summarizing these test results 14, automatically issues.
  • the summary files 15 are created by either CEHarness 8 or the Engine 3. Basically, the Engine 3 traverses all the log files 16 in the logging directory, so the user may receive a log files list not corresponding to the test 11 being run.
  • the user can delete the log directory before running, or the user can select a logging option to select 'Return only the newest summary results' which causes the Engine 3 to traverse only one log file 15 for each test 11.
  • a window termed Available Targets, shows the active devices on the network that are broadcasting information to the GUI 2.
  • the active devices send a large volume of information, some of which is displayed in the Available Targets window.
  • the user may view the information by selecting a View/ Available Targets menu. Another window must be accessed to obtain the complete set of broadcasted information. This broadcast information is valuable, because it is used to initialize the connection from the Engine 3 to a particular CEHarness 8 in a test target device 9.
  • the Stress Suites 11 are manifested in a variety of forms; however, their fundamental purpose is for stressing the target device 9 by running a very long test, running multiple iterations of a short test, or electing a wide range of parameters in one test. These remain specific to their own test 11 area, for example, a Database Stress Suites only stress the Database functionality of an O/S, such as Windows CE.
  • the Stress Test Options are distinguishable from Stress Suites 11.
  • the Stress Test Options are different, because they target functionality to provide more broadband stress scenarios equivalent to real world usage models. These scenarios run in conjunction with any user-selected set of test suite files 5.
  • the Stress Test Options can and should be run both conjointly and separately as, in so doing, a significant boost to the range of any test plan is provided.
  • the first two Stress Test Options are related to the memory on the target device 9.
  • the first Stress Test Option is the Low Virtual Memory Option which greatly reduces the amount of virtual memory of the target device before rining the selected tests 11. This simulates the realistic harsh circumstances that might occur when a user has opened fifteen applications, effecting a malfunction.
  • the second Stress Test Option is the Low Storage Memory option. When selected, this second Stress Test Option fills the storage memory of the target device 9 to its maximum capacity in order to experience the target device
  • This second Stress Test Option is also good for testing application programs as contained within the target device 9 as they may depend on non-existent storage memory.
  • the next three Stress Test Options are execution options.
  • the first executable stress option is the infinite loop, which is ideal for long test cycles. A common problem in many device drivers is malfunction under long, intense, stressful situations.
  • This infinite loop stress test provides a test for determining a possible breakdown. This infinite loop test runs the selected suites 11 until the user manually hits the Stop button.
  • the next stress execution option is the configurable CPU cycle deprivation test available as a text file identified as O/S Validator ⁇ Tests ⁇ TestInputFiles called Data.txt. Two examples are providedin the file which a user may copy, reuse, or modify.
  • the text file, Data.txt controls the number of threads and their attributes that the user may include in his test run. In other words, the user can run his tests while other processes are consuming the CPU time, eliminating many problems including timing.
  • the last Stress Test Option is the Random Execution. When the user selects this option, the GUI 2 will reorder the list of test suites 11 at run time so that they run in a different order. This option is ideal, because it facilitates the diagnoses of interaction problems with various components.
  • the remaining Test Run options are generic activities that the user may control.
  • the first option, "Use Selected Target Exclusively,” is important, because, when a target device 9 is connected over the Ethernet, other users in a subnet can access that target device 9 through the O/S Validator 1 available target devices window. This helps to create stress on the target device 9. In the event that the user wishes to isolate a problem, extra stress should not be applied. In that situations, the user should have exclusive access to the target device 9.
  • the last Test Run Option is set Target Time which prompts the Engine 3 to send the time from the host computer 4 to the target device 9, and thereby synchronizing the target device 9 system time to the host computer 4 time.
  • Synchronization is advantageous as the log files return with a date and time stamp related to the date and time on the target device 9.
  • the last tab before running a test is the Environment Settings tab which contains valuable information for the various selectable environment variables.
  • These environment settings are designed to extend and abstract the test suite files 5 by allowing the test suite files to contain environment variable instead of hard carded information.
  • the Serial Tests take an available com-port as a parameter. If the com-port is not entered as an environment variable, the test fails, because it is unable to open a com-port. All the environment variables used in the suites are provided; however, any additional environment variables may be user-added to the user-input suites.
  • a Test Status is available for obtaining status information for the current test run.
  • the information is dynamically being updated.
  • a Suites Run section window lists the selected suites that have started. The user may open any log file from this window by selecting a desired test icon. The other icons in this control provide failure information. The failure icons are shown, by example, as a stylized beaker crossed-out with a red "X.”
  • a Test Run Summary Information keeps tracks of the number of test suite files run, suites selected, suites with a failure, and percentage of suites with a failure.
  • the user may select a Configure Failed Suites tab prompting the appearance of a new configuration window selecting all the failed suites in the current test run, facilitating regression testing.
  • Test Details The remaining two sections are called Test Details.
  • One of these Test Details section monitors the individual test cases that pass, as well as fail, which section is valuable for gauging the value of a test run.
  • the remaining Test Details section is the Failed Suites section where all selected suites with a fail are listed by suite name, showing the number of corresponding passing and failing test cases. All this information gives the user a very good idea of the limits of his target device 9 during a test run (i.e. what passes, and more importantly what fails).
  • the primary object of the present invention is to properly test a port of an O/S 1001a, such as Windows CE.
  • O/S 1001a such as Windows CE.
  • hundreds of suite tests 11 are needed and are provided by the O/S Validator 1.
  • nearly 1500 test suites 11, as grouped by the verified O/S subsystem, are provided.
  • the O/S Validator 1 includes both source codes and executable codes for all tests 11, covering the major O/S 1001a subsystems and the common adaptation drivers, with special emphasis on subsystems historically exhibiting the most problems.
  • OS subsystem components tested by the O/S Validator 1 include: Ethernet/NDIS, serial port driver, display driver, touch panel driver, mouse driver, keyboard driver, OEM Adaptation Layer, and PC Card adapter driver.
  • Figure 15A, 15B AND 15C show a table listing testing details of these system components.
  • HarnessLink.dll 7 provides a set of functions on the GUI 2 for calling (i.e. affecting) the Engine 3.
  • HarnessLink.dll 7 function-calls set-up some parameters for the Engine 3's command line. All the initial information is passed to the Engine 3 through this command line, insuring readiness of the Engine 3 for specified execution.
  • a plurality of GUI- related functions provide information on the command line. The information on the command line corresponds the GUI 2 information.
  • HarnessLink.dll 7 serves is to communicate activity during the running of the Engine 3 by opening a named pipe.
  • the Engine 3 communicates through the named pipe.
  • a named pipe insures that the communication between the Engine 3 and a specific Harness link is direct, accurate, and without any duplication problems if a plurality of Engines 3 are running.
  • the HarnessLink.dll 7 When the HarnessLink.dll 7 receives a message from the pipe, it signals the appropriate VB event which, in turn, causes the GUI 2 to take the information and process it accordingly.
  • Engine 3 communicates with one HarnessLink.dll 7, which, in turn, communicates with one CEHarness 8.
  • the Engine 3 execution is simple: a command line is received and processed, establishing the execution socket connection to the target device, opening the pipe for communication with the GUI 2, reading the test suite files 5, and subsequently executing the tests in three phases, PreExecution, Execution, and PostExecution.
  • the PreExecution stage establishes the error socket connection between the target device 9 and the host 4. Relevant data such as logging paths and styles, various test run information, and stress scenarios are sent during the PreExecution stage.
  • the Execution stage involves a response to each sequential suite command.
  • a suite command is generally sent by the host 4 and is processed by the CEHarness 8 which, in turn, responds with a socket message when execution of the command is completed.
  • CEHarness 8 in test target device 9 is a significantly more complicated component than the Engine 3. This complexity is because each device 9 has one instance of CEHarness 8 at any given time; however, that device can handle a plurality of simultaneous connections, a vitally important feature to the testing methodology provided by the O/S Validator 1.
  • a broadcast thread When the user starts CEHarness 8, it creates two threads that endure through the entire execution time: a broadcast thread and an execution thread. This broadcast thread updates information such as the device IP, connection type, and available com-ports every ten seconds, sending a broadcast message at the same rate to the network.
  • the connection type will change to PPP_PEER. If this occurs, the broadcast message is sent only to the host 4 from which the target device 9 is directly connected. If the user changes the connection at some point during execution, the message is updated. Meanwhile, the execution thread waits for a connection attempt from an Engine 3. When the execution thread receives the connection, it spawns another thread, a main execution thread, that performs the various functions required. The main execution thread starts another socket for sending any error or memory information. Thus, the execution thread is event-driven, receiving a command and responding appropriately. Each connection attempt spawns its own execution thread; therefore, a single CEHarness 8 may have many active connections, extending the test functionality by running a plurality of configurations simultaneously on one target device 9, and thereby creating a more realistic stress situation.
  • the Logging library 12 shown in Figure 3.0 is a complex tool integrated into the O/S Validator 1 in various ways. Primarily, the Logging library 12 is integrated into the source files for the tests. The tests make calls to the library APIs whereupon the library handles all the details regarding capture and recordation of test results.
  • the Logging library 12 also supports a variety of communication options. The recommended option is through the TCP, allowing the user to see the readout of the log files as they migrate from the TCP connection. Another communication option is direct logging to files on a device. This can be advantageous, for example, if the user wants to log to a PCMCIA card but does not want extra information to be broadcast over the network.
  • the Logging library 12 acts as the device-side component to the TCP communications
  • a host 4 component acts as the GUI 2-side component to the communications.
  • This aspect of the logging library 12 provides a log window on the host 4 and color coding for the test results.
  • failure messages are displayed as a red line.
  • the name and location of the source code file as well as the line number where the message was generated is included in the logging library 12 messages.
  • a detailed error message is provided describing current events in the program.
  • Each log window has a summary tab which facilitates user-access to program memory, passes, failures, and timing information. Another important feature of the log files is that they capture a large volume of information at the beginning and at the end of a test.

Abstract

A system and method for improving quality assurance, imparting many person-months of time and cost savings, and streamlining the product development process of target device utilizing commercially available operating systems, such as Windows CE, includes a testing device which applies a suite of comprehensive validation test programs. This system and method, an O/S Validator (1), provides a fully automated design verification package for commercially available operating systems (1000a), Windows CE, by utilizing a host graphical user interface harness (12), a host to target communications (3), at least one test suite (11), and a result capture methodology (12). The O/S Validator (1) provides a faster and more accurate automated test suite technology for testing a port of a commercially available operating systems (1000a), such as Windows CE, to target hardware (1000, 9). In addition, O/S Validator (1) includes a comprehensive code-base developed specifically to purposefully stress the operating system O/S, device driver, OEM Adaptation Layer (OAL), and hardware interaction. The test suites (11) focus on identifying three primary defects including hardware design, hardware programming (drivers/OAL), and operating system interaction. Special diagnostic emphasis is placed on operating subsystems which have historically demonstrated the most problems.

Description

PATENT APPLICATION
A SYSTEM AND METHOD FOR TESTING AND VALIDATING DEVICES HAVING AN EMBEDDED OPERATING SYSTEM
CROSS-REFERENCE TO RELATED APPLICATIONS
This patent application is related to U. S. Provisional Patent Application Serial No. 60/116,824, entitled: CE VALIDATOR TEST SUITE, filed January 21, 1999, and to U. S. Provisional Patent Application Serial No. 60/137,629, entitled: PROTOCOL ACKNOWLEDGEMENT BETWEEN HOMOGENEOUS SYSTEMS, filed June 4, 1999.
TECHNICAL FIELD This invention relates to product quality assurance and to test systems and methods for validating operating systems provided in computerized products. More particularly, the present invention relates to product quality assurance and to test systems and methods for validating an operating systems during the development of computerized products. Even more particularly, the present invention relates to product quality assurance and to test systems and methods for validating operating systems, such as Windows CE, manufactured and sold by Microsoft, Incorporated of Redmond, WA. , typically provided in computerized products. BACKGROUND OF THE INVENTION Increasingly, developers are embedding operating systems, such as Windows CE, into many different types of computerized products, including set-top boxes, gaming systems, bar-code scanners, and factory automation systems. As Windows CE has grown, so too has the need for "off-the-shelf software development tools. Although many tools and "off-the-shelf software kits have been on the market for saving device design time, leading to speedy device development, no fast testing system or method existed to verify compatibility of these new products, especially at the final stages of device development.
Traditionally, only two operating system device testing options have been available: (1) in-house writing of the test code, or (2) out-sourcing the custom code development to another firm. To complete the testing project in-house, Original Equipment Manufacturers (OEMs) must spend months training their staff, more months developing the test codes, and yet even more months of preparation before their product can be tested using such codes. Likewise, an out-source custom code development house would spend months writing the code. Thus, both options are time-consuming, and therefore, costly.
The foregoing time consuming operating system device testing options have created a long-felt need for a consolidated testing system and method for improving product quality, imparting time and cost savings of many person-months, and streamlining of the product development process. In particular, a system and method for testing and validating devices having an embedded Windows CE operating system installed is needed to overcome the foregoing problem and thus provide a system and method which improves product quality, imparts time and cost savings of many person-months, and streamlines the product development process resulting in a fully automated design verification package for device designers. BRIEF SUMMARY OF THE INVENTION
Accordingly, the present invention, to be commercially available under applicant's assignee's trademark of CEValidator™, is an operating system validator, (herein also referred to as O/S Validator, and designated in the Figures with the numeral 1), which solves the foregoing problems by providing a test system encompassing an automated test suite method for testing a port of an operating system, such as Windows CE, in a target device's hardware and/or software being newly developed. The O/S Validator comprises a comprehensive code base, specifically developed to purposefully stress the O/S, device driver, OEM Adaptation Layer (OAL), and hardware interaction. The provided test suites focus on identifying three primary defects: hardware design, hardware programming (drivers/OAL), and operating system interaction. Special diagnostic emphasis is placed on Windows CE subsystems which have historically shown the most problems. The test suites comprise nearly 1500 tests which include system stress-testing routines, as well as feature-and-function tests, providing a complete analysis of a Windows CE port. These tests are grouped by the O/S Validator. The O/S Validator includes both test source codes and executable programs for all tests.
To simplify execution of test suites and collection of logging results, an intuitive user interface for the O/S Validator host component, such as a standard Windows application leveraging the Microsoft Windows user interface, is utilized. The O/S Validator distributes test suites as a client/server application. A graphical user interface (GUI) interacts with a small application, CEHarness.exe, which is running on a target device. Because this communication may occur over Ethernet, at least one host may run suites against at least one target device. The O/S Validator generates useful error information when a target device fails a test.
While the suites are running, results are displayed in a plurality of dynamically created log windows as well as in a configuration's summary tab. The logging windows contain the full text of a given test's results. Failures are color-coded red to ease identification. Navigation buttons in the logging window allow you to quickly move from one failure to another. The logging APIs in the tests also cause a prolog and an epilog to be generated in each result file. Information such as concurrently ranning processes, battery power level, and execution date and time is automatically recorded in the results file and displayed in the log window. Useful summary information such as loss of program memory, loss of storage memory, or total test execution time is provided in a log window tab. The summary information for a given test result is also collected and displayed in a Summary tab of the configuration window. The summary tab reports the number of PASS and FAIL test cases in real time. Breakout PASS and FAIL numbers for individual suites are also displayed. The configuration window's Summary tab facilitates quick navigation to an individual failure among perhaps thousands of test results. The exact source file and line number corresponding to a logged failure are automatically reported by the O/S Validator's logging APIs. Since O/S Validator provides the source code for all of its executables, being able to go directly to the source code reporting an error is a powerful adjunct to the textural descriptions of the failure.
Other features of the present invention are disclosed or are apparent in the section entitled, "DETAILED DESCRIPTION OF THE INVENTION."
BRIEF DESCRIPTION OF DRAWINGS
For a better understanding of the present invention, reference is made to the accompanying drawings wherein:
Figure 1.0 is a schematic diagram representative of a computerized product presently being provided with an embedded operating system in a control unit. Figure 2.0 is a manufacturing flow diagram illustrating quality assurance testing on a computerized product provided with an embedded operating system, in accordance with the present invention.
Figure 3.0 is a block diagram showing the primary components of the operating system validator of the present invention, including a graphical user interface, an engine, a plurality of test suites and a logging library.
Figure 4.0 is a block diagram showing the present invention executing a plurality of test suite configurations from a host device for testing a plurality of target devices provided with an embedded operating system, in accordance with the present invention.
Figure 5.0 is a block diagram showing the present invention essentially as depicted in Figure 4, except showing communicate by the target device with the O/S Validator 1 at the host via Ethernet means.
Figure 5a illustrates an arrangement where communications between a plurality of host and target devices may occur over Ethernet.
Figure 6 shows yet another arrangement for a test suite execution situation. Figure 7.0 is a table listing of functional areas of specific APIs tested in automatic and manual test suite execution.
Figures 8A. 8B, and 8C is a table listing a comprehensive list of functional area and their APIs which may be tested in automatic or manual mode.
Figure 9.0 is a table listing of selected APIs for use in building automation scripts. Figure 10.0 is a schematic diagram representative of the concept of the present invention providing source code for all executable programs. Figure 11.0 is an block diagram representation of a window showing the test suites selection options as well as other related summary functions.
Figure 12.0 is an block diagram representation of a logging window showing tabs for test results, test failures and related test summary options. Figure 13 illustrates in graph form the relationship of test cycle time versus the number of test devices being concurrently tested
Figure 14.0 is an block diagram representation of a configuration window showing tabs for executing a variety of configuration related functions.
Figures 15A, 15B AND 15C show a table listing testing details of the operating system components, in accordance with the present invention.
DETAILED DESCRIPTION OF THE INVENTION
Figure 1.0 shows a computerized product 1000, (9), typical of computerized products such as computer workstations, set-top boxes, gaming systems, bar-code scanners, and factory automation systems presently being provided with an embedded operating system, depicted by the numeral 1001a. As illustrated, and as described herein, product 1000 may comprise a typical target device 9 provided with an operating system 1001a, such as an embedded Windows CE operating system (O/S). The computerized product 1000 may function as a standalone device, having an installation of the present invention, the O/S Validator 1, for testing and validating its own operating system 1001a. The standalone testing facilitates new test development and de-bugging of reported defects as elaborate knowledge of O/S Validator infrastructure is not necessitated. However, in a more likely application, as shown in Figure 2.0, product 1000 may function in a manufacturing quality assurance testing environment M as a host computer 4 having an installation of the present invention, the O/S Validator 1 for testing a target devices 9, provided with an operating system 1001a. Referring back to Figure
1.0, a computerized product 1000 may comprises, by example, several sub-components, including a control unit 1020, including a plurality of input/output ports 1021, a keyboard 1009, a printer 1010, a mouse 1011, and a monitor 1012. The sub-components 1009, 1010, 1011, 1012 themselves, may be testable target devices. The typical control unit 1020, itself, comprises several sub-components, including a central processing unit 1001, storage devices such as a hard disk drive 1004, other memory components including RAM, 1002, a ROM 1003, a compact disc 1005, an audio component 1006, a network/server card 1007, and a modem 1008. Included in the control unit, of necessity, is an operating system 1001a, to make product 1000 functional as a useful device.
Figure 3.0 shows the primary components of the O/S Validator 1 including a graphical user interface (GUI) 2, an Engine 3, a plurality of Test Suites 11, and a Logging Library 12. The GUI 2 and the Engine 3 communicate internally, in both directions, through a component called HarnessLink.dll, designated by the numeral 7 within O/S Validator 1 and is discussed in more detail below. Figure 4.0 illustrates a host computer 4 provided with O/S Validator 1. As illustrated, a plurality of target devices 9, are provided with an O/S 1001a for being tested in accordance with the present invention. The O/S Validator 1 has capabilities of generating testing configurations, such as a plurality of test configurations 21a, 21b, and 21c, for testing a particular function under control of OS 1001a within target devices 9. A device side component, termed CEHarness 8 communicates with Engine 3 in O/S Validator 1. As depicted in Figures 5 and 5a, CEHarness 8 may also communicate with Engine 3 in O/S Validator 1 via Ethernet means 4a. Figure 5a illustrates that because communication may occur over Ethernet, a plurality host 4 may run suites against a plurality of target device 9. In yet another alternative, and as depicted in Figure 6 for a test suite execution situation, CEHarness 8 may also communicate with Engine 3 in O/S Validator 1 via suite execution connection 1021a, where host computer 4 may comprise a NT host computer, the logging library 12 is also provided in a target device 9, and the test results are provided to Host computer 4 via socket connections 1021b.
In operation, the O/S Validator 1 tests and validates target devices 9 provided with an embedded operating system, by example a Windows CE Operating System. Broadly stated, the O/S Validator 1 functions by (1) validating the Windows CE port to a target device, (2) by providing stress and performance testing, (3) by logging and analyzing results, (4) by testing a plurality of functional areas, including a plurality of applications program interfaces (API), see Table 1.0 and Table 2.0 in Figures 7, 8A, 8B, and 8C, respectively, (5) by executing a plurality of pass/fail tests in a plurality of test suites, (6) by facilitating customization of tests in the automated suites, (7) by providing a host side graphical test harness (8) by stressing and evaluating memory performance, (9) providing means for building test automation, including a plurality of APIs, see Table 3.0 in Figure 9.0, and (10) providing a results analysis tool, termed CEAnalyzer. As previously stated, and as depicted in Figure 10, O/S Validator 1 includes, for all tests, both test source codes SC and executable programs EP (also referred to as test executable). The sole implementation requirement for a test executable EP is that test case "passes" and "failures" be reported using two specific APIs: WRITETESTPASS ( ) and WRITETESTFAIL ( ). These macros have signatures similar to the well-known printf ( ) function, but their use generates a standard test case result format in the "results" file amenable to automated summarization and integrates the reporting with the host user interface. The O/S Validator 1 method further comprises mediating between the GUI 2 and the test cases encapsulated in the test executables EP and providing a means by which the GUI 2 distributes and coordinates the execution of tests. The test suites 11 comprise text files 5 composed of suite language commands (e.g. PUT, RUN, WAIT, and DELETE) which are the direct expressions of the tasks needed to distribute and execute the test. Other supported suite commands are GET for retrieving a file from a target device 9, RUNHOST for running an executable program EP on the host computer 4 which is useful for client/server style tests, WAITHOST for waiting for the termination of a process on the host computer 4, which is also useful for client/ server style tests, PUTSYSTEM for putting a file in the device's system directory (/Windows), SLEEP for basic timing when all else fails, MSGBOX for displaying a message box on the host machine, SETREG for adding or changing a registry setting on the device, and DELREG for removing a registry setting from the device. Initial suite file comments, in addition to providing internal suite documentation, are presented as the suite descriptions in GUI 2. The O/S Validator 1 method includes organizing the test suite files 5 by their hierarchical placement within a directory structure on the host computer 4. As shown in Figure 11, test suites 11 are divided at the top level as being either automatic Au test suites, manual Ma test suites, or stress SS test suites. Automatic suites Au are tests which do not require any user intervention during their execution. In contrast, manual suites Ma do require user intervention (e.g. keyboard and touch panel suites). The O/S Validator 1 method includes stressing the system, through the stress suites SS by pressing input/output (I/O) throughput to the limit, operating with little or no available program or object store memory, and possibly bringing down a custom file system with multi-threaded concurrent stressing. Below this top level of hierarchy, the O/S Validator 1 method includes arranging the test suites 11 by the functional areas, see generally Figure 7. As notes above, Figure 3.0 shows the primary components of the O/S Validator 1 including a graphical user interface (GUI) 2, an Engine 3, a plurality of Test Suites 11, and a Logging Library 12. O/S Validator 1 utilizes GUI 2 as a Visual Basic component since it works well for any level of user. The GUI 2 design is based on the concept of interfacing the user input with underlying components. The Engine 3 holds the core of the functionality on the host 4. The Engine 3 reads a plurality of suite files 5, parses them, and executes the commands. The Engine 3 obtains the information from the GUI 2 and uses the information to set-up a variety of execution options. The Engine 3 is written in C/C- - + language. The Engine 3 is linked to the GUI 2 via a a component termed HarnessLink.dll 7, which is an ActiveX control. HarnessLink.dll 7 is instantiated and called from the GUI 2 by a variety of information, which is passed to the Engine 3 before it begins to execute. Dll link 7 also functions to communicate between the Engine 3 and the GUI 2 during the execution, to relay information, relay error messages, and to relay some dynamic run-time commands. The target device 9 comprises a device-side (as opposed to a host-side) component called CEHarness 8. CEHarness is a C/C+ + program residing on the target device 9 and, as shown in Figure 4, communicates nearly exclusively with the Engine 3, unless broadcasting target information on the network, with the GUI 2 receiving such information and passing it to the Engine 3, see Figures 5 and 5a. CEHarness 8 is an event-driven application, where the Engine 3 sends the events and CEHarness 8 responds. The two remaining components, test suites 11 and logging library 12, are intertwined since the test suites 11 are written using a plurality of application program interfaces (API) 13 that are part of the logging library 12, see Figures 8A, 8B and 8C. These API 13 have substantial functionality and are somewhat dependent on the information passed through the component chain. The logging library 12 has a simple functionality concept, cornmunicating the test results 14 by creating log files 15 by either logging TCP/IP information 16 back to the GUI 2, see Figure 5.0, or by writing results 14 and log files 15 directly to the device 9, see Figure 6.0. As depicted in Figure 12, a logging window LW shows the test results 14 in test files 15, failures F and a summary tab SumT which facilitates user-access to program memory, passes, failures, and timing information. The plurality of test suites tests 11 comprise the indispensable component of the O/S Validator 1. Figure 13 illustrates in graph form that the test cycle time CT decreases as the number of test devices are concurrently tested. In further detail, the GUI 2 is a complex code due to its degree of functionality required for handling its layer of components. The GUI 2 provides a "wizard" whose primary function is walking new users through the various selectable settings and listing the default settings. As shown in Figure 14, GUI 2 also provides a configuration window CW, as the means for executing a test run which comprises a single pass through a set of selected suites 11 on a target device 9. As shown in Figure 4, a plurality of configurations 21a, 21b, and 21c may be run to simulate a variety of scenarios. The contents of a configuration window CW comprises a plurality of tabs for user control. By example, suite tab S provides a tree view of suite files directory under the O/S Validator directory. This tree view is organized to provide meaningful distinctions between the types of tests 11 the user selects. Additionally, this tree view is created as the user opens the configuration window CW, allowing the user to extend the O/S Validator 1 by adding new user-input suites to the suite file directory created by the O/S Validator 1 installation program. Test suites 11 are scripts of commands that the Engine 3 reads and then performs actions corresponding to the commands in the script. The suite files 5 are generally started with a series of comments. These comments appear in the suite file information section of the file. If the word "Manual"' appears at the stop of a suite file 5, the file is deemed to require manual execution and is, therefore, assigned a different icon. In the test suite section, as illustrated in Figure 11, the user may reorder the test suite files 5 in any way. Still referring to Figure 14, the logging tabs contains considerable valuable information. The user may select three methods of logging, namely LH for logging to the host 4, LTD for logging to the target device 9, or LHD for logging to both host 4 and target device 9. The log information is then stored in a configurable directory listed in an edit box. All this information is sent through the DLL 7 to the Engine 3 and then to CEHarness 8 in target device 9. Subsequently, the information is acquired by the logging library 12 upon running a test 11. Other tabs in the configuration window CW include a set stress condition tab SC, select high priority threads during suite execution by selecting thread tab T, reducing program and storage memory by selecting tabs PM and SM, select run time by selecting tab SRT and stopping the run by selecting tab STOP. The user can utilize infinite loop tab Hoop for finding memory leaks in the system. Useful summary information such as loss of program memory, loss of storage memory, or total test execution time is provided in a summary tab SumT. The summary information for a given test result is also collected and displayed in a Summary tab SumT. The summary tab reports the number of PASS and FAIL test cases in real time. Breakout PASS and FAIL numbers for individual suites are also displayed. The configuration window's Summary tab facilitates quick navigation to an individual failure among perhaps thousands of test results. The exact source file and line number corresponding to a logged failure are automatically reported by the O/S Validator's logging APIs. Since O/S Validator provides the source code for all of its executables, being able to go directly to the source code reporting an error is a powerful adjunct to the textural descriptions of the failure. The logging options vary dramatically in their implementation and effects. The first option presumes that whenever the user runs a test suite 11 resulting in a "pass", a set of log files 15, summarizing these test results 14, automatically issues. Dependent upon the chosen logging method, the summary files 15 are created by either CEHarness 8 or the Engine 3. Basically, the Engine 3 traverses all the log files 16 in the logging directory, so the user may receive a log files list not corresponding to the test 11 being run. In order to make the summary files 15 more indicative of the run tests 11, the user can delete the log directory before running, or the user can select a logging option to select 'Return only the newest summary results' which causes the Engine 3 to traverse only one log file 15 for each test 11. This means that if the user ran a file system test thirty times on a given day, there would only be one entry in the summary log for that test corresponding to the most recent execution. The summary log would have only one entry for each test 11 whose log file 15 still resides in the logged directory. The other two options are handled in the GUI 2. If the user logs onto the host 4 via a TCP/IP connection 4a, an entry goes into a log file 15 created on the user's host 4 while appearing in a log window within the GUI 2. This allows the user to examine the log files 15 within the context of the O/S Validator 1, a great advantage since the user can immediately monitor the passes, and more importantly, the failures. However, in some circumstances, there may be an excessive amount of log files 15 due to the size of the test 11 run; therefore, closing the log window LW, without opening it further, would maintain the memory of the host 4 and also impart a clear viewing area for the O/S Validator 1. Also advantageous is the closing of all log files 15 without failures. Failures indicate to product design personnel that the target device 9 will require further development. The user may want to keep open all the log windows F with failures by clicking an option in the logging window to keep the F window open. When operating the present invention as shown in Figure 5a, a window, termed Available Targets, shows the active devices on the network that are broadcasting information to the GUI 2. The active devices send a large volume of information, some of which is displayed in the Available Targets window. The user may view the information by selecting a View/ Available Targets menu. Another window must be accessed to obtain the complete set of broadcasted information. This broadcast information is valuable, because it is used to initialize the connection from the Engine 3 to a particular CEHarness 8 in a test target device 9.
Referring to Figure 11, the stress Test Settings SS are now further described. The Stress Suites 11 are manifested in a variety of forms; however, their fundamental purpose is for stressing the target device 9 by running a very long test, running multiple iterations of a short test, or electing a wide range of parameters in one test. These remain specific to their own test 11 area, for example, a Database Stress Suites only stress the Database functionality of an O/S, such as Windows CE. The Stress Test Options are distinguishable from Stress Suites 11. The Stress Test Options are different, because they target functionality to provide more broadband stress scenarios equivalent to real world usage models. These scenarios run in conjunction with any user-selected set of test suite files 5. The Stress Test Options can and should be run both conjointly and separately as, in so doing, a significant boost to the range of any test plan is provided. The first two Stress Test Options are related to the memory on the target device 9. The first Stress Test Option is the Low Virtual Memory Option which greatly reduces the amount of virtual memory of the target device before rining the selected tests 11. This simulates the realistic harsh circumstances that might occur when a user has opened fifteen applications, effecting a malfunction. The second Stress Test Option is the Low Storage Memory option. When selected, this second Stress Test Option fills the storage memory of the target device 9 to its maximum capacity in order to experience the target device
9 response to low storage memory solutions. In some cases, this second Stress Test Option is also good for testing application programs as contained within the target device 9 as they may depend on non-existent storage memory. The next three Stress Test Options are execution options. The first executable stress option is the infinite loop, which is ideal for long test cycles. A common problem in many device drivers is malfunction under long, intense, stressful situations. This infinite loop stress test provides a test for determining a possible breakdown. This infinite loop test runs the selected suites 11 until the user manually hits the Stop button. The next stress execution option is the configurable CPU cycle deprivation test available as a text file identified as O/S Validator \Tests\TestInputFiles called Data.txt. Two examples are providedin the file which a user may copy, reuse, or modify. The text file, Data.txt, controls the number of threads and their attributes that the user may include in his test run. In other words, the user can run his tests while other processes are consuming the CPU time, eliminating many problems including timing. The last Stress Test Option is the Random Execution. When the user selects this option, the GUI 2 will reorder the list of test suites 11 at run time so that they run in a different order. This option is ideal, because it facilitates the diagnoses of interaction problems with various components.
The remaining Test Run options are generic activities that the user may control. The first option, "Use Selected Target Exclusively," is important, because, when a target device 9 is connected over the Ethernet, other users in a subnet can access that target device 9 through the O/S Validator 1 available target devices window. This helps to create stress on the target device 9. In the event that the user wishes to isolate a problem, extra stress should not be applied. In that situations, the user should have exclusive access to the target device 9. The last Test Run Option is set Target Time which prompts the Engine 3 to send the time from the host computer 4 to the target device 9, and thereby synchronizing the target device 9 system time to the host computer 4 time. Synchronization is advantageous as the log files return with a date and time stamp related to the date and time on the target device 9. In order to keep these accurate, the user should set the target device 9 date and time. The last tab before running a test is the Environment Settings tab which contains valuable information for the various selectable environment variables. These environment settings are designed to extend and abstract the test suite files 5 by allowing the test suite files to contain environment variable instead of hard carded information. For example, the Serial Tests take an available com-port as a parameter. If the com-port is not entered as an environment variable, the test fails, because it is unable to open a com-port. All the environment variables used in the suites are provided; however, any additional environment variables may be user-added to the user-input suites. After running a test, a Test Status is available for obtaining status information for the current test run. The information is dynamically being updated. A Suites Run section window lists the selected suites that have started. The user may open any log file from this window by selecting a desired test icon. The other icons in this control provide failure information. The failure icons are shown, by example, as a stylized beaker crossed-out with a red "X." A Test Run Summary Information keeps tracks of the number of test suite files run, suites selected, suites with a failure, and percentage of suites with a failure. On completion of a test run, the user may select a Configure Failed Suites tab prompting the appearance of a new configuration window selecting all the failed suites in the current test run, facilitating regression testing.
The remaining two sections are called Test Details. One of these Test Details section monitors the individual test cases that pass, as well as fail, which section is valuable for gauging the value of a test run. The remaining Test Details section is the Failed Suites section where all selected suites with a fail are listed by suite name, showing the number of corresponding passing and failing test cases. All this information gives the user a very good idea of the limits of his target device 9 during a test run (i.e. what passes, and more importantly what fails).
The primary object of the present invention is to properly test a port of an O/S 1001a, such as Windows CE. To accomplish this task, hundreds of suite tests 11 are needed and are provided by the O/S Validator 1. By example, nearly 1500 test suites 11, as grouped by the verified O/S subsystem, are provided. As indicated in Figure 10.0 , the O/S Validator 1 includes both source codes and executable codes for all tests 11, covering the major O/S 1001a subsystems and the common adaptation drivers, with special emphasis on subsystems historically exhibiting the most problems. OS subsystem components tested by the O/S Validator 1 include: Ethernet/NDIS, serial port driver, display driver, touch panel driver, mouse driver, keyboard driver, OEM Adaptation Layer, and PC Card adapter driver. Figure 15A, 15B AND 15C show a table listing testing details of these system components.
As discussed above, and shown in Figure 3, Engine 3 is linked to the GUI 2 via a component termed HarnessLink.dll 7, which is an ActiveX control. Specifically, HarnessLink.dll 7 provides a set of functions on the GUI 2 for calling (i.e. affecting) the Engine 3. A majority of HarnessLink.dll 7 function-calls set-up some parameters for the Engine 3's command line. All the initial information is passed to the Engine 3 through this command line, insuring readiness of the Engine 3 for specified execution. A plurality of GUI- related functions provide information on the command line. The information on the command line corresponds the GUI 2 information. The other major function, that HarnessLink.dll 7, serves is to communicate activity during the running of the Engine 3 by opening a named pipe. If an error, or a need arises to transmit some memory information, the Engine 3 communicates through the named pipe. A named pipe insures that the communication between the Engine 3 and a specific Harness link is direct, accurate, and without any duplication problems if a plurality of Engines 3 are running. When the HarnessLink.dll 7 receives a message from the pipe, it signals the appropriate VB event which, in turn, causes the GUI 2 to take the information and process it accordingly. As described with relation to Figure 5.0, Engine 3 communicates with one HarnessLink.dll 7, which, in turn, communicates with one CEHarness 8. The Engine 3 execution is simple: a command line is received and processed, establishing the execution socket connection to the target device, opening the pipe for communication with the GUI 2, reading the test suite files 5, and subsequently executing the tests in three phases, PreExecution, Execution, and PostExecution. The PreExecution stage establishes the error socket connection between the target device 9 and the host 4. Relevant data such as logging paths and styles, various test run information, and stress scenarios are sent during the PreExecution stage. The Execution stage involves a response to each sequential suite command. A suite command is generally sent by the host 4 and is processed by the CEHarness 8 which, in turn, responds with a socket message when execution of the command is completed. The PostExecution stage primarily involves reducing the log information and generating the summary logs. Upon completion of these summary logs, the Engine 3 exits. CEHarness 8 in test target device 9 is a significantly more complicated component than the Engine 3. This complexity is because each device 9 has one instance of CEHarness 8 at any given time; however, that device can handle a plurality of simultaneous connections, a vitally important feature to the testing methodology provided by the O/S Validator 1. When the user starts CEHarness 8, it creates two threads that endure through the entire execution time: a broadcast thread and an execution thread. This broadcast thread updates information such as the device IP, connection type, and available com-ports every ten seconds, sending a broadcast message at the same rate to the network. If device 9 is connected to the host computer 4 via Windows CE Services (i.e. on the NT side) and to either remnet or repllog (i.e. on the device side), the connection type will change to PPP_PEER. If this occurs, the broadcast message is sent only to the host 4 from which the target device 9 is directly connected. If the user changes the connection at some point during execution, the message is updated. Meanwhile, the execution thread waits for a connection attempt from an Engine 3. When the execution thread receives the connection, it spawns another thread, a main execution thread, that performs the various functions required. The main execution thread starts another socket for sending any error or memory information. Thus, the execution thread is event-driven, receiving a command and responding appropriately. Each connection attempt spawns its own execution thread; therefore, a single CEHarness 8 may have many active connections, extending the test functionality by running a plurality of configurations simultaneously on one target device 9, and thereby creating a more realistic stress situation.
The Logging library 12 , shown in Figure 3.0 is a complex tool integrated into the O/S Validator 1 in various ways. Primarily, the Logging library 12 is integrated into the source files for the tests. The tests make calls to the library APIs whereupon the library handles all the details regarding capture and recordation of test results. The Logging library 12 also supports a variety of communication options. The recommended option is through the TCP, allowing the user to see the readout of the log files as they migrate from the TCP connection. Another communication option is direct logging to files on a device. This can be advantageous, for example, if the user wants to log to a PCMCIA card but does not want extra information to be broadcast over the network. Although, the Logging library 12 acts as the device-side component to the TCP communications, a host 4 component acts as the GUI 2-side component to the communications. This aspect of the logging library 12 provides a log window on the host 4 and color coding for the test results. By example, failure messages are displayed as a red line. The name and location of the source code file as well as the line number where the message was generated is included in the logging library 12 messages. Also, very importantly, a detailed error message is provided describing current events in the program. Each log window has a summary tab which facilitates user-access to program memory, passes, failures, and timing information. Another important feature of the log files is that they capture a large volume of information at the beginning and at the end of a test. This information provides snapshot of the Memory, System, Power, and other valuable information. The present invention has been particularly shown and described with respect to a certain preferred embodiments and features thereof. However, it should be readily apparent to those of ordinary skill in the art that various changes and modifications in form, material, and design detail may be made without departing from the spirit and scope of the inventions as set forth in the appended claims.

Claims

CLAIMS What is claimed is:
1. A computer-based system for testing and validating an embedded operating system within a target device, comprising: a. a host computer; b. a target device provided with an operating system; and c. an operating system testing and validating software program, said program being provided in said host computer, wherein said program comprises a graphical user interface program means for interfacing with a user, an engine means for communicating with said target device and responding to command from said graphical user interface, a plurality of test suites comprising at least one test for testing and validating at least one component of said operating system; and d. a logging library means for manipulating and storing test related information as generated by said operating system testing and validating software program.
2. The computer-based system, as recited in Claim 1, wherein: said operating system testing and validating software program further comprises a set of functions on the graphical user interface for calling the engine mean.
3. The computer-based system, as recited in Claim 1 , wherein: said target device further comprises an event-driven application for communicating with said engine means, wherein said engine means sends the events and said event driven application responds.
4. The computer-based system, as recited in Claim 1, wherein: said operating system in said target device comprising a Windows CE operating system.
5. The computer-based system, as recited in Claim 1, wherein: said test suites comprise at least one system stress-testing routine; and at least one feature-and-function test.
6. The computer-based system, as recited in Claim 5, wherein: said system stress-testing routine comprise a code base for stress testing said at least one component of said operating system, said at least one component of said operating system being selected from a group of operating system components comprising an Ethernet/NDIS, PCMIA, a memory, a file system, a serial port a video system having a plurality of application program interfaces, an infrared system, an original equipment manufacturer adaptation layer, a touch panel, a mouse, a keyboard, and an audio/wave system, said test identifying at least three defects, namely, hardware design, hardware programming, and operating system interaction, and being executed in automatic or manual mode.
7. The computer-based system, as recited in Claim 1, wherein: said target device comprise a stand-alone unit provided with said operating system testing and validating software program for conducting validation and stress testing independent of said host computer.
8. The computer-based system, as recited in Claim 1, further comprising: an Ethernet connection coupled to said host and to said target device for conducting testing and validation tasks.
9. The computer-based system, as recited in Claim 1, wherein: said logging library means comprises at least one pass test result file using a WRITETESTPASS application program interface.
10. The computer-based system, as recited in Claim 9, wherein: said at least one pass test file resides in said target device.
11. The computer-based system, as recited in Claim 1, wherein: said logging library means comprises at least one fail test result file using a WRITETESTFAIL application program interface.
12. The computer-based system, as recited in Claim 11, wherein said at least one fail test file resides in said target device.
13. A computer-based method for testing and validating an embedded operating system within a target device, comprising the steps of:
(a) providing a host computer;
(b) providing a target device having an operating system ; and (c) providing an operating system testing and validating software program, said program being provided in said host computer, wherein said program comprises a graphical user interface program means for interfacing with a user, an engine means for communicating with said target device and responding to command from said graphical user interface, a plurality of test suites comprising at least one test for testing and validating at least one component of said operating system;
(d) providing a logging library means for manipulating and storing test related information as generated by said operating system testing and validating software program;
(e) executing said operating system testing and validating software program on said target device and testing and validating said operating system; and (f) generating pass and fail test results.
14. A computer-based method for testing and validating an embedded operating system, as described in claim 13 further comprising the step of: providing said operating system testing and validating software program with a set of functions on the graphical user interface for calling the engine mean.
15. A computer-based method for testing and validating an embedded operating system, as described in claim 13 further comprising the step of: providing said target device with an event-driven application for communicating with said engine means, wherein said engine means sends the events and said event driven application responds.
. .
16. A computer-based method for testing and validating an embedded operating system, as described in claim 13 further comprising the step of: providing said target device with a Windows CE operating system.
17. A computer-based method for testing and validating an embedded operating system, as described in claim 13 further comprising the step of: providing test suites in the form of system stress-testing routine comprising a code base for stress testing said at least one component of said operating system, said at least one component of said operating system being selected from a group of operating system components comprising an Ethernet/NDIS, PCMIA, a memory, a file system, a serial port a video system having a plurality of application program interfaces, an infrared system, an original equipment manufacturer adaptation layer, a touch panel, a mouse, a keyboard, and an audio/wave system, said test identifying at least three defects, namely, hardware design, hardware programming, and operating system interaction, and being executed in automatic or manual mode.
18. A computer-based method for testing and validating an embedded operating system, as described in claim 13 further comprising the step of: providing an Ethernet connection coupled to said host and to said target device for conducting testing and validation tasks on said target device.
19. A computer-based method for testing and validating an embedded operating system, as described in claim 13 further comprising the step of: providing said logging library means with at least one pass test result file using a WRITETESTPASS application program interface.
20. A computer-based method for testing and validating an embedded operating system, as described in claim 13 further comprising the step of: providing said logging library means with at least one fail test result file using a WRITETESTFAIL application program interface.
EP00909951A 1999-01-21 2000-01-21 A system and method for testing and validating devices having an embedded operating system Withdrawn EP1236108A1 (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US11682499P 1999-01-21 1999-01-21
US116824P 1999-01-21
US13762999P 1999-06-04 1999-06-04
US137629P 1999-06-04
PCT/US2000/001583 WO2000043880A1 (en) 1999-01-21 2000-01-21 A system and method for testing and validating devices having an embedded operating system

Publications (1)

Publication Number Publication Date
EP1236108A1 true EP1236108A1 (en) 2002-09-04

Family

ID=26814666

Family Applications (1)

Application Number Title Priority Date Filing Date
EP00909951A Withdrawn EP1236108A1 (en) 1999-01-21 2000-01-21 A system and method for testing and validating devices having an embedded operating system

Country Status (7)

Country Link
EP (1) EP1236108A1 (en)
JP (1) JP2002535773A (en)
KR (1) KR20010112250A (en)
CN (1) CN1359492A (en)
AU (1) AU3212300A (en)
BR (1) BR0009008A (en)
WO (1) WO2000043880A1 (en)

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100367238C (en) * 2001-08-22 2008-02-06 深圳市索普卡软件开发有限公司 X-86 serial compatible machine and generation method for its operation system
CN100456043C (en) * 2003-02-14 2009-01-28 爱德万测试株式会社 Method and apparatus for testing integrated circuits
CN1316358C (en) * 2004-03-05 2007-05-16 英业达股份有限公司 Information platform test environment automatic construction method and system
CN100403701C (en) * 2004-08-09 2008-07-16 华为技术有限公司 Goal device service realization testing method and system
CN100375058C (en) * 2004-12-24 2008-03-12 北京中星微电子有限公司 Software development method for flush type products
CN100440162C (en) * 2005-04-08 2008-12-03 环达电脑(上海)有限公司 Embedded apparatus debugging method
CN100356738C (en) * 2005-07-29 2007-12-19 杭州华三通信技术有限公司 Automatization testing frame system and method
FI118578B (en) * 2006-01-23 2007-12-31 Mika Pollari Testing apparatus and method for testing the apparatus
CN101452415B (en) * 2007-11-30 2011-05-04 鸿富锦精密工业(深圳)有限公司 Auxiliary device and method for testing embedded system
JP2012503819A (en) 2008-09-25 2012-02-09 エルエスアイ コーポレーション Method and / or apparatus for authenticating an out-of-band management application in an external storage array
US9003370B2 (en) 2010-03-04 2015-04-07 Nec Corporation Application modification portion searching device and application modification portion searching method
US10318477B2 (en) * 2010-05-26 2019-06-11 Red Hat, Inc. Managing and archiving system and application log files
CN102339248A (en) * 2010-07-20 2012-02-01 上海闻泰电子科技有限公司 On-line debugging system and method for embedded terminal
CN102916848B (en) * 2012-07-13 2014-12-10 北京航空航天大学 Automatic test method of Ethernet interface equipment based on script technology
CN105445644A (en) * 2015-11-18 2016-03-30 南昌欧菲生物识别技术有限公司 Multi-type chip test plate, test system and test machine bench
CN106201765B (en) * 2016-07-21 2019-03-15 中国人民解放军国防科学技术大学 Task stack area data check restoration methods based on μ C/OS-II operating system
WO2018073395A1 (en) * 2016-10-20 2018-04-26 Y Soft Corporation, A.S. Universal automated testing of embedded systems
CN109960590B (en) * 2019-03-26 2021-05-18 北京简约纳电子有限公司 Method for optimizing diagnostic printing of embedded system
CN110221974A (en) * 2019-05-22 2019-09-10 深圳壹账通智能科技有限公司 Service platform system self checking method, device, computer equipment and storage medium
CN112306888B (en) * 2020-11-13 2022-05-10 武汉天喻信息产业股份有限公司 Test system and method based on equipment library file interface
CN113590475A (en) * 2021-07-13 2021-11-02 北京快乐茄信息技术有限公司 Joint debugging test method and joint debugging test device for online development platform
CN116743990B (en) * 2023-08-16 2023-10-27 北京智芯微电子科技有限公司 Video stream testing method and video stream testing processing method of embedded equipment

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE69415600T2 (en) * 1993-07-28 1999-07-15 Koninkl Philips Electronics Nv Microcontroller with hardware troubleshooting support based on the boundary scan method
US5724505A (en) * 1996-05-15 1998-03-03 Lucent Technologies Inc. Apparatus and method for real-time program monitoring via a serial interface

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO0043880A1 *

Also Published As

Publication number Publication date
BR0009008A (en) 2002-02-13
JP2002535773A (en) 2002-10-22
WO2000043880A1 (en) 2000-07-27
AU3212300A (en) 2000-08-07
CN1359492A (en) 2002-07-17
KR20010112250A (en) 2001-12-20

Similar Documents

Publication Publication Date Title
US6182246B1 (en) Protocol acknowledgment between homogeneous system
EP1236108A1 (en) A system and method for testing and validating devices having an embedded operating system
AU2004233548B2 (en) Method for Computer-Assisted Testing of Software Application Components
US8589886B2 (en) System and method for automatic hardware and software sequencing of computer-aided design (CAD) functionality testing
US6408403B1 (en) Method for integrating automated software testing with software development
US8356282B1 (en) Integrated development environment for the development of electronic signal testing strategies
US9098635B2 (en) Method and system for testing and analyzing user interfaces
US9477581B2 (en) Integrated system and method for validating the functionality and performance of software applications
AU748588B2 (en) Method for defining durable data for regression testing
US8074204B2 (en) Test automation for business applications
US7134049B2 (en) System and method for sequencing and performing very high speed software downloads concurrent with system testing in an automated production environment
US7222265B1 (en) Automated software testing
US20030070120A1 (en) Method and system for managing software testing
US20040153774A1 (en) Generating standalone MIDlets from a testing harness
US20080270841A1 (en) Test case manager
CN110362490B (en) Automatic testing method and system for integrating iOS and Android mobile applications
US7143361B2 (en) Operator interface controls for creating a run-time operator interface application for a test executive sequence
EP1224551A1 (en) Protocol acknowledgment between homogeneous systems
US20050049814A1 (en) Binding a GUI element to a control in a test executive application
CN114265769A (en) Test system and method based on python script test case
Bland et al. Design and implementation of a menu based oscar command line interface
JPH09223040A (en) System test supporting device for software and test scenario generator to be used for the same
CN111639029B (en) Reliability test method and system for soft AC product
JPH09223042A (en) System test device for software
CN117472742A (en) Test tool deployment method, device, equipment and storage medium

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20010808

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20030801