WO2000043880A1 - A system and method for testing and validating devices having an embedded operating system - Google Patents
A system and method for testing and validating devices having an embedded operating system Download PDFInfo
- Publication number
- WO2000043880A1 WO2000043880A1 PCT/US2000/001583 US0001583W WO0043880A1 WO 2000043880 A1 WO2000043880 A1 WO 2000043880A1 US 0001583 W US0001583 W US 0001583W WO 0043880 A1 WO0043880 A1 WO 0043880A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- operating system
- testing
- test
- computer
- target device
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3688—Test management for test execution, e.g. scheduling of test suites
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/22—Detection or location of defective computer hardware by testing during standby operation or during idle time, e.g. start-up testing
- G06F11/26—Functional testing
- G06F11/263—Generation of test inputs, e.g. test vectors, patterns or sequences ; with adaptation of the tested hardware for testability with external testers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/362—Software debugging
- G06F11/366—Software debugging using diagnostics
Definitions
- TECHNICAL FIELD This invention relates to product quality assurance and to test systems and methods for validating operating systems provided in computerized products. More particularly, the present invention relates to product quality assurance and to test systems and methods for validating an operating systems during the development of computerized products. Even more particularly, the present invention relates to product quality assurance and to test systems and methods for validating operating systems, such as Windows CE, manufactured and sold by Microsoft, Incorporated of Redmond, WA. , typically provided in computerized products. BACKGROUND OF THE INVENTION Increasingly, developers are embedding operating systems, such as Windows CE, into many different types of computerized products, including set-top boxes, gaming systems, bar-code scanners, and factory automation systems.
- the present invention to be commercially available under applicant's assignee's trademark of CEValidatorTM, is an operating system validator, (herein also referred to as O/S Validator, and designated in the Figures with the numeral 1), which solves the foregoing problems by providing a test system encompassing an automated test suite method for testing a port of an operating system, such as Windows CE, in a target device's hardware and/or software being newly developed.
- the O/S Validator comprises a comprehensive code base, specifically developed to purposefully stress the O/S, device driver, OEM Adaptation Layer (OAL), and hardware interaction.
- the provided test suites focus on identifying three primary defects: hardware design, hardware programming (drivers/OAL), and operating system interaction.
- the test suites comprise nearly 1500 tests which include system stress-testing routines, as well as feature-and-function tests, providing a complete analysis of a Windows CE port. These tests are grouped by the O/S Validator.
- the O/S Validator includes both test source codes and executable programs for all tests.
- an intuitive user interface for the O/S Validator host component such as a standard Windows application leveraging the Microsoft Windows user interface, is utilized.
- the O/S Validator distributes test suites as a client/server application.
- a graphical user interface interacts with a small application, CEHarness.exe, which is running on a target device. Because this communication may occur over Ethernet, at least one host may run suites against at least one target device.
- the O/S Validator generates useful error information when a target device fails a test.
- results are displayed in a plurality of dynamically created log windows as well as in a configuration's summary tab.
- the logging windows contain the full text of a given test's results. Failures are color-coded red to ease identification. Navigation buttons in the logging window allow you to quickly move from one failure to another.
- the logging APIs in the tests also cause a prolog and an epilog to be generated in each result file. Information such as concurrently ranning processes, battery power level, and execution date and time is automatically recorded in the results file and displayed in the log window. Useful summary information such as loss of program memory, loss of storage memory, or total test execution time is provided in a log window tab.
- the summary information for a given test result is also collected and displayed in a Summary tab of the configuration window.
- the summary tab reports the number of PASS and FAIL test cases in real time. Breakout PASS and FAIL numbers for individual suites are also displayed.
- the configuration window's Summary tab facilitates quick navigation to an individual failure among perhaps thousands of test results.
- the exact source file and line number corresponding to a logged failure are automatically reported by the O/S Validator's logging APIs. Since O/S Validator provides the source code for all of its executables, being able to go directly to the source code reporting an error is a powerful adjunct to the textural descriptions of the failure.
- Figure 1.0 is a schematic diagram representative of a computerized product presently being provided with an embedded operating system in a control unit.
- Figure 2.0 is a manufacturing flow diagram illustrating quality assurance testing on a computerized product provided with an embedded operating system, in accordance with the present invention.
- Figure 3.0 is a block diagram showing the primary components of the operating system validator of the present invention, including a graphical user interface, an engine, a plurality of test suites and a logging library.
- Figure 4.0 is a block diagram showing the present invention executing a plurality of test suite configurations from a host device for testing a plurality of target devices provided with an embedded operating system, in accordance with the present invention.
- Figure 5.0 is a block diagram showing the present invention essentially as depicted in Figure 4, except showing communicate by the target device with the O/S Validator 1 at the host via Ethernet means.
- Figure 5a illustrates an arrangement where communications between a plurality of host and target devices may occur over Ethernet.
- Figure 6 shows yet another arrangement for a test suite execution situation.
- Figure 7.0 is a table listing of functional areas of specific APIs tested in automatic and manual test suite execution.
- Figures 8A. 8B, and 8C is a table listing a comprehensive list of functional area and their APIs which may be tested in automatic or manual mode.
- Figure 9.0 is a table listing of selected APIs for use in building automation scripts.
- Figure 10.0 is a schematic diagram representative of the concept of the present invention providing source code for all executable programs.
- Figure 11.0 is an block diagram representation of a window showing the test suites selection options as well as other related summary functions.
- Figure 12.0 is an block diagram representation of a logging window showing tabs for test results, test failures and related test summary options.
- Figure 13 illustrates in graph form the relationship of test cycle time versus the number of test devices being concurrently tested
- Figure 14.0 is an block diagram representation of a configuration window showing tabs for executing a variety of configuration related functions.
- FIGS. 15A, 15B AND 15C show a table listing testing details of the operating system components, in accordance with the present invention.
- Figure 1.0 shows a computerized product 1000, (9), typical of computerized products such as computer workstations, set-top boxes, gaming systems, bar-code scanners, and factory automation systems presently being provided with an embedded operating system, depicted by the numeral 1001a.
- product 1000 may comprise a typical target device 9 provided with an operating system 1001a, such as an embedded Windows CE operating system (O/S).
- O/S embedded Windows CE operating system
- the computerized product 1000 may function as a standalone device, having an installation of the present invention, the O/S Validator 1, for testing and validating its own operating system 1001a.
- the standalone testing facilitates new test development and de-bugging of reported defects as elaborate knowledge of O/S Validator infrastructure is not necessitated.
- product 1000 may function in a manufacturing quality assurance testing environment M as a host computer 4 having an installation of the present invention, the O/S Validator 1 for testing a target devices 9, provided with an operating system 1001a.
- a computerized product 1000 may comprises, by example, several sub-components, including a control unit 1020, including a plurality of input/output ports 1021, a keyboard 1009, a printer 1010, a mouse 1011, and a monitor 1012.
- the sub-components 1009, 1010, 1011, 1012 themselves, may be testable target devices.
- the typical control unit 1020 itself, comprises several sub-components, including a central processing unit 1001, storage devices such as a hard disk drive 1004, other memory components including RAM, 1002, a ROM 1003, a compact disc 1005, an audio component 1006, a network/server card 1007, and a modem 1008. Included in the control unit, of necessity, is an operating system 1001a, to make product 1000 functional as a useful device.
- Figure 3.0 shows the primary components of the O/S Validator 1 including a graphical user interface (GUI) 2, an Engine 3, a plurality of Test Suites 11, and a Logging Library 12.
- GUI graphical user interface
- Engine 3 a plurality of Test Suites 11, and a Logging Library 12.
- the GUI 2 and the Engine 3 communicate internally, in both directions, through a component called HarnessLink.dll, designated by the numeral 7 within O/S Validator 1 and is discussed in more detail below.
- Figure 4.0 illustrates a host computer 4 provided with O/S Validator 1. As illustrated, a plurality of target devices 9, are provided with an O/S 1001a for being tested in accordance with the present invention.
- the O/S Validator 1 has capabilities of generating testing configurations, such as a plurality of test configurations 21a, 21b, and 21c, for testing a particular function under control of OS 1001a within target devices 9.
- a device side component termed CEHarness 8 communicates with Engine 3 in O/S Validator 1.
- CEHarness 8 may also communicate with Engine 3 in O/S Validator 1 via Ethernet means 4a.
- Figure 5a illustrates that because communication may occur over Ethernet, a plurality host 4 may run suites against a plurality of target device 9.
- CEHarness 8 may also communicate with Engine 3 in O/S Validator 1 via suite execution connection 1021a, where host computer 4 may comprise a NT host computer, the logging library 12 is also provided in a target device 9, and the test results are provided to Host computer 4 via socket connections 1021b.
- the O/S Validator 1 tests and validates target devices 9 provided with an embedded operating system, by example a Windows CE Operating System.
- the O/S Validator 1 functions by (1) validating the Windows CE port to a target device, (2) by providing stress and performance testing, (3) by logging and analyzing results, (4) by testing a plurality of functional areas, including a plurality of applications program interfaces (API), see Table 1.0 and Table 2.0 in Figures 7, 8A, 8B, and 8C, respectively, (5) by executing a plurality of pass/fail tests in a plurality of test suites, (6) by facilitating customization of tests in the automated suites, (7) by providing a host side graphical test harness (8) by stressing and evaluating memory performance, (9) providing means for building test automation, including a plurality of APIs, see Table 3.0 in Figure 9.0, and (10) providing a results analysis tool, termed CEAnalyzer.
- API applications program interfaces
- O/S Validator 1 includes, for all tests, both test source codes SC and executable programs EP (also referred to as test executable).
- the sole implementation requirement for a test executable EP is that test case "passes" and “failures” be reported using two specific APIs: WRITETESTPASS ( ) and WRITETESTFAIL ( ). These macros have signatures similar to the well-known printf ( ) function, but their use generates a standard test case result format in the "results" file amenable to automated summarization and integrates the reporting with the host user interface.
- the O/S Validator 1 method further comprises mediating between the GUI 2 and the test cases encapsulated in the test executables EP and providing a means by which the GUI 2 distributes and coordinates the execution of tests.
- the test suites 11 comprise text files 5 composed of suite language commands (e.g. PUT, RUN, WAIT, and DELETE) which are the direct expressions of the tasks needed to distribute and execute the test.
- the O/S Validator 1 method includes organizing the test suite files 5 by their hierarchical placement within a directory structure on the host computer 4. As shown in Figure 11, test suites 11 are divided at the top level as being either automatic Au test suites, manual Ma test suites, or stress SS test suites. Automatic suites Au are tests which do not require any user intervention during their execution. In contrast, manual suites Ma do require user intervention (e.g. keyboard and touch panel suites).
- the O/S Validator 1 method includes stressing the system, through the stress suites SS by pressing input/output (I/O) throughput to the limit, operating with little or no available program or object store memory, and possibly bringing down a custom file system with multi-threaded concurrent stressing.
- I/O input/output
- the O/S Validator 1 method includes arranging the test suites 11 by the functional areas, see generally Figure 7.
- Figure 3.0 shows the primary components of the O/S Validator 1 including a graphical user interface (GUI) 2, an Engine 3, a plurality of Test Suites 11, and a Logging Library 12.
- GUI 2 graphical user interface
- the GUI 2 design is based on the concept of interfacing the user input with underlying components.
- the Engine 3 holds the core of the functionality on the host 4.
- the Engine 3 reads a plurality of suite files 5, parses them, and executes the commands.
- the Engine 3 obtains the information from the GUI 2 and uses the information to set-up a variety of execution options.
- HarnessLink.dll 7 is an ActiveX control. HarnessLink.dll 7 is instantiated and called from the GUI 2 by a variety of information, which is passed to the Engine 3 before it begins to execute. Dll link 7 also functions to communicate between the Engine 3 and the GUI 2 during the execution, to relay information, relay error messages, and to relay some dynamic run-time commands.
- the target device 9 comprises a device-side (as opposed to a host-side) component called CEHarness 8.
- CEHarness is a C/C+ + program residing on the target device 9 and, as shown in Figure 4, communicates nearly exclusively with the Engine 3, unless broadcasting target information on the network, with the GUI 2 receiving such information and passing it to the Engine 3, see Figures 5 and 5a.
- CEHarness 8 is an event-driven application, where the Engine 3 sends the events and CEHarness 8 responds.
- the two remaining components, test suites 11 and logging library 12, are intertwined since the test suites 11 are written using a plurality of application program interfaces (API) 13 that are part of the logging library 12, see Figures 8A, 8B and 8C.
- API 13 application program interfaces
- the logging library 12 has a simple functionality concept, cornmunicating the test results 14 by creating log files 15 by either logging TCP/IP information 16 back to the GUI 2, see Figure 5.0, or by writing results 14 and log files 15 directly to the device 9, see Figure 6.0.
- a logging window LW shows the test results 14 in test files 15, failures F and a summary tab SumT which facilitates user-access to program memory, passes, failures, and timing information.
- the plurality of test suites tests 11 comprise the indispensable component of the O/S Validator 1.
- Figure 13 illustrates in graph form that the test cycle time CT decreases as the number of test devices are concurrently tested.
- the GUI 2 is a complex code due to its degree of functionality required for handling its layer of components.
- GUI 2 provides a "wizard" whose primary function is walking new users through the various selectable settings and listing the default settings.
- GUI 2 also provides a configuration window CW, as the means for executing a test run which comprises a single pass through a set of selected suites 11 on a target device 9.
- a plurality of configurations 21a, 21b, and 21c may be run to simulate a variety of scenarios.
- the contents of a configuration window CW comprises a plurality of tabs for user control.
- suite tab S provides a tree view of suite files directory under the O/S Validator directory. This tree view is organized to provide meaningful distinctions between the types of tests 11 the user selects.
- Test suites 11 are scripts of commands that the Engine 3 reads and then performs actions corresponding to the commands in the script.
- the suite files 5 are generally started with a series of comments. These comments appear in the suite file information section of the file. If the word "Manual"' appears at the stop of a suite file 5, the file is deemed to require manual execution and is, therefore, assigned a different icon.
- the test suite section as illustrated in Figure 11, the user may reorder the test suite files 5 in any way. Still referring to Figure 14, the logging tabs contains considerable valuable information.
- the user may select three methods of logging, namely LH for logging to the host 4, LTD for logging to the target device 9, or LHD for logging to both host 4 and target device 9.
- the log information is then stored in a configurable directory listed in an edit box. All this information is sent through the DLL 7 to the Engine 3 and then to CEHarness 8 in target device 9. Subsequently, the information is acquired by the logging library 12 upon running a test 11.
- Other tabs in the configuration window CW include a set stress condition tab SC, select high priority threads during suite execution by selecting thread tab T, reducing program and storage memory by selecting tabs PM and SM, select run time by selecting tab SRT and stopping the run by selecting tab STOP.
- the user can utilize infinite loop tab Hoop for finding memory leaks in the system.
- Useful summary information such as loss of program memory, loss of storage memory, or total test execution time is provided in a summary tab SumT.
- the summary information for a given test result is also collected and displayed in a Summary tab SumT.
- the summary tab reports the number of PASS and FAIL test cases in real time. Breakout PASS and FAIL numbers for individual suites are also displayed.
- the configuration window's Summary tab facilitates quick navigation to an individual failure among perhaps thousands of test results.
- the exact source file and line number corresponding to a logged failure are automatically reported by the O/S Validator's logging APIs. Since O/S Validator provides the source code for all of its executables, being able to go directly to the source code reporting an error is a powerful adjunct to the textural descriptions of the failure.
- the logging options vary dramatically in their implementation and effects.
- the first option presumes that whenever the user runs a test suite 11 resulting in a "pass", a set of log files 15, summarizing these test results 14, automatically issues.
- the summary files 15 are created by either CEHarness 8 or the Engine 3. Basically, the Engine 3 traverses all the log files 16 in the logging directory, so the user may receive a log files list not corresponding to the test 11 being run.
- the user can delete the log directory before running, or the user can select a logging option to select 'Return only the newest summary results' which causes the Engine 3 to traverse only one log file 15 for each test 11.
- a window termed Available Targets, shows the active devices on the network that are broadcasting information to the GUI 2.
- the active devices send a large volume of information, some of which is displayed in the Available Targets window.
- the user may view the information by selecting a View/ Available Targets menu. Another window must be accessed to obtain the complete set of broadcasted information. This broadcast information is valuable, because it is used to initialize the connection from the Engine 3 to a particular CEHarness 8 in a test target device 9.
- the Stress Suites 11 are manifested in a variety of forms; however, their fundamental purpose is for stressing the target device 9 by running a very long test, running multiple iterations of a short test, or electing a wide range of parameters in one test. These remain specific to their own test 11 area, for example, a Database Stress Suites only stress the Database functionality of an O/S, such as Windows CE.
- the Stress Test Options are distinguishable from Stress Suites 11.
- the Stress Test Options are different, because they target functionality to provide more broadband stress scenarios equivalent to real world usage models. These scenarios run in conjunction with any user-selected set of test suite files 5.
- the Stress Test Options can and should be run both conjointly and separately as, in so doing, a significant boost to the range of any test plan is provided.
- the first two Stress Test Options are related to the memory on the target device 9.
- the first Stress Test Option is the Low Virtual Memory Option which greatly reduces the amount of virtual memory of the target device before rining the selected tests 11. This simulates the realistic harsh circumstances that might occur when a user has opened fifteen applications, effecting a malfunction.
- the second Stress Test Option is the Low Storage Memory option. When selected, this second Stress Test Option fills the storage memory of the target device 9 to its maximum capacity in order to experience the target device
- This second Stress Test Option is also good for testing application programs as contained within the target device 9 as they may depend on non-existent storage memory.
- the next three Stress Test Options are execution options.
- the first executable stress option is the infinite loop, which is ideal for long test cycles. A common problem in many device drivers is malfunction under long, intense, stressful situations.
- This infinite loop stress test provides a test for determining a possible breakdown. This infinite loop test runs the selected suites 11 until the user manually hits the Stop button.
- the next stress execution option is the configurable CPU cycle deprivation test available as a text file identified as O/S Validator ⁇ Tests ⁇ TestInputFiles called Data.txt. Two examples are providedin the file which a user may copy, reuse, or modify.
- the text file, Data.txt controls the number of threads and their attributes that the user may include in his test run. In other words, the user can run his tests while other processes are consuming the CPU time, eliminating many problems including timing.
- the last Stress Test Option is the Random Execution. When the user selects this option, the GUI 2 will reorder the list of test suites 11 at run time so that they run in a different order. This option is ideal, because it facilitates the diagnoses of interaction problems with various components.
- the remaining Test Run options are generic activities that the user may control.
- the first option, "Use Selected Target Exclusively,” is important, because, when a target device 9 is connected over the Ethernet, other users in a subnet can access that target device 9 through the O/S Validator 1 available target devices window. This helps to create stress on the target device 9. In the event that the user wishes to isolate a problem, extra stress should not be applied. In that situations, the user should have exclusive access to the target device 9.
- the last Test Run Option is set Target Time which prompts the Engine 3 to send the time from the host computer 4 to the target device 9, and thereby synchronizing the target device 9 system time to the host computer 4 time.
- Synchronization is advantageous as the log files return with a date and time stamp related to the date and time on the target device 9.
- the last tab before running a test is the Environment Settings tab which contains valuable information for the various selectable environment variables.
- These environment settings are designed to extend and abstract the test suite files 5 by allowing the test suite files to contain environment variable instead of hard carded information.
- the Serial Tests take an available com-port as a parameter. If the com-port is not entered as an environment variable, the test fails, because it is unable to open a com-port. All the environment variables used in the suites are provided; however, any additional environment variables may be user-added to the user-input suites.
- a Test Status is available for obtaining status information for the current test run.
- the information is dynamically being updated.
- a Suites Run section window lists the selected suites that have started. The user may open any log file from this window by selecting a desired test icon. The other icons in this control provide failure information. The failure icons are shown, by example, as a stylized beaker crossed-out with a red "X.”
- a Test Run Summary Information keeps tracks of the number of test suite files run, suites selected, suites with a failure, and percentage of suites with a failure.
- the user may select a Configure Failed Suites tab prompting the appearance of a new configuration window selecting all the failed suites in the current test run, facilitating regression testing.
- Test Details The remaining two sections are called Test Details.
- One of these Test Details section monitors the individual test cases that pass, as well as fail, which section is valuable for gauging the value of a test run.
- the remaining Test Details section is the Failed Suites section where all selected suites with a fail are listed by suite name, showing the number of corresponding passing and failing test cases. All this information gives the user a very good idea of the limits of his target device 9 during a test run (i.e. what passes, and more importantly what fails).
- the primary object of the present invention is to properly test a port of an O/S 1001a, such as Windows CE.
- O/S 1001a such as Windows CE.
- hundreds of suite tests 11 are needed and are provided by the O/S Validator 1.
- nearly 1500 test suites 11, as grouped by the verified O/S subsystem, are provided.
- the O/S Validator 1 includes both source codes and executable codes for all tests 11, covering the major O/S 1001a subsystems and the common adaptation drivers, with special emphasis on subsystems historically exhibiting the most problems.
- OS subsystem components tested by the O/S Validator 1 include: Ethernet/NDIS, serial port driver, display driver, touch panel driver, mouse driver, keyboard driver, OEM Adaptation Layer, and PC Card adapter driver.
- Figure 15A, 15B AND 15C show a table listing testing details of these system components.
- HarnessLink.dll 7 provides a set of functions on the GUI 2 for calling (i.e. affecting) the Engine 3.
- HarnessLink.dll 7 function-calls set-up some parameters for the Engine 3's command line. All the initial information is passed to the Engine 3 through this command line, insuring readiness of the Engine 3 for specified execution.
- a plurality of GUI- related functions provide information on the command line. The information on the command line corresponds the GUI 2 information.
- HarnessLink.dll 7 serves is to communicate activity during the running of the Engine 3 by opening a named pipe.
- the Engine 3 communicates through the named pipe.
- a named pipe insures that the communication between the Engine 3 and a specific Harness link is direct, accurate, and without any duplication problems if a plurality of Engines 3 are running.
- the HarnessLink.dll 7 When the HarnessLink.dll 7 receives a message from the pipe, it signals the appropriate VB event which, in turn, causes the GUI 2 to take the information and process it accordingly.
- Engine 3 communicates with one HarnessLink.dll 7, which, in turn, communicates with one CEHarness 8.
- the Engine 3 execution is simple: a command line is received and processed, establishing the execution socket connection to the target device, opening the pipe for communication with the GUI 2, reading the test suite files 5, and subsequently executing the tests in three phases, PreExecution, Execution, and PostExecution.
- the PreExecution stage establishes the error socket connection between the target device 9 and the host 4. Relevant data such as logging paths and styles, various test run information, and stress scenarios are sent during the PreExecution stage.
- the Execution stage involves a response to each sequential suite command.
- a suite command is generally sent by the host 4 and is processed by the CEHarness 8 which, in turn, responds with a socket message when execution of the command is completed.
- CEHarness 8 in test target device 9 is a significantly more complicated component than the Engine 3. This complexity is because each device 9 has one instance of CEHarness 8 at any given time; however, that device can handle a plurality of simultaneous connections, a vitally important feature to the testing methodology provided by the O/S Validator 1.
- a broadcast thread When the user starts CEHarness 8, it creates two threads that endure through the entire execution time: a broadcast thread and an execution thread. This broadcast thread updates information such as the device IP, connection type, and available com-ports every ten seconds, sending a broadcast message at the same rate to the network.
- the connection type will change to PPP_PEER. If this occurs, the broadcast message is sent only to the host 4 from which the target device 9 is directly connected. If the user changes the connection at some point during execution, the message is updated. Meanwhile, the execution thread waits for a connection attempt from an Engine 3. When the execution thread receives the connection, it spawns another thread, a main execution thread, that performs the various functions required. The main execution thread starts another socket for sending any error or memory information. Thus, the execution thread is event-driven, receiving a command and responding appropriately. Each connection attempt spawns its own execution thread; therefore, a single CEHarness 8 may have many active connections, extending the test functionality by running a plurality of configurations simultaneously on one target device 9, and thereby creating a more realistic stress situation.
- the Logging library 12 shown in Figure 3.0 is a complex tool integrated into the O/S Validator 1 in various ways. Primarily, the Logging library 12 is integrated into the source files for the tests. The tests make calls to the library APIs whereupon the library handles all the details regarding capture and recordation of test results.
- the Logging library 12 also supports a variety of communication options. The recommended option is through the TCP, allowing the user to see the readout of the log files as they migrate from the TCP connection. Another communication option is direct logging to files on a device. This can be advantageous, for example, if the user wants to log to a PCMCIA card but does not want extra information to be broadcast over the network.
- the Logging library 12 acts as the device-side component to the TCP communications
- a host 4 component acts as the GUI 2-side component to the communications.
- This aspect of the logging library 12 provides a log window on the host 4 and color coding for the test results.
- failure messages are displayed as a red line.
- the name and location of the source code file as well as the line number where the message was generated is included in the logging library 12 messages.
- a detailed error message is provided describing current events in the program.
- Each log window has a summary tab which facilitates user-access to program memory, passes, failures, and timing information. Another important feature of the log files is that they capture a large volume of information at the beginning and at the end of a test.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Debugging And Monitoring (AREA)
- Test And Diagnosis Of Digital Computers (AREA)
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP00909951A EP1236108A1 (en) | 1999-01-21 | 2000-01-21 | A system and method for testing and validating devices having an embedded operating system |
KR1020017009191A KR20010112250A (en) | 1999-01-21 | 2000-01-21 | A system and method for testing and validating devices having an embedded operating system |
AU32123/00A AU3212300A (en) | 1999-01-21 | 2000-01-21 | A system and method for testing and validating devices having an embedded operating system |
BR0009008-5A BR0009008A (en) | 1999-01-21 | 2000-01-21 | System and method for testing and validation devices with an embedded operating system |
JP2000595240A JP2002535773A (en) | 1999-01-21 | 2000-01-21 | System and method for testing and validating a device having an embedded operating system |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11682499P | 1999-01-21 | 1999-01-21 | |
US60/116,824 | 1999-01-21 | ||
US13762999P | 1999-06-04 | 1999-06-04 | |
US60/137,629 | 1999-06-04 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2000043880A1 true WO2000043880A1 (en) | 2000-07-27 |
Family
ID=26814666
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2000/001583 WO2000043880A1 (en) | 1999-01-21 | 2000-01-21 | A system and method for testing and validating devices having an embedded operating system |
Country Status (7)
Country | Link |
---|---|
EP (1) | EP1236108A1 (en) |
JP (1) | JP2002535773A (en) |
KR (1) | KR20010112250A (en) |
CN (1) | CN1359492A (en) |
AU (1) | AU3212300A (en) |
BR (1) | BR0009008A (en) |
WO (1) | WO2000043880A1 (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2003023611A1 (en) * | 2001-08-22 | 2003-03-20 | Mauchai Chan | The x86-compatible computer and the method of the generation of the operating system |
WO2007083001A1 (en) * | 2006-01-23 | 2007-07-26 | Mika Pollari | A test apparatus and a method for testing an apparatus |
CN100356738C (en) * | 2005-07-29 | 2007-12-19 | 杭州华三通信技术有限公司 | Automatization testing frame system and method |
CN102916848A (en) * | 2012-07-13 | 2013-02-06 | 北京航空航天大学 | Automatic test method of Ethernet interface equipment based on script technology |
US9003370B2 (en) | 2010-03-04 | 2015-04-07 | Nec Corporation | Application modification portion searching device and application modification portion searching method |
CN112306888A (en) * | 2020-11-13 | 2021-02-02 | 武汉天喻信息产业股份有限公司 | Test system and method based on equipment library file interface |
CN116743990A (en) * | 2023-08-16 | 2023-09-12 | 北京智芯微电子科技有限公司 | Video stream testing method and video stream testing processing method of embedded equipment |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN100456043C (en) * | 2003-02-14 | 2009-01-28 | 爱德万测试株式会社 | Method and apparatus for testing integrated circuits |
CN1316358C (en) * | 2004-03-05 | 2007-05-16 | 英业达股份有限公司 | Information platform test environment automatic construction method and system |
CN100403701C (en) * | 2004-08-09 | 2008-07-16 | 华为技术有限公司 | Goal device service realization testing method and system |
CN100375058C (en) * | 2004-12-24 | 2008-03-12 | 北京中星微电子有限公司 | Software development method for flush type products |
CN100440162C (en) * | 2005-04-08 | 2008-12-03 | 环达电脑(上海)有限公司 | Embedded apparatus debugging method |
CN101452415B (en) * | 2007-11-30 | 2011-05-04 | 鸿富锦精密工业(深圳)有限公司 | Auxiliary device and method for testing embedded system |
JP2012503819A (en) | 2008-09-25 | 2012-02-09 | エルエスアイ コーポレーション | Method and / or apparatus for authenticating an out-of-band management application in an external storage array |
US10318477B2 (en) * | 2010-05-26 | 2019-06-11 | Red Hat, Inc. | Managing and archiving system and application log files |
CN102339248A (en) * | 2010-07-20 | 2012-02-01 | 上海闻泰电子科技有限公司 | On-line debugging system and method for embedded terminal |
CN105445644A (en) * | 2015-11-18 | 2016-03-30 | 南昌欧菲生物识别技术有限公司 | Multi-type chip test plate, test system and test machine bench |
CN106201765B (en) * | 2016-07-21 | 2019-03-15 | 中国人民解放军国防科学技术大学 | Task stack area data check restoration methods based on μ C/OS-II operating system |
CN110226095B (en) * | 2016-10-20 | 2022-06-17 | Y软股份公司 | Universal automated testing of embedded systems |
CN109960590B (en) * | 2019-03-26 | 2021-05-18 | 北京简约纳电子有限公司 | Method for optimizing diagnostic printing of embedded system |
CN110221974A (en) * | 2019-05-22 | 2019-09-10 | 深圳壹账通智能科技有限公司 | Service platform system self checking method, device, computer equipment and storage medium |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5590354A (en) * | 1993-07-28 | 1996-12-31 | U.S. Philips Corporation | Microcontroller provided with hardware for supporting debugging as based on boundary scan standard-type extensions |
US5724505A (en) * | 1996-05-15 | 1998-03-03 | Lucent Technologies Inc. | Apparatus and method for real-time program monitoring via a serial interface |
-
2000
- 2000-01-21 WO PCT/US2000/001583 patent/WO2000043880A1/en not_active Application Discontinuation
- 2000-01-21 AU AU32123/00A patent/AU3212300A/en not_active Abandoned
- 2000-01-21 KR KR1020017009191A patent/KR20010112250A/en not_active Application Discontinuation
- 2000-01-21 JP JP2000595240A patent/JP2002535773A/en active Pending
- 2000-01-21 BR BR0009008-5A patent/BR0009008A/en not_active Application Discontinuation
- 2000-01-21 CN CN 00802922 patent/CN1359492A/en active Pending
- 2000-01-21 EP EP00909951A patent/EP1236108A1/en not_active Withdrawn
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5590354A (en) * | 1993-07-28 | 1996-12-31 | U.S. Philips Corporation | Microcontroller provided with hardware for supporting debugging as based on boundary scan standard-type extensions |
US5724505A (en) * | 1996-05-15 | 1998-03-03 | Lucent Technologies Inc. | Apparatus and method for real-time program monitoring via a serial interface |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2003023611A1 (en) * | 2001-08-22 | 2003-03-20 | Mauchai Chan | The x86-compatible computer and the method of the generation of the operating system |
CN100367238C (en) * | 2001-08-22 | 2008-02-06 | 深圳市索普卡软件开发有限公司 | X-86 serial compatible machine and generation method for its operation system |
CN100356738C (en) * | 2005-07-29 | 2007-12-19 | 杭州华三通信技术有限公司 | Automatization testing frame system and method |
WO2007083001A1 (en) * | 2006-01-23 | 2007-07-26 | Mika Pollari | A test apparatus and a method for testing an apparatus |
US9003370B2 (en) | 2010-03-04 | 2015-04-07 | Nec Corporation | Application modification portion searching device and application modification portion searching method |
CN102916848A (en) * | 2012-07-13 | 2013-02-06 | 北京航空航天大学 | Automatic test method of Ethernet interface equipment based on script technology |
CN102916848B (en) * | 2012-07-13 | 2014-12-10 | 北京航空航天大学 | Automatic test method of Ethernet interface equipment based on script technology |
CN112306888A (en) * | 2020-11-13 | 2021-02-02 | 武汉天喻信息产业股份有限公司 | Test system and method based on equipment library file interface |
CN112306888B (en) * | 2020-11-13 | 2022-05-10 | 武汉天喻信息产业股份有限公司 | Test system and method based on equipment library file interface |
CN116743990A (en) * | 2023-08-16 | 2023-09-12 | 北京智芯微电子科技有限公司 | Video stream testing method and video stream testing processing method of embedded equipment |
CN116743990B (en) * | 2023-08-16 | 2023-10-27 | 北京智芯微电子科技有限公司 | Video stream testing method and video stream testing processing method of embedded equipment |
Also Published As
Publication number | Publication date |
---|---|
AU3212300A (en) | 2000-08-07 |
JP2002535773A (en) | 2002-10-22 |
KR20010112250A (en) | 2001-12-20 |
EP1236108A1 (en) | 2002-09-04 |
CN1359492A (en) | 2002-07-17 |
BR0009008A (en) | 2002-02-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6182246B1 (en) | Protocol acknowledgment between homogeneous system | |
WO2000043880A1 (en) | A system and method for testing and validating devices having an embedded operating system | |
AU2004233548B2 (en) | Method for Computer-Assisted Testing of Software Application Components | |
US8589886B2 (en) | System and method for automatic hardware and software sequencing of computer-aided design (CAD) functionality testing | |
US6408403B1 (en) | Method for integrating automated software testing with software development | |
US8356282B1 (en) | Integrated development environment for the development of electronic signal testing strategies | |
US9098635B2 (en) | Method and system for testing and analyzing user interfaces | |
US9477581B2 (en) | Integrated system and method for validating the functionality and performance of software applications | |
US7143310B2 (en) | Generating standalone MIDlets from a testing harness | |
AU748588B2 (en) | Method for defining durable data for regression testing | |
US8074204B2 (en) | Test automation for business applications | |
US7134049B2 (en) | System and method for sequencing and performing very high speed software downloads concurrent with system testing in an automated production environment | |
US7222265B1 (en) | Automated software testing | |
US20030070120A1 (en) | Method and system for managing software testing | |
US20080270841A1 (en) | Test case manager | |
US20040201627A1 (en) | Method and apparatus for analyzing machine control sequences | |
CN110362490B (en) | Automatic testing method and system for integrating iOS and Android mobile applications | |
US7143361B2 (en) | Operator interface controls for creating a run-time operator interface application for a test executive sequence | |
US20050049814A1 (en) | Binding a GUI element to a control in a test executive application | |
WO2000075783A1 (en) | Protocol acknowledgment between homogeneous systems | |
CN111639029B (en) | Reliability test method and system for soft AC product | |
CN114265769A (en) | Test system and method based on python script test case | |
Bland et al. | Design and implementation of a menu based oscar command line interface | |
JPH09223040A (en) | System test supporting device for software and test scenario generator to be used for the same | |
JPH09223042A (en) | System test device for software |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 00802922.9 Country of ref document: CN |
|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CR CU CZ DE DK DM EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH GM KE LS MW SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: IN/PCT/2001/00794/MU Country of ref document: IN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020017009191 Country of ref document: KR |
|
ENP | Entry into the national phase |
Ref document number: 2000 595240 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2000909951 Country of ref document: EP |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |
|
WWP | Wipo information: published in national office |
Ref document number: 2000909951 Country of ref document: EP |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 2000909951 Country of ref document: EP |