EP0685084A1 - Collecte automatique et affichage graphique des donnees de tests d'ergonomie - Google Patents
Collecte automatique et affichage graphique des donnees de tests d'ergonomieInfo
- Publication number
- EP0685084A1 EP0685084A1 EP95905852A EP95905852A EP0685084A1 EP 0685084 A1 EP0685084 A1 EP 0685084A1 EP 95905852 A EP95905852 A EP 95905852A EP 95905852 A EP95905852 A EP 95905852A EP 0685084 A1 EP0685084 A1 EP 0685084A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- cursor
- data
- user
- software
- useability
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/34—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
- G06F11/3409—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
- G06F11/3419—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment by assessing time
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/34—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
- G06F11/3409—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
- G06F11/3414—Workload generation, e.g. scripts, playback
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/34—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
- G06F11/3438—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment monitoring of user actions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/34—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
- G06F11/3466—Performance evaluation by tracing or monitoring
Definitions
- This invention relates to computer software testing, and more particularly to a system for developing and testing computer software which includes the automatic tracking and graphical display of the software useability.
- Program useability describes the ease with which an untrained or modestly trained user can maneuver about within the software program and operate the various program functions. This useability includes such considerations as the logical arrangement of functions and menus, the layout of the various graphical features on the computer screen, and the hierarchical arrangement in which various functions are ordered.
- the traditional method of testing useability of computer software is to videotape the actual usage of the software. This generally entails setting up a testing station in which a number of users sequentially perform specific tasks on the software-under-test. The various keystrokes and user inputs are recorded on videotape.
- testing sessions are then reviewed by a software test engineer and various relevant data is recorded relating to the useability and use patterns contained in the videotape.
- Measuring software useability in this way is inconvenient in a number of respects.
- the user since the user must be videotaped, maintaining more than one station can be an expensive process as there must be one taping machine per station.
- having a video camera, and especially having a cameraman present during the software testing session can skew the useability results.
- evaluation of the videotape following the test sessions can be an extremely burdensome process, since each of the video sessions must be reviewed by the software test engineer.
- the test sessions are videotaped by user (i.e.
- a first user is taped performing multiple tasks, followed by a second user performing multiple tasks), they do not lend themselves to being conveniently reviewed by task.
- What is needed is a method to automatically record software utilization in such a way that not only the results achieved by the software user can be measured, but also the process followed by the user in achieving the software results.
- the system consists of programmed computer instructions implemented on a general purpose computer comprising a central processing unit for performing the programmed instructions, a video display for displaying stored data, a keyboard and pointing device for enabling user input, a clock generator for timing each function tested, and memory for data storage.
- the memory includes program memory, test memory, mass storage and software-under- test memory.
- the test memory contains a number of registers including a path array for keeping track of the movements of a pointing device, a click register for locating the position of the pointing device when a specific graphical interface selection is made, and a statistics register for keeping track of various user statistics such as total test time elapsed, user ID number, user name and familiarity data.
- the method of the present invention is initiated when the user is asked to perform a specific function in the software under test. Upon initiating the task, a user ID is established and a clock generator begins to track the time of the user function. The system then traces the path followed by the user in manipulating the pointing device or keyboard.
- the total time elapsed is determined and the user ID and path data are stored in the statistics register and a new user or new task is selected.
- Analysis of the test data is performed by replaying the paths of selected users and displaying the paths and click entry locations in a playback mode on the video screen.
- a single user's test data may by analyzed as the path traced is replayed in real time or by displaying a composite graph comprising a plurality of charted lines shown for comparison of the test data of selected users.
- FIG. 1 is a block diagram showing the system of the present invention
- FIG. 2(a) is a block diagram showing the various components of the test memory of FIG. 1
- FIG. 2(b) is a block diagram showing the various components of the statistics register of FIG. 2(a)
- FIG. 3 is a flow diagram showing the various method steps of the present invention
- FIG. 4 shows an example of the video display screen used in collecting information in the method of the present invention
- FIG. 5 shows an example of the video display screen showing data collected for a single user following execution of the example of FIG. 4
- FIG. 6 shows an example of the video display screen showing various statistics collected for multiple users following execution of the example of FIG. 4.
- FIG. 1 a system 10 is shown for implementing the automatic gathering and graphical display of useability test data of the present invention.
- the preferred embodiment of the present invention consists of programmed computer instructions implemented on a general purpose computer such as an IBM PC, Intel 80486-based machine. Execution of the programmed instructions is performed in a Central Processing Unit (CPU) 12. Connected to the CPU 12 is a video display 14 for displaying stored data to the user. User input is enabled through a keyboard 16 and a pointing device 18 which are both connected to the CPU 12.
- CPU Central Processing Unit
- the CPU 12 can optionally be connected to a printer 20 for generating hard copies of data manipulated by the CPU 12, a Local Area Network (LAN) 22 for communicating with other computers, and a modem 24 for receiving and transmitting data through a telephone network. Also connected to the CPU 12 is a data bus 26 for connecting the CPU 12 to other peripheral devices, such as a clock register 27, and to various computer memories, including a program memory 28, test memory 30, software-under-test memory 32 and mass storage 34.
- the program memory 28 stores the programmed instructions for controlling the operation of the CPU 12 in executing the method steps of the present invention. Following the instructions stored in the program memory 28, the CPU 12 prompts the user via the video display 14 to perform specific tasks on the software being tested, using the keyboard 16 and the pointing device 18.
- the software-under-test memory 32 is used to store the software components being analyzed by the instructions stored in program memory 28.
- a typical software component being analyzed would be a dialog box in which the user is asked to read a set of instructions and pick one or several choices from a plurality of possible selections.
- Statistical data such as the amount of time required for the user to complete the task displayed in the dialog box, is recorded and stored in the test memory 30.
- a mass storage 32 such as a hard disk or the like, is included in the system 10 for long term storage of a multiplicity of test results.
- the clock register 27 provides means for determining the time that a user spends performing a specific function as identified by instructions stored in the software-under-test memory 32. Increments of time are generated by the internal clock of the CPU 12, and the CPU periodically increments the contents of the clock register 27. The periodic updating of the clock register 27 preferably occurs in tenths of a second increments, although other increments may be equivalently useful.
- the clock register 27 is used by the CPU 12 for measuring and tracking functions as required by the programmed 1 instructions of the present invention. The total elapsed time from
- the clock register 27 is associated with a
- FIG. 2(a) a block diagram shows the
- the databus 26 connects
- I I a path array 36, a click register 38 and a statistics register 40 to the
- the path array 36 selectively stores the location of the cursor
- Each user is assigned an identification number by the CPU 12 that is used to identify the user's test data for a specific function.
- the identification numbers are stored in the statistics register 40.
- the CPU 12 utilizes the identification number it generates, and not the user's name, to store and retrieve the test data for that user.
- the CPU 12 periodically requests the user to make selections from among various menu items. The user makes these selections by moving the cursor to the area of the video display 14 ' containing the desired information and either pressing a predetermined keyboard key or by pressing an enter key on a pointing device 18.
- a statistics register 40 stores a variety of statistical data relating the user's interaction with the software-under-test. This stored data includes the elapsed time calculated by the CPU 12 for performing specific functions, the user name entered at the beginning of the test, the user ID generated by the CPU 12 for identifying specific test data and user familiarity data. User familiarity data are answers to multiple choice questions prompted by the test program to ascertain the level of familiarity the user has with the software- under-test or like products. Referring now to FIG.
- a block diagram shows the components comprising the statistics register 40 which include an elapsed time memory 42, a user name memory 44, a user ID number memory 46 and a familiarity data memory 48.
- the elapsed time memory 42 stores the total time that a user spends performing a specific function as determined by the clock register 27 and by the CPU 12.
- the user name memory 44 stores the name that the user input in response to a prompt by the CPU 12 at the beginning of a test cycle.
- the user ID number memory 46 stores the CPU generated ID number for each user.
- the familiarity data memory 48 stores answers to multiple choice questions prompted by the CPU 12 at the beginning of the test cycle, where the user is asked specific questions to gauge the level of familiarity the user has with the software- under-test.
- a flow diagram shows the various steps comprising the method 50 of automatically gathering test data in accordance with the present invention.
- the method starts in step 51 with a prompt from the CPU 12 requesting that the name of the user be entered via the keyboard 16.
- the user's name is used, however, any other identification system may also be used.
- the user's name is entered 53 and is stored in the user name memory 44.
- the CPU 12 generates a separate ID number for each user name input and stores the ID numbers in the user ID number memory 46.
- the CPU 12 utilizes the user ID number as the primary ID source for purposes of identifying data but preferably only the user's name is displayed on the video display 14.
- familiarity data is entered 55 by the user in response to multiple choice questions that gauge the user's familiarity with like software products. For example, an appropriate multiple choice question might ask the user to choose whether he is more familiar with IBM-type personal computers or Apple Macintosh-type computers. Another question might ask the user if he has previously used a particular software package.
- the familiarity level of the user is important in the proper analysis of the software under test as heightened familiarity may skew the test results and cause inaccurate analysis. A computer literate user will most likely have an easier time working with new programs than a computer novice.
- the familiarity data entries 55 are stored in the familiarity data memory 48.
- the useability evaluation of the software-under-test is initiated by displaying 57 a function task, comprising a written set of instructions presented on the video display 14.
- the instructions provide the user with directions about the task to be performed.
- the instructions also direct the user to acknowledge that the directions have been read and to initiate the evaluation preferably by pressing a certain key or by entering a specific combination of keystrokes.
- the user positions the cursor on an acknowledgment button displayed on the video display 14 and initiates the evaluation by pressing the enter key on the pointing device 18.
- the test screen associated with the first function is displayed on the video display 14 and incrementing of the clock register 27 is started 59.
- the clock register 27 continues to increment until a final input from the user signals the completion of the specific function being evaluated.
- the movement of the cursor is traced 61 and cursor position at the time of each click is stored.
- the X and Y coordinates corresponding to the position of the cursor are ascertained by the CPU 12 at periodic time increments and are stored in the path array memory 36. From this stored cursor data, the path 61 of the cursor as it is moved about the video display 14 by the user can be later retraced.
- the position of the cursor each time the mouse is clicked is also stored by its corresponding X and Y coordinates.
- the position data and quantity of clicks are stored in the click register 40.
- the user When the specific function being tested is completed, the user indicates 63 the completion by indicating a "done" command using either the keyboard 16 or the pointing device 18.
- the CPU 12 tests for a done indication in step 63.
- the time stored in the clock register 27 is read by the CPU 12, and the total time elapsed for the function tested is determined and stored in the elapsed time memory 42.
- a software useability test typically comprises a plurality of functions. Where a second function is to be tested, the method repeats by displaying 57 a new set of instructions. As with the instructions for the first function, the user is asked to start the function timer by entering a predetermined combination of keystrokes. The total elapsed time is determined by the CPU 12 for each function performed.
- test data is preferably stored primarily by user ID number and secondarily by function code in the statistics register 40.
- the CPU 12 asks 65 the user if a new user is to be tested. If the answer is 'yes', the user's name is entered 53 and processing iterates once again through the function evaluation steps as discussed. If no further users are found in step 65, the method ends 67.
- test mode video display screen 70 is shown concomitant with the starting 59 of the clock register incrementation.
- the test memory 30 is now prepared to receive information from the user via the CPU 12.
- the user is prompted to begin the test by responding to a first instruction 78.
- the test screen 70 includes an instruction box 72 in the upper right-hand corner of the test screen 70 and a test box 74 in the middle left-hand side of the test screen 70 entitled “new" in the example shown.
- the instruction box 72 provides direction to the user, and prompts the user to begin testing by entering a combination of initiating keystrokes 80.
- test box 74 is a video display window that contains the graphical components being evaluated for useability by the present invention.
- the test box 74 appears much bigger and partially covers the initiating keystrokes 80 prompt, but its size has been reduced in FIG. 4 for illustration purposes.
- the instructions direct the user to "show the titles and left justify them on the page.”
- the cursor 76 would first be moved to the box marked "show titles” 82. When the cursor 76 is within the frame of the show titles box 82, the mouse is clicked to select "show titles”. The cursor 76 is then moved to the box marked "align text” 84, centered within the "left” bubble 86, and the mouse is clicked to select left alignment. Next, the cursor 76 is moved to the box marked "OK” 88 and the mouse is clicked to complete the test function and stop the incrementing clock register 27.
- the path of the cursor 76 is determined by the location of the cursor 76 at selected time intervals.
- Each user's path is stored in the path array memory 36 and the location of each cursor click is stored in the click register 40.
- the user ID, user name, familiarity data and the elapsed function time for the series of functions comprising an event are stored in the statistics register 42.
- the data stored in the path array 36, the click register 38, and the statistics register 40 can be recalled to recreate a test function performed by a user, in order to analyze the useability of the software-under-test.
- the cursor path and cursor click location data for any user can be recalled by the CPU 12 and displayed on the test screen 70 as a charted line representing the cursor path and series of bullets denoting the locations of the cursor clicks.
- the time increment/cursor location data from the path array 36 is recalled to the test screen 70. Because the data is stored as a series of points having X and Y coordinates, drawing a line between subsequent points to connect them creates the charted line.
- the X and Y coordinates relating to the cursor click locations are recalled to the test screen 70 and bullets or other such demarcation is added to the display at each location.
- both the charted line and bullets can be recreated simultaneously where the "real time" (i.e. exact path including pauses) taken by the chosen user is desired for the analysis of the software-under-test.
- FIG. 5 illustrates a playback mode video display screen 89, showing the path array and cursor click data collected for a single user following the execution of the function displayed in FIG. 4.
- the playback screen 89 includes the instruction box 72 and test box 74 as seen by the user during the test, and further includes an analysis box 90 shown below the test box 74.
- the same test environment is presented in the playback mode as the environment of the test mode so that an accurate examination and analysis of the test results is made.
- a user named Andy 91 utilized a total time of 58 one-tenth second increments 92 (or 5.8 seconds) to complete the example task.
- Andy's path 98 is drawn in graphic form and click locations 100a through 100c are shown as bullets.
- Andy started at the upper left- hand c- ⁇ rner 102 of the test box 74, moved to the show titles box 82, clicked the mouse at location 100a, moved toward the right dipping at point 104 slightly until the cursor was within the align text left bubble 86, clicked the mouse at location 100b and moved to the 'OK' box 88 and clicked the mouse at location 100c to finalize the task.
- All existing paths are first cleared by pressing the clear path button 106.
- the name of the user whose test is to be analyzed is input 108.
- the user identification box 109 is shown as a comprising a single row of data, in an alternative embodiment a plurality of user identifications are listed and a list of users may be scrolled in order to select one or more users.
- the speed at which the charted line is to be drawn is changed by moving the speed box 110 to the right for a faster line creation, or to the left for slower line creation.
- the chart is drawn in real time to simulate the exact path, including pauses, of the chosen user.
- the draw speed could be increased for instantaneous line creation or decreased for slow motion line creation, as desired.
- the "dialogid box" 111 indicates which function of an event is being recreated. As the various function tasks comprising the software test are created, each function is coded with a dialogid number 113. Test data is stored in the statistics register 40, preferably primarily by the user ID number, and secondarily by dialogid number 113.
- a plurality of users are simultaneously analyzed in an example of a playback screen 89 showing the paths 112 and cursor click data 114 for multiple users.
- a plurality of sets of preliminary data 116 (user name, time elapsed and familiarity data answers) are shown for multiple users in generally tabular form. Alternatively, only one set of preliminary data 116 can be shown at a time.
- the preliminary data 116 can be sorted by user name, total time elapsed or by the sequence of answers to the familiarity data.
- the paths of selected users are preferably drawn sequentially in overlapping fashion, with the line that is in the process of being drawn highlighted, shown as a darker line 118, or in a different color.
- a composite of the charted lines for a multiplicity of users can be displayed simultaneously for comparison. For instance, to create a composite of the charted lines for only the five fastest users, the preliminary data is sorted by total elapsed time and the paths taken by the first five users are selected to be instantaneously drawn. Further, an analysis can be limited to only the cursor clicks by choosing to regenerate the corresponding data from the click register and having the click locations drawn on the playback screen.
- the automatic gathering and graphical display system of the present invention provides a software useability analyzer with a means for viewing the paths and click locations of one or more users simultaneously to assist in the analysis of the particular software- under-test. By considering the elapsed time data and gauging the familiarity level for each user in addition to examining the traced paths, the analysis of the tested software's useability is accurately determined.
Abstract
Un dispositif d'enregistrement des données d'ergonomie d'un logiciel se compose d'une unité centrale (UC) (12), d'un écran de visualisation (14), d'un clavier (16), d'un registre d'horloge (27), d'une mémoire programme (28), d'une mémoire de test (30), et d'un disque de stockage (34). La mémoire de test (30) comporte un registre vectoriel de navigation (36) gardant trace des mouvements d'un organe de pointage (18), un registre de cliquage (38) situant l'organe de pointage (18), et un registre statistique (40) gardant trace de certaines données statistiques telles que: numéro d'identification utilisateur (ID) repérant les différents utilisateurs se succédant, temps (42) mis par chaque utilisateur pour l'exécution d'une fonction, nom utilisateur (44) et des données (48) de familiarisation établies en fonction des réponses aux questions concernant le niveau utilisateur de prise en main de logiciels du même type. Le procédé de la présente invention commence par une demande (57) à l'utilisateur d'exécuter une fonction spécifique dans le logiciel. Le système assure alors le suivi (61) du curseur et détermine le temps écoulé. Les données sont stockées en mémoire de test (30) et il est ensuite mis fin au programme, ou bien une nouvelle tâche ou un nouvel utilisateur sont sélectionnés (65). L'analyse s'effectue par restitution et affichage à l'écran (14) des navigations des utilisateurs sélectionnés.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16950693A | 1993-12-17 | 1993-12-17 | |
US169506 | 1993-12-17 | ||
PCT/US1994/013853 WO1995016949A1 (fr) | 1993-12-17 | 1994-12-06 | Collecte automatique et affichage graphique des donnees de tests d'ergonomie |
Publications (1)
Publication Number | Publication Date |
---|---|
EP0685084A1 true EP0685084A1 (fr) | 1995-12-06 |
Family
ID=22615984
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP95905852A Withdrawn EP0685084A1 (fr) | 1993-12-17 | 1994-12-06 | Collecte automatique et affichage graphique des donnees de tests d'ergonomie |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP0685084A1 (fr) |
CA (1) | CA2156058A1 (fr) |
WO (1) | WO1995016949A1 (fr) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
IL119746A (en) * | 1996-12-03 | 2000-06-01 | Ergolight Ltd | Computerized apparatus and methods for identifying usability problems of a computerized system |
EP0951706A4 (fr) | 1996-12-03 | 2000-02-23 | Ergolight Ltd | Appareil informatise et procedes d'identification des problemes d'utilisation d'un systeme informatise |
US5872976A (en) * | 1997-04-01 | 1999-02-16 | Landmark Systems Corporation | Client-based system for monitoring the performance of application programs |
US6526526B1 (en) | 1999-11-09 | 2003-02-25 | International Business Machines Corporation | Method, system and program for performing remote usability testing |
WO2002008903A2 (fr) * | 2000-07-25 | 2002-01-31 | Vividence Corporation | Systeme d'analyse automatique de l'efficacite de pages web dans un contexte de taches |
JP2004102564A (ja) * | 2002-09-09 | 2004-04-02 | Fuji Xerox Co Ltd | ユーザビリティ評価支援装置 |
JP2004110548A (ja) | 2002-09-19 | 2004-04-08 | Fuji Xerox Co Ltd | ユーザビリティ評価支援装置および方法 |
WO2005045673A2 (fr) | 2003-11-04 | 2005-05-19 | Kimberly-Clark Worldwide, Inc. | Appareil de test comprenant une matrice de tracabilite multidimensionnelle automatique permettant de mettre en place et de valider des systemes logiciels complexes |
CN107193984A (zh) * | 2017-05-25 | 2017-09-22 | 上海喆之信息科技有限公司 | 一种高质量的用户推荐系统 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4637797A (en) * | 1985-01-11 | 1987-01-20 | Access Learning Technology Corporation | Software training system |
US4845665A (en) * | 1985-08-26 | 1989-07-04 | International Business Machines Corp. | Simulation of computer program external interfaces |
US5086393A (en) * | 1986-03-10 | 1992-02-04 | International Business Machines Corp. | System for testing human factors and performance of a system program |
US4772206A (en) * | 1986-03-10 | 1988-09-20 | International Business Machines Corporation | Multi-mode teaching simulator |
US4941829A (en) * | 1987-12-23 | 1990-07-17 | International Business Machines Corporation | Method for providing a dynamic tutorial display |
US5211564A (en) * | 1989-07-19 | 1993-05-18 | Educational Testing Service | Computerized figural response testing system and method |
-
1994
- 1994-12-06 EP EP95905852A patent/EP0685084A1/fr not_active Withdrawn
- 1994-12-06 CA CA002156058A patent/CA2156058A1/fr not_active Abandoned
- 1994-12-06 WO PCT/US1994/013853 patent/WO1995016949A1/fr not_active Application Discontinuation
Non-Patent Citations (1)
Title |
---|
See references of WO9516949A1 * |
Also Published As
Publication number | Publication date |
---|---|
CA2156058A1 (fr) | 1995-06-22 |
WO1995016949A1 (fr) | 1995-06-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US4696003A (en) | System for testing interactive software | |
US9342437B2 (en) | Backward post-execution software debugger | |
US5086393A (en) | System for testing human factors and performance of a system program | |
US8914777B2 (en) | Forward post-execution software debugger | |
US4772206A (en) | Multi-mode teaching simulator | |
US7653899B1 (en) | Post-execution software debugger with performance display | |
US8924912B2 (en) | Method of recording and replaying call frames for a test bench | |
Kieras | Model-based evaluation | |
Kieras | A guide to GOMS model usability evaluation using NGOMSL | |
US8584097B2 (en) | Post-execution software debugger with event display | |
US5220658A (en) | System for testing a performance of user interactive-commands using an emulator-overlay for determining the progress of the user timing response | |
CN111143200A (zh) | 触摸事件的录制及回放方法、装置、存储介质及设备 | |
Cheng | Trace-driven system modeling | |
US8015552B1 (en) | Post-execution software debugger with coverage display | |
US5513316A (en) | Method and apparatus for exercising an integrated software system | |
EP0685084A1 (fr) | Collecte automatique et affichage graphique des donnees de tests d'ergonomie | |
US8078590B2 (en) | Data processing system | |
CN114897296A (zh) | Rpa流程标注方法、执行过程回放方法及存储介质 | |
Schulman | Hardware measurement device for IBM system/360 time sharing evaluation | |
EP0236746B1 (fr) | Système et méthode de test de facteurs humains | |
CA1293057C (fr) | Systeme de test des facteurs humains et de la performance d'un programme informatique | |
JP2530841B2 (ja) | プログラム性能測定方法 | |
JPH02195448A (ja) | 命令トレース装置 | |
Haag | A Logic State Analyzer for Evaluating Complex State Flow | |
JPH04123237A (ja) | 入力履歴記録再生方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN |
|
17P | Request for examination filed |
Effective date: 19950911 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AT BE CH DE DK ES FR GB GR IE IT LI LU MC NL PT SE |
|
18W | Application withdrawn |
Withdrawal date: 19951107 |