WO2022180863A1 - User operation recording device and user operation recording method - Google Patents

User operation recording device and user operation recording method Download PDF

Info

Publication number
WO2022180863A1
WO2022180863A1 PCT/JP2021/007580 JP2021007580W WO2022180863A1 WO 2022180863 A1 WO2022180863 A1 WO 2022180863A1 JP 2021007580 W JP2021007580 W JP 2021007580W WO 2022180863 A1 WO2022180863 A1 WO 2022180863A1
Authority
WO
WIPO (PCT)
Prior art keywords
user operation
delay time
unit
estimation
screen
Prior art date
Application number
PCT/JP2021/007580
Other languages
French (fr)
Japanese (ja)
Inventor
史拓 横瀬
公雄 土川
佐也香 八木
有記 卜部
美沙 深井
晴夫 大石
Original Assignee
日本電信電話株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電信電話株式会社 filed Critical 日本電信電話株式会社
Priority to PCT/JP2021/007580 priority Critical patent/WO2022180863A1/en
Publication of WO2022180863A1 publication Critical patent/WO2022180863A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment

Definitions

  • the present invention relates to a user operation recording device and a user operation recording method.
  • Non-Patent Document 1 Software that records user operations on a PC for purposes such as business analysis is known (see Non-Patent Document 1). In such software, the time, user name, application name, etc., as well as screen captures at the time of operation are recorded together and used for analysis of operation status.
  • the time from when the mouse/keyboard is operated until the GUI display changes depends on the performance of the terminal, the execution status of the application, and in remote desktop, network delays, etc.
  • the amount of delay varies greatly depending on the communication quality. In the past, this timing was adjusted manually, but since it was not possible to follow changes in the situation, it was not possible to properly acquire screen captures during operation.
  • the present invention has been made in view of the above, and it is an object of the present invention to provide a user operation recording device and a user operation recording method that can acquire screen captures at appropriate timing.
  • the user operation recording device of the present invention includes a detection unit that detects a user operation on an information processing device, Selecting an estimation method for estimating a delay time from when the user operation is detected by the detection unit until the screen of the information processing device changes due to the user operation, and estimating the delay time using the estimation method.
  • An estimating unit, and an acquiring unit that acquires image data by capturing the screen based on the delay time estimated by the estimating unit.
  • a user operation recording method of the present invention is a user operation recording method executed by a user operation recording device, comprising: a detection step of detecting a user operation on an information processing device; Accordingly, an estimation method for estimating a delay time from when the user operation is detected by the detection step until the screen of the information processing device changes due to the user operation is selected, and the delay time is estimated using the estimation method. and an acquisition step of acquiring image data by capturing the screen based on the delay time estimated by the estimation step.
  • FIG. 1 is a diagram illustrating the configuration of a system including a user operation recording device of this embodiment.
  • FIG. 2 is a block diagram illustrating a schematic configuration of the user operation recording device of this embodiment.
  • FIG. 3 is a diagram illustrating an example of a table stored in an estimation method storage unit;
  • FIG. 4 is a diagram for explaining estimation processing by an estimation unit.
  • FIG. 5 is a diagram for explaining estimation processing by the estimation unit.
  • FIG. 6 is a diagram for explaining estimation processing by the estimation unit.
  • FIG. 7 is a diagram for explaining statistical processing by an estimating unit;
  • FIG. 8 is a flowchart illustrating an example of a processing procedure of user operation recording processing.
  • FIG. 9 is a diagram illustrating an example of a computer that executes a user operation recording program;
  • FIG. 1 is a diagram illustrating the configuration of a system including a user operation recording device of this embodiment.
  • a system including a user operation recording device 10 has a user operation recording device 10 and a server 20, and the user operation recording device 10 and the server 20 are connected via a network.
  • a local terminal user operation recording device 10
  • the in-house server 20 to perform work will be described as an example, but this is limited to a remote environment. not a thing Note that the configuration shown in FIG. 1 is merely an example, and the specific configuration and the number of each device are not particularly limited.
  • the user operation recording device 10 is an information processing device (user terminal) used by the user.
  • the terminal device 200 may be any type of information processing device including client devices such as smart phones, desktop PCs, notebook PCs, and tablet PCs.
  • the user operation recording device 10, which is a user terminal has a function for performing user operation recording processing. You may have
  • a function of transmitting information input by a user's input device such as a mouse and keyboard to the server 20 and displaying screen display information received from the server 20 on the screen is in operation.
  • This function is generally implemented as an application within the user terminal, and is displayed as a window like other applications within the user terminal.
  • the screen display information received from the server 20 may be the entire virtual desktop on the server 20 side, or may be a window unit within the virtual desktop.
  • the server 20 When the server 20 receives the operation information of the mouse, keyboard, etc., it performs an operation corresponding to the operation information to the application actually used by the user, and records the screen display information generated on the virtual desktop in the server as a user operation record. Send to device 10 .
  • the user operation recording device 10 estimates the response delay time of the GUI operation, and acquires GUI information such as screen capture at the optimum timing using the estimated response delay time.
  • FIG. 2 is a block diagram showing an example of the configuration of the user operation recording device.
  • FIG. 2 is a block diagram illustrating a schematic configuration of the user operation recording device of this embodiment.
  • the user operation recording device 10 of this embodiment is realized by a general-purpose computer such as a personal computer, and includes an input unit 11, an output unit 12, a communication control unit 13, a storage unit 14, and a control unit 15. Prepare.
  • the input unit 11 is implemented using an input device such as a keyboard and a mouse, and inputs various instruction information such as processing start to the control unit 15 in response to input operations by the operator.
  • the output unit 12 is implemented by a display device such as a liquid crystal display, a printing device such as a printer, or the like.
  • the output unit 12 is a screen on which a user operation is reflected, which is a target of user operation recording processing to be described later.
  • the communication control unit 13 is realized by a NIC (Network Interface Card) or the like, and controls communication between an external device and the control unit 15 via an electrical communication line such as a LAN (Local Area Network) or the Internet.
  • the communication control unit 13 controls communication between the control unit 15 and a management device or the like that manages various types of information regarding a user operation detection method, application processing, and the like.
  • the screen on which the user's operation is reflected may be implemented in hardware different from the user's operation recording device 10 . In that case, the user operation recording device 10 communicates with the terminal operated by the user via the communication control unit 13 .
  • the storage unit 14 stores data and programs necessary for various processes by the control unit 15, and has an estimation method storage unit 14a, an image data storage unit 14b, and a log storage unit 14c.
  • the storage unit 14 is a semiconductor memory device such as RAM (Random Access Memory) or flash memory, or a storage device such as a hard disk or optical disk.
  • the estimation method storage unit 14a stores an estimation method for estimating the delay time from when a user operation is detected until the screen changes due to the user operation.
  • the estimation method storage unit 14a stores a table indicating which estimation method is to be applied for each window application.
  • FIG. 3 is a diagram illustrating an example of a table stored in an estimation method storage unit;
  • the estimation method storage unit 14a stores a "serial number", a "process name” that is the name of an application (process) displaying a window, and a "window title” that is the title of the window. and the "detection method type” indicating the type of delay time estimation method to be applied are stored in association with each other.
  • the table data shown in FIG. 3 is prepared so that the estimation unit 15b, which will be described later, can select an estimation method. Note that this method is the simplest method, and other more sophisticated methods may be used.
  • the detection method (1) here is a method of estimating the delay time by measuring the processing time of the GUI event.
  • the detection method (2) is a method of estimating the delay time by reading changes in the screen display caused by a dedicated application executed on the remote desktop server 20 side.
  • the detection method (3) is a method of estimating the delay time by reading changes in the screen display at the clicked position.
  • the process name "RemoteDesktopClient.exe" illustrated in FIG. 3 identifies the window displayed by the remote desktop application, and the screen information generated on the remote server 20 is displayed on the window. is assumed to be displayed. Also, as shown in FIG. 3, in the detection method (2), since it is assumed that a dedicated application is running on the remote server, the IP address of the connection destination is determined by the window title. . It is assumed that the address of the connection destination is displayed in the window title.
  • "*" in the table means a wild card.
  • sequence number 3 in the example of FIG. 3 both the process name and window title match all patterns, so all windows that do not match the first and second lines are detected by detection method (1). is determined to be optimal.
  • one of the detection methods (1) to (3) is always used, but there may be windows that do not use the detection method. In that case, the response delay time is always regarded as constant.
  • a constant value is either no delay (0 ms) or is given by some setting file or the like.
  • the image data storage unit 14b stores image data acquired by screen capture by the acquisition unit 15c, which will be described later.
  • the log storage unit 14c stores log information in which an operation log including details of user operations and the like is associated with image data in which changes due to the user operations are reflected.
  • the control unit 15 has an internal memory for storing programs defining various processing procedures and required data, and executes various processing using these.
  • the control unit 15 has a detection unit 15a, an estimation unit 15b, an acquisition unit 15c, and a log creation unit 15d.
  • the control unit 15 is an electronic circuit such as a CPU (Central Processing Unit) or MPU (Micro Processing Unit), or an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or FPGA (Field Programmable Gate Array).
  • the detection unit 15a detects user operations on the user operation recording device 10 (information processing device). Specifically, the detection unit 15a detects the user's operation using various detection methods. Further, the detection unit 15a assigns an ID or the like for individually identifying user operations detected among the processing units.
  • the detection unit 15a uses the OS function to detect mouse and keyboard operations. This method of detecting mouse and keyboard operations is generally a high-speed method with very little delay, but it is not possible to determine how the OS and applications are affected by the operation.
  • the detection unit 15a monitors GUI messages, which are exchanges between a GUI system having a window mechanism and an application, to detect user operations. For example, the detection unit 15a detects appearance, disappearance, frontmost (activation), size change, and the like of a window. The detection unit 15a also detects that a button, text box, or the like for standard GUI control in the GUI system has been clicked.
  • the detection unit 15a detects a user operation, for example, using an API uniquely provided by the application.
  • the detection unit 15a can generally detect a button click or the like that occurs in a web page displayed on a web browser that has an API capable of detecting user operations. This detection method generally has a large delay.
  • the detection unit 15a detects a user operation by monitoring the content of communication between an application and a server or the like via a network, for example.
  • the operation detection method executed by the detection unit 15a is not limited to the above, and other more advanced operation detection methods may be used.
  • the estimating unit 15b detects the user operation by the detecting unit 15a according to the type of application executed by the user operation recording device 10 (information processing device), and detects the user operation by the user operation recording device 10 (information processing device). select an estimation method for estimating the delay time until the screen changes, and use the estimation method to estimate the delay time.
  • the estimation unit 15b refers to the table stored in the estimation method storage unit 14a, the process name of the application (process) displaying the window, and the window of that window. Use the title to identify the estimation method. These two pieces of information are available in many OSs, and can be easily obtained by calling APIs provided in the OS. In the example of the table shown in FIG. 3, the estimating unit 15b performs comparisons in order from the row with the lowest serial number. do. Delay estimation processing by the detection methods (1) to (3) will be described below with reference to FIGS. 4 to 6, respectively. 4 to 6 are diagrams for explaining estimation processing by the estimation unit.
  • the estimation unit 15b generates a GUI event for a window on the screen, measures the time from generation of the GUI event to completion of processing of the GUI event, This time is estimated as the delay time.
  • the application has an event loop (also called a message loop) that processes GUI events.
  • the event loop is processed on an application basis, a window basis, a thread basis, or the like, but in this embodiment, it is assumed that there is an event loop for each window. Even in the case of a GUI system in which event loops are prepared for each application and each thread, it is often considered that events are processed for each window.
  • Mouse and keyboard inputs are also notified to each window as GUI events, and the GUI display on the screen is updated in response.
  • the event loop is usually single-threaded, and if response delays occur, it also affects the processing of events other than mouse and keyboard events.
  • the OS API it is possible to generate a GUI event (message) from another process (application) to a window on the desktop.
  • the estimating unit 15b uses this to measure the time from the generation of some GUI event to the completion of processing of the event for the window whose response delay is to be checked, and sets it as the response delay time.
  • the estimating unit 15b causes the response delay estimation process to transmit an event to the target process (application) using the API of the OS, thereby generating a GUI event.
  • the reception of the event processing result indicating that the processing of the event has been completed is measured, and the measured time is estimated as the delay time.
  • the estimation unit 15b applies the detection method (1) when an application other than a remote desktop application is the target application.
  • the estimating unit 15b for example, as one of the estimation methods, is a server 20 that executes an operation on a user's screen on a virtual desktop and transmits information about a screen display as an execution result.
  • the detection unit 15a measures the time from when the user operation is detected until the screen changes, and estimates the measured time as the delay time. do.
  • an application that monitors and detects mouse and keyboard operation information received by the server 20 side, encodes the detected operation information each time it detects it, and reflects it in part of the screen display information sent from the server 20. is executed on the desktop of the server 20.
  • Operation information received by the server 20 can be monitored by the same method as the detection unit 15 a of the user operation recording device 10 .
  • the image displayed on the screen uses, for example, a two-dimensional code. Since the two-dimensional code takes up an area and stands out, in order not to hide the display to be seen by the user, a color bar code using only the pixels at the edges of the screen display information transmitted from the server 20, or an electronic watermark is used. You may The estimation unit 15b updates this image each time it receives operation information from the server 20.
  • the two-dimensional code includes coordinate information, button type and up/down information for a mouse, and key type and up/down information for a keyboard.
  • the two-dimensional code may contain not only information about the latest operation, but also information about multiple operations as an operation history. In other words, when the user performs a plurality of operations in succession in a short period of time, the two-dimensional code may change frequently and the two-dimensional code may not be properly read.
  • the two-dimensional code includes information about a plurality of operations as an operation history, and the sequence number attached to each operation and the update of the two-dimensional code after the server 20 detects the operation. It may also include information about the time difference between
  • the program for detecting the delay time in the user operation recording device 10 detects mouse and keyboard operation information and accumulates the history along with the detection time. Furthermore, the user operation recording device 10 periodically monitors the display of the two-dimensional code on the remote desktop application. Then, the user operation recording device 10 detects the operation information of the mouse and the keyboard received by the server 20 from the information included in the two-dimensional code, compares it with the history of the operation information generated in the user operation recording device 10, Estimate the delay. That is, the user operation recording device 10 compares the detection time of each operation accumulated as a history with the time when a change in display of the two-dimensional code is detected for the same operation, calculates the difference between the two, and calculates the difference. is estimated as the delay time.
  • the user operation recording device 10 needs to monitor changes in the image on the remote desktop application at sufficiently short intervals. Since the display area of the two-dimensional code or the like to be monitored by the user operation recording device 10 is sufficiently small, it can be realized with a small processing load. According to the mechanism of this detection method (2), the response from when the user operates the mouse or keyboard, when the information is sent to the server 20, and is actually reflected on the screen display of the user operation recording device 10 Delay can be estimated.
  • the estimating unit 15b monitors a change in an area within a predetermined range from the click position where the click operation is performed when the detecting unit 15a detects a click operation as a user operation. , the time from the detection of the click operation by the detection unit 15a to the occurrence of a change in the area is measured, and the measured time is estimated as the delay time.
  • the detection method (3) is a method capable of estimating the delay time without adding a special function to the server 20 side. Detection method (3) uses only click information. When a click operation is performed, in many cases, the property that the display changes at the clicked position is utilized.
  • the user operation recording device 10 When the user operation recording device 10 detects a click operation to the remote desktop application within its own device, it monitors a small area centered on the clicked position, and monitors the image at the moment of the click until the change occurs. Measure the time and use it as the response delay time. The user operation recording device 10 needs to monitor changes in the image on the remote desktop application at sufficiently short intervals, but since the area to be monitored is sufficiently small, it can be implemented with a small processing load.
  • the user operation recording device 10 may perform statistical processing to estimate more accurate values because there is a possibility that an image will not change due to a click or that an image will change regardless of the click.
  • Statistical processing may be performed not only for the detection method (3) but also for the detection methods (1) and (2). In other words, in any detection method, a more appropriate value can be estimated by statistical processing because variations occur in single measurements.
  • the estimation unit 15b may aggregate a plurality of delay times estimated by the same estimation method for each predetermined time range, and estimate the median value of the time range with the highest frequency as the delay time.
  • FIG. 7 is a diagram for explaining statistical processing by an estimating unit
  • the estimating unit 15b subjects data within a certain time range from the current time to the past for statistical processing. For example, the measurement results for the past 10 minutes are statistically processed to obtain the current response delay time. Then, the estimating unit 15b removes a single delay time measurement value that is equal to or greater than a threshold value. For example, measured values of 3 seconds or longer are removed as singular values. This is the same even if the measurement is discontinued when the threshold value is exceeded in the single measurement process.
  • a threshold value For example, measured values of 3 seconds or longer are removed as singular values. This is the same even if the measurement is discontinued when the threshold value is exceeded in the single measurement process.
  • the estimation unit 15b aggregates the remaining measured values for each time range. For example, the measured values are aggregated with a time width of 0.2 seconds. Then, as illustrated in FIG. 7, the estimation unit 15b estimates that the true value of the response delay time is included in the time range with the highest frequency, and adopts the median value of that time range as the delay time. Note that the estimation unit 15b is not limited to the above processing, and may perform more advanced statistical processing.
  • the estimating unit 15b may not affect the screen display by clicking, but in this case, it is removed by the threshold.
  • changes in the screen display that are not caused by mouse or keyboard operations may be erroneously detected, but since the changes vary widely over time, the effects can be eliminated by selecting a time range with a high frequency.
  • the estimating unit 15b may store the delay time for each application in the storage unit 14 after performing statistical processing to estimate the delay time.
  • the acquisition unit 15c acquires image data by capturing the screen based on the delay time estimated by the estimation unit 15b. For example, the acquiring unit 15c acquires the image data of the screen after the delay time estimated by the estimating unit 15b has elapsed since the detection unit 15a detected the user's operation. Then, the acquisition unit 15c stores the acquired image data in the image data storage unit 14b.
  • the acquisition unit 15c may store the delay time for each application and acquire image data by capturing screens operated by the user at predetermined intervals. In this case, the acquiring unit 15c selects image data of the screen after the delay time estimated by the estimating unit 15b has elapsed since the detecting unit 15a detected the user's operation from the acquired image data. and stored in the image data storage unit 14b. Further, when the delay time for each application is stored in the storage unit 14, the obtaining unit 15c reads the delay time according to the target application whose window is being displayed, and Image data may be acquired and stored in the image data storage unit 14b.
  • the log creation unit 15d associates the selected image data with the operation log and outputs them.
  • the log creation unit 15d stores, as a log, a text file corresponding to each line of each detected operation in the log storage unit 14c of the storage unit 14, and also creates each image corresponding to each detected operation.
  • the data is stored in the log storage unit 14c with identification information added to the file name or the like so that the corresponding relationship between each line of the text file and each image file can be known as each image file.
  • the log creating unit 15e may output the operation log and the image data as single binary data. Further, instead of storing the log in the log storage unit 14c of the storage unit 14, the log creation unit 15e may transmit to another server or the like via the communication control unit 13.
  • FIG. 8 is a flowchart illustrating an example of a processing procedure of user operation recording processing.
  • the estimation unit 15b selects the detection unit 15a according to the application type for displaying the screen. selects an estimation method for estimating a delay time from the detection of a user operation until the screen changes due to the user operation (step S102).
  • the estimation unit 15b estimates the delay time using an estimation method (step S103).
  • the acquiring unit 15c acquires image data by capturing the screen operated by the user based on the delay time estimated by the estimating unit 15b (step S104).
  • the log creating unit 15d associates and records the selected image data and the operation log (step S105). For example, the log creation unit 15d stores, as a log, a text file corresponding to each line of each detected operation in the log storage unit 14c of the storage unit 14, and also creates each image corresponding to each detected operation. The data is stored in the log storage unit 14c with identification information added to the file name or the like so that the corresponding relationship between each line of the text file and each image file can be known as each image file.
  • the user operation recording apparatus 10 detects a user operation, and according to the type of application for displaying a screen, the delay time from detection of the user operation to change of the screen due to the user operation is determined. is selected, and the delay time is estimated using the estimation method. Based on the estimated delay time, the user operation recording device 10 acquires image data by capturing the screen operated by the user. Therefore, the user operation recording device 10 can acquire screen captures at appropriate timing.
  • screen captures are not limited to looking back in the past, but there are cases where it is necessary to acquire an image in the future (after waiting for a certain amount of time) after an operation is detected.
  • various types of GUI information window title, etc.
  • the user operation recording device 10 can estimate the response delay time of the GUI operation according to the state change of the computer or network at that time. By using this estimated delay time, acquisition of various GUI information including screen capture can be performed at optimum timing.
  • the process of obtaining the delay time of each window may be always performed in parallel for all windows on the desktop, but basically input from the mouse and keyboard is accepted. Since many OSs have only one window (called an active window or the like), it is often sufficient to process only the active window. Therefore, in the present embodiment, processing may be performed only for the active window.
  • the user operation recording device 10 may retain information on the delay of windows other than the active window in a memory or the like. For example, in the user operation recording device 10, at the timing when a window is newly displayed, an area for holding information about the window is secured, a response delay is obtained while the window is active, and the information in the held area is rewritten. . When the window is destroyed, its area may be destroyed accordingly.
  • Each component of the user operation recording device 10 shown in FIG. 1 is functionally conceptual, and does not necessarily need to be physically configured as shown. That is, the specific form of distributing and integrating the functions of the user operation recording device 10 is not limited to the illustrated one, and all or part of it can be functionally or It can be physically distributed or integrated.
  • each process performed in the user operation recording device 10 may be realized entirely or in part by a CPU and a program that is analyzed and executed by the CPU. Further, each process performed in the user operation recording device 10 may be implemented as hardware based on wired logic.
  • the user operation recording device 10 can be implemented by installing a user operation recording program for executing the user operation recording process as package software or online software in a desired computer.
  • the information processing device can function as the user operation recording device 10 by causing the information processing device to execute the user operation recording program.
  • the information processing apparatus referred to here includes a desktop or notebook personal computer.
  • information processing devices include smart phones, mobile communication terminals such as mobile phones and PHSs (Personal Handyphone Systems), and slate terminals such as PDAs (Personal Digital Assistants).
  • the functions of the user operation recording device 10 may be implemented in a cloud server.
  • FIG. 9 is a diagram showing an example of a computer that executes a user operation recording program.
  • Computer 1000 includes, for example, memory 1010 , CPU 1020 , hard disk drive interface 1030 , disk drive interface 1040 , serial port interface 1050 , video adapter 1060 and network interface 1070 . These units are connected by a bus 1080 .
  • the memory 1010 includes a ROM (Read Only Memory) 1011 and a RAM 1012 .
  • the ROM 1011 stores a boot program such as BIOS (Basic Input Output System).
  • BIOS Basic Input Output System
  • Hard disk drive interface 1030 is connected to hard disk drive 1031 .
  • Disk drive interface 1040 is connected to disk drive 1041 .
  • a removable storage medium such as a magnetic disk or an optical disk is inserted into the disk drive 1041, for example.
  • a mouse 1051 and a keyboard 1052 are connected to the serial port interface 1050, for example.
  • a display 1061 is connected to the video adapter 1060 .
  • the hard disk drive 1031 stores an OS 1091, application programs 1092, program modules 1093 and program data 1094, for example. Each piece of information described in the above embodiment is stored in the hard disk drive 1031 or the memory 1010, for example.
  • the user operation recording program is stored in the hard disk drive 1031 as a program module 1093 in which commands to be executed by the computer 1000 are described, for example.
  • the hard disk drive 1031 stores a program module 1093 that describes each process executed by the user operation recording apparatus 10 described in the above embodiment.
  • Data used for information processing by the user operation recording program is stored as program data 1094 in the hard disk drive 1031, for example. Then, the CPU 1020 reads out the program module 1093 and the program data 1094 stored in the hard disk drive 1031 to the RAM 1012 as necessary, and executes each procedure described above.
  • program modules 1093 and program data 1094 related to the user operation recording program are not limited to being stored in the hard disk drive 1031.
  • they may be stored in a removable storage medium and processed by the CPU 1020 via the disk drive 1041 or the like. may be read out.
  • the program module 1093 and program data 1094 related to the user operation recording program are stored in another computer connected via a network such as a LAN or WAN (Wide Area Network), and read by the CPU 1020 via the network interface 1070. may be issued.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Debugging And Monitoring (AREA)

Abstract

This user operation recording device (10) detects a user operation, selects an estimation method for estimating a delay time from a time when the user operation is detected to a time when a screen changes due to the user operation, depending on the type of application to be executed, and estimates the delay time by using the estimation method. Then, the user operation recording device (10) captures the screen operated by the user, on the basis of the estimated delay time, and acquires image data.

Description

ユーザ操作記録装置およびユーザ操作記録方法User operation recording device and user operation recording method
 本発明は、ユーザ操作記録装置およびユーザ操作記録方法に関する。 The present invention relates to a user operation recording device and a user operation recording method.
 業務分析等を目的として、PC上のユーザ操作を記録するソフトウェアが知られている(非特許文献1参照)。このようなソフトウェアでは、時刻、ユーザ名、アプリケーション名等の他、操作時の画面キャプチャを合わせて記録して、操作状況の分析等に活用されている。 Software that records user operations on a PC for purposes such as business analysis is known (see Non-Patent Document 1). In such software, the time, user name, application name, etc., as well as screen captures at the time of operation are recorded together and used for analysis of operation status.
 しかしながら、従来の技術では、ユーザ操作を検出したのち即座に画面キャプチャを行うため、ユーザが操作をしてからそれに伴う画面の変化が起こるまでに時間がかかる状況など、適切に操作時の画面キャプチャを取得することが困難な場合がある。 However, in the conventional technology, since the screen is captured immediately after detecting the user's operation, it is possible to properly capture the screen during the operation in situations such as when it takes time for the screen to change after the user performs an operation. can be difficult to obtain.
 ユーザがマウス・キーボードなどの入力装置を操作してから、その操作によりGUI上の変化などの画面変化が起こるまでに大きな遅延が生じる場合としては、例えば、「PCの処理負荷が高くなっている場合」、「操作されたアプリケーションがGUI処理以外の重い処理を行っている場合」、「シンクライアントやリモートデスクトップ(以下、まとめてリモートデスクトップと記載)で通信の遅延が影響する場合」などがある。 An example of a case in which there is a large delay from when the user operates an input device such as a mouse or keyboard until a screen change such as a change on the GUI occurs due to that operation is, for example, "The processing load on the PC is increasing. "When the operated application is performing heavy processing other than GUI processing", "When communication delay affects thin client or remote desktop (hereinafter collectively referred to as remote desktop)", etc. .
 このような場合に、マウス・キーボードを操作してからGUI表示に変化が生じるまでの時間(応答遅延時間)は、端末の性能やアプリケーションの実行状態、加えてリモートデスクトップでは、ネットワークの遅延などの通信品質により、遅延量が大きく変化する。従来は、手動でこのタイミングを調整するなどしていたが、これでは状況の変化に追従できないため、適切に操作時の画面キャプチャを取得することができなかった。 In such cases, the time from when the mouse/keyboard is operated until the GUI display changes (response delay time) depends on the performance of the terminal, the execution status of the application, and in remote desktop, network delays, etc. The amount of delay varies greatly depending on the communication quality. In the past, this timing was adjusted manually, but since it was not possible to follow changes in the situation, it was not possible to properly acquire screen captures during operation.
 本発明は、上記に鑑みてなされたものであって、適切なタイミングで画面キャプチャを取得することができるユーザ操作記録装置およびユーザ操作記録方法を提供することを目的とする。 The present invention has been made in view of the above, and it is an object of the present invention to provide a user operation recording device and a user operation recording method that can acquire screen captures at appropriate timing.
 上述した課題を解決し、目的を達成するために、本発明のユーザ操作記録装置は、情報処理装置に対するユーザ操作を検出する検出部と、前記情報処理装置で実行されるアプリケーション種別に応じて、前記検出部によってユーザ操作が検出されてから該ユーザ操作によって前記情報処理装置の画面が変化するまでの遅延時間を推定する推定手法を選択し、該推定手法を用いて、前記遅延時間を推定する推定部と、前記推定部によって推定された遅延時間に基づいて、前記画面の画面キャプチャを行って画像データを取得する取得部と、を有することを特徴とする。 In order to solve the above-described problems and achieve the object, the user operation recording device of the present invention includes a detection unit that detects a user operation on an information processing device, Selecting an estimation method for estimating a delay time from when the user operation is detected by the detection unit until the screen of the information processing device changes due to the user operation, and estimating the delay time using the estimation method. An estimating unit, and an acquiring unit that acquires image data by capturing the screen based on the delay time estimated by the estimating unit.
 また、本発明のユーザ操作記録方法は、ユーザ操作記録装置が実行するユーザ操作記録方法であって、情報処理装置に対するユーザ操作を検出する検出工程と、前記情報処理装置で実行されるアプリケーション種別に応じて、前記検出工程によってユーザ操作が検出されてから該ユーザ操作によって前記情報処理装置の画面が変化するまでの遅延時間を推定する推定手法を選択し、該推定手法を用いて、前記遅延時間を推定する推定工程と、前記推定工程によって推定された遅延時間に基づいて、前記画面の画面キャプチャを行って画像データを取得する取得工程と、を含むことを特徴とする。 Further, a user operation recording method of the present invention is a user operation recording method executed by a user operation recording device, comprising: a detection step of detecting a user operation on an information processing device; Accordingly, an estimation method for estimating a delay time from when the user operation is detected by the detection step until the screen of the information processing device changes due to the user operation is selected, and the delay time is estimated using the estimation method. and an acquisition step of acquiring image data by capturing the screen based on the delay time estimated by the estimation step.
 本発明によれば、適切なタイミングで画面キャプチャを取得することが可能となる。 According to the present invention, it is possible to acquire screen captures at appropriate timing.
図1は、本実施形態のユーザ操作記録装置を含むシステムの構成を例示する図である。FIG. 1 is a diagram illustrating the configuration of a system including a user operation recording device of this embodiment. 図2は、本実施形態のユーザ操作記録装置の概略構成を例示するブロック図である。FIG. 2 is a block diagram illustrating a schematic configuration of the user operation recording device of this embodiment. 図3は、推定手法記憶部に記憶されるテーブルの一例を示す図である。FIG. 3 is a diagram illustrating an example of a table stored in an estimation method storage unit; 図4は、推定部による推定処理を説明するための図である。FIG. 4 is a diagram for explaining estimation processing by an estimation unit. 図5は、推定部による推定処理を説明するための図である。FIG. 5 is a diagram for explaining estimation processing by the estimation unit. 図6は、推定部による推定処理を説明するための図である。FIG. 6 is a diagram for explaining estimation processing by the estimation unit. 図7は、推定部による統計処理を説明するための図である。FIG. 7 is a diagram for explaining statistical processing by an estimating unit; 図8は、ユーザ操作記録処理の処理手順の一例を示すフローチャートである。FIG. 8 is a flowchart illustrating an example of a processing procedure of user operation recording processing. 図9は、ユーザ操作記録プログラムを実行するコンピュータの一例を示す図である。FIG. 9 is a diagram illustrating an example of a computer that executes a user operation recording program;
 以下に、本願に係るユーザ操作記録装置およびユーザ操作記録方法の実施の形態を図面に基づいて詳細に説明する。また、本発明は、以下に説明する実施の形態により限定されるものではない。 Embodiments of a user operation recording device and a user operation recording method according to the present application will be described in detail below with reference to the drawings. Moreover, the present invention is not limited to the embodiments described below.
[ユーザ操作記録装置を含むシステム構成]
 図1は、本実施形態のユーザ操作記録装置を含むシステムの構成を例示する図である。図1に示すように、ユーザ操作記録装置10を含むシステムでは、ユーザ操作記録装置10およびサーバ20を有し、ユーザ操作記録装置10とサーバ20とがネットワークを介して接続されている。なお、図1の例では、自宅等のローカル端末(ユーザ操作記録装置10)から社内のサーバ20にリモート接続して業務を実施している場合を例に説明するが、リモート環境に限定されるものではない。なお、図1に示す構成は一例にすぎず、具体的な構成や各装置の数は特に限定されるものではない。
[System configuration including user operation recording device]
FIG. 1 is a diagram illustrating the configuration of a system including a user operation recording device of this embodiment. As shown in FIG. 1, a system including a user operation recording device 10 has a user operation recording device 10 and a server 20, and the user operation recording device 10 and the server 20 are connected via a network. In the example of FIG. 1, a case in which a local terminal (user operation recording device 10) at home or the like is remotely connected to the in-house server 20 to perform work will be described as an example, but this is limited to a remote environment. not a thing Note that the configuration shown in FIG. 1 is merely an example, and the specific configuration and the number of each device are not particularly limited.
 ユーザ操作記録装置10は、ユーザによって利用される情報処理装置(ユーザ端末)である。なお、端末装置200は、スマートフォン、デスクトップ型PC、ノート型PC、タブレット型PC等のクライアント装置を含む、任意のタイプの情報処理装置であってもよい。また、図1の例では、ユーザ端末であるユーザ操作記録装置10がユーザ操作記録処理を行うための機能を有しているが、ユーザ端末とは別の装置として、ユーザ操作記録装置10を有していてもよい。 The user operation recording device 10 is an information processing device (user terminal) used by the user. Note that the terminal device 200 may be any type of information processing device including client devices such as smart phones, desktop PCs, notebook PCs, and tablet PCs. In the example of FIG. 1, the user operation recording device 10, which is a user terminal, has a function for performing user operation recording processing. You may have
 ユーザ操作記録装置10では、ユーザのマウス・キーボードなどの入力装置による入力情報をサーバ20に送信し、サーバ20から受信した画面表示情報を画面上に表示する機能が動作している。この機能は、一般的にはユーザ端末内のアプリケーションとして実現されており、ユーザ端末内の他のアプリケーション同様にウィンドウとして表示される。なお、サーバ20から受信した画面表示情報は、サーバ20側の仮想的なデスクトップ全体であったり、仮想的なデスクトップ内のウィンドウ単位であったりする。 In the user operation recording device 10, a function of transmitting information input by a user's input device such as a mouse and keyboard to the server 20 and displaying screen display information received from the server 20 on the screen is in operation. This function is generally implemented as an application within the user terminal, and is displayed as a window like other applications within the user terminal. The screen display information received from the server 20 may be the entire virtual desktop on the server 20 side, or may be a window unit within the virtual desktop.
 サーバ20は、マウスやキーボード等の操作情報を受信すると、実際にユーザが利用するアプリケーションに操作情報に合わせた操作を行い、サーバ内の仮想的なデスクトップ上で生じた画面表示情報をユーザ操作記録装置10に送信する。 When the server 20 receives the operation information of the mouse, keyboard, etc., it performs an operation corresponding to the operation information to the application actually used by the user, and records the screen display information generated on the virtual desktop in the server as a user operation record. Send to device 10 .
 このようなリモートデスクトップでは、ユーザ端末内で実行されるアプリケーションと比較して大きな応答遅延が生じる場合がある。例えば、通信経路や仮想サーバ、ゲートウェイ等の処理遅延により、操作遅延が発生することがある。このため、ユーザ操作記録装置10では、GUI操作の応答遅延時間を推定し、推定した応答遅延時間を利用して画面キャプチャなどのGUI情報の取得を、最適なタイミングで行う。 With such a remote desktop, there may be a large response delay compared to applications executed within the user terminal. For example, operation delays may occur due to processing delays in communication paths, virtual servers, gateways, and the like. Therefore, the user operation recording device 10 estimates the response delay time of the GUI operation, and acquires GUI information such as screen capture at the optimum timing using the estimated response delay time.
[ユーザ操作記録装置の構成]
 図2は、ユーザ操作記録装置の構成の一例を示すブロック図である。図2は、本実施形態のユーザ操作記録装置の概略構成を例示するブロック図である。図2に例示するように、本実施形態のユーザ操作記録装置10は、パソコン等の汎用コンピュータで実現され、入力部11、出力部12、通信制御部13、記憶部14、および制御部15を備える。
[Configuration of user operation recording device]
FIG. 2 is a block diagram showing an example of the configuration of the user operation recording device. FIG. 2 is a block diagram illustrating a schematic configuration of the user operation recording device of this embodiment. As illustrated in FIG. 2, the user operation recording device 10 of this embodiment is realized by a general-purpose computer such as a personal computer, and includes an input unit 11, an output unit 12, a communication control unit 13, a storage unit 14, and a control unit 15. Prepare.
 入力部11は、キーボードやマウス等の入力装置を用いて実現され、操作者による入力操作に対応して、制御部15に対して処理開始などの各種指示情報を入力する。出力部12は、液晶ディスプレイなどの表示装置、プリンター等の印刷装置等によって実現される。例えば、出力部12は、後述するユーザ操作記録処理の対象となる、ユーザ操作が反映される画面である。 The input unit 11 is implemented using an input device such as a keyboard and a mouse, and inputs various instruction information such as processing start to the control unit 15 in response to input operations by the operator. The output unit 12 is implemented by a display device such as a liquid crystal display, a printing device such as a printer, or the like. For example, the output unit 12 is a screen on which a user operation is reflected, which is a target of user operation recording processing to be described later.
 通信制御部13は、NIC(Network Interface Card)等で実現され、LAN(Local Area Network)やインターネットなどの電気通信回線を介した外部の装置と制御部15との通信を制御する。例えば、通信制御部13は、ユーザ操作の検出方法やアプリケーションの処理等に関する各種情報を管理する管理装置等と制御部15との通信を制御する。なお、ユーザ操作が反映される画面は、ユーザ操作記録装置10とは異なるハードウェアに実装されてもよい。その場合には、ユーザ操作記録装置10は、通信制御部13を介してユーザが操作する端末と通信する。 The communication control unit 13 is realized by a NIC (Network Interface Card) or the like, and controls communication between an external device and the control unit 15 via an electrical communication line such as a LAN (Local Area Network) or the Internet. For example, the communication control unit 13 controls communication between the control unit 15 and a management device or the like that manages various types of information regarding a user operation detection method, application processing, and the like. Note that the screen on which the user's operation is reflected may be implemented in hardware different from the user's operation recording device 10 . In that case, the user operation recording device 10 communicates with the terminal operated by the user via the communication control unit 13 .
 記憶部14は、制御部15による各種処理に必要なデータおよびプログラムを格納し、推定手法記憶部14a、画像データ記憶部14bおよびログ記憶部14cを有する。例えば、記憶部14は、RAM(Random Access Memory)、フラッシュメモリ(Flash Memory)等の半導体メモリ素子、又は、ハードディスク、光ディスク等の記憶装置などである。 The storage unit 14 stores data and programs necessary for various processes by the control unit 15, and has an estimation method storage unit 14a, an image data storage unit 14b, and a log storage unit 14c. For example, the storage unit 14 is a semiconductor memory device such as RAM (Random Access Memory) or flash memory, or a storage device such as a hard disk or optical disk.
 推定手法記憶部14aは、ユーザ操作が検出されてから該ユーザ操作によって画面が変化するまでの遅延時間を推定するための推定手法を記憶する。例えば、推定手法記憶部14aは、ウィンドウのアプリケーションごとに、どの推定手法を適用するかを示すテーブルを記憶する。 The estimation method storage unit 14a stores an estimation method for estimating the delay time from when a user operation is detected until the screen changes due to the user operation. For example, the estimation method storage unit 14a stores a table indicating which estimation method is to be applied for each window application.
 ここで、図3の例を用いて、推定手法記憶部に記憶されるテーブルについて説明する。図3は、推定手法記憶部に記憶されるテーブルの一例を示す図である。図3に例示するように、推定手法記憶部14aは、「通番」と、ウィンドウを表示しているアプリケーション(プロセス)の名称である「プロセス名」と、そのウィンドウのタイトルである「ウィンドウタイトル」と、適用する遅延時間の推定手法の種別を示す「検出法種別」とを対応付けて記憶する。つまり、ウィンドウのアプリケーションの違いによって、最適な遅延の検出方法は異なるため、図3のようなテーブルデータを用意することで後述する推定部15bが推定方法を選択可能としている。なお、この方法は最も単純な方法であるため、他のより高度な方法を用いてもよい。 Here, the table stored in the estimation method storage unit will be described using the example of FIG. FIG. 3 is a diagram illustrating an example of a table stored in an estimation method storage unit; As exemplified in FIG. 3, the estimation method storage unit 14a stores a "serial number", a "process name" that is the name of an application (process) displaying a window, and a "window title" that is the title of the window. and the "detection method type" indicating the type of delay time estimation method to be applied are stored in association with each other. In other words, since the optimal delay detection method differs depending on the application of the window, the table data shown in FIG. 3 is prepared so that the estimation unit 15b, which will be described later, can select an estimation method. Note that this method is the simplest method, and other more sophisticated methods may be used.
 また、遅延時間の推定手法について、ここで、検出法(1)とは、GUIイベントの処理時間を計測することで遅延時間を推定する手法である。また、検出法(2)とは、リモートデスクトップのサーバ20側で実行される専用のアプリケーションによる画面表示の変化を読み取ることで遅延時間を推定する手法である。また、検出法(3)とは、クリックした位置の画面表示の変化を読み取ることで遅延時間を推定する手法である。 Also, regarding the method of estimating the delay time, the detection method (1) here is a method of estimating the delay time by measuring the processing time of the GUI event. The detection method (2) is a method of estimating the delay time by reading changes in the screen display caused by a dedicated application executed on the remote desktop server 20 side. The detection method (3) is a method of estimating the delay time by reading changes in the screen display at the clicked position.
 例えば、図3に例示する「RemoteDesktopClient.exe」というプロセス名は、リモートデスクトップのアプリケーションが表示したウィンドウであることを識別するもので、そのウィンドウ上には遠隔のサーバ20上で生成された画面情報が表示されていることを想定している。また、図3に例示するように、検出法(2)においては、遠隔サーバ上で専用のアプリケーションが実行されていることが前提となるため、ウィンドウタイトルによって接続先のIPアドレスを判定している。ウィンドウタイトルには接続先のアドレスが表示されることを想定している。 For example, the process name "RemoteDesktopClient.exe" illustrated in FIG. 3 identifies the window displayed by the remote desktop application, and the screen information generated on the remote server 20 is displayed on the window. is assumed to be displayed. Also, as shown in FIG. 3, in the detection method (2), since it is assumed that a dedicated application is running on the remote server, the IP address of the connection destination is determined by the window title. . It is assumed that the address of the connection destination is displayed in the window title.
 また、テーブルにおける「*」はワイルドカードを意味する。このため、例えば、図3の例の通番3では、プロセス名とウィンドウタイトルのどちらも全てのパターンに合致するため、1行目と2行目に合致しなかったウィンドウ全ては検出法(1)が最適と判定される。なお、図3の例では、必ず検出法(1)~(3)のどれかを利用するが、検出法を利用しないウィンドウがあってもよい。その場合は、応答遅延時間は常に一定とみなされる。一定の値は、遅延なし(0ms)とするか、何らかの設定ファイルなどで与えられるようにする。 Also, "*" in the table means a wild card. For this reason, for example, in sequence number 3 in the example of FIG. 3, both the process name and window title match all patterns, so all windows that do not match the first and second lines are detected by detection method (1). is determined to be optimal. In the example of FIG. 3, one of the detection methods (1) to (3) is always used, but there may be windows that do not use the detection method. In that case, the response delay time is always regarded as constant. A constant value is either no delay (0 ms) or is given by some setting file or the like.
 画像データ記憶部14bは、後述する取得部15cによって画面キャプチャにより取得された画像データを記憶する。ログ記憶部14cは、ユーザ操作の内容等を含む操作ログと、当該ユーザ操作により変化が反映された画像データとが対応付けられたログ情報を記憶する。 The image data storage unit 14b stores image data acquired by screen capture by the acquisition unit 15c, which will be described later. The log storage unit 14c stores log information in which an operation log including details of user operations and the like is associated with image data in which changes due to the user operations are reflected.
 制御部15は、各種の処理手順などを規定したプログラムおよび所要データを格納するための内部メモリを有し、これらによって種々の処理を実行する。例えば、制御部15は、検出部15a、推定部15b、取得部15cおよびログ作成部15dを有する。ここで、制御部15は、CPU(Central Processing Unit)やMPU(Micro Processing Unit)などの電子回路やASIC(Application Specific Integrated Circuit)やFPGA(Field Programmable Gate Array)などの集積回路である。 The control unit 15 has an internal memory for storing programs defining various processing procedures and required data, and executes various processing using these. For example, the control unit 15 has a detection unit 15a, an estimation unit 15b, an acquisition unit 15c, and a log creation unit 15d. Here, the control unit 15 is an electronic circuit such as a CPU (Central Processing Unit) or MPU (Micro Processing Unit), or an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or FPGA (Field Programmable Gate Array).
 検出部15aは、ユーザ操作記録装置10(情報処理装置)に対するユーザ操作を検出する。具体的には、検出部15aは、各種の検出方法を用いて、ユーザの操作を検出する。また、検出部15aは、各処理部間で検出したユーザ操作を個別に識別するためのIDなどを付与する。 The detection unit 15a detects user operations on the user operation recording device 10 (information processing device). Specifically, the detection unit 15a detects the user's operation using various detection methods. Further, the detection unit 15a assigns an ID or the like for individually identifying user operations detected among the processing units.
 例えば、検出部15aは、OS機能を用いて、マウスやキーボードの操作を検出する。このマウスやキーボードの操作を検出する方法は、一般に遅延が非常に少ない高速な方法であるが、OSやアプリケーションがその操作によりどのような影響を受けたか判断することができない。 For example, the detection unit 15a uses the OS function to detect mouse and keyboard operations. This method of detecting mouse and keyboard operations is generally a high-speed method with very little delay, but it is not possible to determine how the OS and applications are affected by the operation.
 あるいは、検出部15aは、ウィンドウの仕組みをもつGUIシステムとアプリケーションとのやり取りであるGUIメッセージを監視して、ユーザ操作を検出する。例えば、検出部15aは、ウィンドウの出現、消滅、最前面(アクティブ)化、サイズ変更等を検出する。また、検出部15aは、GUIシステムに標準のGUIコントロールのためのボタン、テキストボックス等がクリックされたことを検出する。 Alternatively, the detection unit 15a monitors GUI messages, which are exchanges between a GUI system having a window mechanism and an application, to detect user operations. For example, the detection unit 15a detects appearance, disappearance, frontmost (activation), size change, and the like of a window. The detection unit 15a also detects that a button, text box, or the like for standard GUI control in the GUI system has been clicked.
 または、検出部15aは、例えば、アプリケーションが独自に備えるAPIを用いて、ユーザ操作を検出する。例えば、検出部15aは、一般に、ユーザ操作を検出可能なAPIを備えるWebブラウザを対象として、Webブラウザに表示されたWebページ内で発生するボタンクリック等を検出できる。この検出方法は、一般に、遅延が大きい。 Alternatively, the detection unit 15a detects a user operation, for example, using an API uniquely provided by the application. For example, the detection unit 15a can generally detect a button click or the like that occurs in a web page displayed on a web browser that has an API capable of detecting user operations. This detection method generally has a large delay.
 または、検出部15aは、例えば、アプリケーションとサーバ等との間のネットワークを介した通信の内容を監視して、ユーザ操作を検出する。なお、検出部15aが実行する操作検出方法は上記に限定されるものではなく、その他のより高度な操作検出方法を用いてもよい。 Alternatively, the detection unit 15a detects a user operation by monitoring the content of communication between an application and a server or the like via a network, for example. Note that the operation detection method executed by the detection unit 15a is not limited to the above, and other more advanced operation detection methods may be used.
 推定部15bは、ユーザ操作記録装置10(情報処理装置)で実行されるアプリケーション種別に応じて、検出部15aによってユーザ操作が検出されてから該ユーザ操作によってユーザ操作記録装置10(情報処理装置)の画面が変化するまでの遅延時間を推定する推定手法を選択し、該推定手法を用いて、遅延時間を推定する。 The estimating unit 15b detects the user operation by the detecting unit 15a according to the type of application executed by the user operation recording device 10 (information processing device), and detects the user operation by the user operation recording device 10 (information processing device). select an estimation method for estimating the delay time until the screen changes, and use the estimation method to estimate the delay time.
 図3の例を用いて説明すると、例えば、推定部15bは、推定手法記憶部14aに記憶されたテーブルを参照し、ウィンドウを表示しているアプリケーション(プロセス)のプロセス名と、そのウィンドウのウィンドウタイトルを利用して、推定方法を特定する。この2つの情報は多くのOSで利用可能であり、OSに用意されたAPIを呼び出すことで容易に取得できる。図3に例示したテーブルの例では、推定部15bは、若い通番の行から順番に比較を行い、合致する行があれば、その行に指定された検出法種別をそのウィンドウの遅延推定手法とする。以下では、図4~図6を用いて、検出法(1)~(3)による遅延推定処理をそれぞれ説明する。図4~図6は、推定部による推定処理を説明するための図である。 To explain using the example of FIG. 3, for example, the estimation unit 15b refers to the table stored in the estimation method storage unit 14a, the process name of the application (process) displaying the window, and the window of that window. Use the title to identify the estimation method. These two pieces of information are available in many OSs, and can be easily obtained by calling APIs provided in the OS. In the example of the table shown in FIG. 3, the estimating unit 15b performs comparisons in order from the row with the lowest serial number. do. Delay estimation processing by the detection methods (1) to (3) will be described below with reference to FIGS. 4 to 6, respectively. 4 to 6 are diagrams for explaining estimation processing by the estimation unit.
 まず、図3の検出法(1)について説明する。例えば、推定部15bは、推定手法の一つとして、画面上のウィンドウに対してGUIイベントを発生させ、該GUIイベントの発生から該GUIイベントの処理が完了するまでの時間を計測し、該計測した時間を遅延時間と推定する。 First, the detection method (1) in FIG. 3 will be described. For example, as one of estimation methods, the estimation unit 15b generates a GUI event for a window on the screen, measures the time from generation of the GUI event to completion of processing of the GUI event, This time is estimated as the delay time.
 OSに備わっているGUIシステムでは、アプリケーションがGUIイベントを処理するイベントループ(メッセージループなどとも呼ばれる)を持つ。OSの種類によって、イベントループはアプリケーション単位・ウィンドウ単位・スレッド単位などで処理されるが、本実施形態ではウィンドウ毎にイベントループがあることを前提としている。仮にアプリケーション単位・スレッド単位でイベントループが用意されるGUIシステム場合であっても、ウィンドウ単位でイベントが処理されるとみなして差し支えない場合が多い。 In the GUI system provided with the OS, the application has an event loop (also called a message loop) that processes GUI events. Depending on the type of OS, the event loop is processed on an application basis, a window basis, a thread basis, or the like, but in this embodiment, it is assumed that there is an event loop for each window. Even in the case of a GUI system in which event loops are prepared for each application and each thread, it is often considered that events are processed for each window.
 マウス・キーボードの入力もGUIイベントとして各ウィンドウに通知され、その応答として画面上のGUI表示が更新される。イベントループは通常シングルスレッドであり、応答遅延が発生している場合はマウス・キーボード以外のイベントの処理にも影響する。OSのAPIを利用すると、デスクトップ上のウィンドウに対して別のプロセス(アプリケーション)からGUIイベント(メッセージ)を発生させることができる。 Mouse and keyboard inputs are also notified to each window as GUI events, and the GUI display on the screen is updated in response. The event loop is usually single-threaded, and if response delays occur, it also affects the processing of events other than mouse and keyboard events. By using the OS API, it is possible to generate a GUI event (message) from another process (application) to a window on the desktop.
 推定部15bでは、これを利用し、応答遅延を調べたいウィンドウに対して、何らかのGUIイベントを発生させそのイベントの処理が完了するまでの時間を計測し、応答遅延時間とする。図4の例を用いて説明すると、推定部15bは、OSのAPIを利用して、応答遅延推定プロセスが対象のプロセス(アプリケーション)にイベントを送信させることでGUIイベントを発生させ、イベントの発生から、そのイベントの処理が完了したことを示すイベント処理結果を受信するまでの時間を計測し、該計測した時間を遅延時間と推定する。 The estimating unit 15b uses this to measure the time from the generation of some GUI event to the completion of processing of the event for the window whose response delay is to be checked, and sets it as the response delay time. Using the example of FIG. 4, the estimating unit 15b causes the response delay estimation process to transmit an event to the target process (application) using the API of the OS, thereby generating a GUI event. to the reception of the event processing result indicating that the processing of the event has been completed is measured, and the measured time is estimated as the delay time.
 ここで、発生させるGUIイベントは、相手のアプリケーションに変化を与えないものが最適であるため、例えば、推定部15bでは、ただ無視されるだけのイベント(NULLイベント、NULLメッセージ)が存在する場合はそれを利用する。なお、そのようなイベントが存在しないGUIシステムでは、影響が少ないイベントを選んで利用するようにしてもよい。推定部15bは、例えば、この検出法(1)については、リモートデスクトップのアプリケーション以外のアプリケーションを対象のアプリケーションとしている場合に適用するものとする。 Here, it is optimal that the GUI event to be generated does not change the counterpart application. take advantage of it. In a GUI system in which such an event does not exist, an event with less influence may be selected and used. For example, the estimation unit 15b applies the detection method (1) when an application other than a remote desktop application is the target application.
 続いて、図3の検出法(2)について説明する。推定部15bは、例えば、推定手法の一つとして、ユーザの画面に対する操作を仮想的なデスクトップ上で実行して実行結果としての画面表示に関する情報を送信するサーバ20であって、ユーザの画面に対する操作を検出すると画面表示に関する情報を変化させるアプリケーションを実行するサーバ20について、検出部15aによってユーザ操作が検出されてから画面が変化するまでの時間を計測し、該計測した時間を遅延時間と推定する。 Next, the detection method (2) in FIG. 3 will be explained. The estimating unit 15b, for example, as one of the estimation methods, is a server 20 that executes an operation on a user's screen on a virtual desktop and transmits information about a screen display as an execution result. For the server 20 that executes an application that changes information related to screen display when an operation is detected, the detection unit 15a measures the time from when the user operation is detected until the screen changes, and estimates the measured time as the delay time. do.
 この場合には、サーバ20側が受信したマウスやキーボードの操作情報を監視して検出し、検出する毎に検出した操作情報をコード化しサーバ20から送信される画面表示情報の一部に反映するアプリケーションを、サーバ20のデスクトップ上で実行する。サーバ20側が受信した操作情報の監視は、ユーザ操作記録装置10の検出部15aと同様の方法で可能である。図5に例示するように、画面上に表示する画像は、例えば2次元コードを利用する。2次元コードであると面積をとり目立つため、ユーザが見るべき表示を隠さないように、サーバ20から送信される画面表示情報の端の画素だけを利用したカラーバーコードとしたり、電子透かしを利用したりしてもよい。推定部15bは、操作情報をサーバ20から受け取るたびにこの画像を更新する。2次元コードには、マウスであれば座標情報やボタンの種類及びアップ/ダウンの情報、キーボードであればキーの種類及びアップ/ダウンの情報などが含まれる。 In this case, an application that monitors and detects mouse and keyboard operation information received by the server 20 side, encodes the detected operation information each time it detects it, and reflects it in part of the screen display information sent from the server 20. is executed on the desktop of the server 20. Operation information received by the server 20 can be monitored by the same method as the detection unit 15 a of the user operation recording device 10 . As illustrated in FIG. 5, the image displayed on the screen uses, for example, a two-dimensional code. Since the two-dimensional code takes up an area and stands out, in order not to hide the display to be seen by the user, a color bar code using only the pixels at the edges of the screen display information transmitted from the server 20, or an electronic watermark is used. You may The estimation unit 15b updates this image each time it receives operation information from the server 20. FIG. The two-dimensional code includes coordinate information, button type and up/down information for a mouse, and key type and up/down information for a keyboard.
 なお、2次元コードには、最新の操作に関する情報だけでなく、複数の操作に関する情報を操作の履歴として含まれていてもよい。つまり、ユーザにより複数の操作が短時間に連続して行われた場合には、2次元コードが頻繁に変更してしまい、適切に2次元コードを読み取れない場合がある。このような場合には、例えば、2次元コードには、複数の操作に関する情報を操作の履歴として含め、各操作に付されたシーケンス番号やサーバ20が操作を検出してから2次元コードを更新するまでの時間差に関する情報を含めてもよい。 It should be noted that the two-dimensional code may contain not only information about the latest operation, but also information about multiple operations as an operation history. In other words, when the user performs a plurality of operations in succession in a short period of time, the two-dimensional code may change frequently and the two-dimensional code may not be properly read. In such a case, for example, the two-dimensional code includes information about a plurality of operations as an operation history, and the sequence number attached to each operation and the update of the two-dimensional code after the server 20 detects the operation. It may also include information about the time difference between
 ユーザ操作記録装置10内の遅延時間を検出するプログラムでは、マウスやキーボードの操作情報を検出し、検出時刻と共に履歴を蓄積する。さらに、ユーザ操作記録装置10は、リモートデスクトップアプリケーション上の2次元コードの表示を定期的に監視する。そして、ユーザ操作記録装置10は、2次元コードに含まれる情報からサーバ20が受け取ったマウスやキーボードの操作情報を検出し、ユーザ操作記録装置10内で発生した操作情報の履歴と比較して、遅延を推定する。つまり、ユーザ操作記録装置10は、履歴として蓄積された各操作の検出時刻と、同一の操作について2次元コードの表示の変化を検出した時刻とを比較して両者の差分を算出し、該差分を遅延時間と推定する。 The program for detecting the delay time in the user operation recording device 10 detects mouse and keyboard operation information and accumulates the history along with the detection time. Furthermore, the user operation recording device 10 periodically monitors the display of the two-dimensional code on the remote desktop application. Then, the user operation recording device 10 detects the operation information of the mouse and the keyboard received by the server 20 from the information included in the two-dimensional code, compares it with the history of the operation information generated in the user operation recording device 10, Estimate the delay. That is, the user operation recording device 10 compares the detection time of each operation accumulated as a history with the time when a change in display of the two-dimensional code is detected for the same operation, calculates the difference between the two, and calculates the difference. is estimated as the delay time.
 また、ユーザ操作記録装置10は、リモートデスクトップアプリケーション上の画像変化の監視は十分に短い間隔で行う必要がある。ユーザ操作記録装置10が監視すべき2次元コードなどの表示領域は十分小さいため、小さな処理負荷で実現可能である。この検出法(2)の仕組みにより、ユーザによりマウスやキーボードの操作が行われてから、サーバ20にその情報が送られて、実際にユーザ操作記録装置10の画面表示に反映されるまでの応答遅延を推定することができる。 Also, the user operation recording device 10 needs to monitor changes in the image on the remote desktop application at sufficiently short intervals. Since the display area of the two-dimensional code or the like to be monitored by the user operation recording device 10 is sufficiently small, it can be realized with a small processing load. According to the mechanism of this detection method (2), the response from when the user operates the mouse or keyboard, when the information is sent to the server 20, and is actually reflected on the screen display of the user operation recording device 10 Delay can be estimated.
 続いて、図3の検出法(3)について説明する。推定部15bは、例えば、推定手法の一つとして、検出部15aによってユーザ操作としてクリック操作が検出された場合に、該クリック操作が行われたクリック位置から所定の範囲の領域における変化を監視し、検出部15aによってクリック操作が検出されてから領域における変化が起きるまでの時間を計測し、該計測した時間を遅延時間と推定する。 Next, the detection method (3) in FIG. 3 will be explained. For example, as one of the estimation methods, the estimating unit 15b monitors a change in an area within a predetermined range from the click position where the click operation is performed when the detecting unit 15a detects a click operation as a user operation. , the time from the detection of the click operation by the detection unit 15a to the occurrence of a change in the area is measured, and the measured time is estimated as the delay time.
 検出方法(2)では、リモートデスクトップのサーバ20内に専用のアプリケーションを実行させておく必要があり、全ての環境で実現できるとは限らない。このため、サーバ20側に特別な機能を追加せずに遅延時間を推定可能な方法が検出方法(3)である。検出方法(3)では、クリック情報のみを利用する。クリック操作を行った場合は、多くの場合そのクリックした位置で表示の変化が起こる性質を利用する。 With detection method (2), it is necessary to run a dedicated application in the remote desktop server 20, and it may not be possible in all environments. Therefore, the detection method (3) is a method capable of estimating the delay time without adding a special function to the server 20 side. Detection method (3) uses only click information. When a click operation is performed, in many cases, the property that the display changes at the clicked position is utilized.
 ユーザ操作記録装置10は、自装置内で、リモートデスクトップアプリケーションへのクリック操作を検出した場合は、そのクリックした位置を中心とした小さな領域を監視し、クリックした瞬間の画像から変化が起こるまでの時間を計測し、応答遅延時間とする。ユーザ操作記録装置10は、リモートデスクトップアプリケーション上の画像変化の監視は十分に短い間隔で行う必要があるが、監視すべき領域は十分に小さいため、小さな処理負荷で実現可能である。 When the user operation recording device 10 detects a click operation to the remote desktop application within its own device, it monitors a small area centered on the clicked position, and monitors the image at the moment of the click until the change occurs. Measure the time and use it as the response delay time. The user operation recording device 10 needs to monitor changes in the image on the remote desktop application at sufficiently short intervals, but since the area to be monitored is sufficiently small, it can be implemented with a small processing load.
 なお、ユーザ操作記録装置10は、クリックによる画像の変化が起こらない場合や、クリックとは無関係に画像の変化が起こる可能性があるため、統計的な処理を行いより正確な値を推定するようにしてもよい。また、検出方法(3)に限らず、検出方法(1)、(2)についても統計処理をおこなってもよい。つまり、どの検出方法でも、単発の計測ではバラツキが生じるため、統計処理によってより妥当な値を推定することができる。 Note that the user operation recording device 10 may perform statistical processing to estimate more accurate values because there is a possibility that an image will not change due to a click or that an image will change regardless of the click. can be Statistical processing may be performed not only for the detection method (3) but also for the detection methods (1) and (2). In other words, in any detection method, a more appropriate value can be estimated by statistical processing because variations occur in single measurements.
 例えば、推定部15bは、同一の推定手法により推定された複数の遅延時間を、所定の時間範囲ごとに集計し、最も頻度が大きい時間範囲の中央値を遅延時間として推定するようにしてもよい。ここで、図7の例を用いて、統計処理について説明する。図7は、推定部による統計処理を説明するための図である。 For example, the estimation unit 15b may aggregate a plurality of delay times estimated by the same estimation method for each predetermined time range, and estimate the median value of the time range with the highest frequency as the delay time. . Here, statistical processing will be described using the example of FIG. FIG. 7 is a diagram for explaining statistical processing by an estimating unit;
 例えば、推定部15bは、現時点から過去の一定範囲時間範囲のデータを統計処理対象とする。例えば、過去10分間の計測結果を統計処理して、現在の応答遅延時間とする。そして、推定部15bは、単発の遅延時間の計測値で、閾値以上の値は除去する。例えば、3秒以上の計測値は特異値として除去する。これは、単発の計測処理の処理内で閾値を過ぎた場合は計測を打ち切るようにしても同じである。 For example, the estimating unit 15b subjects data within a certain time range from the current time to the past for statistical processing. For example, the measurement results for the past 10 minutes are statistically processed to obtain the current response delay time. Then, the estimating unit 15b removes a single delay time measurement value that is equal to or greater than a threshold value. For example, measured values of 3 seconds or longer are removed as singular values. This is the same even if the measurement is discontinued when the threshold value is exceeded in the single measurement process.
 続いて、推定部15bは、残った計測値を時間範囲ごとに集計する。例えば、0.2秒の時間幅で計測値を集計する。そして、推定部15bは、図7に例示するように、最も頻度が高かった時間範囲に応答遅延時間の真の値が含まれると推定し、その時間範囲の中央値を遅延時間として採用する。なお、推定部15bは、上記の処理に限定されるものではなく、さらに高度な統計処理を行ってもよい。 Subsequently, the estimation unit 15b aggregates the remaining measured values for each time range. For example, the measured values are aggregated with a time width of 0.2 seconds. Then, as illustrated in FIG. 7, the estimation unit 15b estimates that the true value of the response delay time is included in the time range with the highest frequency, and adopts the median value of that time range as the delay time. Note that the estimation unit 15b is not limited to the above processing, and may perform more advanced statistical processing.
 例えば、推定部15bは、検出方法(3)では、クリックにより画面表示が影響を受けない場合があるが、この場合は閾値により除去される。また、マウスやキーボード操作に起因しない画面表示の変化を誤って検出する場合があるが、その変化は時間的に広くばらつくため頻度の高い時間範囲を選択することで影響を除去できる。また、推定部15bは、統計処理を行って遅延時間を推定した後、アプリケーションごとの遅延時間を記憶部14に記憶させてもよい。 For example, in the detection method (3), the estimating unit 15b may not affect the screen display by clicking, but in this case, it is removed by the threshold. In addition, changes in the screen display that are not caused by mouse or keyboard operations may be erroneously detected, but since the changes vary widely over time, the effects can be eliminated by selecting a time range with a high frequency. The estimating unit 15b may store the delay time for each application in the storage unit 14 after performing statistical processing to estimate the delay time.
 図2の説明に戻って、取得部15cは、推定部15bによって推定された遅延時間に基づいて、画面の画面キャプチャを行って画像データを取得する。例えば、取得部15cは、検出部15aによってユーザの操作が検出されてから、推定部15bによって推定された遅延時間が経過した後の画面の画像データを取得する。そして、取得部15cは、取得した画像データを画像データ記憶部14bに格納する。 Returning to the description of FIG. 2, the acquisition unit 15c acquires image data by capturing the screen based on the delay time estimated by the estimation unit 15b. For example, the acquiring unit 15c acquires the image data of the screen after the delay time estimated by the estimating unit 15b has elapsed since the detection unit 15a detected the user's operation. Then, the acquisition unit 15c stores the acquired image data in the image data storage unit 14b.
 また、取得部15cは、アプリケーションごとの遅延時間を記憶所定の間隔でユーザが操作する画面の画面キャプチャを行って画像データを取得してもよい。この場合には、取得部15cは、取得した画像データのなかから、検出部15aによってユーザの操作が検出されてから推定部15bによって推定された遅延時間が経過した後の画面の画像データを選択して画像データ記憶部14bに格納する。また、取得部15cは、アプリケーションごとの遅延時間が記憶部14に記憶されている場合には、ウィンドウを表示している対象のアプリケーションに応じた遅延時間を読み出し、該遅延時間が経過した後の画像データを取得して画像データ記憶部14bに格納するようにしてもよい。 In addition, the acquisition unit 15c may store the delay time for each application and acquire image data by capturing screens operated by the user at predetermined intervals. In this case, the acquiring unit 15c selects image data of the screen after the delay time estimated by the estimating unit 15b has elapsed since the detecting unit 15a detected the user's operation from the acquired image data. and stored in the image data storage unit 14b. Further, when the delay time for each application is stored in the storage unit 14, the obtaining unit 15c reads the delay time according to the target application whose window is being displayed, and Image data may be acquired and stored in the image data storage unit 14b.
 ログ作成部15dは、選択された画像データと操作ログとを対応付けて出力する。例えば、ログ作成部15dは、検出された各操作が1行ずつに対応したテキストファイルをログとして記憶部14のログ記憶部14cに格納し、合わせて、検出された各操作に対応する各画像データを各画像ファイルとして、テキストファイルの各行と各画像ファイルの対応関係が分かるようにファイル名などに識別情報を付加してログ記憶部14cに格納する。あるいは、ログ作成部15eは、操作ログおよび画像データを単一のバイナリデータとして出力してもよい。また、ログ作成部15eは、記憶部14のログ記憶部14cに格納するかわりに、通信制御部13を介して他のサーバ等に送信してもよい。 The log creation unit 15d associates the selected image data with the operation log and outputs them. For example, the log creation unit 15d stores, as a log, a text file corresponding to each line of each detected operation in the log storage unit 14c of the storage unit 14, and also creates each image corresponding to each detected operation. The data is stored in the log storage unit 14c with identification information added to the file name or the like so that the corresponding relationship between each line of the text file and each image file can be known as each image file. Alternatively, the log creating unit 15e may output the operation log and the image data as single binary data. Further, instead of storing the log in the log storage unit 14c of the storage unit 14, the log creation unit 15e may transmit to another server or the like via the communication control unit 13. FIG.
[ユーザ操作記録装置の処理手順]
 次に、図8を用いて、ユーザ操作記録装置10が実行するユーザ操作記録処理の処理手順の一例について説明する。図8は、ユーザ操作記録処理の処理手順の一例を示すフローチャートである。
[Processing procedure of user operation recording device]
Next, an example of a processing procedure of user operation recording processing executed by the user operation recording device 10 will be described with reference to FIG. 8 . FIG. 8 is a flowchart illustrating an example of a processing procedure of user operation recording processing.
 図8に例示するように、ユーザ操作記録装置10の検出部15aが、画面に対するユーザ操作を検出すると(ステップS101肯定)、推定部15bは、画面を表示するアプリケーション種別に応じて、検出部15aによってユーザ操作が検出されてから該ユーザ操作によって画面が変化するまでの遅延時間を推定する推定手法を選択する(ステップS102)。 As illustrated in FIG. 8, when the detection unit 15a of the user operation recording device 10 detects a user operation on the screen (Yes at step S101), the estimation unit 15b selects the detection unit 15a according to the application type for displaying the screen. selects an estimation method for estimating a delay time from the detection of a user operation until the screen changes due to the user operation (step S102).
 そして、推定部15bは、推定手法を用いて、遅延時間を推定する(ステップS103)。続いて、取得部15cは、推定部15bによって推定された遅延時間に基づいて、ユーザが操作する画面の画面キャプチャを行って画像データを取得する(ステップS104)。 Then, the estimation unit 15b estimates the delay time using an estimation method (step S103). Subsequently, the acquiring unit 15c acquires image data by capturing the screen operated by the user based on the delay time estimated by the estimating unit 15b (step S104).
 その後、ログ作成部15dは、選択された画像データと操作ログとを対応付けて記録する(ステップS105)。例えば、ログ作成部15dは、検出された各操作が1行ずつに対応したテキストファイルをログとして記憶部14のログ記憶部14cに格納し、合わせて、検出された各操作に対応する各画像データを各画像ファイルとして、テキストファイルの各行と各画像ファイルの対応関係が分かるようにファイル名などに識別情報を付加してログ記憶部14cに格納する。 After that, the log creating unit 15d associates and records the selected image data and the operation log (step S105). For example, the log creation unit 15d stores, as a log, a text file corresponding to each line of each detected operation in the log storage unit 14c of the storage unit 14, and also creates each image corresponding to each detected operation. The data is stored in the log storage unit 14c with identification information added to the file name or the like so that the corresponding relationship between each line of the text file and each image file can be known as each image file.
[実施の形態の効果]
 このように、実施形態に係るユーザ操作記録装置10は、ユーザ操作を検出し、画面を表示するアプリケーション種別に応じて、ユーザ操作が検出されてから該ユーザ操作によって画面が変化するまでの遅延時間を推定する推定手法を選択し、該推定手法を用いて、遅延時間を推定する。そして、ユーザ操作記録装置10は、推定した遅延時間に基づいて、ユーザが操作する画面の画面キャプチャを行って画像データを取得する。このため、ユーザ操作記録装置10では、適切なタイミングで画面キャプチャを取得することが可能である。
[Effects of Embodiment]
As described above, the user operation recording apparatus 10 according to the embodiment detects a user operation, and according to the type of application for displaying a screen, the delay time from detection of the user operation to change of the screen due to the user operation is determined. is selected, and the delay time is estimated using the estimation method. Based on the estimated delay time, the user operation recording device 10 acquires image data by capturing the screen operated by the user. Therefore, the user operation recording device 10 can acquire screen captures at appropriate timing.
 つまり、画面キャプチャは、過去に遡る場合だけでなく、操作が検出されてから未来(ある時間待機した後)の画像を取得する必要がある場合がある。また、画面キャプチャに限らず、GUIの各種情報(ウィンドウタイトルなど)を取得する場合も、操作が反映されてから取得する必要がある。ある時間待機する必要が生じるのは、例えばマウス・キーボードの操作を検出してから、その操作によりGUI上の変化が起こるまでに大きな遅延が生じる場合である。このような場合においても、ユーザ操作記録装置10は、GUI操作の応答遅延時間を、その時のコンピュータやネットワークの状態変化に合わせて推定することができる。この推定した遅延時間を利用することで、画面キャプチャを含む各種GUI情報の取得を、最適なタイミングで行うことができるようになる。 In other words, screen captures are not limited to looking back in the past, but there are cases where it is necessary to acquire an image in the future (after waiting for a certain amount of time) after an operation is detected. In addition to screen capture, when acquiring various types of GUI information (window title, etc.), it is necessary to acquire the information after the operation is reflected. It is necessary to wait for a certain period of time, for example, when there is a large delay between detecting a mouse/keyboard operation and causing a change on the GUI due to the operation. Even in such a case, the user operation recording device 10 can estimate the response delay time of the GUI operation according to the state change of the computer or network at that time. By using this estimated delay time, acquisition of various GUI information including screen capture can be performed at optimum timing.
 例えば、ユーザ操作記録装置10では、アプリケーション種類によって、「GUIイベントの処理時間を計測する方法」「サーバ20側に専用のアプリケーションをインストールしその応答を画像から読み取る方法」「クリックした位置の画面表示の変化を読み取る方法」のなかから最適な検出方法に切り替えて、GUIの応答遅延時間を定量的に推定することで、適切なタイミングでの画面キャプチャを取得することが可能である。 For example, in the user operation recording device 10, depending on the type of application, "a method of measuring the processing time of a GUI event", "a method of installing a dedicated application on the server 20 side and reading the response from an image", and "screen display of the clicked position". It is possible to acquire screen captures at appropriate timing by switching to the optimum detection method from among the "methods for reading changes in" and quantitatively estimating the response delay time of the GUI.
 なお、ユーザ操作記録装置10では、各ウィンドウの遅延時間を求める処理は、デスクトップ上の全てのウィンドウに対して常に並行して行ってもよいが、基本的にマウス・キーボードの入力を受け付けているウィンドウ(アクティブウィンドウなどと呼ばれる)は多くのOSで一つのみのため、アクティブウィンドウのみを対象に処理を行うだけで十分な場合が多い。このため、本実施形態では、アクティブウィンドウのみを対象に処理を行うようにしてもよい。 Note that in the user operation recording device 10, the process of obtaining the delay time of each window may be always performed in parallel for all windows on the desktop, but basically input from the mouse and keyboard is accepted. Since many OSs have only one window (called an active window or the like), it is often sufficient to process only the active window. Therefore, in the present embodiment, processing may be performed only for the active window.
 ただし、この場合には、一般的にアクティブウィンドウは頻繁に切り替わるため、ユーザ操作記録装置10は、アクティブウィンドウ以外のウィンドウの遅延に関する情報はメモリ等に保持しておくようにしてもよい。例えば、ユーザ操作記録装置10では、ウィンドウが新規に表示されたタイミングで、そのウィンドウに関する情報を保持する領域を確保し、アクティブになっている間に応答遅延を求め、保持した領域の情報を書き換える。ウィンドウが破棄された場合は、それに合わせてその領域も破棄するようにしてもよい。 However, in this case, since the active window is generally switched frequently, the user operation recording device 10 may retain information on the delay of windows other than the active window in a memory or the like. For example, in the user operation recording device 10, at the timing when a window is newly displayed, an area for holding information about the window is secured, a response delay is obtained while the window is active, and the information in the held area is rewritten. . When the window is destroyed, its area may be destroyed accordingly.
[実施形態のシステム構成について]
 図1に示したユーザ操作記録装置10の各構成要素は機能概念的なものであり、必ずしも物理的に図示のように構成されていることを要しない。すなわち、ユーザ操作記録装置10の機能の分散および統合の具体的形態は図示のものに限られず、その全部または一部を、各種の負荷や使用状況などに応じて、任意の単位で機能的または物理的に分散または統合して構成することができる。
[About the system configuration of the embodiment]
Each component of the user operation recording device 10 shown in FIG. 1 is functionally conceptual, and does not necessarily need to be physically configured as shown. That is, the specific form of distributing and integrating the functions of the user operation recording device 10 is not limited to the illustrated one, and all or part of it can be functionally or It can be physically distributed or integrated.
 また、ユーザ操作記録装置10においておこなわれる各処理は、全部または任意の一部が、CPUおよびCPUにより解析実行されるプログラムにて実現されてもよい。また、ユーザ操作記録装置10においておこなわれる各処理は、ワイヤードロジックによるハードウェアとして実現されてもよい。 Also, each process performed in the user operation recording device 10 may be realized entirely or in part by a CPU and a program that is analyzed and executed by the CPU. Further, each process performed in the user operation recording device 10 may be implemented as hardware based on wired logic.
 また、実施の形態において説明した各処理のうち、自動的におこなわれるものとして説明した処理の全部または一部を手動的に行うこともできる。もしくは、手動的におこなわれるものとして説明した処理の全部または一部を公知の方法で自動的に行うこともできる。この他、上述および図示の処理手順、制御手順、具体的名称、各種のデータやパラメータを含む情報については、特記する場合を除いて適宜変更することができる。 Also, among the processes described in the embodiments, all or part of the processes described as being performed automatically can also be performed manually. Alternatively, all or part of the processes described as being performed manually can be performed automatically by known methods. In addition, the above-described and illustrated processing procedures, control procedures, specific names, and information including various data and parameters can be changed as appropriate unless otherwise specified.
[プログラム]
 上記実施形態に係るユーザ操作記録装置10が実行する処理をコンピュータが実行可能な言語で記述したプログラムを作成することもできる。一実施形態として、ユーザ操作記録装置10は、パッケージソフトウェアやオンラインソフトウェアとして上記のユーザ操作記録処理を実行するユーザ操作記録プログラムを所望のコンピュータにインストールさせることによって実装できる。例えば、上記のユーザ操作記録プログラムを情報処理装置に実行させることにより、情報処理装置をユーザ操作記録装置10として機能させることができる。ここで言う情報処理装置には、デスクトップ型またはノート型のパーソナルコンピュータが含まれる。また、その他にも、情報処理装置にはスマートフォン、携帯電話機やPHS(Personal Handyphone System)などの移動体通信端末、さらには、PDA(Personal Digital Assistant)などのスレート端末などがその範疇に含まれる。また、ユーザ操作記録装置10の機能を、クラウドサーバに実装してもよい。
[program]
It is also possible to create a program in which the processing executed by the user operation recording device 10 according to the above embodiment is described in a computer-executable language. As one embodiment, the user operation recording device 10 can be implemented by installing a user operation recording program for executing the user operation recording process as package software or online software in a desired computer. For example, the information processing device can function as the user operation recording device 10 by causing the information processing device to execute the user operation recording program. The information processing apparatus referred to here includes a desktop or notebook personal computer. In addition, information processing devices include smart phones, mobile communication terminals such as mobile phones and PHSs (Personal Handyphone Systems), and slate terminals such as PDAs (Personal Digital Assistants). Also, the functions of the user operation recording device 10 may be implemented in a cloud server.
 図9は、ユーザ操作記録プログラムを実行するコンピュータの一例を示す図である。コンピュータ1000は、例えば、メモリ1010と、CPU1020と、ハードディスクドライブインタフェース1030と、ディスクドライブインタフェース1040と、シリアルポートインタフェース1050と、ビデオアダプタ1060と、ネットワークインタフェース1070とを有する。これらの各部は、バス1080によって接続される。 FIG. 9 is a diagram showing an example of a computer that executes a user operation recording program. Computer 1000 includes, for example, memory 1010 , CPU 1020 , hard disk drive interface 1030 , disk drive interface 1040 , serial port interface 1050 , video adapter 1060 and network interface 1070 . These units are connected by a bus 1080 .
 メモリ1010は、ROM(Read Only Memory)1011およびRAM1012を含む。ROM1011は、例えば、BIOS(Basic Input Output System)等のブートプログラムを記憶する。ハードディスクドライブインタフェース1030は、ハードディスクドライブ1031に接続される。ディスクドライブインタフェース1040は、ディスクドライブ1041に接続される。ディスクドライブ1041には、例えば、磁気ディスクや光ディスク等の着脱可能な記憶媒体が挿入される。シリアルポートインタフェース1050には、例えば、マウス1051およびキーボード1052が接続される。ビデオアダプタ1060には、例えば、ディスプレイ1061が接続される。 The memory 1010 includes a ROM (Read Only Memory) 1011 and a RAM 1012 . The ROM 1011 stores a boot program such as BIOS (Basic Input Output System). Hard disk drive interface 1030 is connected to hard disk drive 1031 . Disk drive interface 1040 is connected to disk drive 1041 . A removable storage medium such as a magnetic disk or an optical disk is inserted into the disk drive 1041, for example. A mouse 1051 and a keyboard 1052 are connected to the serial port interface 1050, for example. For example, a display 1061 is connected to the video adapter 1060 .
 ここで、ハードディスクドライブ1031は、例えば、OS1091、アプリケーションプログラム1092、プログラムモジュール1093およびプログラムデータ1094を記憶する。上記実施形態で説明した各情報は、例えばハードディスクドライブ1031やメモリ1010に記憶される。 Here, the hard disk drive 1031 stores an OS 1091, application programs 1092, program modules 1093 and program data 1094, for example. Each piece of information described in the above embodiment is stored in the hard disk drive 1031 or the memory 1010, for example.
 また、ユーザ操作記録プログラムは、例えば、コンピュータ1000によって実行される指令が記述されたプログラムモジュール1093として、ハードディスクドライブ1031に記憶される。具体的には、上記実施形態で説明したユーザ操作記録装置10が実行する各処理が記述されたプログラムモジュール1093が、ハードディスクドライブ1031に記憶される。 Also, the user operation recording program is stored in the hard disk drive 1031 as a program module 1093 in which commands to be executed by the computer 1000 are described, for example. Specifically, the hard disk drive 1031 stores a program module 1093 that describes each process executed by the user operation recording apparatus 10 described in the above embodiment.
 また、ユーザ操作記録プログラムによる情報処理に用いられるデータは、プログラムデータ1094として、例えば、ハードディスクドライブ1031に記憶される。そして、CPU1020が、ハードディスクドライブ1031に記憶されたプログラムモジュール1093やプログラムデータ1094を必要に応じてRAM1012に読み出して、上述した各手順を実行する。 Data used for information processing by the user operation recording program is stored as program data 1094 in the hard disk drive 1031, for example. Then, the CPU 1020 reads out the program module 1093 and the program data 1094 stored in the hard disk drive 1031 to the RAM 1012 as necessary, and executes each procedure described above.
 なお、ユーザ操作記録プログラムに係るプログラムモジュール1093やプログラムデータ1094は、ハードディスクドライブ1031に記憶される場合に限られず、例えば、着脱可能な記憶媒体に記憶されて、ディスクドライブ1041等を介してCPU1020によって読み出されてもよい。あるいは、ユーザ操作記録プログラムに係るプログラムモジュール1093やプログラムデータ1094は、LANやWAN(Wide Area Network)等のネットワークを介して接続された他のコンピュータに記憶され、ネットワークインタフェース1070を介してCPU1020によって読み出されてもよい。 Note that the program modules 1093 and program data 1094 related to the user operation recording program are not limited to being stored in the hard disk drive 1031. For example, they may be stored in a removable storage medium and processed by the CPU 1020 via the disk drive 1041 or the like. may be read out. Alternatively, the program module 1093 and program data 1094 related to the user operation recording program are stored in another computer connected via a network such as a LAN or WAN (Wide Area Network), and read by the CPU 1020 via the network interface 1070. may be issued.
 以上、本発明者によってなされた発明を適用した実施形態について説明したが、本実施形態による本発明の開示の一部をなす記述および図面により本発明は限定されることはない。すなわち、本実施形態に基づいて当業者等によりなされる他の実施形態、実施例および運用技術等は全て本発明の範疇に含まれる。 Although the embodiment to which the invention made by the present inventor is applied has been described above, the present invention is not limited by the descriptions and drawings forming part of the disclosure of the present invention according to the present embodiment. That is, other embodiments, examples, operation techniques, etc. made by those skilled in the art based on this embodiment are all included in the scope of the present invention.
 10 ユーザ操作記録装置
 11 入力部
 12 出力部
 13 通信制御部
 14 記憶部
 14a 推定手法記憶部
 14b 画像データ記憶部
 14c ログ記憶部
 15 制御部
 15a 検出部
 15b 推定部
 15c 取得部
 15d ログ作成部
10 user operation recording device 11 input unit 12 output unit 13 communication control unit 14 storage unit 14a estimation method storage unit 14b image data storage unit 14c log storage unit 15 control unit 15a detection unit 15b estimation unit 15c acquisition unit 15d log creation unit

Claims (6)

  1.  情報処理装置に対するユーザ操作を検出する検出部と、
     前記情報処理装置で実行されるアプリケーション種別に応じて、前記検出部によってユーザ操作が検出されてから該ユーザ操作によって前記情報処理装置の画面が変化するまでの遅延時間を推定する推定手法を選択し、該推定手法を用いて、前記遅延時間を推定する推定部と、
     前記推定部によって推定された遅延時間に基づいて、前記画面の画面キャプチャを行って画像データを取得する取得部と、
     を有することを特徴とするユーザ操作記録装置。
    a detection unit that detects a user operation on the information processing device;
    selecting an estimation method for estimating a delay time from when a user operation is detected by the detection unit to when the screen of the information processing device changes due to the user operation, according to the type of application executed by the information processing device; , an estimation unit that estimates the delay time using the estimation method;
    an acquisition unit that acquires image data by capturing the screen based on the delay time estimated by the estimation unit;
    A user operation recording device comprising:
  2.  前記推定部は、前記推定手法の一つとして、ユーザの画面に対する操作を仮想的なデスクトップ上で実行して実行結果としての画面表示に関する情報を送信するサーバであって、前記ユーザの画面に対する操作を検出すると前記画面表示に関する情報を変化させるアプリケーションを実行するサーバについて、前記検出部によってユーザ操作が検出されてから前記画面が変化するまでの時間を計測し、該計測した時間を前記遅延時間と推定することを特徴とする請求項1に記載のユーザ操作記録装置。 The estimating unit, as one of the estimating methods, is a server that executes an operation on a user's screen on a virtual desktop and transmits information about a screen display as an execution result. for a server that executes an application that changes information related to the screen display when is detected, measuring the time from when the user operation is detected by the detection unit until the screen changes, and the measured time is defined as the delay time. 2. The user operation recording device according to claim 1, wherein the estimation is performed.
  3.  前記推定部は、前記推定手法の一つとして、前記検出部によって前記ユーザ操作としてクリック操作が検出された場合に、該クリック操作が行われたクリック位置から所定の範囲の領域における変化を監視し、前記検出部によって前記クリック操作が検出されてから前記領域における変化が起きるまでの時間を計測し、該計測した時間を前記遅延時間と推定することを特徴とする請求項1に記載のユーザ操作記録装置。 As one of the estimation methods, the estimating unit monitors a change in an area within a predetermined range from a click position where the click operation is performed when the detecting unit detects a click operation as the user operation. 2. The user operation according to claim 1, wherein the time from when the click operation is detected by the detection unit to when the change occurs in the area is measured, and the measured time is estimated as the delay time. recording device.
  4.  前記推定部は、前記推定手法の一つとして、前記画面上のウィンドウに対してGUIイベントを発生させ、該GUIイベントの発生から該GUIイベントの処理が完了するまでの時間を計測し、該計測した時間を前記遅延時間と推定することを特徴とする請求項1に記載のユーザ操作記録装置。 As one of the estimation methods, the estimation unit generates a GUI event for a window on the screen, measures the time from generation of the GUI event to completion of processing of the GUI event, and 2. The user operation recording device according to claim 1, wherein the delay time is estimated as the delay time.
  5.  前記推定部は、同一の推定手法により推定された複数の遅延時間を、所定の時間範囲ごとに集計し、最も集計値が大きい時間範囲に含まれる遅延時間の中央値を遅延時間として推定することを特徴とする請求項1に記載のユーザ操作記録装置。 The estimation unit aggregates a plurality of delay times estimated by the same estimation method for each predetermined time range, and estimates the median value of the delay times included in the time range with the largest total value as the delay time. The user operation recording device according to claim 1, characterized by:
  6.  ユーザ操作記録装置で実行されるユーザ操作記録方法であって、
     情報処理装置に対するユーザ操作を検出する検出工程と、
     前記情報処理装置で実行されるアプリケーション種別に応じて、前記検出工程によってユーザ操作が検出されてから該ユーザ操作によって前記情報処理装置の画面が変化するまでの遅延時間を推定する推定手法を選択し、該推定手法を用いて、前記遅延時間を推定する推定工程と、
     前記推定工程によって推定された遅延時間に基づいて、前記画面の画面キャプチャを行って画像データを取得する取得工程と、
     を含むことを特徴とするユーザ操作記録方法。
    A user operation recording method executed by a user operation recording device,
    a detection step of detecting a user operation on the information processing device;
    selecting an estimation method for estimating a delay time from when a user operation is detected by the detection step to when the screen of the information processing device changes due to the user operation, according to the type of application executed by the information processing device; , an estimation step of estimating the delay time using the estimation method;
    an acquisition step of acquiring image data by capturing the screen based on the delay time estimated by the estimation step;
    A user operation recording method, comprising:
PCT/JP2021/007580 2021-02-26 2021-02-26 User operation recording device and user operation recording method WO2022180863A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/007580 WO2022180863A1 (en) 2021-02-26 2021-02-26 User operation recording device and user operation recording method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/007580 WO2022180863A1 (en) 2021-02-26 2021-02-26 User operation recording device and user operation recording method

Publications (1)

Publication Number Publication Date
WO2022180863A1 true WO2022180863A1 (en) 2022-09-01

Family

ID=83048763

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/007580 WO2022180863A1 (en) 2021-02-26 2021-02-26 User operation recording device and user operation recording method

Country Status (1)

Country Link
WO (1) WO2022180863A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013190607A1 (en) * 2012-06-18 2013-12-27 株式会社日立製作所 Screen information collecting computer, screen information collecting method, and computer-readable storage medium
JP2015153210A (en) * 2014-02-17 2015-08-24 日本電信電話株式会社 User operation log recording method, its program, and device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013190607A1 (en) * 2012-06-18 2013-12-27 株式会社日立製作所 Screen information collecting computer, screen information collecting method, and computer-readable storage medium
JP2015153210A (en) * 2014-02-17 2015-08-24 日本電信電話株式会社 User operation log recording method, its program, and device

Similar Documents

Publication Publication Date Title
US10321342B2 (en) Methods and systems for performance monitoring for mobile applications
US9531614B1 (en) Network aware distributed business transaction anomaly detection
AU2011340789B2 (en) End-user performance monitoring for mobile applications
US7831661B2 (en) Measuring client interactive performance using a display channel
US9214004B2 (en) Watermarking and scalability techniques for a virtual desktop planning tool
US20200267203A1 (en) Determining end times for single page applications
JP2018022520A (en) Determination and monitoring performance of computer resource service
US20160224400A1 (en) Automatic root cause analysis for distributed business transaction
US9529691B2 (en) Monitoring and correlating a binary process in a distributed business transaction
US9652357B2 (en) Analyzing physical machine impact on business transaction performance
US10067862B2 (en) Tracking asynchronous entry points for an application
US20170126580A1 (en) Tracking Contention in a Distributed Business Transaction
JP2015153210A (en) User operation log recording method, its program, and device
US10223407B2 (en) Asynchronous processing time metrics
CN112005207A (en) Creating statistical analysis of data for transmission to a server
JP7127525B2 (en) DETECTION DEVICE, DETECTION METHOD, AND DETECTION PROGRAM
US11250100B2 (en) Cause-based event correlation to virtual page transitions in single page applications
US10432490B2 (en) Monitoring single content page application transitions
US10191844B2 (en) Automatic garbage collection thrashing monitoring
WO2022180863A1 (en) User operation recording device and user operation recording method
JP7338791B2 (en) User operation recording device and user operation recording method
US9935856B2 (en) System and method for determining end user timing
JP2022072116A (en) Support system, support method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21927962

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21927962

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP