WO2022034672A1 - Data processing device and data processing method - Google Patents

Data processing device and data processing method Download PDF

Info

Publication number
WO2022034672A1
WO2022034672A1 PCT/JP2020/030801 JP2020030801W WO2022034672A1 WO 2022034672 A1 WO2022034672 A1 WO 2022034672A1 JP 2020030801 W JP2020030801 W JP 2020030801W WO 2022034672 A1 WO2022034672 A1 WO 2022034672A1
Authority
WO
WIPO (PCT)
Prior art keywords
event
screen
data processing
time
data
Prior art date
Application number
PCT/JP2020/030801
Other languages
French (fr)
Japanese (ja)
Inventor
有記 卜部
公雄 土川
晴夫 大石
史拓 横瀬
佐也香 八木
Original Assignee
日本電信電話株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電信電話株式会社 filed Critical 日本電信電話株式会社
Priority to PCT/JP2020/030801 priority Critical patent/WO2022034672A1/en
Priority to JP2022542553A priority patent/JP7439934B2/en
Publication of WO2022034672A1 publication Critical patent/WO2022034672A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment

Definitions

  • the present invention relates to a data processing apparatus and a data processing method.
  • work may be carried out by remotely connecting from a local terminal such as home to an in-house terminal or virtual terminal.
  • operation delay may occur due to communication delay, processing delay of virtual server, gateway, etc., which may lead to an increase in stress of the operator and a decrease in productivity.
  • Non-Patent Document 1 a method of measuring the load on an in-house server or network (see Non-Patent Document 1), or a method of measuring the time taken for operation on a screen accessible from a terminal. (See Non-Patent Document 2).
  • the method of measuring the load on the server or network in the company cannot grasp the processing delay experienced by the worker. Further, the method of measuring the time required for the operation on the screen accessible from the terminal requires time and effort for the operator, and has a problem in the reproducibility of the operation.
  • the present invention has a setting unit that receives setting information indicating an event to be detected among the events that occur in the connection source terminal of the remote connection, the setting information, and every predetermined time.
  • the occurrence time of the event and the change are determined. It is characterized by including an output processing unit that outputs the value of the difference from the acquisition time of the generated screen capture as the operation delay time.
  • FIG. 1 is a diagram for explaining an outline of operation of a data processing device in the system of the first embodiment.
  • FIG. 2 is a diagram showing a configuration example of the system of the first embodiment.
  • FIG. 3 is a flowchart showing an example of the processing procedures of processing 1 and processing 2 performed by the system shown in FIG.
  • FIG. 4 is a flowchart showing an example of the processing procedure of the processing 3 performed by the system shown in FIG.
  • FIG. 5A is a diagram for explaining an outline of operation of the data processing device in the system of the second embodiment.
  • FIG. 5B is a flowchart showing an example of the processing procedure of processing 3 performed by the system of the second embodiment.
  • FIG. 6 is a diagram showing a configuration example of the system of the third embodiment.
  • FIG. 1 is a diagram for explaining an outline of operation of a data processing device in the system of the first embodiment.
  • FIG. 2 is a diagram showing a configuration example of the system of the first embodiment.
  • FIG. 3 is a flowchar
  • FIG. 7 is a flowchart showing an example of the processing procedure of processing 3 performed by the system of the third embodiment.
  • FIG. 8 is a flowchart showing an example of the processing procedure of processing 4 performed by the system of the second embodiment.
  • FIG. 9 is a diagram showing a configuration example of a computer that executes a data processing program.
  • the terminal device is a local terminal (connection source terminal) that connects to another terminal device by remote connection.
  • a terminal device is connected to another terminal device via a VDI (Virtual Desktop Infrastructure) server.
  • the data processing device sets the processing time of EtoE (from the operation issued from the local terminal to the screen change on the local terminal) when the above terminal device remotely connects to another terminal device as the operation time of the local device. It is calculated as the operation delay time experienced by the person.
  • the data processing device receives in advance the setting information of the operation event (event to be detected) of the terminal device used for calculating the above-mentioned EtoE processing time.
  • the setting information is described by, for example, a combination of an operation event to be detected and an image of a portion that changes on the screen of the terminal device due to the operation event.
  • the data processing device stores an operation event (input event) of the terminal device in the input event database.
  • operation event input event
  • the data processing device stores the input event of "A” and the occurrence time of the input event (event) in the input event database in association with each other.
  • the data processing device acquires a screen capture of the terminal device at predetermined time intervals, and acquires a change portion by comparing the screen captures before and after. Then, the data processing device stores the changed part of the acquired screen capture and the acquisition time of the screen capture in the screen capture database.
  • the data processing device stores the screen capture, the change point of the screen capture (for example, the coordinate area of the change point "A"), and the acquisition time of the screen capture in the screen capture database.
  • the data processing device acquires the input event to be detected (for example, the input event of "A") from the input event database. Further, the data processing device acquires a screen capture (screen capture of the detection target) corresponding to the input event of the detection target from the screen capture database. For example, in the data processing device, from the screen capture database, the time when the screen capture occurs is after the time when the input event to be detected occurs, and the image of the changed part in the screen capture is "A" or "A”. Get the screen capture data that is.
  • the data processing device determines whether the screen capture data in the screen capture database is a screen capture including a change portion corresponding to the input event of the detection target, for example, an image set in the setting information and the change portion. If it is considered that both the size and the font match in the image of, the data processing device converts each of the image set in the setting information and the image of the changed part into a gray scale image, further binarizes, and then the template. The similarity is calculated using matching. Then, if the calculated similarity is equal to or higher than the set threshold value, the data processing device may determine the screen capture corresponding to the event to be detected.
  • the data processing device grayscales both the image set in the setting information and the image of the change part. After scaling, features are extracted using SIFT (Scale-Invariant Feature Transform), and if the degree of matching of features is equal to or higher than the threshold value, it is judged as a screen capture corresponding to the input event to be detected. Then, the data processing device calculates the difference between the acquisition time of the acquired screen capture and the occurrence time of the input event. That is, the data processing device calculates the EtoE processing time by setting the acquisition time of the screen capture as the screen reflection time after the operation delay.
  • SIFT Scale-Invariant Feature Transform
  • the system includes a terminal device 10 and a data processing device 20.
  • the number of terminal devices 10 and data processing devices 20 is not limited to the number shown in FIG.
  • the terminal device 10 is a local device that connects to another terminal device by remote connection.
  • the terminal device 10 includes an input acquisition unit 11, a screen acquisition unit 12, a screen comparison unit 13, an output unit 14, and a communication control unit 15.
  • the input acquisition unit 11 acquires an input event by an input device (for example, a keyboard, a mouse, etc.) of the terminal device 10.
  • an input device for example, a keyboard, a mouse, etc.
  • the screen acquisition unit 12 acquires a screen capture of the terminal device 10 at predetermined time intervals.
  • the screen comparison unit 13 acquires a change portion on the screen by using a screen capture acquired by the screen acquisition unit 12 at predetermined time intervals.
  • the output unit 14 outputs the EtoE processing time output from the data processing device 20.
  • the communication control unit 15 controls communication between the terminal device 10 and the data processing device 20.
  • the data processing device 20 includes a communication control unit 21, a control unit (output processing unit) 22, and a storage unit 23.
  • the communication control unit 21 controls communication between the data processing device 20 and the terminal device 10.
  • the control unit 22 controls the entire data processing device 20.
  • the control unit 22 includes a data acquisition unit 221, a determination unit 222, a setting unit 223, a search unit 224, and a measurement unit 225.
  • the data acquisition unit 221 acquires an input event in the terminal device 10 and a change point on the screen obtained from the screen capture of the terminal device 10 from the terminal device 10 via the communication control unit 21. Then, the data acquisition unit 221 stores the acquired input event data (input event data) in the input event database of the storage unit 23.
  • the input event data is information indicating the input event and the occurrence time of the input event (see FIG. 1).
  • the data acquisition unit 221 stores the acquired screen capture data including the changed part on the screen in the screen capture event database of the storage unit 23.
  • This screen capture data is information indicating the screen capture, the change points on the screen obtained from the screen capture, and the acquisition time of the screen capture (see FIG. 1).
  • the input event data and the screen capture data may be held in the memory of the data processing device 20, respectively.
  • the determination unit 222 determines whether or not the change portion of the screen obtained from the input event of the terminal device 10 and the image capture satisfies the content of the setting information set by the setting unit 223.
  • the setting unit 223 accepts the input of setting information.
  • the setting information is described by, for example, a combination of an input event generated in the terminal device and an image (recognition target image) of a portion changed on the screen of the terminal device due to the input event.
  • the setting information is input, for example, via an input unit (not shown) of the data processing device 20, and is stored in a predetermined area of the storage unit 23.
  • the search unit 224 accesses the input event database and the screen capture database, and searches for the screen capture corresponding to the input event.
  • the measurement unit 225 measures the value of the difference between the occurrence time of the input event searched by the search unit 224 and the acquisition time of the screen capture corresponding to the input event. Then, the measurement unit 225 outputs the measured difference value as the EtoE processing time.
  • the storage unit 23 includes an input event database and a screen capture database in a predetermined area.
  • the input event database is a database that stores input events acquired by the data acquisition unit 221.
  • the screen capture database is a database that stores the screen capture data including the changed part acquired by the data acquisition unit 221 in the screen capture database.
  • the function of the data processing device 20 may be installed in the terminal device 10. In this case, the communication control unit 21 of the data processing device 20 and the communication control unit 21 of the terminal device 10 are unnecessary.
  • the process 1 is a process in which the system stores the input event acquired by the terminal device 10 in the input event database.
  • the process 2 is a process in which the system stores the screen capture data acquired by the terminal device 10 in the screen capture database.
  • the process 3 is a process in which the system obtains the EtoE processing time based on the input event data and the screen capture data obtained in the processes 1 and 2.
  • Process 1 and process 2 are processes performed in parallel in the background of the terminal device 10. Further, the process 3 is started, for example, at the timing instructed by the user of the data processing device 20.
  • the setting unit 223 of the data processing device 20 shall accept the input of the setting information of the event to be detected in advance.
  • the terminal device 10 acquires an input event of the terminal device 10 (S11). For example, the terminal device 10 acquires the key input in the terminal device 10 and the time when the input occurs as an input event. After that, when the data processing device 20 acquires the input event of the terminal device 10, the data processing device 20 stores the data (input event data) indicating the acquired input event in the input event database (S12).
  • the process After that, if the user of the terminal device 10 does not stop the process and the terminal device 10 is not shut down (No in S13), the process returns to S11. On the other hand, if the user of the terminal device 10 has stopped the process / the terminal device 10 has shut down (Yes in S13), the process is terminated.
  • the terminal device 10 captures the screen of the terminal device 10 at predetermined time intervals (for example, every 50 ms). Then, the terminal device 10 compares the acquired screen captures, and if there is a change portion on the previous and next screens, acquires the change portion (S21). After that, when the data processing device 20 acquires the screen capture data including the change portion acquired in S21 from the terminal device 10, the acquired screen capture data is stored in the screen capture database (S22). After that, if the user of the terminal device 10 does not stop the process and the terminal device 10 has not shut down (No in S23), the process returns to S21. On the other hand, if the user of the terminal device 10 has stopped the process / the terminal device 10 has shut down (Yes in S23), the process is terminated.
  • predetermined time intervals for example, every 50 ms.
  • the data processing device 20 accesses the input event database and acquires the occurrence time of the input event to be detected.
  • the data processing device 20 accesses the screen capture database and searches for the screen capture data of the detection target that first occurs after the occurrence time of the input event of the detection target.
  • the data processing device 20 obtains a value of the difference between the occurrence time of the input event to be detected and the acquisition time of the screen capture in the searched screen capture data.
  • the data processing device 20 outputs the value of the difference as the processing time of EtoE.
  • the data processing device 20 accesses the input event database, selects the target input event data (S31), and determines whether or not the selected target input event data is an input event to be detected (S32). For example, the data processing device 20 selects one input event from the input event database, and determines whether or not the selected input event is an input event to be detected indicated in the setting information.
  • the data processing device 20 access the data processing device 20 and the screen capture database, and select the target screen capture event (S33). For example, the data processing device 20 selects one screen capture data from the screen capture database.
  • the data processing device 20 determines whether or not the target screen capture event selected in S33 is the screen capture data to be detected and occurs after the occurrence time of the target input event (S34). For example, in the data processing device 20, the image of the change portion in the screen capture data selected in S33 matches the image to be detected (recognition target image), and the acquisition time of the screen capture in the screen capture data is set. Judge whether it is after the occurrence time of the target input event.
  • the data processing device 20 determines that the target screen capture data selected in S33 is the screen capture data to be detected and occurs after the occurrence time of the target input event (Yes in S34). ), Calculate the value of the difference between the occurrence time of the target input event and the acquisition time of the target screen capture event (S35). Then, the data processing device 20 outputs the value of the difference calculated in S35 (S36). After that, when it is determined that the data processing device 20 has confirmed all the input events in the input event database (Yes in S38), the processing ends. On the other hand, when the data processing device 20 determines that any input event in the input event database is unconfirmed (No in S38), the data processing device 20 returns to S31.
  • the data processing device 20 determines that the target screen capture data selected in S33 is not the screen capture data to be detected, or that the target screen capture data has occurred before the time when the target input event occurs. In the case (No of S34), the process proceeds to S37. Then, when the data processing device 20 determines that all the screen capture data of the screen capture database has been confirmed (Yes in S37), the process proceeds to S38. On the other hand, when the data processing device 20 determines that any of the screen capture data in the screen capture database is unconfirmed (No in S37), the data processing device 20 returns to S33.
  • the system can calculate the processing time of EtoE in the terminal device 10 to be remotely connected.
  • the system has described the case where the determination of whether or not the input event and the screen capture data are the detection targets is performed in the process 3, but the present invention is not limited to this.
  • the system may determine whether or not the input event is the detection target in the process 1, and may determine whether or not the screen capture data is the detection target in the process 2. In this case, the system does not need to perform the determination process of whether or not the input event and the screen capture data are the detection targets in the process 3.
  • the system may calculate the above-mentioned EtoE processing time by using an event related to a change in the appearance of a button due to a mouse click of the terminal device 10.
  • the embodiment in this case will be described as the second embodiment.
  • the same configurations as those of the first embodiment are designated by the same reference numerals, and the description thereof will be omitted.
  • an input event related to mouse operation is set as an event to be detected in the setting information.
  • a mouse click operation mouse click
  • an event to be detected in the setting information is set as an event to be detected in the setting information.
  • the data processing device 20 acquires the screen capture of the terminal device 10 at predetermined time intervals, and acquires the changed portion by comparing the screen captures before and after. Then, the data processing device 20 stores the changed part of the acquired screen capture and the acquisition time of the screen capture in the screen capture database.
  • the screen capture at that time and the changed part of the screen capture for example, save.
  • the button coordinate area) and the acquisition time of the screen capture are stored in the screen capture database.
  • the data processing device 20 finds the input event to be detected shown in the setting information from the input event database. Then, the data processing device 20 acquires the place where the input event occurs and the time when the input event occurs.
  • the data processing device 20 accesses the screen capture database and searches for the screen capture data in which the screen change occurs before and after the occurrence time of the input event to be detected. Then, the data processing device 20 determines whether or not the screen capture data corresponds to the input event to be detected. For example, the data processing device 20 determines whether or not the location where the input event (mouse click) to be detected is generated is included in the region of the change location in the searched screen capture data.
  • the data processing device 20 determines that the location where the mouse click occurs is included in the region of the change portion in the searched screen capture data (for example, the coordinate region of the save button), the screen capture data is detected. It is determined that it corresponds to the input event of.
  • the data processing device 20 acquires the time when the screen capture occurs in the screen capture data. After that, the data processing device 20 calculates the value of the difference between the occurrence time of the input event and the acquisition time of the screen capture. Then, the data processing device 20 outputs the value of the difference as the processing time of EtoE.
  • a mouse click operation mouse click
  • a cursor coordinate area in the click operation may be set in the setting information as an event to be detected.
  • the data processing device 20 clicks in the coordinate area of the cursor indicated in the setting information as an input event to be detected indicated in the setting information from the input event database. Find input events related to the operation. Then, the data processing device 20 acquires the coordinate position of the cursor in the input event and the occurrence time of the input event. Subsequent processing is the same as the above-mentioned processing 3, so the description thereof will be omitted.
  • the data processing device 20 accesses the input event database, selects the target input event data (S41), and determines whether or not the selected target input event data is an input event to be detected (S42). For example, the data processing device 20 selects one input event from the input event database, and determines whether or not the selected input event is an input event to be detected indicated in the setting information.
  • the data processing device 20 access the data processing device 20 and the screen capture database, and acquire the screen captures that occurred before and after the target input event (S43).
  • the data processing device 20 acquires screen capture data from the screen capture database whose acquisition time of the screen capture is before or after the occurrence time of the target input event.
  • the data processing device 20 determines whether or not the target input event is included in the change points of the screen capture before and after the target input event (S44). For example, the data processing device 20 determines whether or not the coordinate position of the cursor in the target input event is included in the area of the change portion in the screen capture data acquired in S43.
  • the data processing device 20 determines that the cursor coordinates of the target input event are included in the change points of the screen capture before and after the target input event (Yes in S44), the occurrence time of the target input event and the target screen capture.
  • the difference from the acquisition time of the event is calculated (S45).
  • the acquisition time of this target screen capture event is, for example, the acquisition time of the screen capture in the later screen capture data among the screen capture data determined in S44 that the target input event is included in the change location. ..
  • the data processing device 20 outputs the value of the difference calculated in S45 (S46). After that, when it is determined that the data processing device 20 has confirmed all the input events in the input event database (Yes in S47), the processing ends. On the other hand, when the data processing device 20 determines that any input event in the input event database is unconfirmed (No in S47), the process returns to S41.
  • the process proceeds to S47. Further, when the data processing device 20 determines in S43 that the cursor coordinate position of the target input event is not included in the change points of the screen capture before and after the target input event (No in S44), the process proceeds to S47.
  • the system can calculate the EtoE processing time using the event related to the mouse operation of the terminal device 10.
  • the system has described the case where it is determined in the process 3 whether or not the input event is the detection target, but the present invention is not limited to this.
  • the system may determine whether or not the input event is a detection target in the process 1. In this case, the system does not need to perform the determination process of whether or not the input event is the detection target in the process 3.
  • the system may calculate the EtoE processing time by using an event related to a change in the window state of the local terminal as an event to be detected.
  • an event related to a change in the window state to be detected (for example, the active window is closed, etc.) is set in the setting information used by the system.
  • the embodiment in this case will be described as a third embodiment.
  • the same configurations as those of the above-described embodiments are designated by the same reference numerals, and the description thereof will be omitted.
  • the system includes a terminal device 10, a terminal device 10a, and a data processing device 20a.
  • the terminal device 10a is a terminal device (connection destination terminal) that is remotely connected from the terminal device 10 and provides a remote environment to the terminal device 10.
  • the terminal device 10a includes a window state acquisition unit 16.
  • the window state acquisition unit 16 acquires information on the window and process of the terminal device 10 in the remote environment at predetermined time intervals. Then, if there is a change in the window or process, the window state acquisition unit 16 acquires the window state at the time of the change and the time when the change occurs.
  • the control unit 22a of the data processing device 20a includes a data acquisition unit 221a, a determination unit 222a, a setting unit 223a, a search unit 224a, and a measurement unit 225a.
  • the data acquisition unit 221a acquires an input event and a change point on the screen obtained from the screen capture from the terminal device 10 via the communication control unit 21. Further, the data acquisition unit 221a has input events from the terminal device 10a via the communication control unit 21, changes on the screen obtained from the screen capture, and window state data (changes) acquired by the window state acquisition unit 16. The window state at the time of the occurrence and the time when the change occurred) are acquired.
  • the data acquisition unit 221a stores the input event data of the terminal device 10 and the terminal device 10a in the input event database of the storage unit 23. Further, the data acquisition unit 221a stores the screen capture data including the changed portion on the screen in the screen capture event database of the storage unit 23. Further, the data acquisition unit 221a stores the window state data in the window information database.
  • the determination unit 222a determines whether or not the window state of the terminal device 10a satisfies the content of the setting information set by the setting unit 223. For example, the determination unit 222a determines whether or not the active window is closed by the terminal device 10a.
  • the setting unit 223a accepts the input of setting information.
  • An event in the window state to be detected (for example, the active window is closed, etc.) is set in the setting information.
  • the search unit 224a accesses the input event database and the screen capture database, and searches for the input event of the terminal device 10 and the terminal device 10a corresponding to the window state and the screen capture of the terminal device 10 and the terminal device 10a.
  • the measurement unit 225a measures the value of the difference between the occurrence time of the input event searched by the search unit 224a and the acquisition time of the screen capture corresponding to the input event. Then, the measurement unit 225a outputs the measured difference as the EtoE processing time.
  • the storage unit 23a includes an input event database, a screen capture database, and a window state database in a predetermined area.
  • the input event database is a database that stores the input event data of the terminal device 10 and the terminal device 10a acquired by the data acquisition unit 221a.
  • the screen capture database is a database that stores the screen capture data of the terminal device 10 and the terminal device 10a acquired by the data acquisition unit 221a in the screen capture database.
  • the window state database is a database that stores the window state data acquired from the terminal device 10a by the data acquisition unit 221a.
  • Process 1 is a process in which the system stores input events acquired from each of the terminal device 10 and the terminal device 10a in the input event database.
  • the process 2 is a process in which the system stores the screen capture data acquired from each of the terminal device 10 and the terminal device 10a in the screen capture database.
  • the processes 1 and 2 are processes corresponding to the processes 1 and 2 described with reference to FIG.
  • Process 3 is a process in which the system stores the window state data acquired by the terminal device 10a in the window state database.
  • the process 4 is a process in which the system calculates the EtoE processing time based on the input event data, the screen capture data, and the window state data obtained in the processes 1, 2, and 3.
  • Processes 1 and 2 are processes performed in parallel in the background of the terminal device 10.
  • the processes 1, 2, and 3 are processes performed in parallel in the background of the terminal device 10a.
  • the process 4 is started, for example, at the timing instructed by the user of the data processing device 20a.
  • the setting unit 223a of the data processing device 20a shall accept the input of the setting information of the event to be detected in advance.
  • process 3 the system acquires window and process information from the terminal device 10a to which the terminal device 10 is connected at predetermined time intervals, and if there is a change in the window or process, window state data indicating the change is displayed in the window. Store in the status database. For example, if the states of the active windows before and after the terminal device 10a are different and there is no process of the previous active window, the system determines that the window state of the terminal device 10a is "the state in which the active window is closed". Then, the system stores the above window state and the acquisition time of the window state as window state data in the window state database.
  • the window state acquisition unit 16 of the terminal device 10a acquires information on the active window and the process at predetermined time intervals (S51 in FIG. 7), and if there is a change in the information before and after (Yes in S52), the window state data. To get. Then, when the data acquisition unit 221a of the data processing device 20a acquires the window state data via the communication control unit 21, the acquired window state data is stored in the window state database (S53). After that, if the user of the terminal device 10a does not stop the process and the terminal device 10a is not shut down (No in S54), the process returns to S51. On the other hand, if the user of the terminal device 10a has stopped the process / the terminal device 10 has shut down (Yes in S54), the process is terminated.
  • the data processing device 20a accesses the window state database and acquires the window state data (target window state data) of the window state to be detected. Then, the data processing device 20a is based on the acquired target window state data, and the input event data of the input event (target input event on the connection destination terminal side) of the terminal device 10a that occurred immediately before the acquisition time of the window state to be detected. Is obtained from the input event database.
  • the data processing device 20a screen-captures the screen capture data of the screen capture event (target screen capture event on the connection destination terminal side) of the terminal device 10a that occurs immediately after the acquisition time of the window state shown in the target window state data. Get from the database.
  • the data processing device 20a acquires the input event data of the target input event on the connection source terminal side corresponding to the target input event on the connection destination terminal side from the input event database.
  • the data processing device 20a corresponds to a target input event on the connection destination terminal side (terminal device 10a side) based on the transition order of the input events of the terminal device 10 and the terminal device 10a in the input event database in time series. , Acquires the input event data of the target input event on the connection source terminal side (terminal device 10 side).
  • the data processing device 20a sets the time after the occurrence time of the target input event on the terminal device 10 side as the acquisition time from the screen capture data of the terminal device 10 in the screen capture database, and sets the target screen on the terminal device 10a side as the acquisition time. Find the screen capture data that is most similar (or has a similarity greater than or equal to) the screen capture data of the capture event. Then, the data processing device 20a uses the found screen capture data as the screen capture data of the target screen capture event on the terminal device 10 side. Finally, the data processing device 20a outputs the value of the difference between the occurrence time of the target input event on the terminal device 10 side and the acquisition time of the target screen capture event on the terminal device 10 side as the EtoE processing time.
  • the data processing device 20a can calculate the EtoE processing time by using the change in the window state of the terminal device 10 as an event to be detected.
  • the data processing device 20a accesses the window state database, selects the window state data (S61), and determines whether or not the event is a detection target event (S62). For example, the data processing device 20a selects one window state data from the window state database, and determines whether the window state indicated by the selected window state data is the window state to be detected indicated in the setting information. .. Then, when the data processing device 20a determines that the event is a detection target event (Yes in S62), the input event data of the input event on the connection destination terminal side (terminal device 10a side) that occurred immediately before the detection target event is input from the input event database. Acquire (S63). The input event on the connection destination terminal side that occurs immediately before this detection target event is called the target input event on the connection destination terminal side.
  • the data processing device 20 finds the input event data of the input event on the local terminal side (terminal device 10 side) corresponding to the target input event on the connection destination terminal side (S64).
  • the data processing device 20a corresponds to the target input event on the terminal device 10a side based on the transition order of the input events of the terminal device 10 and the terminal device 10a in the input event database in the time series.
  • Find the input event data for the input event The data processing device 20a may find the input event data of the input event on the terminal device 10 side having the occurrence time immediately after the occurrence time of the target input event on the terminal device 10a side in the input event database.
  • the data processing device 20a acquires the screen capture data on the connection destination terminal side generated immediately after the target window state data (immediately after the acquisition time of the window state shown in the target window state data) from the screen capture database. (S65).
  • the data processing device 20a is a local terminal side (terminal) having a similarity equal to or higher than a threshold value with respect to the screen capture data on the connection destination terminal side acquired in S65 from the screen capture data of the terminal device 10 in the screen capture database. Find the screen capture data of the screen capture of the device 10 side) (S66).
  • the data processing device 20a determines whether or not the acquisition time of the screen capture in the screen capture data on the local terminal side found in S66 is after the occurrence time of the input event in the input event data on the local terminal side found in S64. Judgment (S67: Is the acquisition time of the screen capture on the local terminal side after the occurrence time of the input event on the local terminal side?).
  • the data processing device 20a determines in S67 that the acquisition time of the screen capture in the screen capture data on the local terminal side found in S66 is after the occurrence time of the input event in the input event data on the local terminal side found in S64. (Yes at S67), proceed to S68. Then, the data processing device 20a obtains a value of the difference between the occurrence time of the input event on the local terminal side found in S64 and the acquisition time of the screen capture on the local terminal side found in S66 (S68). Then, the data processing device 20a outputs the value of the difference obtained in S68 (S69). Then proceed to S70.
  • the process proceeds to S70. Further, in S67, when the data processing device 20a determines that the acquisition time of the screen capture on the local terminal side found in S66 is before the occurrence time of the input event on the local terminal side found in S64 (No in S67). , Proceed to S70.
  • the data processing device 20a can calculate the EtoE processing time by using the change in the window state of the terminal device 10 as an event to be detected.
  • the connection destination terminal device is physically connected. It may be a terminal device or a virtual machine provided by VDI.
  • the terminal device of the connection destination is a terminal device installed in the company
  • the connection from the terminal device 10 to the terminal device of the connection destination is, for example, an in-house LAN (Local Area) via a VPN (Virtual Private Network) or the like. Network) may be used.
  • each component of each of the illustrated devices is a functional concept, and does not necessarily have to be physically configured as shown in the figure. That is, the specific form of distribution / integration of each device is not limited to the one shown in the figure, and all or part of them may be functionally or physically distributed / physically in arbitrary units according to various loads and usage conditions. Can be integrated and configured. Further, each processing function performed by each device is realized by a CPU (Central Processing Unit) and a program executed by the CPU, or as hardware by wired logic. Can be realized.
  • CPU Central Processing Unit
  • the system of each of the above-described embodiments can be implemented by installing a data processing program on a desired computer as packaged software or online software.
  • the information processing apparatus can function as the system of each embodiment.
  • the information processing device referred to here includes a desktop type or notebook type personal computer.
  • information processing devices include smartphones, mobile communication terminals such as mobile phones and PHS (Personal Handyphone System), and slate terminals such as PDAs (Personal Digital Assistants).
  • the data processing device of each embodiment can be implemented as a server device in which the terminal device used by the user is a client and the service related to the above processing is provided to the client.
  • the server device may be implemented as a Web server, or may be implemented as a cloud that provides services related to the above processing by outsourcing.
  • FIG. 9 is a diagram showing an example of a computer that executes a data processing program.
  • the computer 1000 has, for example, a memory 1010 and a CPU 1020.
  • the computer 1000 also has a hard disk drive interface 1030, a disk drive interface 1040, a serial port interface 1050, a video adapter 1060, and a network interface 1070. Each of these parts is connected by a bus 1080.
  • the memory 1010 includes a ROM (Read Only Memory) 1011 and a RAM (Random Access Memory) 1012.
  • the ROM 1011 stores, for example, a boot program such as a BIOS (Basic Input Output System).
  • BIOS Basic Input Output System
  • the hard disk drive interface 1030 is connected to the hard disk drive 1090.
  • the disk drive interface 1040 is connected to the disk drive 1100.
  • a removable storage medium such as a magnetic disk or an optical disk is inserted into the disk drive 1100.
  • the serial port interface 1050 is connected to, for example, a mouse 1110 and a keyboard 1120.
  • the video adapter 1060 is connected to, for example, the display 1130.
  • the hard disk drive 1090 stores, for example, the OS 1091, the application program 1092, the program module 1093, and the program data 1094. That is, the program that defines each process executed by the system of each embodiment is implemented as a program module 1093 in which a code that can be executed by a computer is described.
  • the program module 1093 is stored in, for example, the hard disk drive 1090.
  • a program module 1093 for executing a process similar to the functional configuration in the system of each embodiment is stored in the hard disk drive 1090.
  • the hard disk drive 1090 may be replaced by an SSD.
  • the setting information used in the processing of the above-described embodiment is stored as program data 1094 in, for example, a memory 1010 or a hard disk drive 1090. Then, the CPU 1020 reads the program module 1093 and the program data 1094 stored in the memory 1010 and the hard disk drive 1090 into the RAM 1012 and executes them as needed.
  • the program module 1093 and the program data 1094 are not limited to those stored in the hard disk drive 1090, but may be stored in, for example, a removable storage medium and read by the CPU 1020 via the disk drive 1100 or the like. Alternatively, the program module 1093 and the program data 1094 may be stored in another computer connected via a network (LAN (Local Area Network), WAN (Wide Area Network), etc.). Then, the program module 1093 and the program data 1094 may be read from another computer by the CPU 1020 via the network interface 1070.
  • LAN Local Area Network
  • WAN Wide Area Network

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Debugging And Monitoring (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention provides a data processing device that receives setting information indicating an event to be detected from among events that occur in a terminal device from which a remote connection originates. Thereafter, if the data processing device determines from the event to be detected that a change has occurred in a screen of the terminal device on the basis of the setting information and a screen capture from the terminal device acquired at predetermined time intervals, the data processing device outputs, as an operation delay time, the value of the difference between the time at which the event occurred and the time of acquiring the screen capture in which the change occurred.

Description

データ処理装置およびデータ処理方法Data processing equipment and data processing method
 本発明は、データ処理装置およびデータ処理方法に関する。 The present invention relates to a data processing apparatus and a data processing method.
 働き方の改善や感染症対策等のため、自宅等のローカル端末から社内の端末や仮想端末にリモート接続して業務が実施されることがある。上記のようなリモート接続において、通信や仮想サーバ、ゲートウェイ等の処理遅延により、操作遅延が発生し、作業者のストレス増加・生産性の低下を招くことがある。 In order to improve work styles and prevent infectious diseases, work may be carried out by remotely connecting from a local terminal such as home to an in-house terminal or virtual terminal. In the remote connection as described above, operation delay may occur due to communication delay, processing delay of virtual server, gateway, etc., which may lead to an increase in stress of the operator and a decrease in productivity.
 ここで、操作遅延の発生を検知するために、社内のサーバやネットワークの負荷を計測する方法(非特許文献1参照)や、端末からアクセス可能な画面上で操作にかかった時間を計測する方法(非特許文献2参照)がある。 Here, in order to detect the occurrence of operation delay, a method of measuring the load on an in-house server or network (see Non-Patent Document 1), or a method of measuring the time taken for operation on a screen accessible from a terminal. (See Non-Patent Document 2).
 しかし、社内のサーバやネットワークの負荷を計測する方法は、作業者側が体感する処理遅延の把握はできない。また、端末からアクセス可能な画面上で操作にかかった時間を計測する方法は、操作者の手間が発生し、操作の再現性にも問題がある。 However, the method of measuring the load on the server or network in the company cannot grasp the processing delay experienced by the worker. Further, the method of measuring the time required for the operation on the screen accessible from the terminal requires time and effort for the operator, and has a problem in the reproducibility of the operation.
 そこで、本発明は、前記した問題を解決し、リモート接続における操作者側で体感される操作遅延時間を自動で計測することを課題とする。 Therefore, it is an object of the present invention to solve the above-mentioned problem and automatically measure the operation delay time experienced by the operator in the remote connection.
 前記した課題を解決するため、本発明は、リモート接続の接続元の端末で発生するイベントのうち、検知対象とするイベントを示す設定情報を受け付ける設定部と、前記設定情報と、所定時間ごとに取得された前記接続元の端末の画面のキャプチャとに基づき、前記検知対象のイベントにより、前記接続元の端末の画面の変化が発生したと判定した場合、当該イベントの発生時刻と、前記変化が発生した画面のキャプチャの取得時刻との差分の値を、操作遅延時間として出力する出力処理部とを備えることを特徴とする。 In order to solve the above-mentioned problems, the present invention has a setting unit that receives setting information indicating an event to be detected among the events that occur in the connection source terminal of the remote connection, the setting information, and every predetermined time. When it is determined that a change in the screen of the connection source terminal has occurred due to the event to be detected based on the acquired capture of the screen of the connection source terminal, the occurrence time of the event and the change are determined. It is characterized by including an output processing unit that outputs the value of the difference from the acquisition time of the generated screen capture as the operation delay time.
 本発明によれば、リモート接続における操作者側で体感される操作遅延時間を自動で計測することができる。 According to the present invention, it is possible to automatically measure the operation delay time experienced by the operator in the remote connection.
図1は、第1の実施形態のシステムにおけるデータ処理装置の動作概要を説明するための図である。FIG. 1 is a diagram for explaining an outline of operation of a data processing device in the system of the first embodiment. 図2は、第1の実施形態のシステムの構成例を示す図である。FIG. 2 is a diagram showing a configuration example of the system of the first embodiment. 図3は、図2に示すシステムが行う処理1および処理2の処理手順の例を示すフローチャートである。FIG. 3 is a flowchart showing an example of the processing procedures of processing 1 and processing 2 performed by the system shown in FIG. 図4は、図2に示すシステムが行う処理3の処理手順の例を示すフローチャートである。FIG. 4 is a flowchart showing an example of the processing procedure of the processing 3 performed by the system shown in FIG. 図5Aは、第2の実施形態のシステムにおけるデータ処理装置の動作概要を説明するための図である。FIG. 5A is a diagram for explaining an outline of operation of the data processing device in the system of the second embodiment. 図5Bは、第2の実施形態のシステムが行う処理3の処理手順の例を示すフローチャートである。FIG. 5B is a flowchart showing an example of the processing procedure of processing 3 performed by the system of the second embodiment. 図6は、第3の実施形態のシステムの構成例を示す図である。FIG. 6 is a diagram showing a configuration example of the system of the third embodiment. 図7は、第3の実施形態のシステムが行う処理3の処理手順の例を示すフローチャートである。FIG. 7 is a flowchart showing an example of the processing procedure of processing 3 performed by the system of the third embodiment. 図8は、第2の実施形態のシステムが行う処理4の処理手順の例を示すフローチャートである。FIG. 8 is a flowchart showing an example of the processing procedure of processing 4 performed by the system of the second embodiment. 図9は、データ処理プログラムを実行するコンピュータの構成例を示す図である。FIG. 9 is a diagram showing a configuration example of a computer that executes a data processing program.
 以下、図面を参照しながら、本発明を実施するための形態(実施形態)について説明する。本発明は各実施形態に限定されない。 Hereinafter, embodiments (embodiments) for carrying out the present invention will be described with reference to the drawings. The present invention is not limited to each embodiment.
[第1の実施形態]
[概要]
 図1を参照しながら、第1の実施形態のシステムにおけるデータ処理装置の動作概要を説明する。
[First Embodiment]
[Overview]
The outline of the operation of the data processing apparatus in the system of the first embodiment will be described with reference to FIG.
 なお、以下の説明において端末装置は、リモート接続により他の端末装置に接続するローカル端末(接続元の端末)である。例えば、端末装置は、VDI(Virtual Desktop Infrastructure)サーバ経由で他の端末装置に接続される。データ処理装置は、上記の端末装置が他の端末装置にリモート接続する際のEtoE(ローカル端末から発出した操作から、当該ローカル端末上の画面変化が起きるまで)の処理時間を、ローカル装置の操作者側で体感される操作遅延時間として計算する。 In the following description, the terminal device is a local terminal (connection source terminal) that connects to another terminal device by remote connection. For example, a terminal device is connected to another terminal device via a VDI (Virtual Desktop Infrastructure) server. The data processing device sets the processing time of EtoE (from the operation issued from the local terminal to the screen change on the local terminal) when the above terminal device remotely connects to another terminal device as the operation time of the local device. It is calculated as the operation delay time experienced by the person.
 ここで、データ処理装置は、事前に、上記のEtoEの処理時間の計算に用いる端末装置の操作のイベント(検知対象のイベント)の設定情報を受け付けておく。設定情報は、例えば、検知対象の操作イベントと当該操作イベントにより端末装置の画面上で変化する部分の画像との組み合わせにより記述される。 Here, the data processing device receives in advance the setting information of the operation event (event to be detected) of the terminal device used for calculating the above-mentioned EtoE processing time. The setting information is described by, for example, a combination of an operation event to be detected and an image of a portion that changes on the screen of the terminal device due to the operation event.
 ここでは、例えば、検知対象のイベントとして、設定情報に、図1に示す、キーイベント「A」と、画像「A」および「あ」とが設定されている場合を例に説明する。なお、キーイベントに対応する画像は、端末装置10上で利用するアプリケーションによって見た目(例えば、表示サイズ、フォント、色等)が変わるため、考えられる様々な画像を設定しておくことが好ましい。 Here, for example, a case where the key event "A" and the images "A" and "A" shown in FIG. 1 are set in the setting information as the event to be detected will be described as an example. Since the appearance (for example, display size, font, color, etc.) of the image corresponding to the key event changes depending on the application used on the terminal device 10, it is preferable to set various possible images.
<処理1>
 例えば、データ処理装置は、端末装置の操作イベント(入力イベント)を入力イベントデータベースに格納する。例えば、データ処理装置は、端末装置に「A」が入力された場合、「A」の入力イベントと当該入力イベント(イベント)の発生時刻とを対応付けて入力イベントデータベースに格納する。
<Process 1>
For example, the data processing device stores an operation event (input event) of the terminal device in the input event database. For example, when "A" is input to the terminal device, the data processing device stores the input event of "A" and the occurrence time of the input event (event) in the input event database in association with each other.
<処理2>
 また、例えば、データ処理装置は、所定時間ごとに端末装置の画面キャプチャを取得し、前後の画面キャプチャの比較により変化箇所を取得する。そして、データ処理装置は、取得した画面キャプチャの変化箇所と当該画面キャプチャの取得時刻とを画面キャプチャデータベースに格納する。例えば、データ処理装置は、画面キャプチャと当該画面キャプチャの変化箇所(例えば、変化箇所「A」の座標領域)と当該画面キャプチャの取得時刻とを画面キャプチャデータベースに格納する。
<Process 2>
Further, for example, the data processing device acquires a screen capture of the terminal device at predetermined time intervals, and acquires a change portion by comparing the screen captures before and after. Then, the data processing device stores the changed part of the acquired screen capture and the acquisition time of the screen capture in the screen capture database. For example, the data processing device stores the screen capture, the change point of the screen capture (for example, the coordinate area of the change point "A"), and the acquisition time of the screen capture in the screen capture database.
<処理3>
 その後、データ処理装置は、入力イベントデータベースから検知対象の入力イベント(例えば、「A」の入力イベント)を取得する。また、データ処理装置は、画面キャプチャデータベースから、当該検知対象の入力イベントに対応する画面キャプチャ(検知対象の画面キャプチャ)を取得する。例えば、データ処理装置は、画面キャプチャデータベースから、画面キャプチャの発生時刻が、当該検知対象の入力イベントの発生時刻以降であり、かつ、当該画面キャプチャにおける変化箇所の画像が「A」または「あ」である画面キャプチャデータを取得する。なお、データ処理装置が、画面キャプチャデータベースにおける画面キャプチャデータが、当該検知対象の入力イベントに対応する変化箇所を含む画面キャプチャかどうか判定する方法として、例えば、設定情報に設定された画像と変化箇所の画像において、サイズとフォントの両方が一致すると考えられる場合、データ処理装置は、設定情報に設定された画像および変化箇所の画像それぞれを、グレースケール画像に変換し、さらに二値化してからテンプレートマッチングを用いて類似度を算出する。そして、算出した類似度が、設定した閾値以上であれば、データ処理装置は、当該検知対象のイベントに対応する画面キャプチャとして判定する、という方法がある。また、設定情報に設定された画像と変化箇所の画像において、サイズは異なるがフォントが一致すると考えられる場合に、データ処理装置が、設定情報に設定された画像と変化箇所の画像の両方をグレースケール化してからSIFT(Scale-Invariant Feature Transform)を用いて特徴を抽出し、特徴の一致度合いが閾値以上であれば当該検知対象の入力イベントに対応する画面キャプチャとして判定する方法がある。そして、データ処理装置は、取得した画面キャプチャの取得時刻と、当該入力イベントの発生時刻との差分を計算する。つまり、データ処理装置は、当該画面キャプチャの取得時刻を、操作遅延を経た画面反映時刻とし、EtoEの処理時間を計算する。
<Process 3>
After that, the data processing device acquires the input event to be detected (for example, the input event of "A") from the input event database. Further, the data processing device acquires a screen capture (screen capture of the detection target) corresponding to the input event of the detection target from the screen capture database. For example, in the data processing device, from the screen capture database, the time when the screen capture occurs is after the time when the input event to be detected occurs, and the image of the changed part in the screen capture is "A" or "A". Get the screen capture data that is. As a method for the data processing device to determine whether the screen capture data in the screen capture database is a screen capture including a change portion corresponding to the input event of the detection target, for example, an image set in the setting information and the change portion. If it is considered that both the size and the font match in the image of, the data processing device converts each of the image set in the setting information and the image of the changed part into a gray scale image, further binarizes, and then the template. The similarity is calculated using matching. Then, if the calculated similarity is equal to or higher than the set threshold value, the data processing device may determine the screen capture corresponding to the event to be detected. In addition, when it is considered that the fonts of the image set in the setting information and the image of the change part are different but the fonts match, the data processing device grayscales both the image set in the setting information and the image of the change part. After scaling, features are extracted using SIFT (Scale-Invariant Feature Transform), and if the degree of matching of features is equal to or higher than the threshold value, it is judged as a screen capture corresponding to the input event to be detected. Then, the data processing device calculates the difference between the acquisition time of the acquired screen capture and the occurrence time of the input event. That is, the data processing device calculates the EtoE processing time by setting the acquisition time of the screen capture as the screen reflection time after the operation delay.
 これによりデータ処理装置は、端末装置がリモート接続する際、端末装置の操作者側で体感される操作遅延時間を自動で計算することができる。 This allows the data processing device to automatically calculate the operation delay time experienced by the operator of the terminal device when the terminal device is remotely connected.
[構成例]
 次に、図2を用いてデータ処理装置を含むシステムの構成例を説明する。システムは、端末装置10と、データ処理装置20とを含む。端末装置10およびデータ処理装置20の数は、図2に示す数に限定されない。
[Configuration example]
Next, a configuration example of a system including a data processing device will be described with reference to FIG. The system includes a terminal device 10 and a data processing device 20. The number of terminal devices 10 and data processing devices 20 is not limited to the number shown in FIG.
<端末装置>
 まず、端末装置10について説明する。端末装置10は、前記したとおり、端末装置10は、リモート接続により他の端末装置に接続するローカル装置である。端末装置10は、入力取得部11と、画面取得部12と、画面比較部13と、出力部14と、通信制御部15とを備える。
<Terminal device>
First, the terminal device 10 will be described. As described above, the terminal device 10 is a local device that connects to another terminal device by remote connection. The terminal device 10 includes an input acquisition unit 11, a screen acquisition unit 12, a screen comparison unit 13, an output unit 14, and a communication control unit 15.
 入力取得部11は、当該端末装置10の入力デバイス(例えば、キーボードやマウス等)による入力イベントを取得する。 The input acquisition unit 11 acquires an input event by an input device (for example, a keyboard, a mouse, etc.) of the terminal device 10.
 画面取得部12は、所定時間ごとに当該端末装置10の画面キャプチャを取得する。画面比較部13は、画面取得部12により所定時間ごとに取得された画面キャプチャを用いて、画面上の変化箇所を取得する。 The screen acquisition unit 12 acquires a screen capture of the terminal device 10 at predetermined time intervals. The screen comparison unit 13 acquires a change portion on the screen by using a screen capture acquired by the screen acquisition unit 12 at predetermined time intervals.
 出力部14は、データ処理装置20から出力されたEtoEの処理時間を出力する。通信制御部15は、当該端末装置10とデータ処理装置20との通信制御を行う。 The output unit 14 outputs the EtoE processing time output from the data processing device 20. The communication control unit 15 controls communication between the terminal device 10 and the data processing device 20.
<データ処理装置>
 次に、データ処理装置20を説明する。データ処理装置20は、通信制御部21と、制御部(出力処理部)22と、記憶部23とを備える。
<Data processing device>
Next, the data processing device 20 will be described. The data processing device 20 includes a communication control unit 21, a control unit (output processing unit) 22, and a storage unit 23.
 通信制御部21は、当該データ処理装置20と端末装置10との通信制御を行う。制御部22は、データ処理装置20全体の制御を司る。制御部22は、データ取得部221と、判定部222と、設定部223と、探索部224と、計測部225とを備える。 The communication control unit 21 controls communication between the data processing device 20 and the terminal device 10. The control unit 22 controls the entire data processing device 20. The control unit 22 includes a data acquisition unit 221, a determination unit 222, a setting unit 223, a search unit 224, and a measurement unit 225.
 データ取得部221は、通信制御部21経由で、端末装置10から、当該端末装置10における入力イベントと、当該端末装置10の画面キャプチャから得られた画面上の変化箇所とを取得する。そして、データ取得部221は、取得した入力イベントのデータ(入力イベントデータ)を記憶部23の入力イベントデータベースに格納する。入力イベントデータは、入力イベントと、当該入力イベントの発生時刻とを示した情報である(図1参照)。 The data acquisition unit 221 acquires an input event in the terminal device 10 and a change point on the screen obtained from the screen capture of the terminal device 10 from the terminal device 10 via the communication control unit 21. Then, the data acquisition unit 221 stores the acquired input event data (input event data) in the input event database of the storage unit 23. The input event data is information indicating the input event and the occurrence time of the input event (see FIG. 1).
 また、データ取得部221は、取得した画面上の変化箇所を含む画面キャプチャデータを記憶部23の画面キャプチャイベントデータベースに格納する。この画面キャプチャデータは、画面キャプチャと、当該画面キャプチャから得られた画面上の変化箇所と、当該画面キャプチャの取得時刻とを示した情報である(図1参照)。なお、入力イベントデータおよび画面キャプチャデータはそれぞれ、データ処理装置20のメモリに保持されてもよい。 Further, the data acquisition unit 221 stores the acquired screen capture data including the changed part on the screen in the screen capture event database of the storage unit 23. This screen capture data is information indicating the screen capture, the change points on the screen obtained from the screen capture, and the acquisition time of the screen capture (see FIG. 1). The input event data and the screen capture data may be held in the memory of the data processing device 20, respectively.
 判定部222は、端末装置10の入力イベントおよび画像キャプチャから得られた画面の変化箇所が、設定部223により設定された設定情報の内容を満たすか否かを判定する。 The determination unit 222 determines whether or not the change portion of the screen obtained from the input event of the terminal device 10 and the image capture satisfies the content of the setting information set by the setting unit 223.
 設定部223は、設定情報の入力を受け付ける。設定情報は、前記したとおり、例えば、端末装置で発生する入力イベントと、当該入力イベントにより当該端末装置の画面上で変化する部分の画像(認識対象画像)との組み合わせにより記述される。設定情報は、例えば、当該データ処理装置20の入力部(図示省略)経由で入力され、記憶部23の所定領域に記憶される。 The setting unit 223 accepts the input of setting information. As described above, the setting information is described by, for example, a combination of an input event generated in the terminal device and an image (recognition target image) of a portion changed on the screen of the terminal device due to the input event. The setting information is input, for example, via an input unit (not shown) of the data processing device 20, and is stored in a predetermined area of the storage unit 23.
 探索部224は、入力イベントデータベースおよび画面キャプチャデータベースにアクセスし、入力イベントに対応する画面キャプチャを探索する。 The search unit 224 accesses the input event database and the screen capture database, and searches for the screen capture corresponding to the input event.
 計測部225は、探索部224により探索された入力イベントの発生時刻と、当該入力イベントに対応する画面キャプチャの取得時刻との差分の値を計測する。そして、計測部225は、計測した差分の値を、EtoEの処理時間として出力する。 The measurement unit 225 measures the value of the difference between the occurrence time of the input event searched by the search unit 224 and the acquisition time of the screen capture corresponding to the input event. Then, the measurement unit 225 outputs the measured difference value as the EtoE processing time.
 記憶部23は、所定領域に入力イベントデータベースと画面キャプチャデータベースとを備える。入力イベントデータベースは、データ取得部221により取得された入力イベントを格納するデータベースである。画面キャプチャデータベースは、データ取得部221により取得された、変化箇所を含む画面キャプチャデータを画面キャプチャデータベースに格納するデータベースである。 The storage unit 23 includes an input event database and a screen capture database in a predetermined area. The input event database is a database that stores input events acquired by the data acquisition unit 221. The screen capture database is a database that stores the screen capture data including the changed part acquired by the data acquisition unit 221 in the screen capture database.
 また、データ処理装置20の機能は、端末装置10内に装備されてもよい。この場合、データ処理装置20の通信制御部21および端末装置10の通信制御部21は不要である。 Further, the function of the data processing device 20 may be installed in the terminal device 10. In this case, the communication control unit 21 of the data processing device 20 and the communication control unit 21 of the terminal device 10 are unnecessary.
<処理手順の例>
 次に、図3および図4を用いて、システムが実行する処理(処理1,2,3)を説明する。処理1は、システムが、端末装置10で取得された入力イベントを入力イベントデータベースに格納する処理である。処理2は、システムが、端末装置10で取得された画面キャプチャデータを画面キャプチャデータベースに格納する処理である。処理3は、システムが、処理1,2で得られた入力イベントデータと画面キャプチャデータとに基づき、EtoEの処理時間を求める処理である。
<Example of processing procedure>
Next, the processes (processes 1, 2, and 3) executed by the system will be described with reference to FIGS. 3 and 4. The process 1 is a process in which the system stores the input event acquired by the terminal device 10 in the input event database. The process 2 is a process in which the system stores the screen capture data acquired by the terminal device 10 in the screen capture database. The process 3 is a process in which the system obtains the EtoE processing time based on the input event data and the screen capture data obtained in the processes 1 and 2.
 処理1および処理2は、端末装置10のバックグラウンドで並列に行われる処理である。また、処理3は、例えば、データ処理装置20のユーザが命令したタイミングで開始される。なお、データ処理装置20の設定部223は、事前に検知対象イベントの設定情報の入力を受け付けておくものとする。 Process 1 and process 2 are processes performed in parallel in the background of the terminal device 10. Further, the process 3 is started, for example, at the timing instructed by the user of the data processing device 20. The setting unit 223 of the data processing device 20 shall accept the input of the setting information of the event to be detected in advance.
<処理1>
 まず、図3を用いて、処理1を説明する。端末装置10は、当該端末装置10の入力イベントを取得する(S11)。例えば、端末装置10は、当該端末装置10において入力されたキーと、当該入力の発生時刻とを入力イベントとして取得する。その後、データ処理装置20は、当該端末装置10の入力イベントを取得すると、取得した入力イベントを示すデータ(入力イベントデータ)を入力イベントデータベースに格納する(S12)。
<Process 1>
First, the process 1 will be described with reference to FIG. The terminal device 10 acquires an input event of the terminal device 10 (S11). For example, the terminal device 10 acquires the key input in the terminal device 10 and the time when the input occurs as an input event. After that, when the data processing device 20 acquires the input event of the terminal device 10, the data processing device 20 stores the data (input event data) indicating the acquired input event in the input event database (S12).
 その後、端末装置10のユーザがプロセスを止めず、また、端末装置10がシャットダウンしていなければ(S13でNo)、S11へ戻る。一方、端末装置10のユーザがプロセスを止めた/端末装置10がシャットダウンしていれば(S13でYes)、処理を終了する。 After that, if the user of the terminal device 10 does not stop the process and the terminal device 10 is not shut down (No in S13), the process returns to S11. On the other hand, if the user of the terminal device 10 has stopped the process / the terminal device 10 has shut down (Yes in S13), the process is terminated.
<処理2>
 引き続き、図3を用いて、処理2を説明する。端末装置10は、所定時間ごと(例えば、50msごと)に端末装置10の画面キャプチャを行う。そして、端末装置10は、取得した画面キャプチャを比較し、前後の画面に変化箇所があれば変化箇所を取得する(S21)。その後、データ処理装置20は、端末装置10からS21で取得された変化箇所を含む画面キャプチャデータを取得すると、取得した画面キャプチャデータを画面キャプチャデータベースに格納する(S22)。その後、端末装置10のユーザがプロセスを止めず、また、端末装置10がシャットダウンしていなければ(S23でNo)、S21へ戻る。一方、端末装置10のユーザがプロセスを止めた/端末装置10がシャットダウンしていれば(S23でYes)、処理を終了する。
<Process 2>
Subsequently, the process 2 will be described with reference to FIG. The terminal device 10 captures the screen of the terminal device 10 at predetermined time intervals (for example, every 50 ms). Then, the terminal device 10 compares the acquired screen captures, and if there is a change portion on the previous and next screens, acquires the change portion (S21). After that, when the data processing device 20 acquires the screen capture data including the change portion acquired in S21 from the terminal device 10, the acquired screen capture data is stored in the screen capture database (S22). After that, if the user of the terminal device 10 does not stop the process and the terminal device 10 has not shut down (No in S23), the process returns to S21. On the other hand, if the user of the terminal device 10 has stopped the process / the terminal device 10 has shut down (Yes in S23), the process is terminated.
<処理3>
 次に、処理3を説明する。処理3は、まず、データ処理装置20が、入力イベントデータベースにアクセスし、検知対象の入力イベントの発生時刻を取得する。次に、データ処理装置20は、画面キャプチャデータベースにアクセスし、当該検知対象の入力イベントの発生時刻以降に最初に発生した、検知対象の画面キャプチャデータを探索する。その後、データ処理装置20は、当該検知対象の入力イベントの発生時刻と、探索した画面キャプチャデータにおける画面キャプチャの取得時刻との差分の値を求める。そして、データ処理装置20は、当該差分の値を、EtoEの処理時間として出力する。
<Process 3>
Next, the process 3 will be described. In the process 3, first, the data processing device 20 accesses the input event database and acquires the occurrence time of the input event to be detected. Next, the data processing device 20 accesses the screen capture database and searches for the screen capture data of the detection target that first occurs after the occurrence time of the input event of the detection target. After that, the data processing device 20 obtains a value of the difference between the occurrence time of the input event to be detected and the acquisition time of the screen capture in the searched screen capture data. Then, the data processing device 20 outputs the value of the difference as the processing time of EtoE.
 以下、図4を用いて、上記の処理3の一例を説明する。例えば、データ処理装置20は、入力イベントデータベースにアクセスし、対象入力イベントデータを選択し(S31)、選択した対象入力イベントデータが検知対象の入力イベントか否かを判定する(S32)。例えば、データ処理装置20は、入力イベントデータベースの中から入力イベントを1つ選択し、選択した入力イベントが設定情報に示される検知対象の入力イベントか否かを判定する。 Hereinafter, an example of the above process 3 will be described with reference to FIG. For example, the data processing device 20 accesses the input event database, selects the target input event data (S31), and determines whether or not the selected target input event data is an input event to be detected (S32). For example, the data processing device 20 selects one input event from the input event database, and determines whether or not the selected input event is an input event to be detected indicated in the setting information.
 次に、データ処理装置20、画面キャプチャデータベースにアクセスし、対象画面キャプチャイベントを選択する(S33)。例えば、データ処理装置20は、画面キャプチャデータベースの中から1つ画面キャプチャデータを選択する。 Next, access the data processing device 20 and the screen capture database, and select the target screen capture event (S33). For example, the data processing device 20 selects one screen capture data from the screen capture database.
 その後、データ処理装置20は、S33で選択した対象画面キャプチャイベントが、検知対象の画面キャプチャデータであり、かつ、対象入力イベントの発生時刻より後に発生しているか否かを判定する(S34)。例えば、データ処理装置20は、S33で選択した画面キャプチャデータにおける変化箇所の画像が、検知対象の画像(認識対象画像)と一致しており、かつ、当該画面キャプチャデータにおける画面キャプチャの取得時刻が対象入力イベントの発生時刻より後か否かを判定する。 After that, the data processing device 20 determines whether or not the target screen capture event selected in S33 is the screen capture data to be detected and occurs after the occurrence time of the target input event (S34). For example, in the data processing device 20, the image of the change portion in the screen capture data selected in S33 matches the image to be detected (recognition target image), and the acquisition time of the screen capture in the screen capture data is set. Judge whether it is after the occurrence time of the target input event.
 ここで、データ処理装置20が、S33で選択した対象画面キャプチャデータが、検知対象の画面キャプチャデータであり、かつ、対象入力イベントの発生時刻より後に発生していると判定した場合(S34のYes)、対象入力イベントの発生時刻と対象画面キャプチャイベントの取得時刻との差分の値を計算する(S35)。そして、データ処理装置20は、S35で計算した差分の値を出力する(S36)。その後、データ処理装置20が、入力イベントデータベースの全ての入力イベントを確認したと判定した場合(S38でYes)、処理を終了する。一方、データ処理装置20は、入力イベントデータベースのいずれかの入力イベントが未確認であると判定した場合(S38でNo)、S31へ戻る。 Here, when the data processing device 20 determines that the target screen capture data selected in S33 is the screen capture data to be detected and occurs after the occurrence time of the target input event (Yes in S34). ), Calculate the value of the difference between the occurrence time of the target input event and the acquisition time of the target screen capture event (S35). Then, the data processing device 20 outputs the value of the difference calculated in S35 (S36). After that, when it is determined that the data processing device 20 has confirmed all the input events in the input event database (Yes in S38), the processing ends. On the other hand, when the data processing device 20 determines that any input event in the input event database is unconfirmed (No in S38), the data processing device 20 returns to S31.
 なお、データ処理装置20が、S33で選択した対象画面キャプチャデータが検知対象の画面キャプチャデータではない場合、または、当該対象画面キャプチャデータが対象入力イベントの発生時刻以前に発生していると判定した場合(S34のNo)、S37へ進む。そして、データ処理装置20は、画面キャプチャデータベースの全ての画面キャプチャデータを確認したと判定した場合(S37でYes)、S38へ進む。一方、データ処理装置20は、画面キャプチャデータベースのいずれかの画面キャプチャデータが未確認であると判定した場合(S37でNo)、S33へ戻る。 The data processing device 20 determines that the target screen capture data selected in S33 is not the screen capture data to be detected, or that the target screen capture data has occurred before the time when the target input event occurs. In the case (No of S34), the process proceeds to S37. Then, when the data processing device 20 determines that all the screen capture data of the screen capture database has been confirmed (Yes in S37), the process proceeds to S38. On the other hand, when the data processing device 20 determines that any of the screen capture data in the screen capture database is unconfirmed (No in S37), the data processing device 20 returns to S33.
 このようにすることで、システムは、リモート接続する端末装置10におけるEtoEの処理時間を計算することができる。 By doing so, the system can calculate the processing time of EtoE in the terminal device 10 to be remotely connected.
 なお、前記した例では、システムは、入力イベントおよび画面キャプチャデータが検知対象か否かの判定を、処理3の中で行う場合について説明したが、これに限定されない。例えば、システムは、入力イベントが検知対象か否かの判定を処理1の中で行い、画面キャプチャデータが検知対象か否かの判定を処理2の中で行ってもよい。この場合、システムは、処理3において、入力イベントおよび画面キャプチャデータが検知対象か否かの判定処理は行う必要はない。 In the above example, the system has described the case where the determination of whether or not the input event and the screen capture data are the detection targets is performed in the process 3, but the present invention is not limited to this. For example, the system may determine whether or not the input event is the detection target in the process 1, and may determine whether or not the screen capture data is the detection target in the process 2. In this case, the system does not need to perform the determination process of whether or not the input event and the screen capture data are the detection targets in the process 3.
[第2の実施形態]
 なお、システムは、端末装置10のマウスクリックによるボタンの見た目の変化に関するイベントを用いて上記のEtoEの処理時間を計算してもよい。この場合の実施形態を第2の実施形態として説明する。第1の実施形態と同じ構成は同じ符号を付して説明を省略する。
[Second Embodiment]
The system may calculate the above-mentioned EtoE processing time by using an event related to a change in the appearance of a button due to a mouse click of the terminal device 10. The embodiment in this case will be described as the second embodiment. The same configurations as those of the first embodiment are designated by the same reference numerals, and the description thereof will be omitted.
[概要]
 まず、図5Aを参照しながら、第2の実施形態のシステムにおけるデータ処理装置の動作概要を説明する。
[Overview]
First, an outline of the operation of the data processing device in the system of the second embodiment will be described with reference to FIG. 5A.
 第2の実施形態において、設定情報には、検知対象のイベントとして、マウス操作に関する入力イベントが設定される。例えば、設定情報には、検知対象のイベントとして、マウスのクリック操作(マウスクリック)が設定される。 In the second embodiment, an input event related to mouse operation is set as an event to be detected in the setting information. For example, a mouse click operation (mouse click) is set as an event to be detected in the setting information.
<処理1>
 また、第2の実施形態のシステムにおいても、前記した処理1,2,3を行うが、処理1において、端末装置10は、入力イベントとして、当該端末装置10におけるマウスクリック、当該マウスクリックが発生した際のカーソルの座標位置(当該マウスクリック発生箇所)、当該マウスクリックの発生時刻等を取得する。そして、データ処理装置20は、端末装置10における入力イベントを入力イベントデータベースに格納する。
<Process 1>
Further, also in the system of the second embodiment, the above-mentioned processes 1, 2 and 3 are performed, but in the process 1, the terminal device 10 generates the mouse click and the mouse click in the terminal device 10 as input events. Acquires the coordinate position of the cursor (the place where the mouse click occurs), the time when the mouse click occurs, and the like. Then, the data processing device 20 stores the input event in the terminal device 10 in the input event database.
<処理2>
 また、第2の実施形態のシステムの処理2においても、例えば、データ処理装置20は、所定時間ごとに端末装置10の画面キャプチャを取得し、前後の画面キャプチャの比較により変化箇所を取得する。そして、データ処理装置20は、取得した画面キャプチャの変化箇所と当該画面キャプチャの取得時刻とを画面キャプチャデータベースに格納する。例えば、データ処理装置20は、端末装置10のマウスクリックにより、画面上の保存ボタンが押下され、保存ボタンの見た目が変化した場合、そのときの画面キャプチャと当該画面キャプチャの変化箇所(例えば、保存ボタンの座標領域)と当該画面キャプチャの取得時刻とを画面キャプチャデータベースに格納する。
<Process 2>
Further, also in the process 2 of the system of the second embodiment, for example, the data processing device 20 acquires the screen capture of the terminal device 10 at predetermined time intervals, and acquires the changed portion by comparing the screen captures before and after. Then, the data processing device 20 stores the changed part of the acquired screen capture and the acquisition time of the screen capture in the screen capture database. For example, in the data processing device 20, when the save button on the screen is pressed by clicking the mouse of the terminal device 10 and the appearance of the save button changes, the screen capture at that time and the changed part of the screen capture (for example, save). The button coordinate area) and the acquisition time of the screen capture are stored in the screen capture database.
<処理3>
 また、第2の実施形態のシステムにおける処理3においても、データ処理装置20は、入力イベントデータベースから、設定情報に示される検知対象の入力イベントを見つける。そして、データ処理装置20は、当該入力イベントの発生箇所と、当該入力イベントの発生時刻とを取得する。
<Process 3>
Further, also in the process 3 in the system of the second embodiment, the data processing device 20 finds the input event to be detected shown in the setting information from the input event database. Then, the data processing device 20 acquires the place where the input event occurs and the time when the input event occurs.
 そして、データ処理装置20は、画面キャプチャデータベースにアクセスし、検知対象の入力イベントの発生時刻の前後に画面変化が発生した画面キャプチャデータを探索する。そして、データ処理装置20は、当該画面キャプチャデータが検知対象の入力イベントに対応するものか否かを判定する。例えば、データ処理装置20は、検知対象の入力イベント(マウスクリック)の発生箇所が、探索した画面キャプチャデータにおける変化箇所の領域に含まれるかどうかを判定する。ここで、データ処理装置20は、マウスクリックの発生箇所が、探索した画面キャプチャデータにおける変化箇所の領域(例えば、保存ボタンの座標領域)に含まれると判定した場合、当該画面キャプチャデータが検知対象の入力イベントに対応するものと判定する。そして、データ処理装置20は、当該画面キャプチャデータにおける画面キャプチャの発生時刻を取得する。その後、データ処理装置20は、当該入力イベントの発生時刻と、当該画面キャプチャの取得時刻との差分の値を計算する。そして、データ処理装置20は、当該差分の値を、EtoEの処理時間として出力する。 Then, the data processing device 20 accesses the screen capture database and searches for the screen capture data in which the screen change occurs before and after the occurrence time of the input event to be detected. Then, the data processing device 20 determines whether or not the screen capture data corresponds to the input event to be detected. For example, the data processing device 20 determines whether or not the location where the input event (mouse click) to be detected is generated is included in the region of the change location in the searched screen capture data. Here, when the data processing device 20 determines that the location where the mouse click occurs is included in the region of the change portion in the searched screen capture data (for example, the coordinate region of the save button), the screen capture data is detected. It is determined that it corresponds to the input event of. Then, the data processing device 20 acquires the time when the screen capture occurs in the screen capture data. After that, the data processing device 20 calculates the value of the difference between the occurrence time of the input event and the acquisition time of the screen capture. Then, the data processing device 20 outputs the value of the difference as the processing time of EtoE.
 第2の実施形態において、設定情報には、例えば、検知対象のイベントとして、マウスのクリック操作(マウスクリック)と、当該クリック操作におけるカーソルの座標領域とが設定されてもよい。 In the second embodiment, for example, a mouse click operation (mouse click) and a cursor coordinate area in the click operation may be set in the setting information as an event to be detected.
 この場合、第2の実施形態のシステムにおける処理3において、データ処理装置20は、入力イベントデータベースから、設定情報に示される検知対象の入力イベントとして、当該設定情報に示されるカーソルの座標領域におけるクリック操作に関する入力イベントを見つける。そして、データ処理装置20は、当該入力イベントにおけるカーソルの座標位置と、当該入力イベントの発生時刻とを取得する。その後の処理は、前記した処理3と同様であるので、説明を省略する。 In this case, in the process 3 of the system of the second embodiment, the data processing device 20 clicks in the coordinate area of the cursor indicated in the setting information as an input event to be detected indicated in the setting information from the input event database. Find input events related to the operation. Then, the data processing device 20 acquires the coordinate position of the cursor in the input event and the occurrence time of the input event. Subsequent processing is the same as the above-mentioned processing 3, so the description thereof will be omitted.
 以下、図5Bを用いて、第2の実施形態のシステムにおける処理3の一例を説明する。例えば、データ処理装置20は、入力イベントデータベースにアクセスし、対象入力イベントデータを選択し(S41)、選択した対象入力イベントデータが検知対象の入力イベントか否かを判定する(S42)。例えば、データ処理装置20は、入力イベントデータベースの中から入力イベントを1つ選択し、選択した入力イベントが設定情報に示される検知対象の入力イベントか否かを判定する。 Hereinafter, an example of the process 3 in the system of the second embodiment will be described with reference to FIG. 5B. For example, the data processing device 20 accesses the input event database, selects the target input event data (S41), and determines whether or not the selected target input event data is an input event to be detected (S42). For example, the data processing device 20 selects one input event from the input event database, and determines whether or not the selected input event is an input event to be detected indicated in the setting information.
 次に、データ処理装置20、画面キャプチャデータベースにアクセスし、対象入力イベントの前後に発生した画面キャプチャを取得する(S43)。例えば、データ処理装置20は、画面キャプチャデータベースの中から、画面キャプチャの取得時刻が対象入力イベントの発生時刻の前後である画面キャプチャデータを取得する。 Next, access the data processing device 20 and the screen capture database, and acquire the screen captures that occurred before and after the target input event (S43). For example, the data processing device 20 acquires screen capture data from the screen capture database whose acquisition time of the screen capture is before or after the occurrence time of the target input event.
 その後、データ処理装置20は、対象入力イベントが、対象入力イベントの前後の画面キャプチャの変化箇所に含まれるか否かを判定する(S44)。例えば、データ処理装置20は、対象入力イベントにおけるカーソルの座標位置が、S43で取得した画面キャプチャデータにおける変化箇所の領域に含まれるか否かを判定する。 After that, the data processing device 20 determines whether or not the target input event is included in the change points of the screen capture before and after the target input event (S44). For example, the data processing device 20 determines whether or not the coordinate position of the cursor in the target input event is included in the area of the change portion in the screen capture data acquired in S43.
 ここで、データ処理装置20が、対象入力イベントの前後の画面キャプチャの変化箇所に対象入力イベントのカーソル座標が含まれると判定した場合(S44のYes)、対象入力イベントの発生時刻と対象画面キャプチャイベントの取得時刻との差分を計算する(S45)。なお、この対象画面キャプチャイベントの取得時刻は、例えば、S44で対象入力イベントが変化箇所に含まれると判定された画面キャプチャデータのうち、後の方の画面キャプチャデータにおける画面キャプチャの取得時刻である。そして、データ処理装置20は、S45で計算した差分の値を出力する(S46)。その後、データ処理装置20が、入力イベントデータベースの全ての入力イベントを確認したと判定した場合(S47でYes)、処理を終了する。一方、データ処理装置20が、入力イベントデータベースのいずれかの入力イベントを未確認であると判定した場合(S47でNo)、S41へ戻る。 Here, when the data processing device 20 determines that the cursor coordinates of the target input event are included in the change points of the screen capture before and after the target input event (Yes in S44), the occurrence time of the target input event and the target screen capture. The difference from the acquisition time of the event is calculated (S45). The acquisition time of this target screen capture event is, for example, the acquisition time of the screen capture in the later screen capture data among the screen capture data determined in S44 that the target input event is included in the change location. .. Then, the data processing device 20 outputs the value of the difference calculated in S45 (S46). After that, when it is determined that the data processing device 20 has confirmed all the input events in the input event database (Yes in S47), the processing ends. On the other hand, when the data processing device 20 determines that any input event in the input event database is unconfirmed (No in S47), the process returns to S41.
 なお、データ処理装置20が、S41で選択した対象入力イベントデータが検知対象の入力イベントではない場合(S42のNo)、S47へ進む。また、データ処理装置20が、S43で対象入力イベントの前後の画面キャプチャの変化箇所に対象入力イベントのカーソル座標位置が含まれないと判定した場合も(S44のNo)、S47へ進む。 If the target input event data selected in S41 by the data processing device 20 is not an input event to be detected (No in S42), the process proceeds to S47. Further, when the data processing device 20 determines in S43 that the cursor coordinate position of the target input event is not included in the change points of the screen capture before and after the target input event (No in S44), the process proceeds to S47.
 このようにすることで、システムは、端末装置10のマウス操作に関するイベントを用いてEtoEの処理時間を計算することができる。 By doing so, the system can calculate the EtoE processing time using the event related to the mouse operation of the terminal device 10.
 なお、前記した例では、システムは、処理3において、入力イベントが検知対象か否かの判定を行う場合について説明したが、これに限定されない。例えば、システムは、処理1の中で入力イベントが検知対象か否かの判定を行ってもよい。この場合、システムは、処理3において、入力イベントが検知対象か否かの判定処理は行う必要がない。 In the above example, the system has described the case where it is determined in the process 3 whether or not the input event is the detection target, but the present invention is not limited to this. For example, the system may determine whether or not the input event is a detection target in the process 1. In this case, the system does not need to perform the determination process of whether or not the input event is the detection target in the process 3.
[第3の実施形態]
 なお、システムは、ローカル端末のウィンドウ状態の変化に関するイベントを検知対象のイベントとして用いてEtoEの処理時間を計算してもよい。この場合、システムが用いる設定情報には、検知対象となるウィンドウ状態の変化に関するイベント(例えば、アクティブウィンドウが閉じられた等)が設定される。この場合の実施形態を、第3の実施形態として説明する。前記した各実施形態と同じ構成は同じ符号を付して説明を省略する。
[Third Embodiment]
The system may calculate the EtoE processing time by using an event related to a change in the window state of the local terminal as an event to be detected. In this case, an event related to a change in the window state to be detected (for example, the active window is closed, etc.) is set in the setting information used by the system. The embodiment in this case will be described as a third embodiment. The same configurations as those of the above-described embodiments are designated by the same reference numerals, and the description thereof will be omitted.
[構成例]
 図6を用いて第3の実施形態のシステムの構成例を説明する。システムは、端末装置10と、端末装置10aと、データ処理装置20aとを含む。
[Configuration example]
A configuration example of the system of the third embodiment will be described with reference to FIG. The system includes a terminal device 10, a terminal device 10a, and a data processing device 20a.
 端末装置10aは、端末装置10からリモート接続され、当該端末装置10にリモート環境を提供する端末装置(接続先端末)である。端末装置10aは、ウィンドウ状態取得部16を備える。ウィンドウ状態取得部16は、所定時間ごとに、上記のリモート環境における端末装置10のウィンドウおよびプロセスの情報を取得する。そして、ウィンドウ状態取得部16は、ウィンドウまたはプロセスに変化があれば、変化があったときのウィンドウ状態と当該変化の発生時刻とを取得する。 The terminal device 10a is a terminal device (connection destination terminal) that is remotely connected from the terminal device 10 and provides a remote environment to the terminal device 10. The terminal device 10a includes a window state acquisition unit 16. The window state acquisition unit 16 acquires information on the window and process of the terminal device 10 in the remote environment at predetermined time intervals. Then, if there is a change in the window or process, the window state acquisition unit 16 acquires the window state at the time of the change and the time when the change occurs.
 データ処理装置20aの制御部22aは、データ取得部221aと、判定部222aと、設定部223aと、探索部224aと、計測部225aとを備える。 The control unit 22a of the data processing device 20a includes a data acquisition unit 221a, a determination unit 222a, a setting unit 223a, a search unit 224a, and a measurement unit 225a.
 データ取得部221aは、通信制御部21経由で端末装置10から、入力イベントと、画面キャプチャから得られた画面上の変化箇所とを取得する。また、データ取得部221aは、通信制御部21経由で端末装置10aから、入力イベントと、画面キャプチャから得られた画面上の変化箇所と、ウィンドウ状態取得部16により取得されたウィンドウ状態データ(変化があったときのウィンドウ状態と当該変化の発生時刻)とを取得する。 The data acquisition unit 221a acquires an input event and a change point on the screen obtained from the screen capture from the terminal device 10 via the communication control unit 21. Further, the data acquisition unit 221a has input events from the terminal device 10a via the communication control unit 21, changes on the screen obtained from the screen capture, and window state data (changes) acquired by the window state acquisition unit 16. The window state at the time of the occurrence and the time when the change occurred) are acquired.
 そして、データ取得部221aは、端末装置10および端末装置10aの入力イベントデータを記憶部23の入力イベントデータベースに格納する。また、データ取得部221aは、画面上の変化箇所を含む画面キャプチャデータを記憶部23の画面キャプチャイベントデータベースに格納する。さらに、データ取得部221aは、ウィンドウ状態データをウィンドウ情報データベースに格納する。 Then, the data acquisition unit 221a stores the input event data of the terminal device 10 and the terminal device 10a in the input event database of the storage unit 23. Further, the data acquisition unit 221a stores the screen capture data including the changed portion on the screen in the screen capture event database of the storage unit 23. Further, the data acquisition unit 221a stores the window state data in the window information database.
 判定部222aは、端末装置10aのウィンドウ状態が設定部223により設定された設定情報の内容を満たすか否かを判定する。例えば、判定部222aは、端末装置10aでアクティブウィンドウが閉じられたか否かを判定する。 The determination unit 222a determines whether or not the window state of the terminal device 10a satisfies the content of the setting information set by the setting unit 223. For example, the determination unit 222a determines whether or not the active window is closed by the terminal device 10a.
 設定部223aは、設定情報の入力を受け付ける。設定情報には、検知対象となるウィンドウ状態のイベント(例えば、アクティブウィンドウが閉じられた等)が設定される。 The setting unit 223a accepts the input of setting information. An event in the window state to be detected (for example, the active window is closed, etc.) is set in the setting information.
 探索部224aは、入力イベントデータベースおよび画面キャプチャデータベースにアクセスし、ウィンドウ状態に対応する端末装置10および端末装置10aの入力イベントと、端末装置10および端末装置10aの画面キャプチャとを探索する。 The search unit 224a accesses the input event database and the screen capture database, and searches for the input event of the terminal device 10 and the terminal device 10a corresponding to the window state and the screen capture of the terminal device 10 and the terminal device 10a.
 計測部225aは、探索部224aにより探索された入力イベントの発生時刻と、当該入力イベントに対応する画面キャプチャの取得時刻との差分の値を計測する。そして、計測部225aは、計測した差分をEtoEの処理時間として出力する。 The measurement unit 225a measures the value of the difference between the occurrence time of the input event searched by the search unit 224a and the acquisition time of the screen capture corresponding to the input event. Then, the measurement unit 225a outputs the measured difference as the EtoE processing time.
 記憶部23aは、所定領域に入力イベントデータベースと画面キャプチャデータベースとウィンドウ状態データベースとを備える。 The storage unit 23a includes an input event database, a screen capture database, and a window state database in a predetermined area.
 入力イベントデータベースは、データ取得部221aにより取得された、端末装置10および端末装置10aの入力イベントデータを格納するデータベースである。画面キャプチャデータベースは、データ取得部221aにより取得された、端末装置10および端末装置10aの画面キャプチャデータを画面キャプチャデータベースに格納するデータベースである。ウィンドウ状態データベースは、データ取得部221aにより端末装置10aから取得されたウィンドウ状態データを格納するデータベースである。 The input event database is a database that stores the input event data of the terminal device 10 and the terminal device 10a acquired by the data acquisition unit 221a. The screen capture database is a database that stores the screen capture data of the terminal device 10 and the terminal device 10a acquired by the data acquisition unit 221a in the screen capture database. The window state database is a database that stores the window state data acquired from the terminal device 10a by the data acquisition unit 221a.
<処理手順の例>
 次に、第3の実施形態のシステムが実行する処理(処理1,2,3,4)を説明する。
<Example of processing procedure>
Next, the processes (processes 1, 2, 3, and 4) executed by the system of the third embodiment will be described.
 処理1は、システムが、端末装置10および端末装置10aそれぞれから取得された入力イベントを入力イベントデータベースに格納する処理である。処理2は、システムが、端末装置10および端末装置10aそれぞれから取得された画面キャプチャデータを画面キャプチャデータベースに格納する処理である。なお、処理1,2は、図3を用いて説明した処理1,2に対応する処理である。 Process 1 is a process in which the system stores input events acquired from each of the terminal device 10 and the terminal device 10a in the input event database. The process 2 is a process in which the system stores the screen capture data acquired from each of the terminal device 10 and the terminal device 10a in the screen capture database. The processes 1 and 2 are processes corresponding to the processes 1 and 2 described with reference to FIG.
 処理3は、システムが、端末装置10aで取得されたウィンドウ状態データをウィンドウ状態データベースに格納する処理である。処理4は、システムが、処理1,2,3で得られた入力イベントデータと画面キャプチャデータとウィンドウ状態データとに基づき、EtoEの処理時間を計算する処理である。 Process 3 is a process in which the system stores the window state data acquired by the terminal device 10a in the window state database. The process 4 is a process in which the system calculates the EtoE processing time based on the input event data, the screen capture data, and the window state data obtained in the processes 1, 2, and 3.
 処理1,2は、端末装置10のバックグラウンドで並列に行われる処理である。処理1,2,3は、端末装置10aのバックグラウンドで並列に行われる処理である。また、処理4は、例えば、データ処理装置20aのユーザが命令したタイミングで開始される。なお、データ処理装置20aの設定部223aは、事前に検知対象イベントの設定情報の入力を受け付けておくものとする。 Processes 1 and 2 are processes performed in parallel in the background of the terminal device 10. The processes 1, 2, and 3 are processes performed in parallel in the background of the terminal device 10a. Further, the process 4 is started, for example, at the timing instructed by the user of the data processing device 20a. The setting unit 223a of the data processing device 20a shall accept the input of the setting information of the event to be detected in advance.
 処理1,2は、前記した各実施形態とほぼ同様であるので説明を省略し、処理3を説明する。 Since the processes 1 and 2 are almost the same as the above-described embodiments, the description thereof will be omitted and the process 3 will be described.
<処理3>
 処理3において、システムは、所定時間ごとに端末装置10の接続先である端末装置10aからウィンドウとプロセスの情報を取得し、ウィンドウまたはプロセスに変化があれば、当該変化を示すウィンドウ状態データをウィンドウ状態データベースに格納する。例えば、システムは、端末装置10aの前後のアクティブウィンドウの状態が異なり、かつ、前のアクティブウィンドウのプロセスがない場合、端末装置10aのウィンドウ状態を「アクティブウィンドウが閉じられた状態」と判定する。そして、システムは、上記のウィンドウ状態と当該ウィンドウ状態の取得時刻とをウィンドウ状態データとしてウィンドウ状態データベースに格納する。
<Process 3>
In process 3, the system acquires window and process information from the terminal device 10a to which the terminal device 10 is connected at predetermined time intervals, and if there is a change in the window or process, window state data indicating the change is displayed in the window. Store in the status database. For example, if the states of the active windows before and after the terminal device 10a are different and there is no process of the previous active window, the system determines that the window state of the terminal device 10a is "the state in which the active window is closed". Then, the system stores the above window state and the acquisition time of the window state as window state data in the window state database.
 以下、処理3の処理手順の一例を、図7を用いて説明する。例えば、端末装置10aのウィンドウ状態取得部16は、所定時間ごとにアクティブウィンドウおよびプロセスの情報を取得し(図7のS51)、前後の情報に変化があれば(S52でYes)、ウィンドウ状態データを取得する。そして、データ処理装置20aのデータ取得部221aは、通信制御部21経由でウィンドウ状態データを取得すると、取得したウィンドウ状態データをウィンドウ状態データベースに格納する(S53)。その後、端末装置10aのユーザがプロセスを止めず、また、端末装置10aがシャットダウンしていなければ(S54でNo)、S51へ戻る。一方、端末装置10aのユーザがプロセスを止めた/端末装置10がシャットダウンしていれば(S54でYes)、処理を終了する。 Hereinafter, an example of the processing procedure of processing 3 will be described with reference to FIG. 7. For example, the window state acquisition unit 16 of the terminal device 10a acquires information on the active window and the process at predetermined time intervals (S51 in FIG. 7), and if there is a change in the information before and after (Yes in S52), the window state data. To get. Then, when the data acquisition unit 221a of the data processing device 20a acquires the window state data via the communication control unit 21, the acquired window state data is stored in the window state database (S53). After that, if the user of the terminal device 10a does not stop the process and the terminal device 10a is not shut down (No in S54), the process returns to S51. On the other hand, if the user of the terminal device 10a has stopped the process / the terminal device 10 has shut down (Yes in S54), the process is terminated.
<処理4>
 処理4において、データ処理装置20aは、ウィンドウ状態データベースにアクセスし、検知対象のウィンドウ状態のウィンドウ状態データ(対象ウィンドウ状態データ)を取得する。そして、データ処理装置20aは、取得した対象ウィンドウ状態データに基づき、検知対象のウィンドウ状態の取得時刻の直前に発生した端末装置10aの入力イベント(接続先端末側の対象入力イベント)の入力イベントデータを入力イベントデータベースから取得する。
<Process 4>
In the process 4, the data processing device 20a accesses the window state database and acquires the window state data (target window state data) of the window state to be detected. Then, the data processing device 20a is based on the acquired target window state data, and the input event data of the input event (target input event on the connection destination terminal side) of the terminal device 10a that occurred immediately before the acquisition time of the window state to be detected. Is obtained from the input event database.
 また、データ処理装置20aは、対象ウィンドウ状態データに示されるウィンドウ状態の取得時刻の直後に発生した端末装置10aの画面キャプチャイベント(接続先端末側の対象画面キャプチャイベント)の画面キャプチャデータを画面キャプチャデータベースから取得する。 Further, the data processing device 20a screen-captures the screen capture data of the screen capture event (target screen capture event on the connection destination terminal side) of the terminal device 10a that occurs immediately after the acquisition time of the window state shown in the target window state data. Get from the database.
 そして、データ処理装置20aは、入力イベントデータベースから、接続先端末側の対象入力イベントに対応する、接続元端末側の対象入力イベントの入力イベントデータを取得する。例えば、データ処理装置20aは、入力イベントデータベースにおける、端末装置10および端末装置10aそれぞれの入力イベントの時系列での遷移順に基づき、接続先端末側(端末装置10a側)の対象入力イベントに対応する、接続元端末側(端末装置10側)の対象入力イベントの入力イベントデータを取得する。 Then, the data processing device 20a acquires the input event data of the target input event on the connection source terminal side corresponding to the target input event on the connection destination terminal side from the input event database. For example, the data processing device 20a corresponds to a target input event on the connection destination terminal side (terminal device 10a side) based on the transition order of the input events of the terminal device 10 and the terminal device 10a in the input event database in time series. , Acquires the input event data of the target input event on the connection source terminal side (terminal device 10 side).
 次に、データ処理装置20aは、画面キャプチャデータベースにおける端末装置10の画面キャプチャデータから、端末装置10側の対象入力イベントの発生時刻以降の時刻を取得時刻とし、かつ、端末装置10a側の対象画面キャプチャイベントの画面キャプチャデータと、画面キャプチャが最も類似する(あるいは、類似度が所定値以上の)画面キャプチャデータを見つける。そして、データ処理装置20aは、見つけた画面キャプチャデータを端末装置10側の対象画面キャプチャイベントの画面キャプチャデータとする。最後に、データ処理装置20aは、端末装置10側の対象入力イベントの発生時刻と端末装置10側の対象画面キャプチャイベントの取得時間との差分の値を、EtoEの処理時間として出力する。 Next, the data processing device 20a sets the time after the occurrence time of the target input event on the terminal device 10 side as the acquisition time from the screen capture data of the terminal device 10 in the screen capture database, and sets the target screen on the terminal device 10a side as the acquisition time. Find the screen capture data that is most similar (or has a similarity greater than or equal to) the screen capture data of the capture event. Then, the data processing device 20a uses the found screen capture data as the screen capture data of the target screen capture event on the terminal device 10 side. Finally, the data processing device 20a outputs the value of the difference between the occurrence time of the target input event on the terminal device 10 side and the acquisition time of the target screen capture event on the terminal device 10 side as the EtoE processing time.
 このようにすることでデータ処理装置20aは、端末装置10のウィンドウ状態の変化を検知対象のイベントとして用いてEtoEの処理時間を計算することができる。 By doing so, the data processing device 20a can calculate the EtoE processing time by using the change in the window state of the terminal device 10 as an event to be detected.
 以下、図8を用いて、処理4の一例を説明する。例えば、データ処理装置20aは、ウィンドウ状態データベースにアクセスし、ウィンドウ状態データを選択し(S61)、検知対象イベントか否かを判定する(S62)。例えば、データ処理装置20aは、ウィンドウ状態データベースの中からウィンドウ状態データを1つ選択し、選択したウィンドウ状態データの示すウィンドウ状態が、設定情報に示される検知対象のウィンドウ状態か否かを判定する。そして、データ処理装置20aが検知対象イベントと判定した場合(S62でYes)、検知対象イベントの直前に発生した接続先端末側(端末装置10a側)の入力イベントの入力イベントデータを入力イベントデータベースから取得する(S63)。なお、この検知対象イベントの直前に発生した接続先端末側の入力イベントを接続先端末側の対象入力イベントと呼ぶ。 Hereinafter, an example of the process 4 will be described with reference to FIG. For example, the data processing device 20a accesses the window state database, selects the window state data (S61), and determines whether or not the event is a detection target event (S62). For example, the data processing device 20a selects one window state data from the window state database, and determines whether the window state indicated by the selected window state data is the window state to be detected indicated in the setting information. .. Then, when the data processing device 20a determines that the event is a detection target event (Yes in S62), the input event data of the input event on the connection destination terminal side (terminal device 10a side) that occurred immediately before the detection target event is input from the input event database. Acquire (S63). The input event on the connection destination terminal side that occurs immediately before this detection target event is called the target input event on the connection destination terminal side.
 その後、データ処理装置20は、接続先端末側の対象入力イベントに対応するローカル端末側(端末装置10側)の入力イベントの入力イベントデータを見つける(S64)。例えば、データ処理装置20aは、入力イベントデータベースにおける、端末装置10および端末装置10aそれぞれの入力イベントの時系列での遷移順に基づき、端末装置10a側の対象入力イベントに対応する、端末装置10側の入力イベントの入力イベントデータを見つける。なお、データ処理装置20aは、入力イベントデータベースにおける、端末装置10a側の対象入力イベントの発生時刻の直後を発生時刻とする端末装置10側の入力イベントの入力イベントデータを見つけてもよい。 After that, the data processing device 20 finds the input event data of the input event on the local terminal side (terminal device 10 side) corresponding to the target input event on the connection destination terminal side (S64). For example, the data processing device 20a corresponds to the target input event on the terminal device 10a side based on the transition order of the input events of the terminal device 10 and the terminal device 10a in the input event database in the time series. Find the input event data for the input event. The data processing device 20a may find the input event data of the input event on the terminal device 10 side having the occurrence time immediately after the occurrence time of the target input event on the terminal device 10a side in the input event database.
 次に、データ処理装置20aは、対象ウィンドウ状態データの直後(対象ウィンドウ状態データに示されるウィンドウ状態の取得時刻の直後)に発生した、接続先端末側の画面キャプチャデータを画面キャプチャデータベースから取得する(S65)。 Next, the data processing device 20a acquires the screen capture data on the connection destination terminal side generated immediately after the target window state data (immediately after the acquisition time of the window state shown in the target window state data) from the screen capture database. (S65).
 その後、データ処理装置20aは、画面キャプチャデータベースにおける端末装置10の画面キャプチャデータの中から、S65で取得した接続先端末側の画面キャプチャデータに対して、類似度が閾値以上のローカル端末側(端末装置10側)の画面キャプチャの画面キャプチャデータを見つける(S66)。 After that, the data processing device 20a is a local terminal side (terminal) having a similarity equal to or higher than a threshold value with respect to the screen capture data on the connection destination terminal side acquired in S65 from the screen capture data of the terminal device 10 in the screen capture database. Find the screen capture data of the screen capture of the device 10 side) (S66).
 そして、データ処理装置20aは、S66で見つけたローカル端末側の画面キャプチャデータにおける画面キャプチャの取得時刻が、S64で見つけたローカル端末側の入力イベントデータにおける入力イベントの発生時刻の後か否かを判定する(S67:ローカル端末側の画面キャプチャの取得時刻は、ローカル端末側の入力イベントの発生時刻の後か?)。 Then, the data processing device 20a determines whether or not the acquisition time of the screen capture in the screen capture data on the local terminal side found in S66 is after the occurrence time of the input event in the input event data on the local terminal side found in S64. Judgment (S67: Is the acquisition time of the screen capture on the local terminal side after the occurrence time of the input event on the local terminal side?).
 S67においてデータ処理装置20aは、S66で見つけたローカル端末側の画面キャプチャデータにおける画面キャプチャの取得時刻が、S64で見つけたローカル端末側の入力イベントデータにおける入力イベントの発生時刻の後と判定した場合(S67でYes)、S68へ進む。そして、データ処理装置20aは、S64で見つけたローカル端末側の入力イベントの発生時刻と、S66で見つけたローカル端末側の画面キャプチャの取得時刻との差分の値を求める(S68)。そして、データ処理装置20aは、S68で求めた差分の値を出力する(S69)。その後、S70へ進む。 When the data processing device 20a determines in S67 that the acquisition time of the screen capture in the screen capture data on the local terminal side found in S66 is after the occurrence time of the input event in the input event data on the local terminal side found in S64. (Yes at S67), proceed to S68. Then, the data processing device 20a obtains a value of the difference between the occurrence time of the input event on the local terminal side found in S64 and the acquisition time of the screen capture on the local terminal side found in S66 (S68). Then, the data processing device 20a outputs the value of the difference obtained in S68 (S69). Then proceed to S70.
 S70において、データ処理装置20aが、ウィンドウ状態データベース内の全てのウィンドウ状態データを確認したと判定した場合(S70でYes)、処理を終了する。一方、データ処理装置20aが、ウィンドウ状態データベースのいずれかのウィンドウ状態データが未確認であると判定した場合(S70でNo)、S61へ戻る。 In S70, when it is determined that the data processing device 20a has confirmed all the window state data in the window state database (Yes in S70), the process ends. On the other hand, when the data processing device 20a determines that any window state data in the window state database is unconfirmed (No in S70), the process returns to S61.
 なお、S62において、データ処理装置20aが、S61で選択したウィンドウ状態データの示すウィンドウ状態は検知対象イベントではないと判定した場合(S62でNo)、S70へ進む。また、S67において、データ処理装置20aが、S66で見つけたローカル端末側の画面キャプチャの取得時刻は、S64で見つけたローカル端末側の入力イベントの発生時刻以前と判定した場合も(S67でNo)、S70へ進む。 If the data processing device 20a determines in S62 that the window state indicated by the window state data selected in S61 is not a detection target event (No in S62), the process proceeds to S70. Further, in S67, when the data processing device 20a determines that the acquisition time of the screen capture on the local terminal side found in S66 is before the occurrence time of the input event on the local terminal side found in S64 (No in S67). , Proceed to S70.
 このようにすることで、データ処理装置20aは、端末装置10のウィンドウ状態の変化を検知対象のイベントとして用いてEtoEの処理時間を計算することができる。 By doing so, the data processing device 20a can calculate the EtoE processing time by using the change in the window state of the terminal device 10 as an event to be detected.
 なお、各実施形態において、端末装置10は、例えば、VDIサーバ経由で接続先の端末装置(例えば、端末装置10a)にリモート接続する場合について説明したが、当該接続先の端末装置は、物理的な端末装置であってもよいし、VDIにより提供される仮想マシンであってもよい。また、接続先の端末装置が社内に設置される端末装置である場合、端末装置10から接続先の端末装置への接続は、例えば、VPN(Virtual Private Network)等を経由した社内LAN(Local Area Network)への接続を用いてもよい。 In each embodiment, the case where the terminal device 10 is remotely connected to the connection destination terminal device (for example, the terminal device 10a) via the VDI server has been described, but the connection destination terminal device is physically connected. It may be a terminal device or a virtual machine provided by VDI. Further, when the terminal device of the connection destination is a terminal device installed in the company, the connection from the terminal device 10 to the terminal device of the connection destination is, for example, an in-house LAN (Local Area) via a VPN (Virtual Private Network) or the like. Network) may be used.
[システム構成等]
 また、図示した各装置の各構成要素は機能概念的なものであり、必ずしも物理的に図示のように構成されていることを要しない。すなわち、各装置の分散・統合の具体的形態は図示のものに限られず、その全部又は一部を、各種の負荷や使用状況等に応じて、任意の単位で機能的又は物理的に分散・統合して構成することができる。さらに、各装置にて行われる各処理機能は、その全部又は任意の一部が、CPU(Central Processing Unit)及び当該CPUにて実行されるプログラムにて実現され、あるいは、ワイヤードロジックによるハードウェアとして実現され得る。
[System configuration, etc.]
Further, each component of each of the illustrated devices is a functional concept, and does not necessarily have to be physically configured as shown in the figure. That is, the specific form of distribution / integration of each device is not limited to the one shown in the figure, and all or part of them may be functionally or physically distributed / physically in arbitrary units according to various loads and usage conditions. Can be integrated and configured. Further, each processing function performed by each device is realized by a CPU (Central Processing Unit) and a program executed by the CPU, or as hardware by wired logic. Can be realized.
 また、各実施形態において説明した各処理のうち、自動的に行われるものとして説明した処理の全部又は一部を手動的に行うこともでき、あるいは、手動的に行われるものとして説明した処理の全部又は一部を公知の方法で自動的に行うこともできる。この他、上記文書中や図面中で示した処理手順、制御手順、具体的名称、各種のデータやパラメータを含む情報については、特記する場合を除いて任意に変更することができる。 Further, among the processes described in each embodiment, all or part of the processes described as being automatically performed can be manually performed, or the processes described as being manually performed can be performed. All or part of it can be done automatically by a known method. In addition, the processing procedure, control procedure, specific name, and information including various data and parameters shown in the above document and drawings can be arbitrarily changed unless otherwise specified.
[プログラム]
 前記した各実施形態のシステムは、パッケージソフトウェアやオンラインソフトウェアとしてデータ処理プログラムを所望のコンピュータにインストールさせることによって実装できる。例えば、上記のデータ処理プログラムを情報処理装置に実行させることにより、情報処理装置を各実施形態のシステムとして機能させることができる。ここで言う情報処理装置には、デスクトップ型又はノート型のパーソナルコンピュータが含まれる。また、その他にも、情報処理装置にはスマートフォン、携帯電話機やPHS(Personal Handyphone System)等の移動体通信端末、さらには、PDA(Personal Digital Assistant)等のスレート端末等がその範疇に含まれる。
[program]
The system of each of the above-described embodiments can be implemented by installing a data processing program on a desired computer as packaged software or online software. For example, by causing the information processing apparatus to execute the above data processing program, the information processing apparatus can function as the system of each embodiment. The information processing device referred to here includes a desktop type or notebook type personal computer. In addition, information processing devices include smartphones, mobile communication terminals such as mobile phones and PHS (Personal Handyphone System), and slate terminals such as PDAs (Personal Digital Assistants).
 また、各実施形態のデータ処理装置は、ユーザが使用する端末装置をクライアントとし、当該クライアントに上記の処理に関するサービスを提供するサーバ装置として実装することもできる。この場合、サーバ装置は、Webサーバとして実装することとしてもよいし、アウトソーシングによって上記の処理に関するサービスを提供するクラウドとして実装することとしてもかまわない。 Further, the data processing device of each embodiment can be implemented as a server device in which the terminal device used by the user is a client and the service related to the above processing is provided to the client. In this case, the server device may be implemented as a Web server, or may be implemented as a cloud that provides services related to the above processing by outsourcing.
 図9は、データ処理プログラムを実行するコンピュータの一例を示す図である。コンピュータ1000は、例えば、メモリ1010、CPU1020を有する。また、コンピュータ1000は、ハードディスクドライブインタフェース1030、ディスクドライブインタフェース1040、シリアルポートインタフェース1050、ビデオアダプタ1060、ネットワークインタフェース1070を有する。これらの各部は、バス1080によって接続される。 FIG. 9 is a diagram showing an example of a computer that executes a data processing program. The computer 1000 has, for example, a memory 1010 and a CPU 1020. The computer 1000 also has a hard disk drive interface 1030, a disk drive interface 1040, a serial port interface 1050, a video adapter 1060, and a network interface 1070. Each of these parts is connected by a bus 1080.
 メモリ1010は、ROM(Read Only Memory)1011及びRAM(Random Access Memory)1012を含む。ROM1011は、例えば、BIOS(Basic Input Output System)等のブートプログラムを記憶する。ハードディスクドライブインタフェース1030は、ハードディスクドライブ1090に接続される。ディスクドライブインタフェース1040は、ディスクドライブ1100に接続される。例えば磁気ディスクや光ディスク等の着脱可能な記憶媒体が、ディスクドライブ1100に挿入される。シリアルポートインタフェース1050は、例えばマウス1110、キーボード1120に接続される。ビデオアダプタ1060は、例えばディスプレイ1130に接続される。 The memory 1010 includes a ROM (Read Only Memory) 1011 and a RAM (Random Access Memory) 1012. The ROM 1011 stores, for example, a boot program such as a BIOS (Basic Input Output System). The hard disk drive interface 1030 is connected to the hard disk drive 1090. The disk drive interface 1040 is connected to the disk drive 1100. For example, a removable storage medium such as a magnetic disk or an optical disk is inserted into the disk drive 1100. The serial port interface 1050 is connected to, for example, a mouse 1110 and a keyboard 1120. The video adapter 1060 is connected to, for example, the display 1130.
 ハードディスクドライブ1090は、例えば、OS1091、アプリケーションプログラム1092、プログラムモジュール1093、プログラムデータ1094を記憶する。すなわち、各実施形態のシステムが実行する各処理を規定するプログラムは、コンピュータにより実行可能なコードが記述されたプログラムモジュール1093として実装される。プログラムモジュール1093は、例えばハードディスクドライブ1090に記憶される。例えば、各実施形態のシステムにおける機能構成と同様の処理を実行するためのプログラムモジュール1093が、ハードディスクドライブ1090に記憶される。なお、ハードディスクドライブ1090は、SSDにより代替されてもよい。 The hard disk drive 1090 stores, for example, the OS 1091, the application program 1092, the program module 1093, and the program data 1094. That is, the program that defines each process executed by the system of each embodiment is implemented as a program module 1093 in which a code that can be executed by a computer is described. The program module 1093 is stored in, for example, the hard disk drive 1090. For example, a program module 1093 for executing a process similar to the functional configuration in the system of each embodiment is stored in the hard disk drive 1090. The hard disk drive 1090 may be replaced by an SSD.
 また、上述した実施形態の処理で用いられる設定情報は、プログラムデータ1094として、例えばメモリ1010やハードディスクドライブ1090に記憶される。そして、CPU1020が、メモリ1010やハードディスクドライブ1090に記憶されたプログラムモジュール1093やプログラムデータ1094を必要に応じてRAM1012に読み出して実行する。 Further, the setting information used in the processing of the above-described embodiment is stored as program data 1094 in, for example, a memory 1010 or a hard disk drive 1090. Then, the CPU 1020 reads the program module 1093 and the program data 1094 stored in the memory 1010 and the hard disk drive 1090 into the RAM 1012 and executes them as needed.
 なお、プログラムモジュール1093やプログラムデータ1094は、ハードディスクドライブ1090に記憶される場合に限らず、例えば着脱可能な記憶媒体に記憶され、ディスクドライブ1100等を介してCPU1020によって読み出されてもよい。あるいは、プログラムモジュール1093及びプログラムデータ1094は、ネットワーク(LAN(Local Area Network)、WAN(Wide Area Network)等)を介して接続された他のコンピュータに記憶されてもよい。そして、プログラムモジュール1093及びプログラムデータ1094は、他のコンピュータから、ネットワークインタフェース1070を介してCPU1020によって読み出されてもよい。 The program module 1093 and the program data 1094 are not limited to those stored in the hard disk drive 1090, but may be stored in, for example, a removable storage medium and read by the CPU 1020 via the disk drive 1100 or the like. Alternatively, the program module 1093 and the program data 1094 may be stored in another computer connected via a network (LAN (Local Area Network), WAN (Wide Area Network), etc.). Then, the program module 1093 and the program data 1094 may be read from another computer by the CPU 1020 via the network interface 1070.
10,10a 端末装置
11 入力取得部
12 画面取得部
13 画面比較部
14 出力部
15,21 通信制御部
20,20a データ処理装置
22,22a 制御部
23,23a 記憶部
221,221a データ取得部
222,222a 判定部
223,223a 設定部
224,224a 探索部
225,225a 計測部
10,10a Terminal device 11 Input acquisition unit 12 Screen acquisition unit 13 Screen comparison unit 14 Output unit 15,21 Communication control unit 20,20a Data processing device 22,22a Control unit 23,23a Storage unit 221,221a Data acquisition unit 2222 222a Judgment unit 223, 223a Setting unit 224, 224a Search unit 225, 225a Measurement unit

Claims (5)

  1.  リモート接続の接続元の端末で発生するイベントのうち、検知対象とするイベントを示す設定情報を受け付ける入力部と、
     前記設定情報と、所定時間ごとに取得された前記接続元の端末の画面のキャプチャとに基づき、前記検知対象のイベントにより、前記接続元の端末の画面の変化が発生したと判定した場合、当該イベントの発生時刻と、前記変化が発生した画面のキャプチャの取得時刻との差分の値を、操作遅延時間として出力する出力処理部と
     を備えることを特徴とするデータ処理装置。
    Among the events that occur on the connection source terminal of the remote connection, the input unit that accepts the setting information indicating the event to be detected, and
    When it is determined that the screen of the connection source terminal has changed due to the event to be detected based on the setting information and the capture of the screen of the connection source terminal acquired at predetermined time intervals, the said case. A data processing device including an output processing unit that outputs a value of a difference between an event occurrence time and a capture acquisition time of a screen in which the change has occurred as an operation delay time.
  2.  前記出力処理部は、
     前記検知対象のイベントにより変化が発生した画面のキャプチャの取得時刻として、当該イベントの発生時刻以降の最初の画面のキャプチャの取得時刻を用いる
     ことを特徴とする請求項1に記載のデータ処理装置。
    The output processing unit
    The data processing apparatus according to claim 1, wherein the acquisition time of the first screen capture after the occurrence time of the event is used as the acquisition time of the screen capture in which the change occurs due to the event to be detected.
  3.  前記設定情報における検知対象のイベントは、
     前記接続元の端末の操作イベントと、当該操作イベントにより生じる画面変化の内容との組み合わせにより定義され、
     前記出力処理部は、
     前記設定情報に示される操作イベントにより、前記設定情報に示される画面変化が生じた場合、当該イベントの発生時刻と、当該画面変化が発生した画面のキャプチャの取得時刻との差分の値を、前記操作遅延時間として出力する
     ことを特徴とする請求項1に記載のデータ処理装置。
    The event to be detected in the setting information is
    It is defined by the combination of the operation event of the connection source terminal and the content of the screen change caused by the operation event.
    The output processing unit
    When the screen change shown in the setting information occurs due to the operation event shown in the setting information, the value of the difference between the occurrence time of the event and the acquisition time of the capture of the screen in which the screen change occurs is set as described above. The data processing apparatus according to claim 1, wherein the data is output as an operation delay time.
  4.  前記設定情報における検知対象とするイベントは、
     接続先の端末のウィンドウ状態の変化に関するイベントであり、
     前記出力処理部は、
     前記設定情報と、前記接続元の端末とリモート接続される端末である接続先の端末において所定時間ごとに取得された前記接続先の端末のウィンドウ状態とに基づき、前記検知対象のウィンドウ状態の変化により、前記接続元の端末の画面の変化が発生したと判定した場合、
     前記ウィンドウ状態の変化が発生した時刻の直前に発生した前記接続先の端末における操作イベントを特定し、前記特定した操作イベントに対応する前記接続元の端末の操作イベントを特定する第1の処理と、
     前記ウィンドウ状態の変化の発生時刻の直後における前記接続先の端末の画面のキャプチャを特定し、前記特定した画面のキャプチャとの類似度が所定値以上である前記接続元の端末の画面のキャプチャを特定し、前記特定した前記接続元の端末の画面のキャプチャの取得時刻を特定する第2の処理と、
     前記第2の処理により特定した前記接続元の端末の画面のキャプチャの取得時刻が、前記第1の処理により特定した前記接続元の端末の操作イベントの発生時刻より後である場合、当該操作イベントの発生時刻と、前記画面のキャプチャの取得時刻との差分の値を、前記操作遅延時間として出力する第3の処理とを実行する
     ことを特徴とする請求項1に記載のデータ処理装置。
    The event to be detected in the setting information is
    It is an event related to the change of the window state of the connected terminal.
    The output processing unit
    Changes in the window state of the detection target based on the setting information and the window state of the connection destination terminal acquired at predetermined time intervals in the connection destination terminal which is a terminal remotely connected to the connection source terminal. When it is determined that the screen of the connection source terminal has changed due to
    The first process of specifying the operation event in the connection destination terminal that occurred immediately before the time when the change in the window state occurred, and specifying the operation event of the connection source terminal corresponding to the specified operation event. ,
    The capture of the screen of the terminal of the connection destination immediately after the time when the change of the window state occurs is specified, and the capture of the screen of the terminal of the connection source having a similarity with the capture of the specified screen is equal to or more than a predetermined value. The second process of specifying and specifying the acquisition time of the capture of the screen of the specified connection source terminal,
    When the acquisition time of the screen capture of the connection source terminal specified by the second process is later than the occurrence time of the operation event of the connection source terminal specified by the first process, the operation event The data processing apparatus according to claim 1, wherein the third process of outputting the value of the difference between the occurrence time of the screen and the acquisition time of the capture of the screen as the operation delay time is executed.
  5.  データ処理装置により実行されるデータ処理方法であって、
     リモート接続の接続元の端末で発生するイベントのうち、検知対象とするイベントを示す設定情報を受け付ける工程と、
     前記設定情報と、所定時間ごとに取得された前記接続元の端末の画面のキャプチャとに基づき、前記検知対象のイベントにより、前記接続元の端末の画面の変化が発生したと判定した場合、当該イベントの発生時刻と、前記変化が発生した画面のキャプチャの取得時刻との差分の値を、操作遅延時間として出力する工程と
     を含むことを特徴とするデータ処理方法。
    A data processing method executed by a data processing device.
    Among the events that occur on the connection source terminal of the remote connection, the process of accepting the setting information indicating the event to be detected, and
    When it is determined that the screen of the connection source terminal has changed due to the event to be detected based on the setting information and the capture of the screen of the connection source terminal acquired at predetermined time intervals, the said case. A data processing method including a step of outputting a value of a difference between an event occurrence time and a capture acquisition time of a screen in which the change has occurred as an operation delay time.
PCT/JP2020/030801 2020-08-13 2020-08-13 Data processing device and data processing method WO2022034672A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2020/030801 WO2022034672A1 (en) 2020-08-13 2020-08-13 Data processing device and data processing method
JP2022542553A JP7439934B2 (en) 2020-08-13 2020-08-13 Data processing device and data processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/030801 WO2022034672A1 (en) 2020-08-13 2020-08-13 Data processing device and data processing method

Publications (1)

Publication Number Publication Date
WO2022034672A1 true WO2022034672A1 (en) 2022-02-17

Family

ID=80247107

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/030801 WO2022034672A1 (en) 2020-08-13 2020-08-13 Data processing device and data processing method

Country Status (2)

Country Link
JP (1) JP7439934B2 (en)
WO (1) WO2022034672A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004038619A (en) * 2002-07-04 2004-02-05 Hitachi Ltd Automatic response time measuring method, and device therefor
JP2015011653A (en) * 2013-07-02 2015-01-19 富士通株式会社 Performance measuring method, performance measuring program, and performance measuring apparatus
JP2016029543A (en) * 2014-07-25 2016-03-03 株式会社三菱東京Ufj銀行 Information processing apparatus and program

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016018260A (en) 2014-07-04 2016-02-01 日本電気株式会社 Client server system, control method, and control program
CN111208960B (en) 2019-12-26 2023-05-09 杭州顺网科技股份有限公司 Remote display delay reduction method based on frame extraction control and time synchronization algorithm

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004038619A (en) * 2002-07-04 2004-02-05 Hitachi Ltd Automatic response time measuring method, and device therefor
JP2015011653A (en) * 2013-07-02 2015-01-19 富士通株式会社 Performance measuring method, performance measuring program, and performance measuring apparatus
JP2016029543A (en) * 2014-07-25 2016-03-03 株式会社三菱東京Ufj銀行 Information processing apparatus and program

Also Published As

Publication number Publication date
JP7439934B2 (en) 2024-02-28
JPWO2022034672A1 (en) 2022-02-17

Similar Documents

Publication Publication Date Title
US7065461B2 (en) Point cloud measuring system and method
US20030110421A1 (en) Performance evaluation device, performance evaluation information managing device, performance evaluation method, performance evaluation information managing method, performance evaluation system
US20050228875A1 (en) System for estimating processing requirements
WO2021197136A1 (en) Page rendering detection method and apparatus
US20200310555A1 (en) Remote information input method, device and system
WO2021219080A1 (en) Federated learning model-based view display method, apparatus and device, and medium
CN111506942A (en) Model data processing method, device, equipment and computer storage medium
CN105302413A (en) UI test method and system for control
US20040139186A1 (en) Apparatus and method for recommending alternative components based on performance measurement results of components
JP6102575B2 (en) Performance measurement method, performance measurement program, and performance measurement apparatus
US11563654B2 (en) Detection device and detection method
CN111782516A (en) Code testing method and device and storage medium
US20240143651A1 (en) Logging Image Definition Recognition Method and Device, Medium, and Electronic Equipment
WO2022034672A1 (en) Data processing device and data processing method
US10320636B2 (en) State information completion using context graphs
CN112667212A (en) Buried point data visualization method and device, terminal and storage medium
CN111191087B (en) Character matching method, terminal device and computer readable storage medium
WO2023213094A1 (en) Dynamic data area selection method, system and device applied to integrated circuit device, and computer-readable storage medium
WO2021121130A1 (en) Method and apparatus for information collection, computer device, and storage medium
CN112433651B (en) Region identification method, device, storage medium and device
US11762562B2 (en) Performance analysis apparatus and performance analysis method
KR20180131132A (en) Method for image matching and apparatus for executing the method
WO2021229809A1 (en) User operation recording device and user operation recording method
CN104503980B (en) Determining comprehensive search information and determining candidate search sequences to be pushed according to comprehensive search information
US9977721B2 (en) Evaluating and predicting computer system performance using kneepoint analysis

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20949534

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022542553

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20949534

Country of ref document: EP

Kind code of ref document: A1