CN107450793B - Data processing apparatus and data processing method - Google Patents

Data processing apparatus and data processing method Download PDF

Info

Publication number
CN107450793B
CN107450793B CN201611110124.0A CN201611110124A CN107450793B CN 107450793 B CN107450793 B CN 107450793B CN 201611110124 A CN201611110124 A CN 201611110124A CN 107450793 B CN107450793 B CN 107450793B
Authority
CN
China
Prior art keywords
data processing
operation screen
display
processing apparatus
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611110124.0A
Other languages
Chinese (zh)
Other versions
CN107450793A (en
Inventor
末泽庆人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujifilm Business Innovation Corp
Original Assignee
Fujifilm Business Innovation Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujifilm Business Innovation Corp filed Critical Fujifilm Business Innovation Corp
Publication of CN107450793A publication Critical patent/CN107450793A/en
Application granted granted Critical
Publication of CN107450793B publication Critical patent/CN107450793B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1438Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display using more than one graphics controller
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/06Use of more than one graphics processor to process data before displaying to one or more screens

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • User Interface Of Digital Computer (AREA)
  • Information Transfer Between Computers (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The present application relates to a data processing apparatus and a data processing method. Specifically, the data processing apparatus includes: a first display controller that displays, on a first display area of a display, an operation screen of an operation of another apparatus connected to the data processing apparatus via a network, and displays, on a second display area of the display, the operation screen that operates a first object selected from a set of at least one object related to the operation; a storage controller that stores, on a memory, an image of an operation screen displayed on a first display area at a predetermined timing, a first object being associated with the image of the operation screen; and a second display controller that displays an image associated with the first object on the display when a predetermined operation is performed on the first object or the set.

Description

Data processing apparatus and data processing method
Technical Field
The invention relates to a data processing apparatus and a data processing method.
Background
Japanese unexamined patent application publication No. 2010-79342 discloses a technique of automatically generating a model of an operation program manual from a completed web application. Japanese unexamined patent application publication No. 2009-53845 discloses a technique of saving time and labor for calculating an arrangement interval between a plurality of screen objects arranged on a data input screen.
Disclosure of Invention
An object of the present invention is to facilitate resumption of a job suspended by a user using a service via communication via a network.
According to a first aspect of the present invention, a data processing apparatus is provided. The data processing apparatus includes a first display controller that displays, on a first display area of a display, an operation screen of an operation of another apparatus connected to the data processing apparatus via a network, and displays, on a second display area of the display, an operation screen that operates a first object selected from a set of at least one object related to the operation; a storage controller that stores, on a memory at a predetermined timing, an image of the operation screen displayed on the first display area, the first object being associated with the image of the operation screen; and a second display controller that displays the image associated with the first object on the display when a predetermined operation is performed with respect to the first object or the set.
In the data processing apparatus of a second aspect of the present invention, in view of the first aspect, the storage controller stores the image of the operation screen displayed on the first display area on the memory in association with the first object when the operation screen is closed on the second display area.
In the data processing apparatus of a third aspect of the present invention, in view of the first aspect, the storage controller stores the image of the operation screen on the memory in association with the first object when the operation screen is closed on the first display area.
In the data processing apparatus of a fourth aspect of the present invention, in view of the first aspect, the storage controller stores the image of the operation screen displayed on the first display area on the memory in association with the first object at each predetermined timing while the operation screen is displayed on the second display area.
In the data processing device of the fifth aspect of the present invention, in view of one of the first to fourth aspects, the set is stored on the memory.
In the data processing apparatus of a sixth aspect of the present invention, in view of the first to fifth aspects, the second display controller displays the image associated with the first object on the display when the first object is selected from the set.
In the data processing device of the seventh aspect of the present invention, in view of the first to sixth aspects, the set includes image objects indicating the operation to be performed on the objects.
In the data processing apparatus of an eighth aspect of the present invention, in view of the first to seventh aspects, the storage controller stores an address in response to the operation screen displayed on the first display area in association with the first object. The second display controller displays an operation screen obtained by accessing the address associated with the first object, and the first object and the image.
In the data processing apparatus of a ninth aspect of the present invention, in view of the fourth aspect, the storage controller stores the image with attached time information, and information indicating an update time of the first object on the memory. The second display controller displays an image attached with time information having a time difference from the update time of the first object equal to or shorter than a predetermined threshold on the display.
In the data processing apparatus of a tenth aspect of the present invention, in view of the fourth aspect, the storage controller stores the image with attached time information, and information indicating an update time of each page of the first object on the memory. The second display controller displays an image attached with time information having a time difference from the update time of the page to be displayed in the first object on the display, the time difference being equal to or shorter than a predetermined threshold.
A data processing method of a data processing apparatus of an eleventh aspect of the present invention includes the steps of: displaying, on a first display area of a display, an operation screen of an operation of another apparatus connected to the data processing apparatus via a network, and displaying, on a second display area of the display, an operation screen of operating a first object selected from a set of at least one object related to the operation; storing, on a memory, an image of the operation screen displayed on the first display area in association with the first object at a predetermined timing; and displaying the image associated with the first object on the display while performing a predetermined operation on the first object or the set.
According to the first and eleventh aspects of the present invention, it is possible to easily resume a job suspended by a user using a service by communication via a network.
According to the second aspect of the present invention, a user who has paused a job by closing the operation screen on the second display area can easily resume the job.
According to the third aspect of the present invention, a user who has paused a job by closing the operation screen on the first display area can easily resume the job.
According to the fourth aspect of the present invention, a job suspended by a user using a service by communication via a network can be easily restarted.
According to the fifth aspect of the present invention, it is possible to easily resume a job suspended by a user who uses a local environment as a work area.
According to the sixth aspect of the present invention, if the work is once suspended, the work can be easily restarted by selecting the first object.
According to the seventh aspect of the present invention, it is possible to easily resume a job suspended by a user using a service by communication via a network.
According to the eighth aspect of the present invention, when the user restarts the job, the operation screen can be presented to the user as the object of the job.
According to the ninth aspect of the present invention, a job suspended by a user using a service by communication via a network can be easily restarted.
According to the tenth aspect of the present invention, it is possible to easily restart the job of the page to be displayed in the first object.
Drawings
Exemplary embodiments of the invention will be described in detail based on the following drawings, in which:
FIG. 1 illustrates the configuration of a system;
fig. 2 illustrates a functional configuration of a client device;
FIG. 3 illustrates a hardware configuration of a client device;
fig. 4 illustrates an example of a software configuration of a client device;
fig. 5 illustrates an example of a local operation screen;
FIG. 6 illustrates an example of a system operation screen;
fig. 7 illustrates an example of a screen displayed on a User Interface (UI) unit;
FIG. 8 is a flowchart illustrating a storing process of a screenshot (screen shot);
FIG. 9 illustrates an example of the contents of the properties of a screenshot;
FIG. 10 is a flowchart illustrating a storage process of a file;
FIG. 11 illustrates an example of the content of a property of a file;
FIG. 12 is a flowchart illustrating display processing of a screenshot;
fig. 13 illustrates an example of a screen displayed on the UI unit;
FIG. 14 is a flowchart illustrating a storage process of a file; and
fig. 15 is a flowchart illustrating a display process of the screen shot.
Detailed Description
Fig. 1 is a block diagram illustrating the configuration of a system 1 of an exemplary embodiment. The system 1 comprises a client device 10a, 10b, 10c … … and a server 20. The client device 10a, 10b, 10c … … is operated by a user and may be, for example, a personal computer. The client devices 10a, 10b, 10c … … are collectively referred to as "client devices 10" if they do not need to be distinguished from one another. The server 20 provides various services such as storing data. Fig. 1 illustrates a single server 20, but multiple servers may provide the service. In the following discussion, the service provided by the server 20 is referred to as a "coordinated destination system". The communication network 2 includes at least one of the internet, a mobile communication network, and a telephone network. The communication network 2 connects the client device 10 to the server 20.
Fig. 2 illustrates a functional configuration of the client device 10 (an example of a data processing device). The client device 10 includes a first display controller 11, a storage controller 12, and a second display controller 13. The first display controller 11 displays, on a first display area of the display 14, an operation screen for an operation of another device connected to the client device 10 via a network, and displays, on a second display area of the display 14, an operation screen for operating a first object selected from a set of at least one object related to the operation. At a predetermined timing, the storage controller 12 stores the image of the operation screen displayed on the first display area in association with the first object on the memory 15. When a predetermined operation is performed on the first object or set, the second display controller 13 displays an image associated with the first object on the display 14.
Fig. 3 illustrates an example of the hardware configuration of the client device 10. The client device 10 includes a Central Processing Unit (CPU)151, a Read Only Memory (ROM)152, a Random Access Memory (RAM)153, a storage 154, a communication interface 155, and a User Interface (UI) unit 156. The CPU 151 is a control device (processor) that controls elements of the client apparatus 10. The ROM 152 is a nonvolatile storage device that stores programs and data. The RAM 153 is a volatile storage device serving as a work area when the CPU 151 executes programs. Storage 154 is a secondary non-volatile storage device that stores programs and data. The communication interface 155 communicates via the communication network 2. In this case, the communication interface 155 specifically communicates with the server 20. The UI unit 156 includes, for example, a touch screen and keys.
The CPU 151 implements the functions illustrated in fig. 2 by executing programs stored on the storage 154. The CPU 151 that executes the program is an example of each of the first display controller 11, the storage controller 12, and the second display controller 13. The UI unit 156 is an example of the display 14. Storage 154 is an example of memory 15.
Fig. 4 illustrates an example of the software configuration of the client device 10. As illustrated in fig. 4, the cooperation-destination-system client 101 and the document processing system 102 are applications preinstalled on the client device 10, and operate on an Operating System (OS) 110. The cooperation destination system client 101 is an application that displays an operation screen of the cooperation destination system (an operation screen of the operation of the server 20) on the UI unit 156, and may be, for example, a browser. The document processing system 102 is an application that manages files and folders. The document processing system 102 includes a workspace portion 103 and a digital document converter 104. The work space section 103 is a folder that stores files and tools. The workspace part 103 stores not only files and folders but also tools for performing operations (such as file format conversion operations or character recognition operations) on the files or folders. The workspace portion 103 is an example of a collection of at least one object (e.g., a collection including files and folders) related to the collaboration destination system.
The digital document converter 104 is an element that performs an operation of screen-capturing of a screen of the digital documentation collaboration destination system. The digital document converter 104 includes a link destination extractor 105, a screen shot generator 107, and a property information setter 108. The link destination extractor 105 performs an operation of extracting a link destination such as a Uniform Resource Locator (URL) of a screen of the cooperation destination system. The screen shot generator 107 generates a screen shot of an operation screen of the cooperation destination system. The property information setter 108 updates the screen shot and the property information of each file.
Operation example 1
The user executes a job on the cooperation destination system using the client apparatus 10. In this case, the user performs a job using a file or folder stored on the workspace part 103 in the client apparatus 10. The user first starts the document processing system 102 using the UI unit 156. The OS 110 implemented by the CPU 151 starts the document processing system 102 according to information output from the UI unit 156. The document processing system 102 displays an operation screen of the document processing system 102 on the UI unit 156. In the following discussion, for convenience of explanation, the operation screen of the document processing system 102 is referred to as a "partial operation screen".
Fig. 5 illustrates an example of the partial operation screen G2. The local operation screen G2 is an example of an operation screen on which an object selected from the work space portion 103 is operated. The local operation screen G2 includes: an image (such as an icon) of an operation screen of the access cooperation destination system; and icons indicating files or folders included in the work space part 103. The partial operation screen G2 also includes icons (image objects) I1, I2 … … indicating that operations are performed on files or folders in the workspace portion 103. When the user performs an operation on the partial operation screen G2 to select an image of the operation screen for accessing the cooperation destination system, the document processing system 102 requests the OS 110 to start the cooperation destination system client 101 and access the cooperation destination system. In response to a request from the document processing system 102, the OS 110 starts the cooperation destination system client 101. The cooperation destination system client 101 accesses the address of the cooperation destination system, and displays the operation screen of the cooperation destination system on the UI unit 156. In the following discussion, for convenience of explanation, the operation screen of the cooperation destination system is referred to as a "system operation screen".
Fig. 6 illustrates an example of the system operation screen G1. The system operation screen G1 includes: the user enters text boxes T1, T2 … … and a radio button R1, and the user selects one of the items using the radio button R1. The user can input text or select an item on the system operation screen using the UI unit 156. According to an exemplary embodiment, whenever a user enters text or selects an item, and the system operation screen is updated in response to the screen migration, such changes are notified to the document processing system 102 via the OS 110.
The user inputs various information on the system operation screen, and executes a job by opening a file or folder stored on the work space part 103. For example, the user can open a document file included in the workspace part 103 and input text while referring to the content of the document file. The user can also perform an operation of transmitting a file included in the workspace section 103 to the server 20 on the system operation screen.
Fig. 7 illustrates an example of a screen displayed on the UI unit 156. In the example of fig. 7, a system operation screen G1, a local operation screen G2, and a document file G3 are displayed on the UI unit 156. The display area on which the system operation screen G1 is displayed is an example of the first display area in the exemplary embodiment of the present invention. The display area on which the partial operation screen G2 is displayed is an example of the second display area in the exemplary embodiment of the present invention.
If, in the exemplary embodiment, the cooperation-destination system client 101 starts on the operation screen of the document processing system 102, the OS 110 notifies the document processing system 102 to update the system operation screen and information related to the system operation screen (such as the display position and the display size of the operation screen).
Fig. 8 is a flowchart illustrating a storing process of a screenshot to be executed by the document processing system 102. In step S101, the document processing system 102 determines whether an update of the system operation screen has been detected. Upon determining that an update has been detected, the document processing system 102 proceeds to step S102. If no update is detected, the document processing system 102 returns to step S101 and waits for standby until an update has been detected.
In step S102, the document processing system 102 acquires a screen shot of the system operation screen (hereinafter simply referred to as "screen shot"). According to an exemplary embodiment, the document processing system 102 acquires information on the display area of the operation screen of the cooperation destination system client 101 started by the document processing system 102 from the OS 110, and requests the OS 110 to acquire a screen shot of the display area of the system operation screen using the required information. In response to a request from document processing system 102, OS 110 generates a screenshot of the system screen and then hands over the screenshot to document processing system 102.
In step S103, the document processing system 102 acquires the production time and date of the screenshot. In step S104, the document processing system 102 acquires an address in response to the system operation screen. According to an exemplary embodiment, the document processing system 102 obtains a Uniform Resource Locator (URL) of a system operation screen. In step S105, the document processing system 102 stores: time information indicating the production time and date acquired in step S103, and a URL regarding the characteristics of the screen shot acquired in step S104.
FIG. 9 illustrates an example of the contents of the properties of the screenshot. Referring to FIG. 9, the characteristics of the screenshot include "time and date of production" and "URL". Used as "production time and date" is time information indicating the time and date at which the screenshot was produced. Used as the "URL" is the source URL of the screen of the screenshot. The screenshots acquired in steps S102 to S105 of fig. 8 are temporarily stored on a predetermined storage area of the storage 154.
In step S106, the document processing system 102 determines whether the user has completed the job. For example, when the system operation screen is closed, the document processing system 102 may determine that the user has completed the job. If the document processing system 102 determines in step S106 that the user has completed the job, the document processing system 102 ends the processing. If the document processing system 102 determines that the user has not completed the job, the document processing system 102 returns to step S101.
Fig. 10 is a flowchart illustrating a storage process of a file to be executed by the document processing system 102. The process of FIG. 10 and the process of FIG. 8 are performed in parallel by document processing system 102. In step S201, the document processing system 102 determines whether storage of a file has been detected. Upon determining that the storage of the file has been detected, the document processing system 102 proceeds to step S202. Upon determining that the storage of the file has not been detected, the document processing system 102 returns to step S201 and stands by until the storage of the file has been detected.
In step S202, the document processing system 102 stores the characteristics of the stored file, which are accompanied by time information indicating the update time and date.
Fig. 11 illustrates an example of the contents of the property of a file. Referring to fig. 11, time information including a plurality of times and dates is included in the property of the file. According to an exemplary embodiment, each time a file stored on the workspace portion 103 is updated, time information indicating the update time and date is added to the characteristics of the file.
Returning to fig. 10, the document processing system 102 determines in step S203 whether the user has completed the job. For example, when a partial operation screen is closed, the document processing system 102 may determine that the user has completed the job. In another example, when a file stored on the workspace portion 103 of the document processing system 102 is closed, the document processing system 102 may determine that the user has completed the job. In yet another example, the document processing system 102 may determine that the user has completed the job when the displayed page of the document file is changed to another page with the document file included in the workspace portion 103 displayed on the UI unit 156. Upon determining in step S203 that the user has completed the job, the document processing system 102 proceeds to step S204. When it is determined in step S203 that the user has not completed the job, the document processing system 102 returns to step S201.
In step S204, the document processing system 102 stores the set of screen shots acquired in steps S102 to S105 of fig. 8 on the storage area allocated for the work space part 103. The screen shot stored on the workspace part 103 and the object in the workspace part 103 are associated with each other via the relationship between the update time and date of the screen shot and the update time and date of the object. More specifically, according to an exemplary embodiment, when the document processing system 102 determines that the user has completed the job in response to an operation such as closing the partial operation screen, the captured screenshot is associated with the object and then stored on the workspace portion 103. The storing of the association of the screenshot and the object is intended to mean storing at least one of the screenshot and the object, with information indicating the relationship between them (update time and date in the above example). After the operation in step S204, the document processing system 102 simply ends the processing.
For some reason, the user may pause the job using the collaboration destination system. In this case, the user turns off the system operation screen or the partial operation screen. If the user restarts the job again later, the user opens the system operation screen or the partial operation screen again. In other words, the user selects one of the files or folders stored on the workspace part 103 using the local operation screen.
Fig. 12 is a flowchart illustrating processing executed when a job is restarted. The process of fig. 12 is triggered by opening a file in the workspace portion 103. In step S301, the document processing system 102 compares the production time and date of the screen shot stored on the workspace part 103 with the update time and date of the open file, and selects a screen shot whose time difference from the update time and date of the file satisfies a certain condition. More specifically, document processing system 102 may select a screenshot for which the production time and date has minimal differences from the update time and date of the file. Alternatively, document processing system 102 may select a screenshot for which the time difference from the update time and date of the file is equal to or less than a predetermined threshold.
In step S302, the document processing system 102 displays the screen shot selected in step S301 on the UI unit 156, while also displaying a system operation screen obtained by accessing the URL attached to the selected screen shot on the UI unit 156. In other words, in the exemplary embodiment, when the user performs an operation of selecting one of the files or folders stored on the workspace part 103, the document processing system 102 displays a screenshot associated with the selected file or folder on the UI unit 105.
Fig. 13 illustrates an example of a screen displayed on the UI unit 156. As illustrated in FIG. 13, when a user performs an open file operation, document processing system 102 opens file G11 selected by the user while displaying screenshot G12 associated with file G11. The document processing system 102 displays a system operation screen G13 obtained by accessing the URL attached to the screen shot on the UI unit 156.
According to the exemplary embodiment as described above, when the user pauses the job by closing the file in the workspace part 103, the screenshot of the system operation screen at the time of the pause is associated with the file and then stored. When the user resumes the job by opening the file again, the screenshot associated with the file is displayed. In this way, the display state of the user's execution job using the file is reproduced, and the user remembers what job he or she executed on what operation screen using the file.
When document processing system 102 displays a screenshot in an exemplary embodiment, a system operation screen obtained by accessing the URL attached to the screenshot is also displayed together. In this way, when resuming the job, the user does not perform the operation of displaying the system operation screen again.
Operation example 2
Operation example 2 is described below. In operation example 2, the process of fig. 14 is executed instead of the process of fig. 10 of operation example 1, and the process of fig. 15 is executed instead of the process of fig. 12 of operation example 1.
FIG. 14 is a flowchart illustrating a process of storing a file to be executed by the document processing system 102. The process of fig. 14 differs from the process of fig. 10 in that the operation in step S201 is replaced with the operation in step S401, and the operation in step S202 is replaced with the operation in step S402. In step S401, the document processing system 102 determines whether storage of a file or transition between pages in a file has been detected. More specifically, the user opens a file (such as a document file) in the workspace portion 103, and the document processing system 102 detects a transition whenever a page in the opened file is changed to another page. If the storage of the file or the transition between pages in the file has been detected, the document processing system 102 proceeds to step S402.
In step S402, the document processing system 102 stores a property of the file with time information indicating the update time and date attached. When a transition between pages is detected, document processing system 102 stores: information indicating a page before transition, time information indicating an update time and date, and the like are associated therewith.
Fig. 15 is a flowchart illustrating processing performed when a job restarts. The process of FIG. 15 is triggered by opening a file in the workspace portion 103. The process of fig. 15 is also executed when a page of a displayed file is changed to another page. The process of fig. 15 differs from the process of fig. 12 in that the operation in step S301 is replaced with the operation in step S501. In step S501, the document processing system 102 compares the update time and date of the page of the open file with the production time and date of the screen shot stored on the workspace part 103, and selects a screen shot whose time difference between the selections satisfies a predetermined condition. For example, document processing system 102 may select a screenshot in which the production time and date of the displayed page has the least difference from the update time and date of the displayed page.
In this operation example, when the user opens a file in the workspace part 103 to restart the job, a screen shot of a page in response to the displayed file is displayed. When a page to be displayed in a displayed file is changed to another page, the displayed screenshot is updated with the page transition. In this way, the state of the display in which the user has executed the job is reproduced by opening the page of the file.
Modifications of the invention
The exemplary embodiment is one of exemplary embodiments of the present invention, and may be modified as described below. The exemplary embodiments and the modifications described below may be combined as appropriate.
First modification
According to an exemplary embodiment, document processing system 102 takes a screenshot when an update to the system operation screen has been checked. The capture screenshot is not limited to this time. For example, document processing system 102 may obtain the screenshot when the system operation screen is closed. In another example, document processing system 102 may take a screenshot each time a transition from one screen to another on a system operation screen is detected. In yet another example, the document processing system 102 may obtain a screenshot of the system operation screen at the time that an update of the local operation screen is detected. More specifically, when a specific button of a toolbar displayed on the local operation screen is pressed, or when a predetermined comment is added to a document file included in the workspace part 103, the document processing system 102 can acquire a screenshot of the system operating system.
While the partial operation screen is displayed on the UI unit 156, the document processing system 102 can acquire and store the screen shot on the workspace section 103 at each predetermined timing. More specifically, the document processing system 102 periodically performs a query operation to obtain screenshots.
Second modification
In an exemplary embodiment, document processing system 102 stores time information indicating the production time and date and a URL on the characteristics of the screenshot. Storing time information and a URL indicating the production time and date on the screen shot is not limited to this method. For example, a database may be provided that is configured to store the production time and date and the URL of the screenshot, and document processing system 102 may store the production time and date and the URL on the database.
According to an exemplary embodiment, the document processing system 102 includes time information indicating an update time and date in a property of the document file. Storing information about the update time and date of the file is not limited to this method. For example, a database configured to store the update time and date of a file may be provided, and the document processing system 102 may store information indicating the update time and date on the database.
Third modification
Referring to step S204 of fig. 8, the document processing system 102 stores a set of screen shots on the storage area allocated to the workspace part 103. Storing the screenshot is not limited to that time. For example, each time a screenshot is captured, the document processing system 102 may store the captured screenshot on the workspace portion 103, rather than storing the screenshot on a temporary storage area.
Fourth modification
According to an exemplary embodiment, the storage time and date of the file is compared to the production time and date of the screenshot to determine a correspondence therebetween. The association between the file and the screenshot is not limited to this method. For example, document processing system 102 may store an identifier identifying the file and an identifier identifying the screenshot in association with each other on a predetermined database. Any storage method may be accepted as long as the file is associated with the screenshot.
Fifth modification
According to an exemplary embodiment, a screenshot is displayed when a file in the workspace portion 103 is opened. The display of the screenshot is not limited to this time. For example, document processing system 102 may display the screenshot and the file when the local action screen is opened. In another example, the document processing system 102 may display the screenshot and the file at the time of opening the folder included in the workspace part 103.
Sixth modification
According to an exemplary embodiment, when displaying the screenshot, the document processing system 102 displays the actual system operation screen together by accessing the URL. Alternatively, the screenshot and the file may be displayed together. In other words, it may not be necessary to display the system operation screen acquired by accessing the URL.
Seventh modification
According to an exemplary embodiment, the number of files associated with a screenshot is not limited to one. Multiple files may be associated with a single screenshot. This is not limited to files. One or more folders may be associated with the screenshot.
If there are multiple workspaces, a set of screenshots may be stored collectively on the memory area, rather than the case where screenshots are stored separately on workspaces. More specifically, the screenshots need not be stored on a per workspace basis. In this case, the document processing system 102 stores the storage destination and the file name of the screen shot on the workspace part 103, instead of performing the operation of storing the set of screen shots on the workspace part 103 (the operation in step S204 of fig. 8). In order to store the storage destination and the file name, the document processing system 102 may describe the storage destination and the file name in a predetermined setting file. Alternatively, a database configured to store the storage destination and the file name may be provided.
In this modification, when the user restarts the job, the document processing system 102 acquires the storage destination and the file name of the screen shot associated with the file or folder selected by the user, and reads and displays the screen shot using the acquired storage destination and file name.
Eighth modification
According to an exemplary embodiment, a digital document converter 104 in the document processing system 102 takes a screenshot of the system operation screen. The element configured to take a screenshot is not limited to the digital document converter 104. For example, the workspace portion 103 may take a screenshot of the system operation screen.
According to an exemplary embodiment, document processing system 102 obtains a screenshot of a display area of a system operation screen. The display area in which the screenshot is taken is not limited to this area. For example, a screenshot of the entire display area of the UI unit 156 may be acquired.
Ninth modification
According to an exemplary embodiment, the program executed by the CPU 151 in the client device 10 may be downloaded via a communication network (such as the internet). The program may also be distributed in a recorded form on one of a computer-readable recording medium including a magnetic recording medium (such as a magnetic tape, or a magnetic disk), an optical recording medium (such as an optical disk), a magneto-optical recording medium, and a semiconductor memory.
The foregoing description of the exemplary embodiments of the invention has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed. It is evident that many modifications and variations will be apparent to those skilled in the art. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, to thereby enable others skilled in the art to understand the invention for various embodiments and with various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the following claims and their equivalents.

Claims (29)

1. A data processing apparatus, the data processing apparatus comprising:
a first display controller that displays, on a first display area of a display, an operation screen for an operation of another apparatus connected to the data processing apparatus via a network, and displays, on a second display area of the display, an operation screen for operating a first object selected from a set of at least one object related to the operation;
a storage controller that stores, on a memory, an image of the operation screen displayed on the first display area in association with the first object at a predetermined timing; and
a second display controller that displays the image associated with the first object on the display when a predetermined operation is performed with respect to the first object or the set.
2. The data processing apparatus according to claim 1, wherein the storage controller stores the image of the operation screen displayed on the first display area on the memory in association with the first object when the operation screen is closed on the second display area.
3. The data processing apparatus according to claim 1, wherein the storage controller stores the image of the operation screen on the memory in association with the first object when the operation screen is closed on the first display area.
4. The data processing apparatus according to claim 1, wherein the storage controller stores the image of the operation screen displayed on the first display area on the memory in association with the first object at respective predetermined timings while the operation screen is displayed on the second display area.
5. The data processing device of claim 1, wherein the set is stored on the memory.
6. The data processing device of claim 2, wherein the set is stored on the memory.
7. The data processing device of claim 3, wherein the set is stored on the memory.
8. The data processing device of claim 4, wherein the set is stored on the memory.
9. The data processing apparatus according to claim 1, wherein the second display controller displays the image associated with the first object on the display when the first object is selected from the set.
10. The data processing apparatus according to claim 2, wherein the second display controller displays the image associated with the first object on the display when the first object is selected from the set.
11. The data processing apparatus according to claim 3, wherein the second display controller displays the image associated with the first object on the display when the first object is selected from the set.
12. The data processing apparatus according to claim 4, wherein the second display controller displays the image associated with the first object on the display when the first object is selected from the set.
13. The data processing apparatus according to claim 5, wherein the second display controller displays the image associated with the first object on the display when the first object is selected from the set.
14. The data processing device of claim 1, wherein the set includes image objects indicative of the operations to be performed on the objects.
15. The data processing apparatus according to claim 2, wherein the set comprises image objects indicative of the operations to be performed on the objects.
16. The data processing apparatus according to claim 3, wherein the set comprises image objects indicative of the operations to be performed on the objects.
17. The data processing apparatus of claim 4, wherein the set comprises image objects indicative of the operations to be performed on the objects.
18. The data processing apparatus according to claim 5, wherein the set comprises image objects indicative of the operations to be performed on the objects.
19. The data processing apparatus of claim 6, wherein the set comprises image objects indicative of the operations to be performed on the objects.
20. The data processing apparatus according to claim 1, wherein the storage controller stores an address in response to the operation screen displayed on the first display area in association with the first object, and
wherein the second display controller displays an operation screen obtained by accessing the address associated with the first object, and the first object and the image.
21. The data processing apparatus according to claim 2, wherein the storage controller stores an address in response to the operation screen displayed on the first display area in association with the first object, and
wherein the second display controller displays an operation screen obtained by accessing the address associated with the first object, and the first object and the image.
22. The data processing apparatus according to claim 3, wherein the storage controller stores an address in response to the operation screen displayed on the first display area in association with the first object, and
wherein the second display controller displays an operation screen obtained by accessing the address associated with the first object, and the first object and the image.
23. The data processing apparatus according to claim 4, wherein the storage controller stores an address in response to the operation screen displayed on the first display area in association with the first object, and
wherein the second display controller displays an operation screen obtained by accessing the address associated with the first object, and the first object and the image.
24. The data processing apparatus according to claim 5, wherein the storage controller stores an address in response to the operation screen displayed on the first display area in association with the first object, and
wherein the second display controller displays an operation screen obtained by accessing the address associated with the first object, and the first object and the image.
25. The data processing apparatus according to claim 6, wherein the storage controller stores an address in response to the operation screen displayed on the first display area in association with the first object, and
wherein the second display controller displays an operation screen obtained by accessing the address associated with the first object, and the first object and the image.
26. The data processing apparatus according to claim 7, wherein the storage controller stores an address in response to the operation screen displayed on the first display area in association with the first object, and
wherein the second display controller displays an operation screen obtained by accessing the address associated with the first object, and the first object and the image.
27. The data processing apparatus according to claim 4, wherein the storage controller stores the image with attached time information and information indicating an update time of the first object on the memory, and
wherein the second display controller displays an image attached with the time information on the display, a time difference of the time information from the update time of the first object being equal to or shorter than a predetermined threshold.
28. The data processing apparatus according to claim 4, wherein the storage controller stores the image with attached time information and information indicating update time of each page of the first object on the memory, and
wherein the second display controller displays an image attached with the time information having a time difference from the update time of the page to be displayed in the first object on the display, the time difference being equal to or shorter than a predetermined threshold.
29. A data processing method of a data processing apparatus, the data processing method comprising the steps of:
displaying, on a first display area of a display, an operation screen of an operation of another apparatus connected to the data processing apparatus via a network, and displaying, on a second display area of the display, an operation screen of operating a first object selected from a set of at least one object related to the operation;
storing, on a memory, an image of the operation screen displayed on the first display area in association with the first object at a predetermined timing; and
displaying the image associated with the first object on the display while performing a predetermined operation on the first object or the set.
CN201611110124.0A 2016-06-01 2016-12-06 Data processing apparatus and data processing method Active CN107450793B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016-109805 2016-06-01
JP2016109805A JP6668953B2 (en) 2016-06-01 2016-06-01 Data processing device and program

Publications (2)

Publication Number Publication Date
CN107450793A CN107450793A (en) 2017-12-08
CN107450793B true CN107450793B (en) 2021-10-22

Family

ID=60482217

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611110124.0A Active CN107450793B (en) 2016-06-01 2016-12-06 Data processing apparatus and data processing method

Country Status (3)

Country Link
US (1) US10365880B2 (en)
JP (1) JP6668953B2 (en)
CN (1) CN107450793B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11553316B2 (en) * 2017-05-16 2023-01-10 Angel Pena Method and apparatus for storing and sending a computer location
JP7167476B2 (en) * 2018-04-18 2022-11-09 富士フイルムビジネスイノベーション株式会社 Information processing system, information processing device and program
CN110209324B (en) * 2019-04-30 2020-11-10 维沃移动通信有限公司 Display method and terminal equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102279791A (en) * 2010-06-11 2011-12-14 微软公司 User interface inventory
CN103049477A (en) * 2012-11-19 2013-04-17 腾讯科技(深圳)有限公司 Method and system for sharing streetscape views to social network site
CN103168299A (en) * 2010-06-18 2013-06-19 特拉克180公司 Information display
CN103428489A (en) * 2012-05-15 2013-12-04 奇扬网科股份有限公司 Sending device and method for sharing screen capture
CN103780652A (en) * 2012-10-23 2014-05-07 腾讯科技(深圳)有限公司 Micro-blog resource sharing method and system thereof
CN105204844A (en) * 2015-08-20 2015-12-30 上海斐讯数据通信技术有限公司 Application interface storage and recovery method and system, and electronic equipment

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4835118B2 (en) * 2005-11-16 2011-12-14 コニカミノルタビジネステクノロジーズ株式会社 Operation information recording method for image forming apparatus and image forming apparatus
JP2008217585A (en) * 2007-03-06 2008-09-18 Casio Comput Co Ltd Client device for server base computing system, and client control program
JP4718530B2 (en) 2007-08-24 2011-07-06 東芝テック株式会社 Screen creation device and screen creation program
JP5066499B2 (en) 2008-09-24 2012-11-07 株式会社日立ソリューションズ Web application operation procedure manual generation system
JP2013161302A (en) * 2012-02-06 2013-08-19 Pfu Ltd Information processing device, information processing method, and program
JP6127690B2 (en) * 2013-04-25 2017-05-17 富士ゼロックス株式会社 Information processing apparatus and program
JP6179177B2 (en) * 2013-05-07 2017-08-16 株式会社リコー Information processing program, information processing apparatus, and display control program
JP6337810B2 (en) * 2014-03-14 2018-06-06 オムロン株式会社 Information processing apparatus, information processing method, and program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102279791A (en) * 2010-06-11 2011-12-14 微软公司 User interface inventory
CN103168299A (en) * 2010-06-18 2013-06-19 特拉克180公司 Information display
CN103428489A (en) * 2012-05-15 2013-12-04 奇扬网科股份有限公司 Sending device and method for sharing screen capture
CN103780652A (en) * 2012-10-23 2014-05-07 腾讯科技(深圳)有限公司 Micro-blog resource sharing method and system thereof
CN103049477A (en) * 2012-11-19 2013-04-17 腾讯科技(深圳)有限公司 Method and system for sharing streetscape views to social network site
CN105204844A (en) * 2015-08-20 2015-12-30 上海斐讯数据通信技术有限公司 Application interface storage and recovery method and system, and electronic equipment

Also Published As

Publication number Publication date
US10365880B2 (en) 2019-07-30
JP2017215827A (en) 2017-12-07
CN107450793A (en) 2017-12-08
JP6668953B2 (en) 2020-03-18
US20170351477A1 (en) 2017-12-07

Similar Documents

Publication Publication Date Title
CN103092665B (en) Immediate updating device and immediate updating method
JP6900557B2 (en) Information processing equipment and programs
CN107450793B (en) Data processing apparatus and data processing method
KR20160125401A (en) Inline and context aware query box
US20170272265A1 (en) Information processing apparatus, system, and information processing method
US20120041946A1 (en) Data search apparatus, control method thereof and computer readable storage medium
US20140172909A1 (en) Apparatus and method for providing service application using robot
KR101777035B1 (en) Method and device for filtering address, program and recording medium
WO2021233115A1 (en) Method and apparatus for modifying file name, and storage medium
CN112148395A (en) Page display method, device, equipment and storage medium
CN107341234B (en) Page display method and device and computer readable storage medium
US9537850B2 (en) Information processing apparatus, information processing method, and storage medium
JPWO2014097380A1 (en) Information processing apparatus, work environment cooperation method, and work environment cooperation program
JP6313987B2 (en) File management program, file management method, and file management system
CN112379800A (en) Electronic manual implementation method and device, readable storage medium and computer equipment
CN107766018B (en) Information processing apparatus, information processing method, and computer program
CN107544750B (en) Terminal device
JP6572679B2 (en) Information processing apparatus and program
JP2016091092A (en) Browser, control method of browser, and program
JP6540121B2 (en) Information management device and program
JP6024115B2 (en) Request management apparatus, request management method, and request management program
KR101545653B1 (en) Apparatus and method for providing searching service
JP6683042B2 (en) Data processing device, system and program
JP2018088199A (en) Information processing device, control method, and program
JP4698295B2 (en) Operation history storage system and method, program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: Tokyo, Japan

Applicant after: Fuji film business innovation Co.,Ltd.

Address before: Tokyo, Japan

Applicant before: Fuji Xerox Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant